Date post: | 17-Dec-2014 |
Category: |
Law |
Upload: | adam-thierer |
View: | 132 times |
Download: | 0 times |
The Challenge of Benefit-Cost Analysis As Applied to Online Safety & Digital Privacy
Adam ThiererSenior Research FellowMercatus Center at George Mason University
January 17, 2012
Presentation before George Mason University Law & Economics Center conference on Privacy, Regulation, & Antitrust
2
Purpose of Talk / Paper• Explore challenges of applying benefit-cost analysis
(BCA) to privacy & online safety – in many ways, they are really same debate
• Discuss particular problem of defining harm (and, correspondingly, the benefits of regulation)
• Outline the range of costs that must be considered (even if they prove difficult to quantify)
• Focus on the many alternatives to regulation• Explain how the toolbox of solutions we apply to
online safety can work for privacy
3
Prefatory Note: Strict Rights Approach Limits BCA
• If privacy & safety are “dignity” rights, then it trumps all other considerations & BCA goes out the window
• But at least here in the U.S. the calculus has been more utilitarian in character
• Better to think about privacy & safety the way we think about happiness: You have the right to pursue it, not so much a strict right to it
• So, BCA remains important
The Nuts & Bolts of RIA / BCA
4
5
The Triumph of Benefit-Cost Analysis
“It is not exactly news that we live in an era of polarized politics. But Republicans and Democrats have come to agree on one issue: the essential need for cost- benefit analysis in the regulatory process. In fact, cost-benefit analysis has become part of the informal constitution of the U.S. regulatory state. This is an extraordinary development.”
- Cass Sunstein (2012)
6
The Basics of RIA / BCA
• Since early 1980s, regulatory impact assessments (RIAs) have been required
• Executive Order 12291 (Reagan)• Executive Order 12866 (Clinton)• Executive Order 13563 & 13610 (Obama)• OMB Circular A-4
– 3 core elements of an RIA…
7
RIA Step #1:Statement of need for the regulatory action
• a clear explanation of need for the regulatory action
• “Agencies should explain whether the action is intended to address a market failure or to promote some other goal, such as improving governmental processes, protecting privacy, or combating discrimination.”
8
RIA Step #2:Identify range of regulatory approaches
• agencies must describe “range of alternative regulatory approaches, including the option of not regulating.”
• “should consider flexible approaches that reduce burdens and maintain freedom of choice”
• “should specify performance objectives, rather than specifying the behavior or manner of compliance that regulated entities must adopt.”
9
RIA Step #3:Conduct the Benefit-Cost Analysis
• estimates the benefits & costs associated with each alternative approach.
• costs should be quantified and monetized to the extent possible
• when quantification of a particular benefit or cost is not possible, it should be described qualitatively.
• where relevant, consider “values such as equity, human dignity, fairness, potential distributive impacts, privacy, and personal freedom.”
10
Possible Privacy Regs to Consider• “Do Not Track” mandate• Mandatory default switches
– Opt-in mandates
• Bans on specific practices– 3rd party data sharing / aggregation– Deep packet inspection
• Targeted privacy laws– COPPA
Note: Much harder to do BCA when these things are being “nudged” instead of formally mandated.
The Problem of Defining the Problem(and how that complicates the question of
benefits of regulation)
11
12
Privacy & Safety Metrics = A Nightmare
There are no good, widely agreed upon metrics by which to measure online safety & digital privacy “harms.”
13
How Privacy & Safety Harmsare Described in Popular Culture
Online Safety “Harms”• Offensive• Smutty• Filthy• Hurtful / Hateful
Privacy “Harms”• Invasive• Manipulative • Annoying• Creepy
** None of these things are even remotely quantifiable **
14
How Privacy & Safety Harmsare Described in the Literature
• Tangible vs. Intangible• Objective vs. Subjective• Direct vs. Indirect• Monetary vs. Non-Monetary• Quantifiable vs. Unquantifiable • Physical vs. Psychological
In a perfect world…
15
Intangible Harm Tangible Harm
Monetizable Harm Easier case
Non-Monetary Harm Hard case
16
Intangible Tangible
Monetizable Harm
Data Breach Invasion / Trespass
Non-Monetary Harm
“Creepiness”“Offensiveness”
Stalking
and sometimes that works …
… but most of the time it doesn’t.
17
The Key (very, very hard) Question
To what extent are harms that purely psychological in character really harms at all?
• Clearly, some can be– Incessant digital harassment / cyber-bullying
• But most not harmful in a traditional sense– If “creepiness” is an actionable harm, then the Net
as we know it would have to be shut down
18
The hopeless subjectivity of it all…
• One person’s “creepy” is another person’s “killer app.”
• Worse yet, even our own actions don’t match up with our professed values!• Stated preferences vs. revealed preferences problem
19
What’s Your Privacy Indifference Curve Look Like?
Privacy
Sociabilityand / orServices
privacy unconcerned
privacy fundamentalist
Many people who say they are
here, actually seem to be here.
20
The Privacy Paradox
“Ask 100 people if they care [about privacy] and 85 will say yes. Ask those same 100 people if they’ll give you a DNA sample just to get a free Big Mac, and 85 will say yes.”
- Austin Hill, 2002
21
What about “peace of mind” arguments?
• “peace of mind” / “user trust” = primary benefit?– i.e. more privacy & safety regs make consumers more
comfortable getting online– Analogy: post – 9/11 security mandates
• But they had opposite impact! (see J. Law & Econ. 2007)
• Problem with “peace of mind” = again, how to prove it– More people online than ever
• Plus, could cut other way (i.e., “free” services drive subscribership / usage)
• Natural experiments? (US vs. EU?)– Problem = many confounding variables
22
A Value Exchange withoutFormal Contracting
• Popular belief = If you’re not buying the product, you are the product.
• Only partially correct. • Reality = You can be both part of value
exchange & the recipient of the benefits of that value exchange.
• Past examples = traditional broadcast & newspaper media advertising
23
The Current Value Exchange
• Almost all online media / services driven by a simple quid pro quo…– Data collection / targeted advertising in exchange
for “free” (or cheap) content and services– But hard to precisely quantify this value exchange
24
Market Failure = Lack of Options?
• Why no “Facebook Private” or “YouTube Family Friendly”
• Actually, those sites do exist (by other names & from other vendors), but there is very little demand for them+ (as summarized later) lots of PETs exist
+ no requirement you use any of these services (Facebook isn’t a forced labor camp!)
25
How do we factor social adaptation into BCA?
• People adapt! (sometimes quicker than we imagine)• Today’s “technopanic” is tomorrow’s widely accepted
technology / business practice– Examples: photography, caller ID, Gmail
• 2 general lessons:1. Cautions against knee-jerk regulatory responses2. BCA should account for rapid social / cultural adjustment
to new info-tech
26
When All Else Fails…Claim False Consciousness as Harm!
• Manipulation! Mind control!Siva Vaidhyanathan:
– consumers are being tricked by the “smokescreen” of “free” online services and “freedom of choice.”
– “We are conditioned to believe that having more choices… is the very essence of human freedom. But meaningful freedom implies real control over the conditions of one’s life.”
• Such “people are sheep” arguments can’t stand in a free society (or be used in BCA)
Costs of Privacy Regulation(Lost Outputs & Forgone Opportunities)
27
28
What Data Collection Has Enabled• Transport yourself back a decade and consider how
far we’ve come• Things that used to cost $$ and were capacity-
capped:– Email– Data & document storage – Photo hosting– Mapping services– Security / anti-virus software– Online bulletin boards / hobby pages– Most online news
29
The Magic of Data & Advertising
• Lowers Price: None of those services cost us a dime now
• Expands Quantity: There are more of those services today
• Improves Quality: All of those service are more diverse or innovative than before
Data collection & advertising made it all possible= a huge boon for consumer welfare.
30
Possible Regulatory Costs BCA Must Consider
• Will regulation tip the balance between business models? • instead of ad & data supported online sites and services, we
could get…1) More intrusive but untargeted ads? (banner, pop-up, etc.)2) Pay-per-use or subscription-based?3) Restricted output / lower quality sites?4) Mix of all of the above?
• This has both micro and macro consequences – For industry: If not targeted data collection / advertising, what will
fund online content and services in the future?– For consumers: If those other methods fail to work = less content &
services or more expensive content & services• As always, there is no free lunch
31
Some Good Studies Showing Trade-offs
• Beales – targeted advertising 2.68 times the price of run of
network advertising• Tucker & Goldfarb
– EU Privacy Directive = 65% decrease in advertising effectiveness in Europe
– Because regulation decreases ad effectiveness, “this may change the number and types of businesses sustained by the advertising-supporting Internet.”
32
WTP Literature is Very Thin• Perhaps not surprising since
– People aren’t paying anything in real world!– But polls / surveys make crappy proxies– Stated preferences vs. revealed preferences problem
• WTP study by CMU (2007)– price matters a lot, but some consumers willing to
pay more for privacy when better informed
33
ENISA Study• European Network & Information Security
Agency (ENISA) “Study on Monetizing Privacy” (Feb. 2012)– Pointed out lack of real-world experiments
as major problem
• “a large share of literature is devoted to [surveys]” & “economic experiments that implement real purchase transactions are rather scarce”
• “there are no works in economics that combine theoretical and experimental methods for the analysis of the interplay of privacy concerns, product personalization and competition.”
34
ENISA Results• combined lab & field experiments (cell phone number for
movie ticket discount)Results:• Price matters!
– Most consumers bought from “privacy-invasive” providers if price lower (50 Euro cents)
• Privacy matters (at least a little)! – 29% would pay extra to avoid giving up cell number; 9% would pay
more to avoid giving up email. – And fully 80% preferred privacy-friendly option if no price difference at
all.
35
In sum, the problem we face (re: determining costs / WTP) …
“Empirical research on the subject is still in its infancy. Most studies ask for personal opinion, rather than measure the digital choices people make, and even there, the results usually find a gap between what people say and what they do about their privacy online.”
- Somini Sengupta, NYT (3/19/12)
>> BCA has to account for this & try to better measure real-world choices & trade-offs, not polls & surveys.
36
Other Costs / Lost Outputs to Consider• International competitiveness
– Has privacy regulation limited Europe’s online market?– Could regulation here diminish U.S. competitive advantage
relative to world?
• Market structure / competition– Privacy mandates could lead to industry consolidation
• Speech considerations– Many online safety & privacy regulations raise thorny free
speech / 1st Amendment issues– Privacy & online safety regs could limit aggregate amount
of speech produced
Alternatives to Regulation
37
38
Less Restrictive Alternatives to Regulation
• Education• Empowerment• (Targeted) Enforcement Efforts • Evolving Norms / Social Adaptation
This is the model we use today for online safety concerns.
– Why not apply it to privacy concerns as well?– BCA must at least take these alternatives into account
39
Education / Awareness-Building• Education & Awareness campaigns at all levels
(from privacy advocates, govt, industry, educators, etc.)
• Encouraging better online “netiquette” and “data hygiene”
• Push for better transparency across the board– Better notice & labeling – Need more watch-dogging of privacy promises made by
companies • Govt can play key role here (ex: PSAs, help sites)
40
FTC’s YouTube page
41
OnGuardOnlineContributors:FTCDHSDoCDoEDOJCFTCCFPBFCCFDICIRSState Dept.
42
Best Practice Guidelines
Query: Are such “non-law law” nudges & agency threats legal / sensible? (see forthcoming Randy Picker paper)
43
Empowerment / Privacy-Enhancing Technologies (PETs)
= Helping users help themselves• User “self-help” tools are multiplying (next slide)• Industry self-regulation
– More cross-industry collaboration on privacy programs
– Better notice– Best practices & smarter defaults– More and better tools to respond to new
developments and needs
44
Other Tools• AdBlockPlus• Ghostery• NoScript• Cookie Monster• Better Privacy• No More Cookies• CCleaner• Flush• Priveazy• Privacyfix• PrivacyWatch
Ad Preference Managers• Google, MS, Yahoo• DuckDuckGoPrivate Browsing Mode• Google “Incognito”• IE “InPrivate Browsing”• Firefox “Private Browsing”• Safari “Private Browsing”Do Not Track• Now in all major browsers• + DNT add-ons from others
45
McAfee Privacy Notice Cartoons
46
The AdBlock Plus Story
• Most demanded browser add-on in history• roughly 175 million people downloaded
Firefox version (as of Oct. 2012) • Shows a clear demand for PETs• also shows that there are powerful ways for
privacy-sensitive users to handle the problem• (Implications for contracting debate?)
47
Enforcement (Targeted)
• Holding companies to the promises they make – stepped-up FTC Sec. 5 enforcement
• Demand better notice & transparency• Mandatory disclosure of data breaches• Targeted regulation of sensitive data, but with
flexibility
Miscellaneous Closing Thoughts
48
49
The Legal Standard That Should Govern for Both Safety & Privacy
• In 2000, the Sup. Ct. struck down a requirement that cable companies “fully scramble” video signals that include sexually explicit content (U.S. v. Playboy) – “Simply put, targeted blocking is less restrictive than
banning, and the Government cannot ban speech if targeted blocking is a feasible and effective means of furthering its compelling interests.”
– “It is no response that voluntary blocking requires a consumer to take action, or may be inconvenient, or may not go perfectly every time. A court should not assume a plausible, less restrictive alternative would be ineffective; and a court should not presume parents, given full information, will fail to act.”
50
In other words…
• If effective privacy-enhancing tools & options exist, they must be factored into the BCA process
• Weighs heavily against use of preemptive regulation, especially in the absence of more concrete harms
• But other govt action still possible
51
Other Govt Options
• Privacy awareness-building (PSAs + websites)• Digital literacy & labeling programs
– nutritional labeling model• Govt-created privacy & safety apps?• Govt-created / underwritten search engine?
– PBS model for online media / services• FTC oversight
– Sec. 5 enforcement– best practice guidance?
52
How is the Govt Doing On This Front?
• Not enough focus on BCA in privacy reports, consent decrees, or COPPA revisions– Regulation treated as a costless exercise
• Strong focus on best practices / nudging industry (but still little discussion of costs)
• Good awareness-building efforts best practices reports, online advice (“OnGuardOnline” PSA efforts)
• No govt-provided privacy apps / tools
53
Going Forward… Is Formal Contracting Possible?
• Could users contract w sites for privacy / PII “rights”?• If users started using more PETs & actively thwarted data
collection / advertising, then perhaps some sort of Coasean bargaining would develop– Ex: if 25% were using DNT & AdBlockPlus = tipping point?
• But is it possible without formal “propertiziation” of PII??– Doubtful more sophisticated contracting schemes develop without
formal propertiziation?– Licensing likely to have same problems / limitations as seen in
copyright context
• Transaction costs matter! (could hinder positive developments)