Neil Richards, Professor of Law, Washington University
(1)
Privacy is dead.
1970: Newsweek cover story: Is privacy dead? People have been declaring privacy dead for a
while. Privacy law is booming, as is
debate on appropriate collection, use, and disclosure of information. Privacy still exists: no cameras in bathroom
(and expectation is so deep that we feel no need to disclose that there are no
cameras in the bathroom). Need for rules
regulating what happens after info captured is all the more important.
HIPAA/FCRA govern use of information.
Transparency makes consumer credit work for consumers as well as
agencies. Trade secret law is a law of
privacy. Data brokers like being able to operate in private and resist calls
for transparency. Facebook is an enemy
of privacy but likes its own: Richards visited and was required to sign an NDA
saying he couldn’t describe anything that happened during his visit; tried that
with a press conference too. FB believes
in privacy for itself, as does the NSA.
The content of the rules is more important than ever, and we
need to decide what rules to have before it’s too late.
(2)
People don’t care about privacy
If people didn’t care, would you know who Edward Snowden
was? Pew study shows people do care—when they can figure it out, they use
adblockers, Tor, pseudonyms, give false info.
Privacy paradox: the idea that we say we like privacy but just go ahead
and sell our info. Dan Solove: the
failure of privacy self-management. FB
provides “tools” to control privacy, but they’re too hard to use. Even privacy experts aren’t good at this—there
are too many of them to control. If you
read all the privacy notices you encounter in a day you’d need 76 workdays. Putting burden on users to read, adjust,
tweak, opt out does not work—choices and time are limited.
(3)
Young people don’t care about privacy
Actually, they do. They see the benefits of connecting. They
care about the immediate privacy threats to them, which is not the NSA, and not
even FB. It’s their parents and teachers.
FB activity is declining because old people are now there. It’s not that FB is uncool. It’s that when your parents are on FB you
watch what you do. Privacy purges upon
college graduation: pictures of everything they’d taken shouldn’t be on their
timelines any more. Also, like older
people, young people face limited privacy choices—you have to go where the
people are; having less money gives them more limited choices. Reverse privacy paradox: if we don’t care
about privacy, why do we talk about it so much?
Privacy is the shorthand for anxiety about radical transformations in
info tech we’re witnessing and how those affect our lives.
(4)
If you’ve got nothing to hide you’ve got nothing to
fear.
Posner said: privacy is just a right to conceal
discreditable facts about ourselves: fraud on the market for humanity. But we all have something to hide. Not just toilet activities or private
parts. Tyler Clementi killed himself
after his sexual activities were secretly recorded and shared.
People behave differently when they believe they’re under
surveillance, or even behaving in front of a picture of eyes. Is the solution
for all of us to watch each other? No. Privacy in social space allows us to
figure out who we are, play with identity as Tushnet discussed.
Privacy has power effects: information can be power
over. NSA collecting information on
people viewed as pro-radical Islam, creating a chart of the porn they liked,
planning to threaten disclosure to silence them. FBI did the exactly the same thing to MLK Jr.,
believing him a Communist agent.
Target knows you’re pregnant: the intent wasn’t to be
creepy; it was to get power, here commercial power.
(5)
Privacy is bad for business
User data is valuable.
If this is just a transaction, we’re paying for FB without dollars, and that’s
misleading. Is privacy a tax on
profitability? Well, is having to pay
wages to your employees a tax on profitability?
It’s a cost/input. Privacy is
ultimately good for businesses, which depend on trust. Trust/confidentiality can garner a
competitive advantage.
How we frame/talk about privacy matters. Free services, Big Brother, oil, death of
privacy matters for diagnosis and solutions.
Moderator: Fred
Vars, Associate Professor of Law, The University of Alabama
Qs: is that what we really mean when we say privacy is dead?
Don’t we mean that the scope of available info about individuals is much
greater, even if rules still exist? Would
a more satisfactory response to the myth be that the percentage of total
information that’s private is smaller but not because privacy is dying but
rather than the amount of public information has skyrocketed?
Richards: goes to discourse.
Labeling and framing: when it ceases to be private, information does not
necessarily become public. A lot of
baggage comes with the term “public.” Info
is shifting from the superprivate state to the less superprivate, but that
doesn’t mean it’s all out there and unregulable. Managing flows becomes more important.
Sarat: Contrast Haggerty’s point: privacy for excretory
functions remains. But the social domain
has dramatically shifted. Just because
it’s not fully public doesn’t mean it’s private (against the gov’t,
advertisers, etc.).
Real point of disagreement is whether rules still matter a
lot. Haggerty says no. You say yes.
Slippage between language of secrecy and privacy—privacy is
even more a matter of power.
Richards: as a sociologist, the goal is to explain and
describe and diagnose complex social phenomena.
As lawyers, our job is to fix things and talk about rules and
prescriptions. Haggerty is right about
the mean shift from relatively private to relatively public, and the risk that
it will become dangerously public. Also
agrees that legal rules/privacy rules can end up becoming problems themselves,
making it easier to give up privacy—FISA court gives stamp of democratic
approval. But sometimes legal rules have
unintended effects, and even fail; his departure is the hope that we can have
good rules that actually work and don’t become privacy theater.
Q: seems like the only people who really care about privacy
are those who want to live off the grid—no phone, no connection to public
services. Tech is too pervasive for
privacy to exist. Security breach at Target = no privacy.
Richards: we continue to use credit cards because the
benefits outweigh the costs, but the costs are still there. Humans are wired to
be intellectually lazy, and existing concepts are easily blurred. Need better
encryption, among other things. But clients still care about confidences—they want
their information kept private. (We act
as if the NSA isn’t listening, I would say!)
Regulation can prevent certain uses. Cultivate ethical sensibility among
info tech engineers. When doctors started to be able to cure, we had ethical
regulations about their responsibilities; when lawyers could change legal
status, they developed ethics; software engineering needs constraints too, through
professional ethic of utilizing great power.
My Q is about the last myth: Silicon Valley libertarianism;
why doesn’t market reasoning completely dispose of your responses—we’ve
bargained/there’s consent; minimum wage laws are bad; if privacy is good for business we will adopt it precisely
to that extent, and no more.
Richards: My audience is not so much the Randians as the
non-true believers, as well as consumers.
There are different kinds of privacy/rules. Some are at times flatly
opposed to profitability. “You may not
have a free service” would make things less profitable. Other protections
allowing users real control, or regulating the way ads delivered, or regulating
how information is safeguarded, are essential. Even where there are actual
tradeoffs, by looking at the rules in the broader context you can see the
complexity of the problem and that we’re looking for decisions about when we’re
allowing information to flow unimpeded.
Less concerned with rules that come out of the ethical conversation than
about having a conversation about what ethical rules would be, as we did about
what the rules for workplace safety should be.
Austin: to what extent is privacy a real live analytic tool
any more? Yes, we need rules, but does it make a difference to call them
privacy rules as opposed to information rules?
Confidentiality won’t get us very far; it’s narrow.
A: doesn’t like term privacy. We need to work on the
intermediate concepts. People keep looking for the harm rather than figuring out
rules. He has tried to pick some concepts and start from there: informational
privacy/privacy in the formation of beliefs and political commitments—the goal
is to defend kinds of information rules. In security, we may want to rely on
economics—our interest in credit card security is more in not losing money. In
surveillance, we need to look at balance between political liberty, public
safety, and risks of corruption from unreviewable security apparatuses. Just not privacy as secrets or on/off
binaries.
No comments:
Post a Comment