Panel discussion on Julie E. Cohen’s great new book, Configuring the Networked Self: Law, Code, and the Play of Everyday Practice.
Danielle Citron, U. Md.: Concurring Opinions will have an online symposium on the book this coming March. Helps understand importance of architecture
on human development. Networks know who we are and sort, categorize, and make
decisions for and about us. Arbiters of access to knowledge, jobs, etc. Pervade
our daily lives. Search engines highlight things they think are relevant to us;
companies give us social media influence scores and sell those to advertisers;
automated systems count and miscount votes, and remove voters from rolls;
determine how much Medicaid etc. will be paid for a person; flag individuals as
potential terrorists/threats. These systems have tremendous impact on the play
of everyday life/creativity, but there’s a huge info imbalances. To them, users
are open books; to us, they’re black boxes. Can’t demand of private parties or
government to find out what they know, making it difficult to protest.
Examples: Advertising/marketing companies mine info to find
valuable customers. Ads and news can be
tailored to demographics and interests.
Someone who’s unemployed may see ads for payday loans, fast food,
for-profit schools; in ad terms, she’s known as “waste” and so the news she
sees will be tailored to that—military recruitment, vocational schools. Architecture influences culture.
Md. started surveying protest groups against the Iraq war.
Physical surveillance and also data mining to identify activists as
“terrorists,” including 2 Catholic nuns and a man running for Democratic
office. Normally you’d never know you were identified as such, but you would
lose jobs/wouldn’t be able to travel.
However, ACLU filed open government request; after a big fight,
determined that 53 people were designated terrorists. The explanation was: the automated software
only offered me that choice; there was no option for “extremist” and terrorist
seemed close enough.
Public benefits systems: increasingly automated. Programmed hundreds of legal rules
incorrectly—denied Medicaid to breast cancer patients based on income
allocations not required by state law; denied food stamps based on drug
convictions contrary to federal law.
System hasn’t gotten better despite high-profile litigation and change
of vendors. One girl died when erroneously
kicked from system.
Sen & Nussbaum’s capability approach is important.
Automated systems impact our core capabilities.
Cohen develops this by showing how the systems interfere with play,
creativity, space to breathe. Activists:
felt watched and didn’t go to meetings or didn’t say what they thought.
Does Cohen overemphasize creativity when more pressing
capabilities should be in the foreground? Systems can deprive people of
necessities of survival, like health care.
Creative activity is hard if you’re starving or if you can’t get a job
because you’ve been marked as a terrorist. Social mobility—ability to give
credit to talent—is key to creativity, but systems stereotype/pigeonhole people
to keep them impoverished. Next steps: sort out how different systems affect
different capabilities. Objectification:
whether systems fail to treat us as ends in ourselves.
Daniel Solove, GW: Broad theme of Cohen’s work: how to carve
out appropriate space for intellectual creativity? One of few scholars to explore privacy and
creativity together in their nuances. Copyright and privacy both concern
control over information; tension because scholars who argue for limits on
copyright are often arguing for more protection for privacy—less control/more
control over information. Is there a
coherent way to argue for less copyright/more privacy? Cohen’s work establishes the normative
foundations for that. A set of ways to
allow creativity and the development of the self.
We need to consume in order to create. Also need breathing
space to create without someone watching us all the time. Cohen uses Sen &
Nussbaum’s concept of flourishing: freedom that transcends negative liberty,
includes access to real opportunities.
Introduce chance into our controlled, networked world. Copyright: strict control over information
can impede our ability to create in the ways we want to create. Privacy:
growing surveillance threatens to put us under control that will make us hard
to create and flourish in ways outside the norm. Privacy is often treated as a second-class
right; hard to give it the status of a fundamental right because fundamental
rights tend to be simple and stable; privacy is nuanced, culturally and
historically contingent; amorphous.
Privacy is a critical right, though, not just to individual flourishing
but to society because our own intellectual development depends on the
development of others. Stunting
creativity of others stunts our creativity.
Surveillance can chill eccentric behavior; it can dull
us. Removes interesting eccentricities
that are key to great aspects of creativity. Has subtle effects we might not
notice: expecting to be under surveillance, we might not even realize we’re
holding back.
Self-exposure: not a nirvana of selfhood. Consequences of
self-exposure not felt equally for everyone in society: race, gender. Self-exposure is not pure: sites subtly
manipulate, shape and control our own expression—site architecture pushes us
towards particular ways of being and makes it hard to understand its effects. We experience the consequences; the sites
don’t.
Next steps: answer more how one is to weigh privacy and
integrate it into the matrix of other values. Cohen criticizes instrumental
trading privacy off with other interests. But if it’s not instrumental and not
something we should leave people just to choose to give up, then do we risk
paternalism in telling them not to self-expose?
Law can’t be neutral; will shape architecture. But where do we draw the line? Some
creativity can be harmful to other people and society; some self-expression
harms the expression of others, e.g., hate speech.
Me: Sickness kept me (relatively) brief.
There’s a tendency in law generally to oppose the real with
the culturally constructed, and treat the former as unchangeable and the latter
as not very important. In fact culture can be more powerful and constraining to
imagining potential change, which is why 1960s Star Trek has recognizable (if
bigger) computers and communicators that work a lot like our phones, but racial
and gender assumptions that are now quite hard to sympathize with. Cohen
challenges us to imagine better: understand culture’s power and make policies
that both acknowledge and attempt to work with that power.
Popular legal imagination: privacy is a featureless goo,
copyright is crystal-edged property.
These are both unworkable and misdescriptive without an account of how
people make themselves and each other in light of their environments.
Past week: US v. Jones, majority used property concepts in
the guise of trespass to avoid problems of what privacy means. I’m most interested in the Alito concurrence,
though, which tries to go beyond property but has a silence at its heart. The concurrence has what Scalia, in a worse
mood, might have called a “sweet mystery of life” section; it recites the law
about reasonable expectation of privacy, talks in general about expectations,
then jumps to “therefore, this was unreasonable.” Illustrates the difficulty courts have thinking
about privacy in context.
Previous week: Justice Ginsburg used property concepts to
say that the public domain was worthless to the public because unowned; the
Bible and Shakespeare are not yours and
therefore you lack a First Amendment interest in using them. They are debris,
not the culture you breathe in and breathe out, transformed. This is again a
failure of understanding how creativity is lived by people.
So, of course, I ask how we can bridge this enormous gap between
theory and practice.
Cohen: we come to the law with bodies and histories; that
means information and information rights affect us in ways that law often
abstracts from. People feel false and
unwarranted confidence that we are unchanged by surveillance and copyright law. Yes, we need ultimate answers; the way we’ve
gone so far is to assume we can rationalize our way through the hard
decisions. We need to start making the
hard choices so people can make themselves into critical citizens.
Q: how to deal with the urge to share, experienced as
empowering? It’s neither inherently good
nor bad, nor do networks inherently inhibit human flourishing as Citron
suggested.
Citron: she agrees there are stories of fighting back, like
Hollaback against harassment. We need to think about management, and how we do
undermine capabilities in ways that could be fixed.
Cohen: we tend to uncritically celebrate freedom and
delegate implementation to the technologist; or we get really technocratic and
start dictating as if we could fix everything if we had enough info. Those are both faulty ways of thinking: tech
is empowering but also dangerous. Think about sensitizing designers to
non-neutrality of tech.
Heidi Li Feldman: is privacy any more slippery than any
other fundamental right? Maybe it’s more
salient because of tech that we don’t understand what we’re trying to protect.
Cohen: autonomy is a crutch; assumes a fixed self that
privacy shelters from the world. But the self is in motion; privacy provides
breathing space for the process of changing ourselves.
Solove: sees privacy as an umbrella term; has identified 16
different things under the rubric. One of the challenges is that it’s
culturally and historically contingent.
2 comments:
My tech policy seminar students will blog Julie's book for two weeks starting Feb 9. We are currently doing Merges's Justifying Intellectual Property: http://picker.typepad.com/picker_seminar/
I'm delighted to hear that, Randy!
Post a Comment