Discussant: Molly Land: categorizing rationales states offer
for censorship. Prior normative
commitments, which often don’t leave us with much room to negotiate and push us
to extremes. Five categories: designed
to prevent individual harm; preventing tangible social harm (military secrets);
inchoate harm to social fabric/deeply held values (hate speech/Holocaust
denial); something larger than the state (blasphemy); widgets/economic values
(copyright). How do we distinguish censorship of hate speech from censorship of
political dissent? Bambauer’s past work: we should have a framework for
evaluating censorship that focuses on process, not on the reasons for the
censorship. At the very least we should
hold states to the consequences of their own rationales, and if there’s
contradiction then we have an additional tool to attack the state’s actions.
Her comments: additional rationales. Speech that is critical of the gov’t pure and
simple; political organizing that might threaten the state; inappropriateness—gay
rights/family planning. The categories chosen almost presuppose a
well-intentioned state.
Possible responses: you are emphasizing proferred reasons
rather than real reasons. State might be
self-serving but offer more palatable rationale in public. Use that rationale
instrumentally to hold the state to its own reasons? Another possible response: maybe these
self-serving rationales fall into the harm to society/social fabric categories.
But if so, the categories may be so broad as to deprive them of any power to
hold the state to account. If your argument is that the state has to accurately
report on the threat, or articulate what beliefs and values ought to be
enduring, the categories may lack rationales for determining validity. What
counts as a threat? Without that, there’s
no traction for limiting outlier repressive regimes. Categories one and four do
offer some constraints—articulate the harm and show that censorship addresses
the harm. These harms are more definable.
Maybe we can’t assess the validity of the claims—do we have
to be value-neutral? Uses Russia as an
example, targeting political dissent with other rationales. Do we simply to
defer to the state’s claim that these are valid rationales?
Domestically, most of our rules are one and four, and that
does offer a helpful framework. Takes us to first principles: is this a harm we
want to prevent and do these measures work.
But it’s hard to figure out how to challenge claims by a state that censorship
prevents the overthrow of the government.
Q: continuum? Profaning the sacred can work a harm on every
member of the affected group.
A: could do both—saps the strength of the thing that links
them together.
Jack Balkin: You could map censorship for sociological
reasons; because you want normative judgments; because you want to know the
most effective techniques. What is the
purpose of the typology?
A: starts with the first. Censorship is widespread.
Balkin: if sociological, ask what work is being done for
society by the particular techniques of censorship? What values is society
attempting to protect or defend? Is the state the central actor in censorship
or one among many?
A: I think the project is looking at state action. Private censorship: don’t yet have a useful
way to deal with that. Not sure that the
First Amendment should apply to Google.
Balkin: Althusser talks about schools, the church,
markets/capitalist system. Censorship as a practice does particular work—shoring
up religious belief, protecting relations between the sexes, securing property,
legitimating/justifying the use of power both public and private. In those
categories the public/private line will sometimes be salient and sometimes not.
When a Republican congressman insists on no sex education that can be
understood as the state or as a group using the state for a particular purpose.
A: the state as one mechanism of effectuating particular goals. Are functions and values different here?
Copyright is instrumental—we want more stuff—but that may not be a value, where natural rights claims do
the work.
Balkin: control over culture—the work copyright does is to
allow particular people to control what things mean in society and also to make
money. Critiques of copyright often
focus on control over cultural meaning more than money. Interesting connections between copyright and
blasphemy: certain things shouldn’t be said so as not to undermine particular
kinds of meanings (that’s TM dilution!).
Even if the actor is the state, others’ goals are served.
Robert Post posits democracy, community and management as
the key axes—we can imagine censorship that defends any of these. This is a sociological division of life into
boxes.
We could understand Google as censoring for its own
purposes, or as a pawn to serve the interests of various states. You might see
both of these—Google wants a quiet moneymaking life and thus lets itself be
used.
A: history of censorship is a larger project.
David Thaw: have to take the claim on its own terms, and
also investigate what it is doing: empirically or analytically is there a
mismatch between the claim and the solution.
A: it might be both.
Censorship for copyright is both to get more movies and to control who
gets to say what.
Robert Post: private power has to come into it. MPAA couldn’t get SOPA/PIPA, so it pressured
ISPs into six strikes through private agreement. ICANN; other institutions coming in to take
over the roles that state actors used to fill.
A: Agreed, but this is the difficulty of being an American
lawyer. Free speech is directed at gov’t initially. Then we have to decide what
restraints on private actors are appropriate.
Land: another thing to think about is that different
perspectives will categorize harms different ways: European perspective is that
hate speech is category one: it harms specific individuals.
Nonstate actors: this is the approach people have taken to
nonstate actors—we want transparency and we want to hold them to the promises
they make/claims about their goals they make.
That seems like all we have with respect to private actors. Applying
this to state actors seems like a new move; perhaps normative judgments can be
made about state actors that we can’t make with private actors. Functional analysis may be all we can do for
Google, but not all we can do for states.
Balkin: might be interested in technologies/techniques of
what’s generally regarded as censorship (attempts to prevent people from
expressing opinions or making claims in public sphere), or might expand
concepts of censorship—subsidization, public education, scientific research.
Post: if you start with Foucault, information
flows/censorship are everywhere.
Balkin: but that’s not where he’s starting. Paradigm: gov’t beats you up for your speech,
and expands from there.
A: particularly interested to what happens to broadcast—interdiction
because the gov’t stops you from broadcasting your message and substitutes a
cartoon instead, even if it allows you to use a public square.
Post: what’s the state of nature? Can’t use unmediated
communication as a baseline. You have to start with values if you go that way:
the idea of censorship may look meaningless from that perspective where flows
of information are always shaped.
Thaw: who I get to talk to because of where the roads go is
affected by gov’t decisions. Shaping the
flows of information: there is no such thing as unmediated communication. Definition of censorship: introducing another
variable after whatever the starting point is.
The paper could be about what happens after the starting point. Not sure whether that works, but could
explore.
Post: but why privilege the starting point if it could be
different? If Google could start in one
configuration or another, why privilege the one it chose? (The discussion has in general been going
back and forth between gov’t and private actors deciding to interrupt
information flows that would otherwise occur, given the background conditions
already in place.)
Thaw: not so much privileging but a way of deciding what to
talk about.
Post: but we want to talk about some things separately in a
way that’s normatively loaded, which “censorship” is.
No comments:
Post a Comment