Section 230 – Nurturing Innovation or Fostering
Unaccountability? DOJ Workshop
These are copied from my handwritten notes, so will likely
be terser than usual.
Introduction of the Attorney General
The Honorable Christopher Wray, Director, Federal Bureau of
Investigation
Tech is critical for law enforcement. Tech facilitates speech
& enriches lives & poses serious dangers. As our use increases, so does
criminals’. Extremists, drugs, child solicitation. Like much infrastructure,
the internet is largely in private hands, leaving vital public safety in their
hands. Will they guard elections against foreign influence? [I’m
not optimistic.] Will they identify child victims? Can have entrepreneurial
internet and safety.
Welcome
The Honorable William P. Barr, Attorney General
DOJ’s interest came out of review of market leading online
platforms. Antitrust is critical, but not all concerns raised fall within
antitrust. Need for enforcement to keep up with changing tech. Internet changed
since 1996, when immunity was seen as nurturing nascent tech. Not underdog
upstarts any more; 230’s immunity may no longer be necessary in current form.
Platform size has left consumers with fewer options: relevant for safety and
for those whose speech has been banned [because the platforms deem it unsafe].
Big platforms often mondetize through targeted ads, creating financial incentives
for distribution rather than for what’s best for users. 230 immunity is also
implicated by concentration. Substance has also changed. Platforms have
sophisticated algorithms, moderation. Blurs line between hosting and promoting.
No one—including drafters—could have imagined, and courts have stretched 230 beyond
its intent and purpose, beyond defamation to product sales to terrorism to
child exploitation, even when sites solicited illegal content, shared in its
proceeds, or helped perpetrators hide.
Also matters that the rest of the CDA was struck down: unbalanced regime
of immunity without corresponding protection for minors on the internet. Not here to advocate a position, just
concerned and looking to discuss. [Said w/a straight face, if you’re
wondering.]
(1) Civil tort law can be important to law enforcement,
which is necessarily limited. Civil liability produces industry wide pressure
and incentives. Congress, in Antiterrorism Act, provided for civil redress on
top of criminal. Judicial construction diminished the reach of this tool. (2) Broad
immunity is a challenge for FBI in civil envorcement that doesn’t raise same concerns
as mass tort liability. Questionable whether 230 should apply to the federal
gov’t [civilly]. (3) Lawless spaces online: cocnerned that services can block
access to law enforcement and prevent victims from civil recoverly, with no
legal recourse. Purposely blind themselves and law enforcement to illegal conduct=no
incentives for safety for children. Goal of firms is profit, goal of gov’t is to
protect society. Free market is good for prices, but gov’t must act for good of
society at large. We must shape incentives for companies to shape a safer environment.
Question whether incentives need to be recalibrated, though must recognized 230’s
benefits too.
Panel 1: Litigating Section 230
The history, evolution, and current application of Section
230 in private litigation.
Moderator: Claire McCusker Murray, Principal Deputy
Associate Attorney General
Q: History?
Professor Jeff Kosseff, United States Naval Academy: Disclaimer:
views are his own. Misinformation in debate over lack of factual record.
Development out of bookstore cases prosecuted for distributing obscene
material. SCt said that ordinance can’t be strict liability, but didn’t clearly
establish what the scienter standard could be. Reason to know standard existed
in lower courts. Worked for 30 years or so until early online services.
Compuserve found not liable because did little monitoring; Prodigy was found
liable because it moderated other content. Perverse incentive not to moderate;
concern that children would access porn.
Early on it wasn’t clear whether distributor liability would still be
available after 230 or whether distributor liability was a special flavor of
publisher liability.
Patrick Carome, Partner, WilmerHale: Zeran was a garden variety
230 case, buit it was the first. Zeran was the subject of a cruel hoax. Zeran’s
theory: negligence/his communications put AOL on notice. Ruling: distributor
liability is a subset of publisher liability. Absent 230, 1A would be the main
defense. Platforms would still probably win most cases. Smith v. California:
free of liability absent specific knowledge of content, which would create
strong incentive to avoid becoming aware of problems. W/o 230 platforms would
be discouraged from self-moderation and they’d respond to heckler’s veto; would
not have successful, vibrant internet. Would discourage new entrants; need it
for new companies to get off ground.
Professor Benjamin Zipursky, Fordham University School of Law:
Zeran itself ok, subsequent decisions too far. American system: normally dealing
with state tort law, not just defamation, before we go to 230/1A. Common law of
torts, not just negligence, disginguishes bringing about harm from not stopping
others from harming. Misfeasance/nonfeasance distinction. But for causation is
not enough. For defamation, publication is normally an act. NYT prints copies.
Failing to force person to leave party before he commits slander is not
slander. Failing to throw out copies of the NYT is not defamation.
But there are exceptions: schools, landlords, mall owners
have been held liable for nonfeasance. Far less clear that common law of libel
has those exceptions as general negligence does, and not clear that they
survived NYT v. Sullivan if it does. There
are a few cases/it’s a teeny part of the law. Owner of wall (bathroom stall) on
which defamatory message is placed may have duty to remove it. No court willing
to say that a wire carrier like AT&T can be treated as publisher, even with
notice. Not inconsistent with Kosseff’s account, but different.
In 90s, scholars began to speculate re: internet. Tort
scholars/cts were skeptical of the inaction/action distinction and interested
in extending liability to deep pockets. Unsurprising to see expansion in
liability; even dicta in Compuserve said online libraries might be liable with
notice. Prodigy drew on these theories of negligence to find duty to act; one
who’s undertaken to protect has such a duty because it is then not just nonfeasance.
Internet industry sensibly went to DC for help so they could continue to
screen.
Punchline: state legislatures across the country faced an
analogous problem with negligence for decades. Misfeasance/nonfeasance
distinction tells people that strangers have no duty to rescue. But if you
undertake to stop and then things go badly, law imposes liability. Every state
legislature has rejected those incentives by creating Good Samaritan laws. CDA 230 is also a Good Samaritan law.
[Zipursky’s account helped me see something that was
previously not as evident to me: The Good Samaritan-relevant behavior of a platform
is meaningfully different from the targets of those laws about physical injury
liability, because it is general rather than specific. Based on the Yahoo case,
we know that making a specific promise to a user is still enforceable despite
230; the argument for negligence/design liability was not “you stopped to help
me and then hurt me,” but “you stopped to help others and not me, proving that you
also should have stopped to help me”/ “you were capable of ordering your activities
so that you could have stopped to help me but you didn’t.” Good Samaritan
protection wasn’t necessary to protect helpful passersby from the latter
scenarios because passersby didn’t encounter so many of those situations as to
form a pattern, and victims just wouldn’t have had access to that information
about prior behavior/policies around rescue, even if it existed. In this
context, Good Samaritan and product design considerations are not
distinguishable.]
(c)(2) isn’t actually bothering most people [just you wait];
(c)(1) does. Problem is that there was no baseline for liability for platforms,
no clear rule about what happens if you own the virtual wall. Implications: (1) Zeran is correctly decided.
(2) This isn’t really an immunity. (3) If a platform actually says it likes a
comment, that’s an affirmative act to project something and there should be a
distinction. The rejection of active/passive was a mistake. [Which means that having something in search
results at all should lead to liability?]
(4) This was mostly about defamation, not clear how rest of common law should
be applied/what state tort law could do: 230 cut off development of common law.
Carrie Goldberg, Owner, C. A. Goldberg, PLLC: Current scope
limitless. Zeran interpreted 230 extravagantly—eaten tort law. Case she brought
against Grindr, man victimized by ex’s impersonation—thousands of men sent to
his home/job because of Grindr. Flagged the account for Grindr 50 times.
Services just aren’t moderating—they see 230 as a pass to take no action. Also
goes past publication. We sued for injunction/product liability; if they couldn’t
stop an abusive user from using the app for meetings that use geolocation, then
it’s a dangerous product. Foreseeable that product would be used by predators. Grindr
said it didn’t have tech to exclude users. Big issue: judge plays computer
scientist on MTD.
Annie McAdams, Founder, Annie McAdams PC: Lead counsel in
cases in multiple states on product liability claims. Our cases have horrible
facts. Got involved in sex trafficking investigation. Tech plays a role: meet
trafficker on social media, was sold on website, sometimes even on social
media. “Good Samaritan” sites process their credit cards, help them reach out.
Sued FB, IG; still pending in Harris County. Sued FB in another state court.
Still fighting about Zeran. FB doesn’t
want to talk about FOSTA/SESTA. Law has been pulled away from defamation using
languages from a few cases to support theories about “publisher.” Knowingly facilitating/refusing to take down
harassing content. Waiting on Ct of Appeals in Texas; Tex SCt ruled in their
favor about staying the case. Courts are embracing our interpretation of Zeran.
Saleforce case in Texas was consolidated in California; in process of appealing
in California.
If Congress wanted immunity, could have said torts generally,
not publisher, which is from defamation law not from Good Samaritan law.
Jane Doe v. Mailchimp: pending in Atlanta federal court. We
were excited to see DOJ seize Backpage but another US company assisted a
Backpage clone in Amsterdam.
Carone: Complaint on expansion beyond defamation is mistaken:
Congress intended breadth. Didn’t say defamation; wrote specific exceptions
about IP etc that wouldn’t have been necessary if it had been a defamation law.
Needs to be broad to avoid heckler’s veto/deterrent to responsible self-regulation.
Problem here is extraordinary volume of content. Kozinski talked about saving platform
from 10,000 duck bites; almost all these cases would fail under normal law.
Terrorism Act cases for example: no causation, actually decided on that ground
and not on 230. Victims of terrorism are
victims, but not victims of platforms.
Not speaking for clients, but sees immense efforts to deal with
problematic content. Google has over 10,000 eomployees. FB is moderating even
more but will always be imperfect b/c volume is far more than firehose. Need
incentives and policies that leave space for responsible moderation and not
destruction by duck bites. 230 does make it easy to win cases that would
ultimately be won, but only more expensively.
230 puts focus on wrongdoers in Goldberg’s case: the ex is the person
who needs to be jailed.
Kosseff: based on research with members, staffers, industry,
civil liberties groups: they knew it was going to be broad. No evidence it was limited
to defamation. 2d case argued was over a child porn video marketed on AOL. Some
of this discussion: “platforms” is often shorthand for YT, FB, Twitter, but
many other platforms are smaller and differently moderated. Changes are easier
for big companies to comply with; they can influence legislation so that (only)
they can comply.
Zipursky: Even though publisher or speaker suggests basic
concern with libel, agrees with K that it’s not realistic to understand 230 as
purely about defamation. Compromise? Our tort law generally doesn’t want to
impose huge liability on those who could do more to protect but don’t, even on
big companies. But not willing to throw up hands at outliers—something to protect
against physical injury. [But who, and how? Hindsight is always 20-20 but most
of the people who sound bad online are false positives. It’s easy to say “stop this one guy from creating
an account” but you can’t do that without filtering all accounts.]
Q: what changes do you see in tech and how does that change
statutory terms?
McAdams: broad statements about impossibility of moderation,
10,000 duck bites—there’s no data supporting this not paid for by big tech. Who
should be responsible for public health crisis? Traffickers and johns can be
sent to jail, but what about companies that knowingly benefit from this
behavior? May not need much legislative change given her cases. [Big lawyer
energy! Clearly a very effective trial lawyer; I mean that completely sincerely
while disagreeing vigorously with her factual claims about the ease of moderation/costs
of litigation for small platforms and her substantive arguments.]
Goldberg: Criminal justice system is a monopoly. It’s tort
that empowers individuals to get justice for harm caused. When platform
facilitates 1200 men to come & harass and platform does nothing, that’s an
access to justice issue. Not about speech, but about conduct. It’s gone too
far. Need injunctive relief for emergencies. Limit 230 to publication torts
like obscenity and defamation. Needs to
be affirmative defense. Plaintiffs need to be able to sue when companies violate
their own TOS. Grindr said it could exclude users but didn’t have the
tech. Exception for federal crimes is a
misnomer: these companies don’t get criminally prosecuted.
Carome: 230 isn’t just for big tech. 1000s of websites
couldn’t exist. If you want to lock in incumbents, strip 230 away. What’s allowed
on street corners is everything 1A allows: a lot of awful stuff. Platforms
screen a lot of that out. 230 provides freedom to do that.
Zipursky: caution required. Don’t go too crazy about
liability. Don’t abandon possibility of better middle path.
Kosseff: 230 was for user empowerment, market based
decisions about moderation. Is that working? If not, what is the alternative?
Too much, too little moderation: how do we get consensus? Is there a better
system?
No comments:
Post a Comment