Monday, February 24, 2020

DOJ 230 workshop part 3


Panel 3: Imagining the Alternative

The implications on competition, investment, and speech of Section 230 and proposed changes.    

Moderator: Ryan Shores, Associate Deputy Attorney General

Professor Eric Goldman, Santa Clara University: (c)(1) means no liability for 3d party content. Difference between 1st/3d party content isn’t always clear. (2) protects good faith filtering and (2)(b) also helps providers of filters. Exclusions: IP, federal criminal law, federal privacy, FOSTA sex trafficking. No prerequisites for immunity as w/DMCA, no scienter required for (1). Not claim-specific unless excepted. Common-law exceptions: (1) Roommates: when sites encourage/require provision of illegal content. (2) Failure to warn? (3) Promissory estoppel. (4) Anticompetitive animus.

Neil Chilson, Senior Research Fellow, Charles Koch Institute: Taxonomy of possible regimes: what type of bad thing are we concerned about? Is it illegal already or should it be? Who should be held liable? Person doing, person providing tools? In what situations: strict, participation in creation, knowledge, unreasonability? Can you get immunity back by taking action, e.g. by takedown after notice? Concerns about incentives created. How do we protect speech/public participation? Other countries don’t have 1A. Over-removal: ideal outcome is sorting legal/illegal, but it’s hard to align incentives to do that. Who makes the decision about legit speech remaining? Can companies decide for themselves to remove legal speech? Does our approach disadvantage specific business models?  What affects on legal certainty are there?

Possible legislative alternatives: (1) exemptions approach, like PLAN Act focusing on homesharing sites, (2) bargaining chip proposals: keep 230 if you do X; Hawley’s proposal for politically neutral content moderation/EARN IT for commission to define X.

David Chavern, President, News Media Alliance: 230 was designed to nurture new industry, became distortion: punishes folks who are willing to take responsibility for their content. News publishers’ responsibility for content wasn’t hindrance to our growth; we were pretty good at it [but see: Alabama in the civil rights era].  230 means our content is subject to extreme editorial control by major platform cos. Google News: someone has decided to surface different content for you than for me. Their business value is algorithmic judgments; they should be responsible for their judgments. They also make decisions about reach. Small slander w/no impact could reach 10 people or 10 million, they should be responsible for that. Anonymity: a design factor that prevents going after a speaker. If you’re a journalist, part of your job is being abused online w/no redress, esp. if you’re a female journalist.  Need incentives for quality, investment in quality content. Zuckerberg says FB is b/t a newspaer and a telecom pipe—but they can’t be neither. Not impressed by the billions of pieces of content: they built it, that’s their problem.

Julie Samuels, Executive Director, Tech:NYC: As we think about landscape, think through lens of smaller cos. Need to incentivize competition; 230 is crucial for that. Printing press allowed one to many and we’re in another fundamental shift moment to many to many. Worried that we think we can put genie back in bottle. It’s hard if certain industries don’t work like they used to but that can be ok.

Goldman: Elevate existing benefits, even if there are also costs. It is balancing; easy to overlook benefits. Millennials don’t know what they have: don’t take for granted what the internet provides. Benefits haven’t changed; we didn’t know what tech could do when 230 was enacted, but we don’t know what it can do now. 230 preserves freedom to see where we can go. Solves moderator’s dilemma, that if you try and fail you’ll be liable for having tried. 230 still lowers barriers to entry. Baseline is not “can we eliminate all online harms.” Internet as mirror: people are awful to each other all the time. Might be able to find ways to make us kinder: Nextdoor is trying algorithms to suggest kindness.

Chilson: Conservative principle of individual responsibility, not tool responsibility: the normal way we do things in the US. Tort law generally favors punishing actors over intermediaries—authors, not bookstores—social media users, not platforms. Unusual to hold one person responsible for acts of others; need good reason to do that. 230 doesn’t immunize produced content, as newspapers are liable for their own content. Google is liable for its own content; they just do different things. Services connect people on unprecedented scale. Participation in group for people parenting a child with clubfoot: b/c FB didn’t have to vet any post, that group exists and is greatly beneficial to participants. Can’t build a business model around that alone, but can build FB.

Pam Dixon, Executive Director, World Privacy Forum: Promoting voluntary consensus standards. Just finished a multiyear study on FERPA, has lessons learned. Striking that this area suffers from (1) lack of systems thinking and (2) lack of research on fact patterns. Systems thinking: people called in w/privacy harms in about 3-4 categories including (1) victims of domestic violence/rape, fleeing/trying to stay alive; (2) people with genetic based illness. It is rare to find a situation with one platform/issue; need system analysis: public records, health records, educational records, other platforms. Lack of fact patterning is a problem. OECD principles on AI: we all learned that we were correct in our own way. Disagreement is ok but can we find consensus? Individuals and organizations can lose trust in systems, platforms can lose trust in gov’t. In our interest to solve trust problems. Voluntary consens standards as a solution: not self-regulation. What if a more formal process allowed all stakeholders, not just the big guys, to find consensus on a discrete, observable, solvable problem?  Ability exists under OMB rules. FDA has recognized it for medical devices.

Q: some proposals have carveouts for small & medium entities. OK?

Samuels: size carveouts are worrisome. Small isn’t automatically good. Swiss cheese approach. Small startups have big legal costs for handling all kinds of issues; 230 is good at the pleading stage by making MTDs relatively cheap, otherwise survival becomes difficult. Compare to the patent troll problem: cottage industry of suing SMEs.

Chavern: we’re the only business mentioned in the 1A. Incremental approach is justified. A few platforms matter more to society. Not a lot of search, or social media, startups. Great scale = great responsibility. Not irrational to start there.

Chilson: threshold concern: current antitrust investigation is about search/social media killzone. If you have a threshold at which content moderation becomes required, then the only safe way to cross that threshold will be to get acquired. That’s not good. Big players are younger than many in this room; they can come and go if competitive environment doesn’t cement their market power into place.

Dixon: carveouts have unintended consequences. Right now no unitary privacy test done by carveouts: should do that. Voluntary standards can ID all stakeholders & discuss better solutions. Standard would be there if you want to adopt it, not if you don’t.

Goldman: there are small companies in top 15 services, like Craigslist, Wikipedia, Reddit. Some large cos have small UGC presence. Easy to wrongly trip threshold. Concept great, translating hard.

Q: F/x on speech?

Chavern: Many complaints about speech we don’t like, not all of it illegal. Freedom of speech isn’t freedom of reach. No inherent problem with asking companies to be accountable about the act of deciding what to disseminate. They’re deciding what you get to see, should be accountable for that like a publisher. Weird that they get immunity for commercial decisions that help their product. Unsustainable.

Samuels: That looks like a fundamentally different internet experience. [Consider if you, an individual, had to have your posts go through FB’s libel review before they’d post.] Social networks would be total chaos without moderation etc. Real impact on users. Social movements and connections happen now in incredible ways. Need to talk about end user experience.

Goldman: 230 can be the solution of how we interact as humans; enables developments of better tools, services taking action on Gab. Users on Gab, however, did have chilled conversations as a result. This is not free. 230 enables diversity of editorial practices, not all like traditional media. Finding communities that understand one another.

Dixon: Points to need for additional research and fact patterning. Predictive speech is a coming issue.


No comments: