Monday, February 24, 2020

DOJ 230 workshop part 2


Panel 2: Addressing Illicit Activity Online

Whether Section 230 encourages or discourages platforms to address online harms, such as child exploitation, revenge porn, and terrorism, and its impact on law enforcement.

Moderator: The Honorable Beth A. Williams, Assistant Attorney General Office of Legal Policy

Yiota Souras, Senior Vice President and General Counsel, National Center for Missing and Exploited Children: One main program is Cyber Tipline, reporting mechanisms for public/ISPs to report suspected child sexual exploitation. We analyze and make available to law enforcement. Receive reports including CSE, trafficking, enticement, molestation. Largest category: CSAM. Tremendous growth in reports: 2019, just under 17 million, w/over 69 million files including video and images. Continues to grow. Many preverbal children as well as younger teens.

Professor Mary Anne Franks, University of Miami: Cyber Civil Rights Initiative: aimed at protecting vulnerable populations, online exploitation, harm to women/sexual minorities/racial minorities. Civil rights relate to tech. On nonconsensual pornography, active in (1) legislation where needed, (2) working with tech cos on policies, (3) general social awareness. For all tech’s good, have to be attentive to social media amplifying abuse and civil rights violations. Bad actors, bystanders, accomplices, those who profit from bads and hide under shield. Model statute issue: faced pushback from tech & civil liberties groups. Many states take this issue seriously, thanks to brave victims; rapid development in state law. Now up to 46 states & DC with restrictions. Not solving problem: in many states the law is too narrow. Many states require personal intent to harm victim, which is not how the internet works. Average revenge porn site owner doesn’t intend to harm any given person, just doesn’t care/interested in profits/voyeurism. 79% cases aren’t personal intent to harm.

230 is the other big problem: trumps state criminal law. Only way to maneuver around is federal criminal law on nonconsensual porn. We’ve introduced a bill, not voted on yet.

Q. re reposting as harm to victims.

Franks: That’s one of the most severe aspects of attack: infinite replicability. Much is initially obtained nonconsensually, via assault or secret recording, or distribution without consent. It’s a harm each time. What happens when a search on one’s name reveals all porn. 230 isn’t fulfilling its goals for good samaritans. 230 doesn’t distinguish between helpers, bystanders, and thieves. Intermediaries solicit, encourage, amplify violations. Also domestic terrorism, misogyny, disinformation: harm to democracy/erosion of shared responsibility for terrible actions.

Q: FOSTA/SESTA tried to address this for CSE. Impact?

Franks: we don’t see impact b/c we deal with adult victims, not trafficking but privacy. Piecemeal tinkering on one bad form isn’t best way to reform, makes unwieldy and sets up hierarchy of harms. Sex trafficking isn’t the only bad.

Souras: we’ve seen Backpage go down, overlapped w/enactment of FOSTA/SESTA. Immense disruption in market for child sex trafficking, which continues. Feds did move against Backpage, no single co has risen up to fill that lucrative gap. We’d love to see more federal action but there is deterrence.

The Honorable Doug Peterson, Attorney General of Nebraska: Trafficking online: federal prosecutors were very active in Nebraska; developed a state law. No revenge porn prosecutions yet but can see issues with drug sales and fraud, limited by 230. Nat’l Ass’n of AGs proposal: allow states and territories to prosecute, just like feds: simple solution. Acceleration of online crimes is significant, especially for young people targeted by apps. Feds require a certain threshold; need to get aiders/abettors.

Q: challenges to law enforcement?

Peterson: some platforms: good cooperation. Murder case on Tinder, which was v. helpful. Google & others have informed us and allowed prosecution, esp. child porn. Enabled more thorough investigation.

Matt Schruers, President, Computer & Communications Industry Association: What platforms are doing: over 100,000 people focused on trust and safety. Large services have elaborate & sophisticated tech tools, frequently made available to others. Participates with NCMEC and other private sector initiatives. 10s of millions of reports to law enforcement. More investement can and should be done—not industry alone. Many cases industry refers to law enforcement don’t result in action: fewer than 1500 cases.

Q: why do companies report?

Schruers: no one wants service to be used for illegal activity, regardless of law. There are bad actors, but a number of cases illustrate that services that solicit/participate in unlawful content lack 230 protection.

Q: what about bad samaritans who don’t report their knowledge: should industry set standards?

Schruers: There’s a role for best practices, much of which is going on now. Don’t generalize a few bad actors.

Q: does 230 mean companies aren’t obligated to remove harmful content?

Soares: most companies have separate reporting obligation for CSAM. But co can choose to moderate or not; protected if they moderate or they can moderate sporadically. Incentive promise has become aspirational. There are cos that are partners, do tremendous work, but others turn the other way recklessly.

Q: when did industry recognize existing problem and what did it do?

Professor Kate Klonick, St. John’s University: doesn’t represent any co. Her work doesn’t focus predominantly on illegal content but on harmful/violation of community standards. There’s a huge difference in top 3 cos and many sites discussed today. Different incentives to keep up/take down. FB etc. seek to make platforms what people want to see over breakfast. Many incentives to remove bad content—economic harms from bad media, users, advertisers who don’t want ads to run against CSAM or revenge porn. Techlash in which it’s easy to gang up on platforms. Since 2008 FB has been very robust on systems & processes to avoid these. Not all tech/platforms are the same.

Peterson: AG of Pa had Tree of Life mass shooting; D was using Gab before he struck. Looked at Gab’s engagement, but Paypal and GoDaddy reacted quickly and industry response was so quick there was nothing to go after.

Schruers: 230 protects those decisions by service providers. Undermine that=no incentive to cut off.

Franks: distinguish b/t (c)(1) and (c)(2) [of course if you only had (2) then any failure is held against you if you were correct once before]. No incentive to act like good samaritans. They only grew a conscience after public pressure in response to victims. Could have been avoided in first place if design had been less negligent. Why should any of us be at the mercy of corporations to see whether firearms are sold to a mass shooter? (c)(1) doesn’t do anything to encourage cos to do better. Google is not a clean, well lit place, nor is Twitter, if you’ve been attacked. Some people have always had privacy, free speech, and ability to make money. But civil rights is about who’s been left out. Descriptively not true that internet is by and large a good place.

Q: Klonick says economic incentives align with moderation for some. What to do about other companies where there’s a market for revenge porn and CSAM?

Klonick: agree w/Shield Act: there are things to be done with regulation and companies. This is a norm setting period: what to make of what’s happening. Tech moves forward and our expectations change again. Concern over acting quickly; hard to know ramifications.

Q: does 230 address safety?

Schruers: these trust and safety programs are not new. More can & should be done. Prepared to engage w/ law enforcement; predate recent bad press, part of doing business. There are a few bad actors, not entitled to 230, which creates exactly the right incentives by allowing policing w/o fear of liability. (c)(1) does create issues when content is not taken down, but if it were gone, there’d be nothing but takedowns, suppressing marginal voices and unpopular views. We see this in other jurisdictions; no protection for lawful but unpopular viewpoints. Requires balancing; there will be missed calls.

Q: what does more can & should be done mean?

Schruers: Asymmetry between reports & prosecutions; new tools to be shared. Engaging w/IGOs around the world, OECD cooperation to measure and respond to problems.

Q: CSAM reports grew a lot last year. How is there still so much?

Souras: there is tremendous work being done by largest companies, typically the best screeners & reporters. Once we drop off top 4-6 companies, there are 1000s of platforms around the world—chat, filesharing. One problem: there is no level set. Moderation is helpful but completely voluntary. Many choose not to screen. Larger companies also inconsistent over time/across platforms/lack transparency.   When we talk about 100,000 duck bites, there’s a harmed person behind every one of those cases even if also a business cost.

Q: Is automation/AI the answer? Small business burdens?

Souras: We have supported tests of AI/ML. We are far away from that eliminating the proliferation.

Q: why so far away? Zuckerberg says 5-10 years.

Franks: there will always be promises around the corner. Human judgment is required. Have to stop illusion of control from tech tools. Problems are structural/design problems. Whether cos recognized the problem 10 years ago or now, this is the world 230 built. Do we think we’re living in best possible world? Only people who aren’t sent death/rape threats can speak freely because laws don’t stop threats and abuse from happening. Imagine any other industry killing people w/toxic products getting away w/it and promising to fix it later. FB Live was used to livestream murderes and rapes. Zuckerberg didn’t think it would be misused. That’s unacceptable as an answer. Industry has been treated like gun industry—immune from all harm caused. How long will we allow this? Don’t look to tech for how serious the problem is. Industry keeps promising tools but law is about changing human behavior for good. We’ve seen that status quo has failed.

Klonick: The internet is everything that makes you mad about humanity. Zuckerberg didn’t murder or rape anyone. He created transparency so now we see how terrible we all are and now you want tech cos to clean it up for you. Tech cos don’t make murder a product, they surface action that has already taken place.

Schruers: Role of tech: sometimes held out as perfectable, but not a cureall for humans. Journey, not a destination; ML/AI is being deployed as we speak. They have false positives and false negatives. This requires both tech and people.

Peterson: talk is cheap. Deeds are precious. Mississippi AG’s concerns about prescription drugs, for which he sent Google CIDs, were rejected and Google went immediately to 230. Message to AGs: you wont’ see behind our walls. Tired of good intentions; would prefer cooperation.

Q: carveouts for federal prosecution?

Peterson: we work w/DOJ a lot; complement each other. We can deal with smaller operations where DOJ may not have bandwidth. [Smaller operations … like Google?]  Request to add states/territories to exclusion is important b/c a lot of these are small operators. [There’s a lot of slippage here: is there a website that is just one guy trafficking that isn’t also a content provider?]

Franks: No one is saying Zuckerberg is responsible for murder, but there is accomplice/collective liability. [So FB is responsible for murder?] Intermediaries aren’t directly causing, but promoting, facilitating, and profiting from it. Collective responsibility: it takes a village to harass, cause a mass shooting, use revenge porn. No need for complete difference from real world rules.

Q: Encryption and CSAM: even if services don’t want it, they can’t see it.

Schruers: Volume of reports shows that’s not the case. These aren’t the only threats: beyond problematic content, fraud, crime, foreign adversaries mean that other tech tools are required, one of which is encryption. Safe communications protects user info: 82d Airborne in Iran is using E2E app Signal widely used for secure communications because overseas communication networks could be penetrated and transmissions b/t gov’t devices aren’t secure. Encryption has a variety of lawful purposes: protestors, jurisdictions w/problems w/rule of law. Balancing needs to be done but encryption is a critical tool.

Q: FB Messenger could hide millions of reports.

Souras: E2E is necessary for some things but there has to be a balance. 17 million reports: if we were in E2E environment for Messenger we’d lose 12 million reports—children raped, abused, enticed undetected. There has to be a compromise w/encryption rollout, or we lose 12 million children. [Each report apparently reflects a different child. It is clearly correct to say that encryption can be used for bad things as well as good. But the whole day I never heard anyone explain what the balance would be if we have to balance: do we only allow people we trust to use encryption? How does that work, especially given what we know about how trust can be abused? Do we only allow financial services to use encryption? How does that work? I don’t know whether encryption does more harm than good or how you’d even weigh the bads against the goods. But “there must be a balance” is not a plan.]

Klonick: PhotoDNA worked for a while; deplatforming means that groups move and get smaller and narrower. Encryption does allow that. Autocrats have learned to use platforms for surveillance and harm, and E2E helps with that too. We need to think about full ramifications.

Q: should 230 be amended?

Klonick: works as intended. Was not just for startups: was explicitly for telecoms, libraries, public schools. Nor was encryption not contemplated: 1996 was mid-Crypto Wars I. Lots of sources exist outside of encryption. These are critical tools for other equally serious threats. Mistake to amend.

Peterson: Our proposal is simple: give us ability to support criminal laws.

Schruers: 230 doesn’t prevent law enforcement action by states. Prevents action against ISPs. If they’re direct actors, states can go after them too. Fundamentally interstate commerce protection: services should be dealt w/at federal level. If answer is resources, provide more federal resources.

Peterson: let us go after bad actors aiding/abetting criminal acts to clean up industry instead of waiting for industry to clean up itself.


No comments:

Post a Comment