Friday, September 13, 2019

9th Circuit drives big hole through 230(c)(2) immunity


Enigma Software Group USA, LLC v. Malwarebytes, Inc., --- F.3d ----, 2019 WL 4315152, No. 17-17351 (9th Cir. Sept. 12, 2019)

Section 230(c)(2) “immunizes computer-software providers from liability for actions taken to help users block certain types of unwanted, online material,” including sex, violence, and material that is “otherwise objectionable.” “We have previously recognized that the provision establishes a subjective standard whereby internet users and software providers decide what online material is objectionable.”

The parties compete in the market for software that help internet users filter unwanted content from their computers. Enigma alleged that Malwarebytes violated the Lanham Act and New York state law by configuring its software to block users from accessing Enigma’s software in order to divert Enigma’s customers. The district court found this covered by 230(c)(2), but because the parties are competitors, the majority (over a dissent) disagreed.  “Otherwise objectionable” isn’t broad enough to encompass an anticompetitive motive.  Malwarebytes argued that its reasons were legitimate, but Enigma’s allegations of anticompetitive animus were sufficient to avoid a motion to dismiss.

Eric Goldman is gonna hate that.

The court also, correctly, held that just because the Lanham Act claim was a Lanham Act claim didn’t bring it within §230’s exception for “any law pertaining to intellectual property.” The Lanham Act covers trademarks and false advertising; the former fall within the IP exception and the latter doesn’t.

In its recitation of the legislative history and caselaw, the majority drops a line that is going to prove particularly destructive of (c)(2) immunity: “What is clear to us from the statutory language, history and case law is that the criteria for blocking online material must be based on the characteristics of the online material, i.e. its content, and not on the identity of the entity that produced it.”  (What happens when a provider says “this entity has produced objectionable content in the past and we are therefore going to screen material from this entity”?  Does the majority really mean that screening has to be applied on an item by item basis?  Does that mean you can’t block an entire website, perhaps even for things that are explicitly listed in (c)(2) like violent content, unless each page has objectionable content, since blocking an entire website focuses on the entity?  Honestly, I can see a case for that rule—but it seems like something we should talk about.)  Where the OSP at issue is a host, however, the identity of the identity that produced content is a classic publisher consideration and (c)(1) immunity should be unaffected.  Eric Goldman has identified a shift from §230(c)(2) to (c)(1) in many situations where (c)(2) could in theory apply; this language will only harden that shift.

Facts: Malwarebytes software searches for what it calls Potentially Unwanted Programs (PUPs), including software that contains “obtrusive, misleading, or deceptive advertisements, branding or search practices.” If the user tries to download a program that Malwarebytes has determined to be a PUP, a pop-up alert warns the user of a security risk and advises the user to stop the download and block the potentially threatening content. “In their first eight years as competitors, neither Enigma nor Malwarebytes flagged the other’s software as threatening or unwanted. In late 2016, however, Malwarebytes revised its PUP-detection criteria to include any program that, according to Malwarebytes, users did not seem to like…. Malwarebytes’s software immediately began flagging Enigma’s most popular programs—RegHunter and SpyHunter—as PUPs.” Enigma alleged that its programs are “legitimate”, “highly regarded”, and “pose no security threat,” and that it’s lost customers and goodwill from Malwarebytes’ deceptive practices.

Judge Fisher’s concurring opinion in the 9th Circuit’s previous case considering §230(1)(c), Zango, warned that extending immunity beyond the facts of that case could “pose serious problems,” allowing a content provider to “block content for anticompetitive purposes or merely at its malicious whim.” District courts have disagreed on whether Malwarebytes can be sued for its blocking and how expansive Zango is. Allowing Malwarebytes to block based on “anticompetitive” motives would be “contrary to CDA’s history and purpose,” which included an express congressional aim “to preserve the vibrant and competitive free market that presently exists for the Internet and other interactive computer services” and to “remove disincentives for the development and utilization of blocking and filtering technologies.”

The point was to help consumers, who “must trust that the provider will block material consistent with that user’s desires. Users would not reasonably anticipate providers blocking valuable online content in order to stifle competition.” Immunizing anticompetitive blocking would therefore [?] also conflict with the express policy of “removing disincentives for the utilization of blocking and filtering technologies.”

However, “otherwise objectionable” was broader than the rest of the categories in the statutory list: “obscene, lewd, lascivious, filthy, excessively violent, harassing or otherwise objectionable.” Thus, the majority rejected Enigma’s argument that its software has no such content, and that Malwarebytes definitively couldn’t claim immunity for blocking it. Under ejusdem generis, when a generic term follows specific terms, “the generic term should be construed to reference subjects akin to those with the specific enumeration.” But the specific categories listed in § 230(c)(2) “vary greatly: Material that is lewd or lascivious is not necessarily similar to material that is violent, or material that is harassing. If the enumerated categories are not similar, they provide little or no assistance in interpreting the more general category.”  Anyway, even if ejusdem generis did apply, Enigma’s interpretation failed. Congress identified “harassing” as one of the problematic categories, and spam, malware and adware are close enough. [So if Malwarebytes  succeeds in showing that it reasonably categorized Enigma's software as such, that's enough to win.] But the majority wasn’t making a final ruling on the relationship between “otherwise objectionable” and the other listed categories. It’s merely that “if a provider’s basis for objecting to and seeking to block materials is because those materials benefit a competitor, the objection would not fall within any category listed in the statute and the immunity would not apply.” Key takeaway: now we fight about what else is like "anticompetitive" and thus not legitimately "otherwise objectionable," since the majority has left the issue open (except to the extent you think identity v. content is the holding).

Malwarebytes argued that it had legitimate reasons for its acts and that Enigma’s programs, SpyHunter and RegHunter, use “deceptive tactics” to scare users into believing that they have to download Enigma’s programs to prevent their computers from being infected. This is a factual dispute.

Judge Rawlinson dissented. The CDA is broadly worded; Congress hasn’t acted to clarify it; and the statute should be applied according to its provisions. “[N]othing in the statutory provisions or our majority opinion in Zango supports” limiting (c)(2) when the parties are competitors. “The majority’s real complaint is not that the district court construed the statute too broadly, but that the statute is written too broadly. However, that defect, if it is a defect, is one beyond our authority to correct.” The dissent pointed out that, although the parties in Zango weren’t direct competitors, the plaintiff asserted similar anti-competitive effects, but that didn’t matter there.

1 comment:

  1. “What is clear to us from the statutory language, history and case law is that the criteria for blocking online material must be based on the characteristics of the online material, i.e. its content, and not on the identity of the entity that produced it.”

    Seems like that's counting on "harassing" to pick up all of the spammers, astroturfers, duplicate-content posters and others of that ilk whose content is moderated out based - at least primarily - on their identities.

    ReplyDelete