The most typical case that implicates Section 230 of the Communications Decency Act (CDA) involves a provider that hosts content and a third party plaintiff seeking to have content removed. Last month, in a less typical case, a New York district court magistrate dismissed, with prejudice, discrimination and related claims against video-sharing website Vimeo, Inc. (“Vimeo”) based on Vimeo’s termination of a user account for posting objectionable videos. (Domen v. Vimeo, Inc., No. 19-08418 (S.D.N.Y. Jan. 15, 2020)).
Vimeo’s platform terms prohibit, among other things, content that “[c]ontains hateful, defamatory, or discriminatory content or incites hatred against any individual or group.” The terms also reference Vimeo’s Guidelines, which states that moderators will generally remove, among other things, videos that promote sexual orientation change efforts. The videos at issue were deemed by Vimeo as promoting sexual orientation change.
The plaintiff, founder of a religious organization, challenged Vimeo’s decision to remove his account, alleging that Vimeo censored plaintiffs’ videos and violated New York and California anti-discrimination statutes. The court found that Vimeo was immune from plaintiff’s claims based on two aspects of CDA immunity: the most commonly-pleaded, § 230(c)(1), which provides immunity for “online publishers” of third-party content, and also under § 230(c)(2), the “Good Samaritan” screening provision, which immunizes providers for good faith actions to police objectionable content.. The Vimeo court’s application of both provisions of the CDA is important for online providers that want to regulate third party content without fear of liability.
Publisher Immunity
In dismissing the claims based on § 230(c)(1), the New York court agreed with precedent from other circuits that held that users that seek to impose liability on online platforms when their accounts are terminated are treating such platforms as a “publisher” of third party content. Thus, the court found that Vimeo was acting as a “publisher” when it removed plaintiff’s content from its platform: “[S]ection 230 ‘bars lawsuits seeking to hold a service provider liable for its exercise of a publisher’s traditional editorial functions—such as deciding whether to publish, withdraw, postpone or alter content.’”
Good Samaritan Immunity
Section 230(c)(2), the Good Samaritan provision, allows online providers to self-regulate third party content without fear of liability. Section 230(c)(2)(A) grants immunity to interactive computer service providers that act in good faith to “restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected.” As the court stated, that section does not require that the material actually be objectionable; rather, it affords protection for blocking material that the provider or user considers to be “objectionable.” However, Section 230(c)(2)(A) requires that providers act in “good faith” in screening objectionable content. Accordingly, the court held that the Good Samaritan provision of the CDA independently also shielded Vimeo from liability for actions voluntarily taken by Vimeo to restrict access to plaintiff’s materials that Vimeo found to be objectionable. In applying its Guidelines to remove plaintiff’s videos, the court found plaintiff’s allegations of Vimeo’s bad faith to be unfounded.
It should be noted that the Vimeo holding was the second “Good Samaritan” interpretation in the past few months. On December 31, 2019, the Ninth Circuit released an amended opinion in Enigma Software Group USA, LLC v. Malwarebytes, Inc., No. 17-17351 (9th Cir. Dec. 31, 2019), a case that involved competing providers of filtering software. In that case, Enigma Software Group USA, LLC (“Enigma”) claimed that Malwarebytes, Inc. (“Malwarebytes”) configured its software to block users from accessing Enigma’s software in order to divert Enigma’s customers. Malwarebytes countered that its software legitimately classified some Enigma’s offerings as “potentially unwanted programs” and that its classification of Enigma’s software as “objectionable” was protected by a second prong of the CDA’s Good Samaritan provision, § 230(c)(2)(B). Specifically, § 230(c)(2)(B) states: “No provider or user of an interactive computer service shall be held liable on account of […]any action taken to enable or make available to information content providers or others the technical means to restrict access to [objectionable] material…” In reversing the lower court’s dismissal of claims under the CDA, the Ninth Circuit held that “the phrase ‘otherwise objectionable’ does not include software that the provider finds objectionable for anticompetitive reasons.” The Ninth Circuit concluded that providers have broad immunity for filtering actions under the Good Samaritan provision, § 230(c)(2)(B), but not “unfettered discretion to declare online content ‘objectionable’”.
In light of the extreme scrutiny that the CDA is under from the media and Congress, Section 230(c)(2) may offer social media platforms an avenue to increase the level of filtering of third party content, without waiving any of their protections under the CDA or setting a bad precedent. Perhaps stepped-up enforcement of their applicable terms of use, together with more vigorous review and removal procedures – all protected under the CDA – will be helpful in reducing the amount of harmful, objectionable, and deceptive content that is currently available on social media platforms.