HB Ad Slot
HB Mobile Ad Slot
YouTube Protected by CDA Immunity over Claims That It Provided Material Support to Terrorists
Tuesday, November 14, 2017

Following the reasoning of several past decisions, a California district court dismissed claims against Google under the Anti-Terrorism Act (ATA), 18 U.S.C. § 2333, for allegedly providing “material support” to ISIS by allowing terrorists to use YouTube  (temporarily, before known accounts are terminated) as a tool to facilitate recruitment and commit terrorism.  (Gonzalez v. Google, Inc., 2017 WL 4773366 (N.D. Cal. Oct. 23, 2017)). The court rejected the plaintiffs’ arguments that Google provided the terrorists with material support by allowing them to sign up for accounts (or regenerate shuttered accounts) and then allegedly serve targeted ads alongside such posted videos.  It ruled that even careful pleadings cannot change the fact that, in substance, plaintiffs’ attempt to hold Google liable as a publisher of the terrorist’s detestable content was barred by Section 230 of the Communications Decency Act (“CDA Section 230” or “CDA”).   

Congress provided immunity under Section 230 to online service providers for all claims stemming from third party content  appearing on or through the service provider’s platform (“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”).  Generally speaking, courts have construed the immunity provisions in Section 230 broadly in cases arising from the publication of user-generated content. In short, Section 230(c)(1) protects: (a) a provider or user of an interactive computer service (b) that the plaintiff seeks to treat as a publisher or speaker (c) of information provided by another information content provider. The CDA’s immunity is more fully explained here.

The plaintiffs, family members of a victim of the Paris attacks, sought to hold Google liable under the ATA for providing material support to ISIS via its YouTube platform.  Even though Google regularly suspends or blocks extremist content, the plaintiffs claimed that Google did not make “substantial or sustained efforts to ensure that ISIS would not re-establish the accounts using new identifiers.” Google moved to dismiss all claims on the ground that CDA Section 230 bars any claim that seeks to hold an online service provider liable for injuries allegedly resulting from its hosting of third-party material.  According to Google, plaintiffs’ theory that Google did not do enough to remove such extremist content and associated accounts necessarily targeted Google’s “traditional editorial functions” such as whether to publish, withdraw, postpone, or alter content.  In rebuttal, plaintiffs contended that Google was not entitled to CDA immunity in this case because the statute does not apply to extraterritorial acts and that, based upon certain actions, Google should be considered the creator of the videos at issue.

Ultimately, though highly sympathetic to the plaintiffs, the court dismissed the complaint, with leave to amend, and ruled that the claims were barred by CDA Section 230. The court first rejected the argument that the CDA does not apply extraterritorially to international events, finding that the scope of CDA immunity is not determined by the location of relevant events but where redress is sought (i.e., U.S. courts).

The court also found Google was protected by CDA immunity because plaintiffs’ claims implicated defendant’s role, broadly speaking, as the publisher or speaker of ISIS’s YouTube videos.  In an attempt to bypass the CDA, the plaintiffs argued that their claims were not based on the content of the videos or the failure to remove the videos, but were instead based on Google’s provision of YouTube accounts to ISIS.  The court found that such a theory attempts to penalize Google on its decision to permit third parties to post content, which treats Google as the publisher of the extremist videos.  The court also held that seeking to impose liability on Google for allowing users to recreate accounts which Google has already disabled (or failing to adopt a strategy to defeat activity such as account reconstitution and bulk friend/follow request) also treats Google as the publisher of the content because Google’s choices as to who may use its platform are inherently bound up in its decision as to what may be said on its platform – as echoed in precedent, decisions relating to the monitoring, screening, and deletion of content are actions quintessentially related to a publisher’s role.

Lastly, the court could find “no support in the case law” to bolster plaintiffs argument that Google acts as an “information content provider” by allegedly placing targeted ads next to the videos at issue; in the court’s reasoning, “Google’s provision of neutral tools, including targeted advertising, does not equate to content development under Section 230,” because as currently alleged, the tools did not encourage the posting of unlawful material or “materially contribute” to the content’s alleged unlawfulness.

Following the ruling, the plaintiffs filed a third amended complaint in an attempt to plead a cognizable claim that can survive a CDA defense.

The presence of content that promotes terrorism on social media sites has stirred up a host of legal, moral and technological issues for social media platforms.  In recent years, after calls for greater action to combat the spread of online terrorist videos and social media accounts, the major social media services redoubled efforts to take down terror-related content on their platforms.  For example, Google announced this past summer its continued commitment to improve its systems to more quickly detect and remove violent extremist content on YouTube, and other social media services have removed hundreds of thousands of terrorist accounts through improved automated systems and the hiring of more human reviewers.  Yet, pressure continues from government, advertisers and others to flag extremist content more rapidly, preclude reposting of the same offensive content and otherwise foil efforts by terrorist groups to use social media as a propaganda tool.  On the legal front, with members of Congress threatening more regulation of the major social media platforms, there is clearly momentum behind an effort to make social media platforms more responsible for making this type of content available, evidenced by S.1693 (commonly known as SESTA), that cleared the Senate Commerce Committee last week and would limit CDA immunity for online services that knowingly host third party content related to child sex trafficking. However, unless there is a legislative amendment to Section 230 or a fundamental change in judicial interpretation of Section 230, it seems that social media platforms will remain protected by CDA immunity from lawsuits such as Gonzalez.

We will keep you posted as to the status of SESTA  and any other developments that address this issue.

HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 
NLR Logo
We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins