HB Ad Slot
HB Mobile Ad Slot
The Communication Decency Act and the DOJ’s Proposed Solution: No Easy Answers
Friday, June 19, 2020

Section 230 of the Communications Decency Act (“CDA”), 47 U.S.C. §230, enacted in 1996, is often cited as the most important law supporting the Internet, e-commerce and the online economy. Yet, it continues to be subject to intense criticism, including from politicians from both sides of the aisle. Many argue that the CDA has been applied in situations far beyond the original intent of Congress when the statue was enacted. Critics point to the role the CDA has played in protecting purveyors of hate speech, revenge porn, defamation, disinformation and other objectionable content.

Critics of the CDA raise valid concerns.  But what is the right way to address them? One must remember that for organizations that operate websites, mobile apps, social media networks, corporate networks and other online services, the CDA’s protections are extremely important.  Many of those businesses could be impaired if they were subject to liability (or the threat of liability) for objectionable third party content residing on their systems.

The criticism surrounding the CDA hit a fever pitch on May 28, 2020 when the President weighed in on the issue by signing an Executive Order attempting to curtail legal protections under Section 230. While the Executive Order was roundly labelled as political theater – and is currently being challenged in court as unconstitutional – it notably directed the Justice Department to submit draft proposed legislation (i.e., a CDA reform bill) to accomplish the policy objectives of the Order. This week, on June 17, 2020, the DOJ announced “that the time is ripe to realign the scope of Section 230 with the realities of the modern internet” and released a document with its recommendations for legislative reform of Section 230.  This is on the heels of a recent initiative by several GOP lawmakers to introduce their own version of a reform bill.

In its recommendations, the DOJ stated that, in its view, the benefits of CDA immunity to the online ecosystem should not outweigh the cost of enabling serious offenses without giving affected individuals a more substantial form of redress. The DOJ believes that Section 230 should distinguish between its “core immunity” for defamation-type torts and those claims against providers that knowingly host or are willfully blind to third-party content that facilitates federal criminal activity.

The DOJ’s 28-page recommendations outline several proposals for reforms that, in its view, provide “incentives for online platforms to address illicit material on their services, while continuing to foster innovation and free speech.”  Some highlights include the following:

  • Enact “Bad Samaritan” Carve-Outs. The DOJ suggests that the CDA should not be extended to so-called “bad actors.”  This would include explicit carve-outs for civil claims concerning “egregious conduct,” such as child exploitation, terrorism and cyber-stalking, or in instances where a platform purposefully facilitates or solicits third-party content or activity that would violate federal criminal law. In particular, the recommendations propose that online platforms should not be protected by the CDA in cases where the platform “had actual knowledge or notice that the third party content at issue violated federal criminal law or where the platform was provided with a court judgment that content is unlawful in any respect.”  To assist providers with gaining knowledge of illegal content, the DOJ recommends that providers be required to offer users a tool to flag such content. The proposal also suggests that the law be amended to expressly state that the CDA would not prevent the federal government from pursuing civil enforcement actions against platforms regarding unlawful content. Of course, as noted by the DOJ, the CDA currently precludes immunity in a federal criminal enforcement action.
  • Narrow Good Samaritan Protections. This proposal would narrow the Good Samaritan protections for online providers under 47 U.S.C. §230(c)(2) by eliminating the broad catchall in the statute that affords providers leeway in filtering decisions to screen out “objectionable content” and replace it with examples of content which can be screened out with impunity, namely content that is “unlawful” or “promotes terrorism.” The proposed change would also define “good faith” within §230(c)(2) to limit content moderation decisions to those done within online terms of service and expressly provide that content removal done in accordance with online terms would not render a platform the publisher or speaker of third party content.
  • Preserve Antitrust Claims. The proposal also suggests that the Good Samaritan provisions of the CDA, which protect providers from suits over filtering of objectionable content and for offering filtering tools to others, should not be interpreted as barring antitrust-related claims.

The Difficulty of Striking the Proper Balance

It remains to be seen how much traction the DOJ’s proposal gets in Congress, particularly in an election year and as legislators are currently focused on the virus response and the economic slowdown. Certainly, the most ambitious attacks to CDA immunity, on first glance, appear counterproductive when it comes to “cleaning up the internet” and would not be widely embraced. However, a bipartisan group of legislators could perhaps coalesce around a narrow set of limitations to CDA immunity that root out troubling content circulated online without otherwise affecting a dynamic online realm. Even this modest goal can be difficult as it’s not always apparent which providers are the “bad actors” cited by the DOJ.

So, where should the line drawn?  The passage of FOSTA drew one line, and the DOJ’s proposal urges that additional areas of undesirable third party content should also be excised from the CDA’s protection. Are we comfortable with the fact that the vibrancy of the internet may be diminished, that “good” platforms that foster the online marketplace of ideas would likely be adversely impacted and possibly forced out of business, and that the online experience will be more limited?  Are we comfortable with a legislative definition of “good” and “bad” content? Or, rather, for the greater good of our society and to sustain greater exchange of ideas online, we are willing to maintain the CDA’s broad immunity, even knowing that occasionally those of questionable ethics or moral standing will benefit.

Some Illustrations of the Challenge

A review of the decades of CDA decisions clearly illustrates the problem we face in navigating this issue, as broad immunity has been afforded to both the vile and the virtuous.  Indeed, as the DOJ states, under the current expansive interpretation of Section 230, even websites designed to promote or facilitate illegal conduct can still enjoy the protections of Section 230 immunity (citing both the Armslist case, where the Wisconsin Supreme Court ruled that the CDA barred tort claims against a classified advertising website where an individual purchased what turned out to be a murder weapon from a private seller, and Sixth Circuit’s TheDirty ruling which found that the CDA protected a trashy gossip site from defamation claims).  But these cases do not tell the entire story, for online platforms do not always fall into easy categories. For example, a review of two recent cases — the application of the CDA in another case involving TheDirty, juxtaposed against the application of the CDA to  protect a #MeToo era whisper network —  illustrates the difficulty in drawing that line.

TheDirty is a website that has served as the posterchild for what some people view as the CDA’s overbroad application to protect less-than-honorable conduct by service providers. We last wrote about the site with respect to the decision cited by the DOJ in its recommendations. In that 2014 opinion, the Sixth Circuit ruled that even though the gossip site selected and edited user-generated posts for posting and added non-defamatory, albeit sophomoric, comments following each post, the site was protected by CDA immunity because it was neither the creator nor the developer of the challenged defamatory posts and did not materially contribute to the defamatory nature of the user postings. This past April, an Arizona district court dismissed another lawsuit against the operator of TheDirty for hosting an allegedly defamatory third party posting. (Laake v. Dirty World LLC, No. 19-05444 (D. Ariz. Apr. 14, 2020)). In the case, the plaintiff brought defamation claims and copyright claims against the website operator, Dirty World, LLC (“Dirty World”), for displaying an anonymous post that contained false claims about the plaintiff alongside plaintiff’s own acting headshot that the anonymous poster had apparently screen-grabbed from the web. The complaint argued that Dirty World “specifically encourages” users to upload “dirt” and unfounded accusations and speculated that perhaps Dirty World was the actual author of the post.

The content of the TheDirty website can be characterized as offensive and misogynistic, appealing to a particular audience, and as TheDirty has proven, a site of its nature can use the CDA to shield itself from civil liability for the third party content that it hosts. Simply put, as an interactive computer service displaying third party content, it was entitled to protection of CDA Section 230. But if Congress decided to carve providers like TheDirty from the CDA, the amendment could sweep away protection for users and operators of other social media or user communication platforms, including sites that are the antithesis of TheDirty.  For example, in one recent case, Comyack v. Giannella, No. SOM-L-1356-19 (N.J. Superior Ct., Somerset Cty, Apr. 21, 2020), the court dismissed defamation claims against members of a women’s network that disseminated and republished third party reports about the plaintiff-bartender’s allegedly predatory behavior that had been previously posted on social media and an industry website/bulletin board. Among other things, the court applied CDA Section 230 to dismiss the defamation claims, finding that CDA Section 230 barred such claims because the network contained mere republications of the original non-party posts (with limited or no commentary) and the members of the women’s network could be considered “users” of an interactive computer service.  The court rejected the plaintiff’s argument that the network members should not be protected by the CDA, concluding that even if the defendants “reposted messages originated by anonymous users or users whose identities were known to them does not affect the broad immunity granted to them by the CDA.”

As these cases illustrate, how will a judgement call be made – what is “good” content and what is “bad” content?

* * *

These are just some a few of the many recent CDA cases decided recently, but these examples illustrates how difficult it is to surgically amend the CDA to eliminate “bad” content while maintaining robust protection for the beneficial internet.

Whatever the approach, it will not be easy. Congress has lately had trouble reaching consensus on technology-related and privacy-related legislation, so the DOJ’s CDA reform proposal may not likely advance toward any meaningful debate.  The drumbeat calling for some regulation of the excesses of the internet has been steady for several years, so it’s not entirely improbable that a narrow CDA reform bill could see support (recall, Congress overwhelmingly passed FOSTA in 2018). Still, akin to the First Amendment, regulating freedom can be a messy affair, and the carve-outs for unwanted content that the DOJ proposes, if not carefully drafted, can have far-reaching effects. Other proposals heard over the years for a content takedown regime modeled on the DMCA safe harbors could potentially be misused.  Remembering the adage of “bad facts make bad law,” we think that it is important for the dialogue on amending the CDA be robust and take into consideration all of the viewpoints at issue, rather than be a knee-jerk political reaction to any particular distasteful application of the law.

One thing is certain. The CDA – intended to serve as a shield against liability – has been the source of countless legal disputes over the years in often-doomed attempts to pierce that shield.  If that shield is ultimately chipped away by Congress, the litigation surrounding the CDA – including both frivolous, new-fangled attempts at eroding what’s left of the immunity, as well as meritorious actions against no-longer-protected platforms – will ultimately mushroom.

As a result, the only certain result of any amendment to the CDA will be more litigation.

HB Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 
NLR Logo
We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins