The appetite for acquisitions and investment in online businesses has never been stronger, with many of the most attractive online opportunities being businesses that host, manage and leverage user-generated content. These businesses often rely on the immunities offered by Section 230 of the Communications Decency Act, 47 U.S.C. §230 (“Section 230” or the “CDA”) to protect them from liability associated with receiving, editing and posting or removing such content. And investors have relied on the existence of robust immunity under the CDA in evaluating the risk associated with such investments in such businesses. This seemed reasonable, as following the legacy of the landmark 1997 Zeran decision, for more than two decades courts have fairly consistently interpreted Section 230 to provide broad immunity for online providers of all types.
However, in the last five years or so, the bipartisan critics of the CDA have gotten more full-throated in decrying the presence of hate speech, revenge porn, defamation, disinformation and other objectionable content on online platforms. This issue has been building throughout the past year, and has reached a fever pitch in the weeks leading up to the election. The government’s zeal for “reining in social media” and pursuing reforms to Section 230, again, on a bipartisan basis, has come through loud and clear (even if the justifications for such reform differ on party lines). While we cannot predict exactly what structure these reforms will take, the political winds suggest that regardless of the administration in charge, change is afoot for the CDA in late 2020 or 2021. Operating businesses must take note, and investors should keep this in mind when conducting diligence reviews concerning potential investments.
Let’s recap the latest developments:
-
In late July, the Commerce Department submitted a petition requesting that the FCC write rules to limit the scope of CDA immunity and place potentially additional compliance requirements on many providers that host third party content. On October 15, 2020, FCC Chairman Pai issued a statement indicating that he intends to move forward with a rulemaking to clarify certain aspects of the CDA, namely the relationship between the more well-known §230(c)(1) “publisher” immunity for hosting third-party content and the lesser-utilized §230(c)(2) “Good Samaritan” immunity for filtering of objectionable content, and rulemaking as to when content removals are done in “good faith.”
This is an interesting development, as the language of Section 230 does not contemplate FCC rulemaking and none had been undertaken in the nearly 25 years since the statute’s passage. However, in the week following Chairman Pai’s statement, the FCC General Counsel posted a statement that, in his opinion, the FCC has the authority to interpret all provisions of the Communications Act, including amendments such as Section 230. Of course, as a practical matter, whether or not such a rulemaking is ever promulgated may depend on the results of the election (or even the makeup of the next Congress in January, when legislators could attempt to deploy the Congressional Review Act, 5 U.S.C. §801, which is a tool Congress can use to overturn certain federal agency actions). Given the bipartisan interest in scaling back the CDA, it is possible that even if there is a change in administration, the FCC rulemaking could still proceed or be used as a model for legislative amendments to the CDA.
-
On October 28, 2020, the Senate Committee on Commerce, Science and Transportation held a hearing questioning the major social media CEOs about Section 230 reform, the accountability of the big technology companies, the flagging and restriction of certain election-related content and disinformation that violates platform content policies, and the state of local journalism in the online space, among other things. This comes on the heels of a July 29, 2020 hearing in the House Subcommittee on Antitrust, Commercial, and Administrative Law about online platforms and market power. If anything, what the latest hearing highlighted is that both parties seem to have an appetite for CDA reform on some level (and that the chance of some limited reform bill passing is certainly more likely than it’s ever been), but, pushing aside the partisan rhetoric, it’s not clear exactly what concrete policy changes would garner a consensus in Congress.
-
Speaking of legislative efforts at reform, there is currently a stack of CDA reform bills in Congress, including, most notably, a proposal submitted by the DOJ last month. On October 20, 2020, two Democratic members of Congress added to the pile with the “Protecting Americans from Dangerous Algorithms Act,” which would compel large platforms to make changes to limit the algorithmic amplification of harmful, radicalizing content that leads to violence.
-
Even the Supreme Court entered the debate. On October 13, 2020, the Court denied a cert. petition to review a prior Ninth Circuit decision which had derived an implied exception to CDA Section 230(c)(2)(B) “Good Samaritan” immunity for blocking or filtering decisions when they are alleged to be “driven by anticompetitive animus.” While agreeing with the Court’s decision to deny cert., Justice Thomas issued a statement to explain why, “in an appropriate case,” the Court should consider whether the text of the CDA “aligns with the current state of immunity enjoyed by Internet platforms,” even suggesting that courts interpreting the CDA over the years have perhaps might have been “reading extra immunity into statutes where it does not be-long.”
The continuing attention to the CDA this year has been coming from all corners of the map, and we are hard-pressed to try to predict what’s ahead with respect to the current or any future administration’s efforts to limit the scope of the CDA. Yet, skimming the legislative and regulatory proposals as well as reading the tea leaves of Congress, it appears that there is a focus on greater transparency in moderation decisions, more “due process”-like user rights regarding content removal and account termination decisions, more emphasis on terms of service surrounding the handling of user content and a push by some to generally trim the immunity around the edges – or even deeper.
What does this mean for online businesses and potential investors in such businesses? Depending on the scope of changes in the CDA, the risk profile of companies that rely on third party content may increase. Despite twenty-five years and many cases construing the CDA immunity broadly, a rethinking of the law in Washington could remake the liability calculus (or at least create more administrative burdens or make it more difficult to quickly dismiss negligible lawsuits), thereby threatening the vibrancy and viability of certain online services. One thing is certain – as there are likely hundreds of plaintiffs that have been frustrated by CDA immunity, it is clear that once that immunity is scaled back, the lawsuits will flood in.
Businesses and investors should follow developments closely. The long-held assumption that the CDA will always be there to shield against third party content liability is no longer valid. Online businesses, and the investors in online businesses, take heed: it is not too early to start thinking about risk mitigation strategies to apply to such circumstances.