HB Ad Slot
HB Mobile Ad Slot
Public’s M&A Comments Hold Clues for Agency Guidelines
Friday, May 19, 2023

1. Introduction

On January 18, 2022, the FTC and DOJ (the “agencies”) issued a joint request for information (RFI) asking for comments on how they should update their approach to merger enforcement. Similar to past RFIs, a variety of academics, attorneys, and other antitrust practitioners submitted detailed comments based on their experience and expertise in the field.[1] However, in this case, the general public also took an interest: over 5,000 comments were submitted,[2] compared to only 74 comments submitted during the 2020 public comment period for the Vertical Merger Guidelines.

In a speech on September 13th at the Georgetown Antitrust Law Symposium, Assistant Attorney General Jonathan Kanter pointed to the large number of comments as an indicator of significant public interest in the process. He further noted their origins: from “workers and consumers and entrepreneurs who have no idea what double marginalization is, or how many assumptions you have to credit in order to conclude it would be eliminated by a merger.”[3]

Kanter’s characterization has the important caveat that the agencies did not publicly post many of these over 5,000 comments, leaving open the possibility that they may have included duplicates or irrelevant submissions. But a large share of these comments do appear to have come from individuals who are not involved in antitrust policy or litigation. Of the 1,906 comments posted, 1,688 (89%) came from such individuals.

Although Kanter indicated that the agencies have read all of these comments and plan to pay attention to them when drafting new Guidelines, it would not be easy to determine what these comments collectively convey simply by reading them. Since their authors are likely not antitrust practitioners, the comments do not necessarily convey specific policy changes or recommendations. And because of their large number, it would be difficult to identify themes by just reading through each comment.

Fortunately, a variety of standard and state-of-the-art natural language processing (NLP) tools now exist that can help summarize these comments. NLP tools, in this context, can be used to sift through a large number of unstructured documents (e.g., memos, reports, or anything where relevant information is not clearly marked and categorized) to identify major themes. Standard NLP methods, such as Latent Dirichlet Allocation (LDA), look at how frequently words appear together in documents (e.g., “doctor,” “private,” and “equity”) to give the analyst a sense of the topics that may be important. In this article, we use methods, including LDA and Bidirectional Encoder Representations from Transformers (BERT), to take the additional step of looking at a word’s context in the document, so that different uses of the word can be discerned (e.g., “doctor” can refer to a medical doctor, but also refers to the holder of a PhD or to an action taken to alter something).

NLP tools can also analyze the “sentiment” of a block of text, determining whether it is generally positive or negative.[4] Using sentiment analysis tools, we found that less than 2% of the 1,688 general public comments had a positive semantic tone. We further found broad consensus in favor of increased enforcement: only one comment among the 1,688 general public comments called for weaker enforcement.

Beyond this general sentiment, our analysis revealed several common themes. First,  a significant amount (though not a majority) of the comments were driven by two mass campaigns, including one campaign that offered a template and assistance with submission. Second, while many of the comments were fairly generic in expressing concerns about monopolies, mergers, and similar terms, several industries appeared to attract particular interest: large numbers of comments addressed healthcare and large tech companies, with smaller but significant numbers addressing agriculture and internet service providers (ISPs).

Overall, while the general comments often discussed concerns in layman’s terms, many issues they raise, such as inflated prices, lack of consumer choice, lack of innovation, and depressed wages, tie into economic concepts that the Guidelines have long recognized as relevant to merger enforcement. These comments naturally rarely addressed specific technical debates about enforcement, such as the importance of structural presumptions, but recurring examples of specific industries and types of potential harm can shed light on how the commenters broadly might feel about these issues.

2. Mass Comment Campaigns

We found that roughly 37% of the 1,688 general public comments were generated by just two mass campaigns: one by the American Economic Liberties Project (AELP), an organization founded in 2020 with the purpose of pushing greater antitrust enforcement; and one by the American College of Emergency Medicine. Our programmatic text analysis identified 458 comments, or about 27% of all comments from the general public,[5] that appear to have been based off a common template offered on an AELP website, which also offered an easy process for submitting comments.[6] This percentage could be even larger among the full set of 5,000 submissions: it is possible that the agencies did not post multiple submissions that exactly duplicated the template. Text analysis also identified 164 comments, or approximately 10% of the total, that appear to have been driven by a campaign by the American College of Emergency Medicine. These had template-like language patterns, but it is unclear whether such a public online template still exists.

The success of these mass campaigns may indicate that many people share the concerns they are intended to highlight. However, there may have been fewer submissions addressing these concerns absent the ease of submitting a template-based comment. While we do not treat them differently in our analysis, this is a useful caveat to keep in mind.

3. Detailed Analysis of Recurring Topics

Outside of the mass comment campaigns, the comments varied considerably in what they discussed and how they discussed it—leaving the agencies a daunting task, if they want to extract any general lessons on commenters’ concerns. Fortunately, the task of reviewing and finding patterns in the comments can be aided substantially by both standard and state-of-the-art text analysis tools.

To identify patterns, we applied both an “unsupervised” approach, in which we provided no input to the model other than the comments themselves, and a “semi-supervised” approach, in which we provided suggestions of likely topics as a starting point. The resulting groups allowed us to identify common themes across many comments as well as topics that were rarely, if ever, discussed. As we explain below, these methods do not fully replace manual review—they are not precise enough to create specific antitrust-related insights. But they have significant advantages over initial review by a human reviewer, who would need to continuously assign and reassign comments into groups in an attempt to systematically find patterns.

Our analysis revealed that healthcare was by far the most dominant topic. Nearly 25% of comments fell into a healthcare-related group, with representative words such as “patient,” “physician,” “medicine,” and “hospital.” While this may have been driven in part by the mass campaign, that campaign accounts for less than half of the healthcare comments. Further analysis of this group showed that “private equity” was mentioned in 52% of these healthcare-related comments, including the comments from the emergency physician mass campaign discussed above. Manual review suggests that commenters attributed a number of problems to private equity, including lower quality and reduced access to care. Some commenters additionally expressed the belief that acquisitions by private equity firms may escape notice because PE firms are not obviously direct competitors in the healthcare space.

The healthcare-related group was so large and unique that it had a substantial impact on the algorithmically-generated topics. To gain further insight, we set the healthcare comments aside and re-applied the algorithms to the other comments. While the largest group remained fairly generic (representative words such as “mergers,” “corporate,” “monopoly,” and “economy”), some smaller distinct groups also emerged.

We identified two groups of comments focused on large tech companies such as Google. While at first glance these groups appear to discuss similar issues, one group commonly included the word “search” and the other commonly included the word “ad.” Manual review showed that “search” commenters believe that the dominance of search engines by large tech companies has lowered the quality of search results. For example, several comments expressed the belief that Google’s search algorithm downgraded certain websites so that Google could more effectively monetize its search results. On the other hand, the other grouping of commenters expressed general concerns over the consolidation of media—including the consolidation of advertising choices.

Other groups of comments focused on specific industries, notably ISPs and agriculture/farming. While the algorithmic results did not offer additional insight, a manual review of the comments showed concerns over perceived market concentration. For ISPs, several commenters expressed concerns over price and service quality that they believed stemmed from market concentration. Commenters also expressed frustration over the lack of choice. For agriculture, commenters expressed the belief that concentration among processors was squeezing farmers—a notion also expressed in comments from several agriculture industry producer groups.

The topic modeling methods we employed are designed to focus on keywords that are highly distinguishable from one another, sometimes leaving shared keywords across topics out of focus. To address this, we directly searched the comments’ text for specific words that would indicate a topic was discussed. For example, 407 comments (24%) mentioned wages (specifically the word “wage”), even though none of the topic modeling groups clearly focused on labor issues. A number of these comments, including comments based on the AELP mass campaign template, link low wages to increasing market concentration. Other comments in this group did not make this connection, but noted that wages were not keeping up with prices. Similarly, 187 comments (11%) expressed concerns that consolidation was reducing innovation—though many of these were also based on the AELP template.

4. Conclusion

A number of responses to the RFI from antitrust practitioners noted that the RFI itself suggested that the agencies may already be leaning towards stronger enforcement. Programmatic tools show that the general public guidelines also lean in that direction. In some cases, large numbers of commenters even had specific items they would like to see from the agencies. For example, commenters had significant concerns about concentration in the healthcare industry, with private equity showing up as particularly salient (though this was also aided by a mass campaign). Smaller numbers of general public commenters also expressed concerns over the influence of large tech companies, or on concentration in specific industries, such as ISPs.

Beyond the large number of healthcare comments and the much smaller sets of comments addressing specific industries, there remained a large group expressing a general sense that large companies or concentrated industries were leading to higher prices and worse service, that they drive smaller companies out of business, or that they wield too much power in society. The agencies may find it difficult to develop any specific changes to policy based on these comments.  

This article was first published in Law360.  The views expressed herein do not necessarily represent the views of Cornerstone Research.  Authorized republication by Cornerstone Research. 


[1] We summarized many of these in a previous Law360 article, entitled “A Look At Public’s Divergent Views on New Merger Guidelines”.

[4] In general, sentiment analysis is the process of using computational tools to classify different pieces of text into positive, neutral, or negative sentiment. See, e.g., Systematic reviews in sentiment analysis: a tertiary study | SpringerLink

[5] Because the agencies filtered out duplicate comments before posting, comments beyond the first exactly following the template are not analyzed. Thus, these statistics may undercount the total number of template-based comments.

[6] https://secure.everyaction.com/FNcO8Kqp_kGsVUT1sYEC1g2

HTML Embed Code
HB Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 
NLR Logo
We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up to receive our free e-Newsbulletins

 

Sign Up for e-NewsBulletins