On March 20, 2025, U.S. District Judge Sharon Johnson Coleman approved a nationwide class-action settlement that grants class members a 23% equity stake in Clearview AI. The settlement, outlined in a 36-page order, is a first-of-its-kind resolution to privacy litigation involving alleged violations of Illinois’ Biometric Information Privacy Act (BIPA) and other state laws.
The case centered on Clearview AI’s collection and use of biometric data. The company, a U.S.-based facial recognition startup, built a database of more than 60 billion facial images by scraping photographs from social media platforms, news websites, Venmo, and other publicly accessible online sources. Its software enables law enforcement agencies to upload a photo of an individual and receive potential identity matches, along with links to the original source pages.
Plaintiffs alleged that Clearview collected and used these images without notice or consent, in violation of BIPA’s restrictions on the collection, use, and storage of biometric identifiers.
The Privacy Backlash
In early 2020, a New York Times exposé by Kashmir Hill revealed Clearview’s activities, dubbing it “the secretive company that might end privacy as we know it.” Following publication, multiple privacy-related state and federal class action lawsuits were filed against the company, including one by the ACLU.
One of the primary statutes used to target Clearview was Illinois’ Biometric Information Privacy Act (BIPA), the nation’s strongest biometric privacy law, which protects Illinois residents from the unauthorized collection, use, and storage of their biometric data. It also includes a private right of action and statutory damages of $1,000 per negligent violation and $5,000 for intentional or reckless violations. More than 1,500 lawsuits have been filed under BIPA since 2018.
By January 2021, eleven class actions from Illinois, California, New York, and Virginia were consolidated into a multidistrict litigation lawsuit in the Northern District of Illinois under Judge Sharon Johnson Coleman. The suit spanned sixteen claims in total, with 7 brought under BIPA on behalf of the Illinois subclass.
The Clearview Settlement: Ownership as Restitution
After five years of litigation, Clearview and the plaintiffs reached an unusual, first-of-its-kind settlement agreement in June 2024. Rather than a traditional cash payout, the deal granted the class a 23% ownership stake in Clearview AI, a move that the court called “fair, reasonable, and adequate” and “significant compared to similar BIPA settlements.”
This equity-based remedy breaks from class action convention. It was shaped by financial and legal realities: Clearview, still a private startup, lacks the capital or insurance coverage to fund the kind of massive payout typical in privacy cases. With a potential class consisting of anyone whose face appeared online, a number that could run into the hundreds of millions, the company faced theoretical damages in the billions. Simply put, a conventional settlement could have bankrupted Clearview before reaching trial.
Lead counsel Jon Loevy described the situation bluntly in the plaintiffs’ motion: “Clearview and the class members were trapped together on a sinking ship.” The solution? Transfer part of the ship to the plaintiffs.
How the Class Will Get Paid
The 23% stake in Clearview is valued at approximately $51.75 million, based on an agreed valuation of $225 million for the company as of January 2024. But that value is contingent and only becomes real if the class can monetize the stake. The settlement outlines three pathways to do so:
- IPO or sale: If Clearview goes public or is acquired, the class’s 23% equity converts into cash based on the valuation at the time.
- Revenue sharing: If no liquidity event occurs in the near term, a court-appointed monitor can require Clearview to contribute 17% of its revenue over a two-year period into a settlement fund.
- Class-initiated sale: The class may also choose to sell its 23% interest independently. If a buyer purchases the stake, the resulting proceeds would be distributed to class members.
This unique settlement raises questions about privacy, restitution, and accountability. Can a company accused of violating millions of individuals’ privacy provide meaningful compensation by offering partial ownership instead of cash? What does it mean for victims of a privacy breach to hold a financial stake in the continued success of the very entity that misused their data? Does this arrangement create a perverse incentive where those harmed must now hope for the prosperity of the violator in order to be made whole? And what precedent does this set for future data privacy class actions, especially when defendants lack the funds to pay traditional damages?
No Injunctive Relief or Admission of Wrongdoing
The settlement does not require Clearview to stop collecting biometric data or admit to wrongdoing. That became a central point of contention for objectors, including 22 state attorneys general and the District of Columbia, who filed a joint amicus brief opposing the deal.
The AGs argued that the settlement failed to stop Clearview from continuing to gather and monetize people’s facial data without consent. In their view, allowing the company to avoid injunctive relief while offering victims speculative compensation tied to future profits undermines the very purpose of privacy law. “This is a highly consequential settlement that would profoundly impact nearly every American’s ability to protect their privacy,” the brief states. “It demands thorough and searching judicial scrutiny.”
They also criticized the proposed attorneys’ fees, which would allow class counsel to collect up to 39.1% of the $51.75 million value, roughly $20 million, if the class ultimately monetizes its stake. The AGs called the figure excessive, particularly given that the class would see no immediate cash payout.
Judge Coleman rejected those objections. On injunctive relief, she reasoned that only the Illinois subclass had a strong claim for an injunction, and the restrictions relating to Clearview’s operations under the 2022 settlement with the ACLU meant that additional orders in relation to this subclass would be duplicative. The Judge noted that BIPA did not operate extraterritorially, and that the claims under other states’ laws did not provide a strong basis for injunctive relief. Emphasizing that “the essence of settlement is compromise,” Coleman concluded the agreement shouldn't be rejected simply for failing to provide a complete victory. On fees, Coleman found the proposed structure acceptable, noting 30-40% attorney awards are common in the Seventh Circuit for complex, multi-year litigation.
The Influence on Privacy Litigation
The Clearview case shines a spotlight on the need for biometric privacy protections in more states. Illinois remains one of the few states with a tough law like BIPA and a private right of action. As the ACLU’s counsel said in the 2022 case against Clearview, other states should “follow Illinois’ lead in enacting strong biometric privacy laws.” While states, such as Texas and Washington, have biometric privacy statutes, enforcement is left to state officials and lacks the punch of private lawsuits. The fact that an Illinois law forced Clearview, a New York-based startup, to curb its practices nationwide and compensate people across the country is significant. It emphasizes how state privacy laws can become de facto national standards when companies operate across borders.
Second, the settlement also strengthens BIPA’s deterrent effect: companies can no longer assume insolvency shields them from liability. Giving up ownership may prove a powerful motivator for compliance from the outset. Despite the controversy around this type of settlement, the court found it fair, signaling openness to creative remedies in privacy litigation. Clearview’s equity-based settlement reflects a pragmatic compromise: offering partial ownership when cash payouts are not feasible.
Third, the concept of using ownership stakes as a remedy could inspire creative thinking in other privacy or tech-related class actions. Data privacy litigation is a growing field, from lawsuits over social media data sharing, to data breach cases, to claims under video privacy and consumer protection laws. Traditionally, these cases settle for cash payments, credit monitoring services, or practice changes. Now, attorneys might ask: If the defendant is a cash-poor but potentially valuable startup, can we negotiate an equity fund for the class? We might imagine, for example, a future settlement where users of a scandal-hit app get a small stock interest in the company as part of the resolution. Such arrangements will be case-specific, but Clearview’s precedent shows it’s not off the table. This raises novel issues: for instance, how to manage and eventually liquidate that stock for a large class, but Clearview’s settlement provides a blueprint with its mechanisms for selling or cashing out the stake.
There are also cautionary lessons. Linking compensation to a company’s future profits can be fraught. In Clearview’s case, class members will recover money only if Clearview succeeds well enough to trigger an IPO, sale, or substantial revenue. Essentially, the victims’ payout is underwritten by the continued operation of the privacy-invading business (albeit under stricter rules). This dynamic could create moral hazards or conflicts: regulators and advocates might worry that such settlements indirectly incentivize a company to keep exploiting data so the class’s stake gains value. Future courts and parties will have to balance these concerns. In some instances, a partial equity deal might be paired with more direct relief or stronger prospective safeguards to avoid the perception that privacy rights are being “paid off” with speculative future gains.
From a national perspective, this case emphasizes the growing interplay between state-level privacy laws and nationwide impacts. It serves as a reminder that America lacks a standard federal biometric privacy law, so state laws like BIPA are filling the void. But large tech companies and data aggregators should heed the warning: even one strict state law can create nationwide exposure if data on that state’s residents is involved.