HB Ad Slot
HB Mobile Ad Slot
Court Slams Lawyers for AI-Generated Fake Citations
Friday, April 25, 2025

A federal judge in Colorado has issued a scathing order that should serve as a wake-up call for attorneys who use frontier generative artificial intelligence (Gen AI) models in legal research. On April 23, federal Judge Nina Y. Wang of the District of Colorado issued an Order to Show Cause in Coomer v. Lindell that exposes the dangers of unverified AI use in litigation.

The Case and the Famous Defendant

The defamation lawsuit involves plaintiff Eric Coomer, a former Dominion Voting Systems executive, and defendant Mike Lindell, the well-known CEO of MyPillow. The case has gained significant attention not only for the high-profile parties involved but also for becoming a neon-red-blinking cautionary tale of consequential Gen AI misuse.

"Cases That Simply Do Not Exist"

Judge Wang identified "nearly thirty defective citations" in a brief submitted by Lindell's attorneys, and they weren't mere minor errors. The court found:

  • Citations to cases that "do not exist"
  • Legal principles attributed to decisions that contain no such language
  • Cases from one jurisdiction falsely labeled as being from another
  • Misquotes of actual legal authorities

One particularly egregious example involved a citation to "Perkins v. Fed. Fruit & Produce Co., 945 F.3d 1242, 1251 (10th Cir. 2019)"—a completely fabricated case. The court noted that while a similar-named case exists in a different form, the Gen AI tool had essentially cobbled together a fictional citation by merging elements from entirely different cases.

The Reluctant Admission

When confronted about these errors at a hearing, the defense attorney initially deflected, suggesting it was a "draft pleading" or blaming a colleague for failing to perform citation checking. Only when directly asked by Judge Wang whether AI had generated the content did Kachouroff admit to using Gen AI.

Even more damning, Kachouroff "admitted that he failed to cite check the authority in the Opposition after such use before filing it with the Court—despite understanding his obligations under Rule 11." The court expressed open skepticism about his claim that he had personally drafted the brief before using AI, noting "the pervasiveness of the errors."

Lessons for Legal Practice

AI verification isn't optional—it's a professional obligation. While AI tools can enhance efficiency, they require human oversight. The ethical foundations of legal practice remain unchanged: attorneys must verify information presented to the court, regardless of its source.

As GenAI continues to integrate into legal practice rapidly, the Coomer case serves as a stark reminder that technology cannot replace professional judgment. Case citations must be verified, quotations must be confirmed, and legal principles must be substantiated against primary sources. While the legal profession has always adapted to new technologies, core professional responsibilities have not changed significantly. In the AI era, these obligations require vigilance more than ever.

Not until this Court asked Mr. Kachouroff directly whether the Opposition was the product of generative artificial intelligence did Mr. Kachouroff admit that he did, in fact, use generative artificial intelligence.
HTML Embed Code
HB Ad Slot
HB Ad Slot
HB Mobile Ad Slot
HB Ad Slot
HB Mobile Ad Slot
 
NLR Logo
We collaborate with the world's leading lawyers to deliver news tailored for you. Sign Up for any (or all) of our 25+ Newsletters.

 

Sign Up for any (or all) of our 25+ Newsletters