Fifth Circuit Cracks Down on “AI Hallucinations,” Saying Lawyers Can’t Plead Ignorance Anymore

U.S. federal appeals court has sanctioned a lawyer for filing a legal brief riddled with AI-generated errors, warning that fabricated citations and “hallucinated” material in court filings remain a growing problem. A three-judge panel of the U.S. Court of Appeals for the Fifth Circuit ordered attorney Heather Hersh to pay $2,500 after concluding she used artificial intelligence to draft much of a brief and failed to verify whether the output was accurate. The panel said the continuing appearance of AI-driven mistakes in litigation “shows no sign of abating,” despite nearly three years of high-profile warnings and incidents since 2023.

The sanctions stem from an appeal connected to earlier penalties a Texas federal judge imposed on Shawn Jaffer and his firm (formerly known as Jaffer & Associates) in a lawsuit alleging violations of the Fair Credit Reporting Act by a lender and a credit-reporting agency. The trial judge had ordered Jaffer and his firm to pay $23,000 in attorneys’ fees after finding he did not conduct even a minimal investigation before filing the case. The Fifth Circuit later overturned that underlying sanctions order. But before doing so, it scrutinized Hersh’s appellate brief and identified 21 instances of fabricated quotations or serious misrepresentations of law or fact—prompting a show-cause order demanding an explanation.

Judge Jennifer Walker Elrod described Hersh’s response as “disappointing.” Hersh initially suggested she had relied on publicly available versions of cases and pointed to errors allegedly originating from major legal databases. The court rejected those explanations as “not credible” and said her response was misleading in several respects. Elrod added that Hersh admitted using AI only after being directly asked, and said the court likely would have imposed a lesser penalty if she had accepted responsibility sooner and been more forthcoming. Instead, the opinion said, when confronted with a serious ethical lapse, Hersh “misled, evaded, and violated her duties as an officer of this court.”

The decision places Hersh’s filing in a wider pattern that courts say is undermining trust in legal submissions. Elrod cited a database maintained by Damien Charlotin tracking AI hallucinations in U.S. court filings, which listed 239 such cases as of that Wednesday. The court also noted it had considered adopting a first-of-its-kind appellate rule regulating generative AI use by lawyers, but decided in 2024 that existing professional and ethical rules were sufficient. The opinion’s bottom line is blunt: if it ever was an excuse to claim ignorance about the risks of using generative AI to draft a brief without checking the citations, it “certainly” is not anymore.

The case is Fletcher v. Experian Info Solutions, and the ruling sends a clear message to the legal profession: AI can assist drafting, but responsibility for accuracy still rests entirely on the lawyer who signs and files the document.

Facebook
Twitter
LinkedIn
Pinterest
WhatsApp

Subscribe Now

Never miss any important news. Subscribe to our newsletter.