MTC: AI Hallucinated Cases Are Now Shaping Court Decisions - What Every Lawyer, Legal Professional and Judge Must Know in 2025!
/AL Hallucinated cases are now shaping court decisions - what every lawyer and judge needs to know in 2025.
Artificial intelligence has transformed legal research, but a threat is emerging from chambers: hallucinated case law. On June 30, 2025, the Georgia Court of Appeals delivered a landmark ruling in Shahid v. Esaam that should serve as a wake-up call to every member of the legal profession: AI hallucinations are no longer just embarrassing mistakes—they are actively influencing court decisions and undermining the integrity of our judicial system.
The Georgia Court of Appeals Ruling: A Watershed Moment
The Shahid v. Esaam decision represents the first documented case where a trial court's order was based entirely on non-existent case law, likely generated by AI tools. The Georgia Court of Appeals found that the trial court's order denying a motion to reopen a divorce case relied upon two fictitious cases, and the appellee's brief contained an astounding 11 bogus citations out of 15 total citations. The court imposed a $2,500 penalty on attorney Diana Lynch—the maximum allowed under GA Court of Appeals Rule 7(e)(2)—and vacated the trial court's order entirely.
What makes this case particularly alarming is not just the volume of fabricated citations, but the fact that these AI-generated hallucinations were adopted wholesale without verification by the trial court. The court specifically referenced Chief Justice John Roberts' 2023 warning that "any use of AI requires caution and humility".
The Explosive Growth of AI Hallucination Cases
The Shahid case is far from isolated. Legal researcher Damien Charlotin has compiled a comprehensive database tracking over 120 cases worldwide where courts have identified AI-generated hallucinations in legal filings. The data reveals an alarming acceleration: while there were only 10 cases documented in 2023, that number jumped to 37 in 2024, and an astounding 73 cases have already been reported in just the first five months of 2025.
Perhaps most concerning is the shift in responsibility. In 2023, seven out of ten cases involving hallucinations were made by pro se litigants, with only three attributed to lawyers. However, by May 2025, legal professionals were found to be at fault in at least 13 of 23 cases where AI errors were discovered. This trend indicates that trained attorneys—who should know better—are increasingly falling victim to AI's deceptive capabilities.
High-Profile Cases and Escalating Sanctions
Always check your research - you don’t want to get in trouble with your client, the judge or the bar!
The crisis has intensified with high-profile sanctions. In May 2025, a special master in California imposed a staggering $31,100 sanction against law firms K&L Gates and Ellis George for what was termed a "collective debacle" involving AI-generated research4. The case involved attorneys who used multiple AI tools including CoCounsel, Westlaw Precision, and Google Gemini to generate a brief, with approximately nine of the 27 legal citations proving to be incorrect.
Even more concerning was the February 2025 case involving Morgan & Morgan—the largest personal injury firm in the United States—where attorneys were sanctioned for a motion citing eight nonexistent cases. The firm subsequently issued an urgent warning to its more than 1,000 lawyers that using fabricated AI information could result in termination.
The Tech-Savvy Lawyer.Page: Years of Warnings
The risks of AI hallucinations in legal practice have been extensively documented by experts in legal technology. I’ve been sounding the alarm at The Tech-Savvy Lawyer.Page Blog and Podcast about these issues for years. In a blog post titled "Why Are Lawyers Still Failing at AI Legal Research? The Alarming Rise of AI Hallucinations in Courtrooms," the editorial detailed how even advanced legal AI platforms can generate plausible but fake authorities.
My comprehensive coverage has included reviews of specific platforms, such as the November 2024 analysis "Lexis+ AI™️ Falls Short for Legal Research," which documented how even purpose-built legal AI tools can cite non-existent legislation. The platform's consistent message has been clear: AI is a collaborator, not an infallible expert.
International Recognition of the Crisis
The problem has gained international attention, with the London High Court issuing a stark warning in June 2025 that attorneys who use AI to cite non-existent cases could face contempt of court charges or even criminal prosecution. Justice Victoria Sharp warned that "in the most severe instances, intentionally submitting false information to the court with the aim of obstructing the course of justice constitutes the common law criminal offense of perverting the course of justice".
The Path Forward: Critical Safeguards
Based on extensive research and mounting evidence, several key recommendations emerge for legal professionals:
For Individual Lawyers:
Lawyers need to be diligent and make sure their case citations are not only accurate but real!
Never use general-purpose AI tools like ChatGPT for legal research without extensive verification
Implement mandatory verification protocols for all AI-generated content
Obtain specialized training on AI limitations and best practices
Consider using only specialized legal AI platforms with built-in verification mechanisms
For Courts:
Implement consistent disclosure requirements for AI use in court filings
Develop verification procedures for detecting potential AI hallucinations
Provide training for judges and court staff on AI technology recognition
FINAL THOUGHTS
The legal profession is at a crossroads. AI can enhance efficiency, but unchecked use can undermine the integrity of the justice system. The solution is not to abandon AI, but to use it wisely with appropriate oversight and verification. The warnings from The Tech-Savvy Lawyer.Page and other experts have proven prescient—the question now is whether the profession will heed these warnings before the crisis deepens further.
MTC
Happy Lawyering!