The Georgia Supreme Court suspended a prosecutor for submitting legal filings containing fabricated and inaccurate case citations generated by artificial intelligence software. The court found that multiple citations in the filings either did not exist or failed to support the legal propositions they purported to establish.
The court characterized the conduct as falling "far beneath the conduct we expect" from members of the legal profession. The suspension represents a direct disciplinary response to the misuse of AI tools in legal practice without adequate verification of their output.
This case underscores a growing problem in legal practice. AI language models like ChatGPT and similar tools frequently generate plausible-sounding but entirely fictitious case citations, a phenomenon known as "hallucination." Lawyers who rely on these systems without independently verifying citations risk submitting false information to courts.
The prosecutor's error violated fundamental professional responsibility rules requiring attorneys to present truthful and accurate information. Georgia's Supreme Court Disciplinary Rules demand that practitioners verify the accuracy and relevance of all legal authorities cited. Automated tools cannot replace human judgment and verification in legal research.
Courts nationwide have begun addressing similar AI-related misconduct. In New York, federal judge Jae Park sanctioned attorneys in Mata v. Avianca, Inc. for relying on ChatGPT to generate entirely fictitious case citations in a brief. The case prompted widespread warnings from bar associations and courts about the dangers of unverified AI assistance.
For prosecutors specifically, the stakes are higher. Prosecutorial ethics rules impose additional duties to seek justice fairly, not merely to win cases. Submitting fictitious authorities undermines the integrity of judicial proceedings and erodes public confidence in the criminal justice system.
The suspension sends a clear message to practitioners. AI tools can assist legal research, but only when their output undergoes rigorous independent verification. Courts expect attorneys to understand the limitations of the technologies they employ.
