Russell v. Mills, 50 Fla. L. Weekly D2609 (Fla. 2d DCA Dec. 10, 2025):
In a case litigated between an attorney and a pro se litigant, the attorney filed an answer brief containing 3 case citations. Only 2 were actual cases published in the Southern Reporter, and both were misquoted in the answer brief (including attributing quoted text from one case that was actually found in another). The 3rd citation was completely made up.
Because the 3rd case citation appeared to have been “hallucinated,” as the court noted, generated by artificial intelligence, and because the other two case citations were misquoted, the court issued an order to show cause requiring appellee’s counsel to explain how the citations and quotations were generated. The court also asked counsel to show cause why sanctions should not be imposed.
In her response, the attorney stated that the 3 case citations were researched through computer-generated searches, and she acknowledged that she failed to fully vet those searches. In substance, counsel advised the court that her computer-generated searches misstated the law, but she did not intend to mislead the court.
The court observed that it is seeing this problem more and more frequently, and that this was the 2nd case in which the court referred an attorney to the Florida Bar for similarly relying on generative artificial intelligence without checking the veracity of the authorities. The court reminded us that ethical requirements are not excused simply because a computer program generates faulty or misleading legal analysis. Attorneys have a fundamental duty to read the authorities they cite in appellate briefs and other court filings to confirm they support the propositions for which they are cited. When an attorney “fundamentally” abdicates that duty to the court, the court said it has a duty to refer the matter to the Florida Bar. The court noted that using generative artificial intelligence is acceptable, but the attorney remains responsible for the work product it generates.
