Estranged Trump Attorney Michael Cohen Unwittingly Supplied AI-Generated Bogus Legal Citations for Court Filing

Ryan Morgan
By Ryan Morgan
December 29, 2023Judiciary
share
Estranged Trump Attorney Michael Cohen Unwittingly Supplied AI-Generated Bogus Legal Citations for Court Filing
Former President Donald Trump's former attorney Michael Cohen looks on at court during a break in the former presidents's fraud trial in New York on Oct. 25, 2023. (Timothy A. Clary/AFP via Getty Images)

Michael Cohen, former President Donald Trump’s former attorney turned critic, admitted in court on Friday to inadvertently relying on fake legal citations generated by Google’s artificial intelligence chatbot service.

Mr. Cohen had served time in prison after pleading guilty in 2018 to federal charges of tax evasion, making false financial statements, and making improper campaign contributions. He pled guilty later that same year to perjury in Congressional testimony. He claimed he made the improper campaign contributions and false statements before Congress to help conceal an alleged affair between President Trump and an adult film star and to conceal efforts the Trump organization had made to develop property in Moscow, Russia, a project that hasn’t come to fruition.

In recent months, Mr. Cohen had sought to have his period of supervised release terminated early. However, the effort hit a roadblock earlier this month after legal filings entered on Mr. Cohen’s behalf appeared to cite nonexistent court cases. U.S. District Judge Jesse Furman of the Southern District of New York subsequently ordered Mr. Cohen’s attorney, David M. Schwartz, to account for the counterfeit legal citations and threatened to impose sanctions.

In a Dec. 28 response statement, Mr. Cohen’s legal team admitted citing fake legal cases but said Mr. Cohen inadvertently supplied the fake citations while seeking to assist his attorney.

“Looking to assist his attorney, Mr. Cohen conducted open-source research for cases that reflected what he anecdotally knew to be true. Specifically, he sought case law to illustrate that terminating his supervised release under these circumstances would be in line with how other courts have handled similar requests. Unfortunately, the citations that Mr. Cohen found online refer to nonexistent cases, and Mr. Schwartz submitted them unchecked,” read the statement submitted by one of Mr. Cohen’s other attorneys, E. Danya Perry.

According to the statement, Mr. Cohen had unknowingly used Google Bard, the search giant’s generative artificial intelligence chatbot.

Mr. Cohen had previous experience finding case citations as a practicing attorney but had been disbarred following his criminal conviction.

“The invalid citations at issue—and many others that Mr. Cohen found but were not used in the motion—were produced by Google Bard, which Mr. Cohen misunderstood to be a supercharged search engine, not a generative AI service like Chat-GPT. Mr. Cohen had used Google Bard to successfully identify accurate information in other contexts before and did not appreciate its unreliability as a tool for legal research,” Ms. Perry’s Dec. 28 statement reads. “Like most lay clients, Mr. Cohen does not have access to Westlaw or other standard legal research tools to verify any citations he finds online. Instead, he trusted his attorney to verify them on his behalf.”

Generative AI chatbots like ChatGPT and Google Bard use samples of real written text, known as large language models, to generate seemingly authentic human conversations. While chatbot services can provide reliable information in some instances, they can also experience what some AI researchers call “hallucinations.” A hallucination can be an instance where, when prompted to provide an example citation, the AI chatbot will generate its best approximation of what that citation might look like rather than an authentic example.

Ms. Perry further explained that a miscommunication between her, Mr. Schwartz, and Mr. Cohen had occurred, in which there was a misunderstanding about who was handling certain aspects of Mr. Cohen’s case. As a result, Ms. Perry said Mr. Schwartz believed she had been the one who had supplied the false case citations instead of realizing the citations actually came from Mr. Cohen’s open-source research using Google Bard.

In discussing possible sanctions earlier this month, the judge noted that it was the second time this year that a judge in Manhattan federal court had confronted lawyers over fake citations generated by artificial intelligence. Two lawyers in an unrelated case were fined $5,000 for citing bogus cases that ChatGPT invented.

The Associated Press contributed to this article.

ntd newsletter icon
Sign up for NTD Daily
What you need to know, summarized in one email.
Stay informed with accurate news you can trust.
By registering for the newsletter, you agree to the Privacy Policy.
Comments