According to CNN, Schwartz, a veteran lawyer in New York, used cases provided by the AI chatbot ChatGPT in court documents, but many of the cases cited were fabricated. Schwartz has admitted that he failed to verify the information and apologized for it, and he will face punishment.

The lawyer involved, Schwartz, has reportedly been practicing in New York for 30 years. His client, Mata, sued Columbia Airlines for injuring his knee by a cart while on a flight in August 2019, but Colombia asked the Manhattan Federal Court to drop the case on the grounds that the statute of limitations had expired.

Infographic: ChatGPT is an artificial intelligence chatbot program developed by OpenAI. Photo/Visual China

Schwartz cited six similar successful rulings in his court filings, however, Columbia Airlines lawyers wrote to the judge in April questioning the veracity of the information Schwartz had submitted. Judge Kevin Castel of the Southern District of New York wrote in a written document in early May that the verdicts, quotes, etc. in six of the cases were false.

Schwartz also admitted in a May 5 affidavit that he had used ChatGPT to gather cases. He asked about the authenticity of ChatGPT's cases, which repeatedly determined that they were genuine and could be found in a "trusted legal database."

Schwartz said he had never used ChatGPT as a source of legal research prior to the case, so he "didn't know that its content might be fake."

Schwartz said he regretted supplementing legal research with artificial intelligence and would never do so without verifying the authenticity of its content in the future.

Judge Custer called the situation "unprecedented" and ordered a June 6 hearing on the punishment of Schwartz.