US Lawyer Faces Legal Trouble for Citing Fictitious Case Generated by ChatGPT

May29,2023 #Avianca #ChatGPT
AVIATION SECTOR FLIGHTS DGCA LAW INSIDER IN

LI Network

Published on: 29 May 2023 at 11:35 IST

Roberto Mata filed a legal complaint against Avianca, an airline company, alleging that he suffered an injury when a metal serving cart struck his knee during a flight to New York’s Kennedy International Airport.

Bart Banino, a lawyer from Avianca’s legal firm, Condon & Forsyth, specializing in aviation law, stated that they could discern that the cases cited in the brief were fictitious. They suspected the involvement of a chatbot in the creation of the brief.

When Avianca requested the dismissal of the case from a federal judge in Manhattan, Mata’s attorneys strongly objected. They submitted a 10-page document citing numerous relevant court decisions, including Martinez v. Delta Air Lines, Zicherman v. Korean Air Lines, and Varghese v. China Southern Airlines. However, there was one problem: none of the decisions or quotations mentioned in the brief could be found by anyone involved, including the airline’s lawyers and the judge.

The reason for this was that everything had been invented by ChatGPT.

Steven A Schwartz, the lawyer who prepared the brief from Levidow, Levidow & Oberman law firm, confessed to the court in an affidavit that he had relied on an artificial intelligence program for his legal research. However, he acknowledged that the source turned out to be unreliable. Schwartz, who has been practicing law in New York for thirty years, expressed that he had no intention to deceive the court or the airline.

Schwartz claimed that he had never used ChatGPT before and was unaware of the possibility that its content could be false. He even asked the program to verify the authenticity of the cited cases, and it confirmed their existence. Schwartz expressed deep regret for relying on ChatGPT and vowed never to do so again without absolute verification.

In response to this unprecedented situation, Judge P Kevin Castel scheduled a hearing on June 8 to discuss potential penalties or sanctions. He described the submission as a legal document filled with fabricated judicial decisions, quotes, and internal citations.

As artificial intelligence becomes increasingly prevalent, concerns arise about computers replacing not just human interaction but also human labor. Knowledge workers, including lawyers, are particularly worried about the value and dangers of AI software like ChatGPT and the importance of verifying its provided information.

Stephen Gillers, a legal ethics professor at New York University School of Law, emphasized that the legal community is currently engaged in discussions on how to avoid the exact scenario described in this case. Simply relying on AI output and copying it into court filings is insufficient.

The real-life case of Roberto Mata v Avianca illustrates that white-collar professions may still have some time before being completely taken over by robots. Mata’s incident occurred on Avianca Flight 670 from El Salvador to New York on August 27, 2019, when an airline employee accidentally struck him with a serving cart, leading to his lawsuit.

Avianca responded to the lawsuit by filing papers seeking its dismissal due to the expiration of the statute of limitations. In a subsequent brief filed in March, Mata’s lawyers argued for the continuation of the lawsuit, supporting their position with references and quotes from several court decisions that were later discovered to be nonexistent.

Avianca’s lawyers wrote to Judge Castel, informing him that they were unable to find the mentioned cases. They couldn’t locate Varghese v China Southern Airlines or Zicherman v Korean Air Lines Co Ltd, which were quoted in the brief. The airline’s lawyers also expressed their inability to find the purported US Court of Appeals for the 11th Circuit decision from 2008 that was referenced in the quotation from Varghese.

In response, Judge Castel instructed Mata’s attorneys to provide copies of the opinions mentioned in their brief. The lawyers submitted a compilation of eight supposed opinions, including information about the issuing courts, judges, docket numbers, and dates. However, Avianca’s lawyers informed the judge that they could not find any of those opinions on court dockets or legal databases.

Neither Schwartz nor Peter LoDuca, another lawyer from the firm whose name appeared on the brief, responded to requests for comment.”

Related Post