Current:Home > ContactLawyers fined for filing bogus case law created by ChatGPT -WealthSphere Pro
Lawyers fined for filing bogus case law created by ChatGPT
NovaQuant Quantitative Think Tank Center View
Date:2025-04-07 05:57:08
A federal judge on Thursday imposed $5,000 fines on two lawyers and a law firm in an unprecedented instance in which ChatGPT was blamed for their submission of fictitious legal research in an aviation injury claim.
Judge P. Kevin Castel said they acted in bad faith. But he credited their apologies and remedial steps taken in explaining why harsher sanctions were not necessary to ensure they or others won't again let artificial intelligence tools prompt them to produce fake legal history in their arguments.
"Technological advances are commonplace and there is nothing inherently improper about using a reliable artificial intelligence tool for assistance," Castel wrote. "But existing rules impose a gatekeeping role on attorneys to ensure the accuracy of their filings."
A Texas judge earlier this month ordered attorneys to attest that they would not use ChatGPT or other generative artificial intelligence technology to write legal briefs because the AI tool can invent facts.
The judge said the lawyers and their firm, Levidow, Levidow & Oberman, P.C., "abandoned their responsibilities when they submitted non-existent judicial opinions with fake quotes and citations created by the artificial intelligence tool ChatGPT, then continued to stand by the fake opinions after judicial orders called their existence into question."
- Texas judge bans filings solely created by AI after ChatGPT made up cases
- A lawyer used ChatGPT to prepare a court filing. It went horribly awry.
In a statement, the law firm said it would comply with Castel's order, but added: "We respectfully disagree with the finding that anyone at our firm acted in bad faith. We have already apologized to the Court and our client. We continue to believe that in the face of what even the Court acknowledged was an unprecedented situation, we made a good faith mistake in failing to believe that a piece of technology could be making up cases out of whole cloth."
The firm said it was considering whether to appeal.
Bogus cases
Castel said the bad faith resulted from the failures of the attorneys to respond properly to the judge and their legal adversaries when it was noticed that six legal cases listed to support their March 1 written arguments did not exist.
The judge cited "shifting and contradictory explanations" offered by attorney Steven A. Schwartz. He said attorney Peter LoDuca lied about being on vacation and was dishonest about confirming the truth of statements submitted to Castel.
At a hearing earlier this month, Schwartz said he used the artificial intelligence-powered chatbot to help him find legal precedents supporting a client's case against the Colombian airline Avianca for an injury incurred on a 2019 flight.
Microsoft has invested some $1 billion in OpenAI, the company behind ChatGPT.
The chatbot, which generates essay-like answers to prompts from users, suggested several cases involving aviation mishaps that Schwartz hadn't been able to find through usual methods used at his law firm. Several of those cases weren't real, misidentified judges or involved airlines that didn't exist.
The made-up decisions included cases titled Martinez v. Delta Air Lines, Zicherman v. Korean Air Lines and Varghese v. China Southern Airlines.
The judge said one of the fake decisions generated by the chatbot "have some traits that are superficially consistent with actual judicial decisions" but he said other portions contained "gibberish" and were "nonsensical."
In a separate written opinion, the judge tossed out the underlying aviation claim, saying the statute of limitations had expired.
Lawyers for Schwartz and LoDuca did not immediately respond to a request for comment.
- In:
- Technology
veryGood! (35761)
Related
- How to watch the 'Blue Bloods' Season 14 finale: Final episode premiere date, cast
- Binance was once FTX's rival and possible savior. Now it's trying not to be its sequel
- In bad news for true loves, inflation is hitting the 12 Days of Christmas
- As Deaths Surge, Scientists Study the Link Between Climate Change and Avalanches
- Current, future North Carolina governor’s challenge of power
- Tired of Wells That Threaten Residents’ Health, a Small California Town Takes on the Oil Industry
- Warming Trends: Green Grass on the Ski Slopes, Covid-19 Waste Kills Animals and the Virtues and Vulnerabilities of Big Old Trees
- Senators reflect on impact of first major bipartisan gun legislation in nearly 30 years
- Biden administration makes final diplomatic push for stability across a turbulent Mideast
- The Senate’s Two-Track Approach Reveals Little Bipartisanship, and a Fragile Democratic Consensus on Climate
Ranking
- North Carolina trustees approve Bill Belichick’s deal ahead of introductory news conference
- Can shark repellents avoid your becoming shark food?
- In the Pacific, Global Warming Disrupted The Ecological Dance of Urchins, Sea Stars And Kelp. Otters Help Restore Balance.
- Thousands of children's bikes recalled over handlebar issue
- The 401(k) millionaires club keeps growing. We'll tell you how to join.
- A Federal Court Delivers a Victory for Sioux Tribe, Another Blow for the Dakota Access Pipeline
- Could New York’s Youth Finally Convince the State to Divest Its Pension of Fossil Fuels?
- Inside a Southern Coal Conference: Pep Rallies and Fears of an Industry’s Demise
Recommendation
As Trump Enters Office, a Ripe Oil and Gas Target Appears: An Alabama National Forest
Warming Trends: A Baby Ferret May Save a Species, Providence, R.I. is Listed as Endangered, and Fish as a Carbon Sink
2022 marked the end of cheap mortgages and now the housing market has turned icy cold
You People Don't Want to Miss New Parents Jonah Hill and Olivia Millar's Sweet PDA Moment
Intellectuals vs. The Internet
New York bans pet stores from selling cats, dogs and rabbits
Could you be eligible for a Fortnite refund?
Contact lens maker faces lawsuit after woman said the product resulted in her losing an eye