Lawyers Blame ChatGPT for Tricking Them Into Citing Bogus Case Law
NEW YORK (AP) Two apologetic lawyers responding to an angry judge blamed ChatGPT Thursday for tricking them into including fictitious legal research in a court filing.
Attorneys Steven A. Schwartz and Peter LoDuca are facing possible punishment over a filing in a lawsuit against an airline that included references to past court cases that Schwartz thought were real, but were actually invented by the artificial intelligence-powered chatbot.
Schwartz explained that he used the groundbreaking program as he hunted for legal precedents supporting a client's case against the Colombian airline Avianca for an injury incurred on a 2019 flight.
The chatbot, which has fascinated the world with its production of essay-like answers to prompts from users, suggested several cases involving aviation mishaps that Schwartz hadn't been able to find through usual methods used at his law firm.
The problem was, several of those cases weren't real or involved airlines that didnt exist.
Schwartz told Judge P. Kevin Castel he was operating under a misconception ... that this website was obtaining these cases from some source I did not have access to.