英语轻松读发新版了,欢迎下载、更新

Judge dismisses class action lawsuit after attorney cites AI-generated fake precedents

2025-05-28 09:20:00 英文原文

作者:Lital Dubrovitsky|Add a commentPrintFind an error? Report us

Court dismisses $1 billion class action against Israeli health funds after attorney cited fake AI-generated precedents; Lawyer deemed unfit to lead the case and fined for procedural misconduct, marking a significant precedent for AI use in legal proceedings

On Monday, the Tel Aviv District Court ordered the dismissal of a request to approve a class action lawsuit valued at approximately $1 billion against Israeli health funds. This ruling followed revelations that the attorney representing the association that filed the request had relied on an artificial intelligence tool, citing non-existent legal precedents.

Additionally, Judge Ilan Daphni ordered the attorney to pay $1,320 in expenses to the Economic Health Fund, which had sought the dismissal of the lawsuit, as well as $1,320 to the state treasury. The judge also accepted the argument made by Clalit Health Services, represented by attorneys Shai Tamar and Adi Erman from the Lipa Meir & Co law firm, that the attorney for the association was unfit to act as a representative in this legal proceeding.

The request for approval of the class action lawsuit was filed by an association against health funds in Israel, alleging a lack of mental health services in the community and unreasonable availability. It was claimed that insured individuals were forced to endure unreasonable delays in receiving the mental health services to which they were entitled. These claims were denied by the health funds.

During the proceedings, Clalit Health Services argued in its motion to dismiss the class action that the association and its attorney, who sought to file a lawsuit for no less than $1 billion against all Israeli health funds, had apparently relied blindly on an AI tool and failed to conduct even minimal checks of the legal documents they submitted. The health fund further argued that such severe conduct should not be tolerated, and the request for approval should be dismissed, with a determination that the attorney was unfit to manage such a proceeding.

In response, the association’s attorney argued that the error was corrected immediately, there was no attempt to evade responsibility, and the mistake stemmed from good-faith reliance on a recognized legal database without realizing it might contain erroneous references. However, the judge ruled in favor of dismissing the request for the class action lawsuit and imposing personal expenses on the attorney who submitted it.

2 View gallery

בינה מלאכותית ברפואה

בינה מלאכותית ברפואה

(Photo: shutterstock)

Recently, the Supreme Court issued two rulings outlining ways to address situations similar to this case.

The attorney who filed the class action request responded: "I deeply regret that all the affected individuals may not receive compensation due to an error in relying on a legal database as an ostensibly reliable source for legal precedents. I am sorry that procedure has overshadowed substance, but I have no choice but to accept the ruling."

The attorneys representing Clalit Health Services, stated: "The court has established clear norms regarding the procedural conduct expected of an attorney who seeks to serve as a representative in a class action lawsuit. The ruling indicates that an attorney who blindly relies on AI tools and includes references to non-existent legal precedents and statutory provisions in their legal documents is simply unfit to act as a representative plaintiff in a class action proceeding."

关于《Judge dismisses class action lawsuit after attorney cites AI-generated fake precedents》的评论


暂无评论

发表评论

摘要

Tel Aviv District Court dismissed a $1 billion class action lawsuit against Israeli health funds after discovering the attorney used an AI tool that cited fake legal precedents. The judge fined the attorney for procedural misconduct and deemed them unfit to lead the case, setting a significant precedent for AI use in legal proceedings. Clalit Health Services argued that relying blindly on AI without verification is unacceptable, leading to the dismissal of the lawsuit request.

相关新闻