Interview The founder of an AI startup who attempted to use an artificially generated avatar to argue his case in court has been scolded by a judge for the stunt.
The avatar – its appearance and voice created by software – appeared on behalf of Jerome Dewald, the plaintiff in an employment dispute with insurance firm MassMutual Metro New York, at a March 26 hearing before the US state's supreme court appellate division.
During oral arguments, Dewald asked for a video to be played depicting a man in a V-neck sweater to the five-judge panel. The video opened: "Now may it please the court, I come here today a humble pro se before a panel of five distinguished justices…" A pro se being someone representing themselves.
Gen–errr–ative AI ... The moment in Dewald's hearing when his AI-generated avatar was played to the court, as seen in the proceedings' live-stream
Confused by the unknown speaker, one of the judges, Associate Justice Sallie Manzanet-Daniels, immediately interrupted to ask who was addressing the court. "Is this … hold on? Is that counsel for the case?"
"That? I generated that," replied Dewald, who was physically sitting before the panel of judges in the hearing.
"I'm sorry?" the judge said.
"I generated that," Dewald reiterated. "That is not a real person."
"Okay," the judge snapped. "It would have been nice to know that when you made your application. You did not tell me that, sir."
That application being Dewald's request to play a video arguing his case, as according to him a medical condition had left the entrepreneur unable to easily address the court verbally in person at length. The panel was not expecting a computer-imagined person to show up, however.
"You have appeared before this court and been able to testify verbally in the past," Judge Manzanet-Daniels continued. "You have gone to my clerk's office and held verbal conversations with our staff for over 30 minutes.
"I don't appreciate being misled. So either you are suffering from an ailment that prevents you from being able to articulate or you don't. You are not going to use this courtroom as a launch for your business, sir. If you want to have oral argument time you may stand up and give it to me."
In an interview with The Register this week, Dewald said: "I asked the court for permission in advance and they gave it to me. So they were not unprepared to have the presentation. They were unprepared to see an artificially generated image."
The judge's reference to an ailment refers to Dewald's bout with throat cancer 25 years ago. "Extended speaking is problematic for me," he explained. "I mean, I can go through the different things that happened, but that was part of the reason that they agreed to let me do the presentation."
I asked the court for permission in advance, and they gave it to me. So they were not unprepared to have the presentation. They were unprepared to see an artificially generated image
Dewald, who operates a startup called Pro Se Pro that aims to help unrepresented litigants navigate the US legal system without hiring lawyers, had planned to use an AI service called Tavus to create a realistic video avatar of himself to read his argument to the court.
"I did get a permission in advance," he claimed. "I intended to use my own replica that would have been an image of me talking. But the technology is fairly new. I had never made a replica before of myself or anybody."
Dewald explained that the process of creating an avatar to appear in court involves providing Tavus a two-to-four-minute video of the subject talking plus a one-minute segment that shows the subject standing still. That material is used to generate the subject's digital replica, a process that takes about two to four hours. He ended up using a default avatar, called Jim, rather than one of himself, though.
"On my basic plan, I only get three replicas a month to generate," Dewald said. "So I was trying to be conservative. I tried one. It failed after about six hours. I tried another one. It failed after about eight hours. And by the time we were getting ready for the hearing I still didn't have my own replica. So I just used one of their stock replicas, that big, beautiful hunk of a guy that they call Jim."
Not real ... Jim, one of the default Tavus-generated avatars, in a demonstration video, the kind used by Dewald for his hearing
"Jim" only got a few words out at the hearing before being cut off.
"So you can see the judge was upset, she was really upset in the beginning," said Dewald, who ended up addressing the court himself. "And then when I started giving my presentation, as poorly as I did, she seemed to become much more sympathetic. The look on her face was more like, 'Well I'm sorry I chewed you out so badly.'"
While there have been several instances of attorneys being chided by judges for filing court documents with AI-generated inaccuracies, Dewald believes the judge's ire in this matter was due to being surprised by an unexpected person on the video presentation.
Asked about whether the court's reaction gave him pause about the viability of AI applications in legal matters, Dewald said, "I don't know, but the technology has changed so quickly. You know my site, we get a fair number of views but we really don't get much business out of it."
Dewald said he'd been unable to develop his AI legal business due to lack of funding and other concerns, so it had remained untended for about a year.
"In the artificial intelligence world, a year is like an eon," he said. "When I put it up, we're still working on the level of ChatGPT-3.5. Our centerpiece was a scenario analyzer. That is an AI piece that interviews a pro se [litigant] and then gives some advice. I would argue it's not legal advice, but you can argue what you want.
"And that piece worked kind of OK. It involved a tech stack that had some Amazon pieces in it that have now been deprecated. And the whole landscape has changed so much that that site needs to be rebuilt with agentic AI now, because you can just do so much more."
Dewald downplayed the judge's admonition not to use the court as a venue to promote his biz. "There was nothing there that was promoting any business that I have," he said.
I think the courts eye [AI use] very skeptically
Asked about whether AI should be accepted in courtrooms, Dewald said: "I think the courts eye it very skeptically. With respect to the replica and the presentation that I did, there could be no hallucinations in that unless there were hallucinations of the script that I gave them to read. I did use a generative AI to draft the script, but I also checked it very thoroughly. I've been doing this for a long time."
Dewald has a background in engineering and computer science, and is not a lawyer. He said he was admitted to law school in New York in the 1970s but never attended. He also said he recently sat a law school admission test, is a member of some bar associations, and follows the evolving use of AI in the law.
Citing a panel discussion with several New York justices about a year ago, he said the recommendation was that the use of AI should be disclosed to your opponent and the court.
"I've been doing that for over a year," said Dewald. "I'm not sure how useful it is because in some respects, full and open disclosure, on the other side of the coin, I think it tends to be discriminatory sometimes. It tends to prejudice readers against you because there's such a negative view of hallucinations and AI."
He said hallucinations represent a real problem for AI, along with misstating actual citations, and misinterpreting the premise of a case. He added he's very thorough when checking the accuracy of AI output.
Dewald pointed to a recent American Bar Association seminar, Navigating Artificial Intelligence in the Judiciary, that covered guidelines for the responsible use of AI tools by judicial officers.
AI actually tends to empower unrepresented litigants, gives them a voice that they wouldn't normally have in the courtroom
The seminar attempts to grapple with common concerns about AI, such as model bias, hallucination, and confidentiality. It also mentions pushing the boundaries of legal norms as is the case when AI models are used to generate output that's outside the court record that may be put forth as a sort of uncredentialed expert witness.
Courts have also taken to using AI. As the seminar notes, Arizona courts have deployed digital avatars to summarize decisions.
Dewald reckons AI can help pro se litigants, and "actually tends to empower unrepresented litigants, as it gives them a voice that they wouldn't normally have in the courtroom."
He added he already filed an apology with the court because it was “a mistake not to be fully transparent" and warn the justices his argument would be presented by an avatar. ®