An Oregon attorney accused of relying' on the totally plausible — and often totally erroneous — output of so-called artificial intelligence was slapped with a fine by the Oregon Court of Appeals on Wednesday.
The appellate court determined that Portland civil attorney Gabriel A. Watson filed briefs citing two made-up cases and used a fabricated quote that was attributed to a real piece of case law.
In a first for Oregon, the Courts of Appeals ordered Watson to pay $2,000 to the state judicial department, charging him $500 for each baloney citation and $1,000 for the bogus quote.
“Although artificial intelligence programs may seem to offer a shortcut for a busy attorney in an individual case, at present, they may create a long cut to justice,” Chief Judge Erin Lagesen wrote, calling it a “very grave situation.”
The errors were discovered by Watson’s legal opponent, former state lawmaker and retired attorney Charles Ringo.

Ringo, representing himself, sued architectural designer Jennifer Cohoon in 2023, claiming her firm had created faulty plans for remodeling a duplex he owns in Bend.
An arbitrator sided with Cohoon in January and ordered Ringo to pay $1,200 plus $15,000 in fees to Watson, her attorney.
Ringo appealed and the case went haywire in May, when Watson filed the bunk-filled brief with the appellate court.
Ringo said he spent several hours chewing over Watson’s document, eventually making a trip to the Bend library to check legal databases and confirm his suspicions that Watson’s arguments were bolstered by fake decisions in prior cases that never happened.
“I had to consider whether maybe there was just an innocent mistake in terms of the name of the case or the case citation numbers,” he said. “You have to check all sorts of variations to make sure that, no, this just doesn’t exist.”
Watson, for his part, tried to explain the error by saying that his assistant had mistakenly filed a “draft/placeholder” brief.
He later acknowledged and apologized for the apparently AI-generated errors, asking the court not to sanction him.
“As a solo practitioner, with a heavy case load, and a desire to fight for justice for all clients, there is an inherent risk of becoming overwhelmed,” he wrote. “The temptation of relying on technology to support these well-intentioned goals is strong.”
But the court had none of it.
Lagesen, the judge, said Watson hadn’t provided a “clear explanation” of how the error occurred and that each false brief created by AI costs the judicial system time and money untangling the mix-up.
Legal precedent is the backbone of the law, Lagesen said, but artificial intelligence is a machine built on the probable order of words, not the truth itself.
AI mistakes are sometimes dubbed “hallucinations.” But Lagesen rejected that term.
“Artificfial intelligence is not perceiving nonexistent law as the result of a disorder,” she wrote. “Rather, it is generating nonexistent law in accordance with its design.”
Watson didn’t respond to requests for comment. Cohoon learned about the matter from a reporter and declined to comment.
Oregon federal judges have encountered AI errors in at least two cases so far, The Oregonian/OregonLive previously reported. U.S. District Judge Michael Simon declined to impose sanctions against attorneys for Green Building Initiative on Nov. 12, ruling that he was “satisfied with the remedial actions already taken.”
U.S. Magistrate Judge Mark Clarke has not yet ruled on a similar matter in Medford.
- PSU professor put on leave over Hamas comments threatens lawsuit
- Oregon sues feds, saying USDA’s new SNAP guidance cuts legal immigrants
- Oregon among 20 states suing Trump administration to prevent dramatic homeless funding changes
- Coast Guard rescue helicopter spotted at Newport Air Facility after judge’s order
- Federal judge extends appointment of Oregon U.S. attorney
Zane Sparling is an award-winning journalist who has covered the Portland metro area for the past decade. In 2025, he was part of a team of reporters from The Oregonian/OregonLive that received the first place...