作者:Sukey Lewis
In its analysis of the bill, the Assembly Committee on Privacy and Consumer Protection raised concerns that third-party tech companies could access and profit from sensitive police materials, potentially compromising privacy. KQED’s reporting was also cited in the analysis.
“There is potential for a race to the bottom, where sensitive body-worn camera data could be repurposed to train other technologies, including facial recognition systems or other surveillance tools,” the report said.
The committee added language to ensure that AI vendors cannot sell or misuse any personal information contained in the bodycam footage or report.
Opponents of the bill include the California Police Chiefs Association and the Police Officers Research Association of California, a police union advocacy and lobbying group. Neither responded to KQED’s request for comment, but PORAC submitted a statement to the legislature.
“The bill raises serious concerns about unintended consequences that would undermine officer integrity, impose significant administrative burdens, and introduce unnecessary legal vulnerabilities,” the statement said.
It said that the mandatory page disclosures “imply to the public, courts, or defense attorneys that such reports are inherently less reliable or credible” and that defense attorneys “might argue that AI introduced errors or biases, casting doubt on the officer’s account, regardless of the officer’s oversight or edits.”
While the tools promise to save officers time so they can spend more of it on the street, PORAC said the disclosure mandates would undermine those potential benefits and increase administrative work.
The bill is currently with the Assembly Appropriations Committee. If approved there, it will go to a floor vote later this month.
The law is narrow by design, Chatfield said. It doesn’t prohibit the use of AI or dictate which programs can be used.
“All we’re saying is you have to be transparent about it,” she said. “That’s it.”