San Francisco artist Tiana Oreglia has been closely watching how transparency rules surrounding generative artificial intelligence unfold.
“There’s a lot of caginess about what’s in those datasets” built by generative AI companies, said Oreglia, 31, who works as a freelance concept artist for video game companies.
In the past, OpenAI and other AI companies have given vague answers on where they get the data they use to develop their systems that are called models. Oreglia said she wants to tell them, “Hey, my work is actually there, and I don’t want it to be there.”
When prompted, generative AI uses training data — such as images, text, video and audio — which it learns from to identify patterns and trends, and then recombines to generate new text, images, video and audio. Many creative professionals — including visual artists, writers, actors, singers and musicians — are concerned that companies are feeding existing creative work into data troves and applying generative AI to produce content based on their original work without giving credit or compensation. Creative professionals say their work is being used and monetized without their permission.
Oreglia is a member of Concept Art Association, a Los Angeles-based nonprofit advocacy organization representing artists who create the concept for the visual elements for movies, video games and animation. The group supported California’s generative AI dataset disclosure law, which Gov. Gavin Newsom signed into law last fall and will go into effect Jan. 1, 2026.
The legislation mandating generative AI companies to acknowledge copyrighted material in training data is “solid concrete action” to bring some transparency to “an unscrupulous, opaque and predatory model for data acquisition,” the Concept Art Association stated in support of the disclosure law.
Nicole Hendrix, co-founder of the association, said a lot of creatives “feel violated — because you basically had your portfolio stolen and then used against you and replacing you” with generative AI.
According to think tank CVL Economics, by 2026, generative AI could negatively affect 62,000 of California’s entertainment jobs — some would be consolidated, some would be replaced and some would be eliminated. The association’s leaders say strong legislation can protect the livelihood of creative workers.
In February, state Assemblywoman Rebecca Bauer-Kahan introduced a generative AI copyright transparency bill that would complement the state’s existing dataset disclosure law. The new bill would require generative AI companies to report to copyright owners how their copyrighted materials were being used. The bill is also supported by the association.
Jennifer King is a privacy and data policy fellow at the Stanford Institute for Human-Centered Artificial Intelligence. She said that California, as a base for both prominent creative and tech sectors, is in a unique position to shape AI policy.
“We’re going to see states taking a lot more action than the federal government on AI,” she said.
San Francisco-based OpenAI recently asked the Trump administration to advance a federal AI strategy to help companies avoid “overly burdensome state laws” and allow companies to continue to collect copyrighted material. TechNet, a national association of technology leaders that counts Google, Apple, Amazon, Meta and OpenAI among its members, also recently suggested that the federal government should “impose a moratorium on state legislation” until national standards were adopted.
There is “no consensus about at what level of generality or specificity the law requires” for California, said Pamela Samuelson, a professor at UC Berkeley Law. The European Union has a similar training data disclosure rule in its EU Artificial Intelligence Act, and “Gen AI developers are working toward a consensus about what level of disclosure is required,” she said.
Income ‘dried up’
Oreglia, a character designer for the video game Voodoo Detective, suspected she had already lost income due to the proliferation of generative AI, but she didn’t know how much. She said the calculation is difficult for younger or less experienced artists who don’t have as many established client relationships, whereas more experienced creatives might have a clearer sense when they lose work to generative AI.

Yuri Nagano / San Francisco Public Press
Tiana Oreglia, a character designer for the video game Voodoo Detective, says she suspects she has already lost income due to the proliferation of generative AI, but she doesn’t know how much.Veteran artists, such as Michigan-based Reid Southen or Nashville, Tenn.-based Kelly McKernan, have been vocal about their financial losses. McKernan said that “life-sustaining gigs have all but dried up” because of generative AI, and she has taken part-time jobs to pay her bills.
Southen, an artist for major Hollywood films such as “The Hunger Games,” said at an L.A. conference last fall that he saw his 2023 income cut by almost half with the proliferation of generative AI. He said that his income was “fairly steady” leading up to 2023 before the explosive rise of generative AI tools such as OpenAI and Midjourney.
“AI is really eating people’s lunches now,” he said.
L.A.-based Andrew Leung, a veteran artist who has worked on big-budget films like Marvel’s “Black Panther: Wakanda Forever,” said he has also seen less work due to generative AI. In 2023, Leung worked on a project where he was promised eight weeks of work, only to see the scope cut by half “because the art director had decided to use a lot of AI,” he said. Last fall, Leung returned to a film job with full union benefits to support his family with two children.
Oreglia has spoken out about the effects of generative AI on her art community. She testified at a California State Assembly hearing last year and shared statistics on how more than three-quarters of artists and designers in the U.S. are freelance. Generative AI, she said, is “taking over more and more work, especially from freelance artists, because we’re easy to cut out.”
Oreglia said she has proof that AI companies are taking her work without her permission. She discovered artwork from her high school years in datasets used to train generative AI models. Beyond Oreglia’s artwork, the dataset included “all sorts of personal data, along with more nefarious things like child abuse material,” she said. She said she has reached out to around a dozen state senators, including her own, Scott Wiener, and has demanded better protection for artists.
In addition to possible copyright, privacy and labor violations, she points to less obvious issues, such as AI companies generating copycat mimics of artists’ original works and flooding the internet with “a slog of material.” The AI mimics hurt creative professionals, Oreglia said, because their art is now much harder to find.
What’s next?
Oreglia and other creatives are watching how generative AI companies comply with rules on disclosing training data sources. The office of the law’s lead sponsor, state Assemblywoman Jacqui Irwin, has indicated that enforcement would be based on the state’s Unfair Competition Law, which includes civil penalties of up to $2,500 per violation.
Matthew Butterick is an attorney in copyright infringement cases brought by creatives against companies who use data for AI training without owners’ consent. One of the cases is a class action against Stability AI, Midjourney and other generative AI companies.; Plaintiffs include San Francisco-based artist Karla Ortiz, who designed the titular character Doctor Strange for Marvel’s 2016 film “Doctor Strange.”
“We certainly take the view in all of our cases that the infringement is happening right away when these companies are amassing these huge, essentially libraries,” Butterick said, adding that the repositories are filled with items numbering in the “millions or billions of other people’s work.” The class action is in San Francisco in the U.S. District Court for the Northern District of California.
OpenAI and Stability AI, which are based in San Francisco, and Midjourney, which is based in London, did not respond to requests for comment.
Others are finding ways to use technology to protect creative work. Ben Zhao, a computer science professor at the University of Chicago, and his research lab of doctoral students have developed Glaze, a cloaking tool that adds a disruptive layer — imperceptible to the human eye — to digitized images that prevents the original artwork from being mimicked by generative AI. Zhao’s team has also created Nightshade, a companion tool. Both Glaze and Nightshade have been downloaded by creatives millions of times since 2023.
If large numbers of artists using such protective tools make it difficult for generative AI companies to take their work without permission, the industry might be pushed to license work by creative artists. Generative AI companies are “going to keep getting more issues in their models, and it’s going to increase their production costs and training and testing and so presumably, I think they’re going to start pushing for more licensing deals,” Zhao said.
Some changes are already setting in. Generative AI is eliminating a large number of entry-level roles and the nurturing of up-and-coming creatives, said Hendrix of the Concept Art Association. There will be fewer master-level artists in the next generation and “it’s a very shortsighted thing to do” to keep on feeding generative AI with stolen creative work, she said.