作者:Katie Bente
WEST PALM BEACH, Fla. (CBS12) — Florida’s new law criminalizing nonconsensual deepfake pornography took effect Oct. 1, marking one of the strongest state responses yet to the rise of artificial intelligence in online abuse.
A deepfake is an image, video or audio clip that has been digitally altered — often using artificial intelligence — to make it appear that someone said or did something they never did. In the case of “deepfake porn,” an innocent photo can be manipulated to strip away clothing or place a person’s face onto explicit material.
The measure, House Bill 757, makes it a felony to create, share or even possess AI-generated sexually explicit images of someone without consent. It also allows victims to sue for damages and requires platforms to create takedown systems by the end of 2025.
The law builds on the federal Take It Down Act, signed in May, which requires websites and apps nationwide to remove intimate images — real or AI-generated — within 48 hours of a victim’s request. Unlike the federal law, Florida’s statute adds criminal penalties, exposing offenders to arrest, prosecution and prison time.
At just 14 years old, Elliston Berry woke up to a nightmare.
The Texas freshman’s phone was buzzing with messages. Friends warned her that nude photos of her were circulating around school. But they weren’t real.
A classmate had pulled an innocent photo from her Instagram account, run it through an app and used artificial intelligence to digitally strip away her clothes.
“This was not a great way to wake up, and I was truly disgusted,” Berry told CBS12 News. “My brain went to so many different areas. I had so many questions but I was just speechless.”
Berry, now 16, says she felt her innocence was erased “pixel by pixel.” She remembers the shame of having to tell her mom, and the humiliation of convincing classmates the photos weren’t real.
“I felt like I had to prove my point to everyone who had seen these images that they were fake, but I couldn’t get any words out,” she said.
See also: New Florida laws take effect October 1: What you need to know
Her mother, Anna McAdams, said schools and police were unprepared to respond.
“The school wouldn’t really help us, and then our local law enforcement didn’t know what to do with it,” McAdams said. “It was kind of like we were just running in circles.”
Parents tried contacting Snapchat, where some of the images were shared, but received no response. Even a warrant, McAdams said, was described as “good faith only.”
The classmate who created the deepfake was eventually identified and charged as a juvenile. But McAdams said the damage was already done.
With few options, she took her daughter’s story public. They met with lawmakers in Washington, including Sen. Ted Cruz, and with First Lady Melania Trump, who highlighted Elliston’s case to raise awareness. Their push helped shape the national Take It Down Act.
What Florida’s law does
HB 757 created a new section of state law, Section 800.045, specifically targeting AI-altered sexual images.
The law makes it a third-degree felony to create, solicit or knowingly possess such images, punishable by up to five years in prison.
Possession with intent to distribute is treated more harshly as a second-degree felony, carrying up to 15 years.
Victims are also given the right to sue, with damages starting at $10,000 per violation plus attorney’s fees.
Each image is considered a separate offense, meaning multiple charges can stack quickly.
The measure requires websites and apps to create reporting systems by Dec. 31, 2025. Once notified, platforms must remove flagged images within 48 hours and attempt to scrub copies. Companies that act in good faith are shielded from liability, even if a takedown later proves mistaken.
Gov. Ron DeSantis signed the bill on May 27, calling it a necessary protection for children in the digital age.
Cases like Berry’s reflect a wider trend. Monitoring groups report nonconsensual deepfake cases surged nearly 400% in 2024, with most victims being women and minors.
“It takes a toll on your mental health,” Berry said. “It gives you fear about the future. I was scared about colleges, jobs, and how people would look at me if those images stayed online.”
McAdams said prevention must start at home.
“You’ve got to be proactive as a parent,” she said. “Go ahead and have those conversations. It could happen to a child, a teenager, or an adult.”
Florida is one of more than a dozen states that have passed legislation to combat deepfake abuse. The Take It Down Act established a national baseline, but states like Florida are going further by attaching criminal penalties.
Legal experts say the combination of federal and state laws finally gives victims both a clear path to removal and a way to punish perpetrators.
Experts recommend victims take immediate steps.
Preserve evidence by saving screenshots, copying links and recording usernames, dates and times. Report the content directly to the platform; under federal law, companies must remove nonconsensual intimate images within 48 hours of receiving a valid notice.
Victims should also file a police report.
In Florida, authorities say it is important to reference Section 800.045, which makes deepfake pornography a felony.
In addition to criminal charges, victims may pursue a civil lawsuit to seek damages and a court order requiring removal of the images.
Find more ways to stay up to date with your latest local news. Sign up for our newsletter to get the day's top headlines delivered right to your inbox. Subscribe to our YouTube channel for the biggest stories and can't miss video.