英语轻松读发新版了,欢迎下载、更新

UF researchers: AI websites can produce explicit images of humans quickly and cheaply

2025-05-25 20:20:21 英文原文

作者:Drew Dixon

Researchers say it may be time to consider serious regulation of AI websites that could be used to humiliate people with manufactured nude images.

Cyber Artificial Intelligence (AI) sounds as if it would be a great enhancement to the lives of humans. But University of Florida (UF) researchers are warning AI technology may become more intrusive than many people bargained for and it can get explicit.

Professor Kevin Butler, a researcher in the University of Florida Department of Computer and Information Science and Engineering is saying AI can be responsible for generating sexually explicit images of humans and without their consent, according to a UF news release.

Butler, along with other researchers from Georgetown University and the University of Washington, have been investigating a growing class of cyber tools that can enable users to generate realistic nude images after uploading photos of the real person. Those tools don’t require much skill and are low cost in an era of very little regulation covering the tech.

The research project is funded in part by the National Science Foundation.

“Anybody can do this,” said Butler, director of the Florida Institute for Cybersecurity Research. “It’s done on the web, often anonymously, and there’s no meaningful enforcement of age or consent.”

The research team has even coined  a phrase to describe the non-consensual capability of generating nude images of real people. They call the activity SNEACI, which means synthetic non-consensual explicit AI-created imagery. They phonetically pronounce the acronym as “sneaky.”

“SNEACI really typifies the fact that a lot of these are made without the knowledge of the potential victim and often in very sneaky ways,” said Patrick Traynor, Professor and Associate Chair of Research in UF’s Department of Computer and Information Science and Engineering and co-author of a research paper with Butler.

The research project analyzed 20 what’s known as “nudification” AI websites.” Researchers found the platforms allowed image uploads and body and clothing components could be shaped and posed into explicit pictures within seconds.

Some may consider tools such as photoshop could also provide such image manipulation. But the AI technology is more immediate and involves little expense.

“Photoshop requires skill, time and money,” Butler said in a news release. “These AI application websites are fast, cheap — from free to as little as six cents per image — and don’t require any expertise.”

While the researchers looked at about 20 sites, few required any age minimums and the ones that did had no real security facets to confirm ages.

“These images aren’t just created for amusement,” Butler said. “They’re used to embarrass, humiliate and even extort victims. The mental health toll can be devastating.”

Butler said researchers concluded it may be time for serious consideration for regulation of AI websites on the internet.

“There’s an emerging conversation in the machine learning community about whether some of these tools should be restricted,” Butler said. “We need to rethink how open-source technologies are shared and used.”

Post Views: 0

Drew Dixon

Drew Dixon is a journalist of 40 years who has reported in print and broadcast throughout Florida, starting in Ohio in the 1980s. He is also an adjunct professor of philosophy and ethics at three colleges, Jacksonville University, University of North Florida and Florida State College at Jacksonville. You can reach him at [email protected].

关于《UF researchers: AI websites can produce explicit images of humans quickly and cheaply》的评论


暂无评论

发表评论

摘要

Researchers from the University of Florida, along with colleagues from Georgetown University and the University of Washington, are warning about the misuse of AI technology to generate realistic nude images without consent. This capability, termed SNEACI (synthetic non-consensual explicit AI-created imagery), is easily accessible on various websites that require minimal skill and cost. The researchers argue for stricter regulation due to the potential for harassment, humiliation, and extortion, highlighting the psychological impact on victims. They call for a reevaluation of how open-source technologies are shared and used to prevent such abusive applications.