英语轻松读发新版了,欢迎下载、更新

AI companions are dangerous and not meant for minors

2025-08-23 06:31:20 英文原文

作者:By Dallas Morning News EditorialAug. 23, 2025|Updated 1:30 a.m. CDT|3 min. read

Be wary of the rise of the AI companions. Teenagers, and plenty of adults too, are engaging quickly with new digital friends created by a largely unregulated industry growing at a rapid speed. What could go wrong?

Not surprisingly, there are plenty of red flags, according to a recent Common Sense Media survey of teenagers. There is enough in this study to send shivers down the spines of parents, teachers and policymakers. We agree with the study’s overall recommendation: No one under 18 should be using AI companions.

Teenagers can text or talk with these digital creations and ask for advice. A staggering 31% said their conversations with AI companions were “as satisfying or more satisfying” than talking with real friends. While only 23% said they trust AI’s advice, younger teens (13-14) tended to be more trusting. Another concern: 12% are already using AI companions for emotional or mental health support. More than two-thirds of teenagers have used AI companions and over half of those are regular users.

Earlier this week, Attorney General Ken Paxton opened an investigation into artificial intelligence chatbot platforms, including Meta AI Studio and Character.AI. Paxton is looking into possible deceptive trade practices since some of these platforms are being marketed as mental health tools, according to a statement.

Paxton’s office is already investigating Character.AI and other 14 tech companies for potential violations of the Securing Children Online Through Parental Empowerment Act — known as the SCOPE Act — and the Texas Data Privacy and Security Act. The SCOPE Act requires companies to provide parents with tools to manage and control their children’s privacy settings, and it limits the collection of minors’ data. The TDPSA imposes notice and consent requirements on companies that collect children’s data, including artificial intelligence products.

These tools, however, are limited. There is some promise, however, in the App Store Accountability Act, recently signed into state law. This statute requires Google and Apple to implement age verification to prevent minors from downloading apps, which should be the bare minimum, but minors can still find ways to access these platforms.

That act is a start, but we would like to see more responsibility placed on actual platforms and apps rather than stores that distribute them.

Tech companies, which usually fight any attempt at legislation, should at least be more active in including safety features and crisis intervention systems, and should not allow these digital creations to pose as mental health professionals.

Character.AI, for instance, has a user-created bot called Psychologist with high demand among young users. Meta AI Studio doesn’t offer therapy bots, but according to Techcrunch, children can still use this chatbot for therapeutic purposes.

The use of AI companions is rightly raising serious questions. The mother of a 14-year-old who took his life and was obsessed with a Character.AI bot is suing the company.

Social media companies are usually protected by Section 230 of the Communications Decency Act, a 1996 federal law that protects platforms from liability for what their users post. We don’t think this law should extend to AI platforms that are serving up content they generate.

In the end, we are playing catch-up with whatever legal avenues are available. This technology is evolving so fast that Big Tech needs to take responsibility and not only endorse the benefits, but also admit its potential dangers.

Dallas Morning News editorials are written by the paper's Editorial Board and serve as the voice and view of the paper. The board considers a broad range of topics and is overseen by the Editorial Page Editor.

Meet the board

关于《AI companions are dangerous and not meant for minors》的评论


暂无评论

发表评论

摘要

The rapid growth of an unregulated AI companion industry has raised concerns among parents, teachers, and policymakers due to recent studies indicating that many teenagers find interactions with these digital entities as satisfying or more so than real-life friendships. While only a quarter trust the advice given by AI companions, younger teens tend to be less discerning. Some teens are already relying on AI for emotional support, prompting investigations into potentially deceptive marketing practices and violations of child privacy laws. There is a call for tech companies to implement stricter safety measures and crisis intervention systems within their platforms rather than just through app stores, as minors often find ways around existing restrictions. As the technology evolves quickly, there's growing consensus that Big Tech must acknowledge both the benefits and risks associated with AI companions.

相关新闻