作者:Barbara Wabara See full bio
The chatbot that lets you have surprisingly realistic conversations with AI-generated personas of athletes, celebrities and politicians has had its fair share of controversies. We walk you through what you need to know.
There are a lot of AI chatbots out there, but Character.AI is an artificial intelligence chatbot with a difference. Unlike traditional chatbots like ChatGPT, Claude and Gemini, which are primarily for productivity, research and answering questions, Character.AI was built for entertainment, role-playing, companionship and interactive storytelling. You can even use Character.AI to practice a new language and play games.
The brainchild of its namesake company, Character.AI was launched in November 2021 by former Google engineers Noam Shazeer and Daniel De Freitas. Available in beta until September 2024, you'll now be directed to the website or mobile apps available on iOS and Android platforms, while community forums have been moved to Discord.
According to Character.AI, it now supports more than 20 million monthly active users. With nearly 200 million total visits equally distributed between male and female visitors, it is most popular with 18- to 24-year-olds.
Character.AI allows you to talk to AI versions of your favorite celebrities, sports stars and world leaders, and because it can hold surprisingly realistic conversations, it is very popular among those looking for engaging dialogue rather than straightforward answers. Another interesting feature is that multiple users can interact with these characters simultaneously, creating a more communal experience.
But it's not without its controversies. Character.AI has been hit with multiple lawsuits about the content it has generated, especially with younger users. There are also concerns about AI chat addiction and unauthorized depictions of real people -- though it does have a disclaimer that you're talking to an AI bot, not a real person, with fictional responses.
"I think AI-driven chatbots like Character.AI will only grow in popularity, especially if we see better regulation and ethical oversight," Fang Liu, professor at the University of Notre Dame and co-editor-in-chief of ACM Transactions on Probabilistic Machine Learning, told CNET. "I don't think AI will replace real human relationships, but it could polarize social behavior."
I'll dive into all that and everything else you need to know about Character.AI.
Character.AI runs on large language models (LLMs). It uses natural language processing (NLP) and deep machine learning to create more natural and engaging conversations. Like other AI chatbots, it's trained on massive amounts of text data, allowing it to recognize context and predict responses.
Unlike most chatbots, which often come across as affirmative and apologetic, Character.AI lets you design characters that adjust their tone based on what you say, and can even sound sassy, like Librarian Linda in the example below.
Technically, you can program tools like ChatGPT to give it specific personalities, too, but Character.AI's responses sound more relatable than generic chatbot replies. For example, it can detect emotions from your input and respond with appropriate emotional tones. It maintains short-term context within a conversation, making exchanges feel fluid, though it does not retain long-term memory.
From historical figures and celebrities to fictional characters and user-created personas, there's a character for everyone to talk to. Because of its popularity among younger crowds, many characters come from anime and gaming communities.
One of the most popular characters is Gojo Satoru from Jujutsu Kaisen, with over 785 million chats. Celebrities like Beyoncé and Elon Musk, along with athletes such as Cristiano Ronaldo and Lionel Messi, have several character chatbots, each with a few million chats. Meanwhile, Kylie Jenner, South Korean boy band BTS and Harry Styles have multiple bots, each accumulating tens of millions of chats.
Certain celebrity characters, like Taylor Swift, have reportedly been removed because of the ToS violations. If none of the existing characters are to your liking, you can easily create your own and define their personality, appearance and even voice.
Start with choosing a name that forms the foundation of your character's identity. Next, you'll craft an opening greeting to set the tone for how your character interacts. You can customize it further by selecting a voice, adjusting the conversation style to be more casual or formal and fine-tuning its tone to match its personality.
Privacy settings let you decide who can interact with your character, whether you keep it private, share it via a link or make it fully public. Once your character is live, you can chat with it, tweak its responses and refine its personality over time to make it feel even more real.
Compared with other AI companion chatbots on the market, Character.AI is most similar to tools like Chai and Replika.
While Character.AI allows you to switch between multiple characters, Replika focuses on building an emotional bond with a single AI personality. Chai focuses on casual AI conversations with various premade characters, but lacks the deep customization and personality-driven responses that Character.AI offers.
Character.AI is free for its basic version and also offers a premium subscription plan called Character.AI+ for $10 per month. This subscription lets users skip queues, ensuring faster access during peak demand, early access to new features and a dedicated community forum.
Like any AI tool, Character.AI has limitations. It can sometimes generate inaccurate information, called hallucinations, or responses that don't quite align with a character's intended persona. It's important to remember that every chat has a disclaimer that these are AI bots, not real people, and their responses should be taken as fiction.
While these interactions can be entertaining, experts worry about their impact on social behavior, especially among younger users. "If we don't address the root cause, we'll keep seeing users, especially teens, forming emotional dependencies on AI rather than real human connections," Brandon Purcell, vice president and principal analyst at Forrester Research, told CNET.
AI companionship may blur the lines between reality and artificial relationships, raising concerns about anthropomorphism. It's a phenomenon where people attribute human emotions and consciousness to AI and form deep emotional attachments, sometimes treating these interactions as genuine relationships.
There are also worries that some users (particularly teenagers) are reporting AI chat addiction and spending excessive time chatting with AI rather than engaging with people in person. "Regulation alone won't solve AI addiction. The real issue is the epidemic of loneliness driving people to these tools in the first place," Purcell warned. The company has introduced new safety measures to address concerns about teen usage. Character.AI has an age restriction of 13, and 16 for users in the EU.
Moderation challenges have also led to legal issues because much of its content is user-generated. The company has faced multiple lawsuits after reports surfaced of AI-generated characters providing sexual content to minors and encouraging violence and self-harm, including a case linked to a teen's death by suicide.
Chelsea Harrison, head of Communications at Character.AI, told CNET the company could not comment on pending litigation but said, "Character.AI takes safety on our platform seriously and our goal is to provide a creative space that is both engaging and safe. We are always working toward achieving that balance." The company also announced support for The Inspired Internet Pledge, created by the Digital Wellness Lab and partnered with ConnectSafely, an organization focused on online safety, privacy, security and digital wellness.
Another controversy is the creation of AI characters based on real people, including deceased individuals and celebrities, sometimes with extreme or misleading attributes that could potentially harm their reputations. However, unauthorized depictions of real people remain a gray area. While copyright law protects fictional characters, it doesn't cover an individual's speech or personality. Publicity rights prevent commercial misuse of a celebrity's likeness, but Section 230 of the Communications Decency Act largely exempts platforms like Character.AI from liability for user-created content.
The company's terms of service prohibit unauthorized use of real people's likenesses and voices, but enforcement depends mainly on user reporting rather than proactive moderation. In contrast, companies like Meta have partnered with celebrities who explicitly authorize using their likeness in creating generative AI versions for interactive purposes.
In late January, the company made several announcements. It launched a Games beta, introducing three interactive challenges with AI characters. For a limited time, you can play Speakeasy, where you must get a character to guess a secret word without using any forbidden words from the game card. War of Words is a game where you compete against a character in creative contests judged by a referee. Match Me if You Can will test how well you know your character's preferences. Along with the games, Muted Words (web version only) lets you block specific words from appearing in chats. That is a handy feature, especially for younger users.
Artificial intelligence is evolving faster than any other technology in history, and tools like Character.AI are continually improving. As these platforms advance, they'll open up even more ways to interact with digital personalities. For now, Character.AI is a fun way to experiment with different characters, get creative and see what AI can do beyond just boosting your productivity.
If you're struggling with negative thoughts or suicidal feelings, resources are available to help. In the US, call the 988 Suicide & Crisis Lifeline at 988. In the UK, call the National Suicide Prevention Helpline UK at 0800 689 5652. In Australia, call Lifeline at 13 11 14. You can also find help at these 13 suicide and crisis intervention hotlines.