Character.AI ends the use of a chatbot for children

Teens are attempting to determine where they fit in a world that is changing faster than any generation before them. They are overemotional, overstimulated, and chronically online. And now artificial intelligence corporations have provided them with chatbots designed to never stop talking. The results were disastrous.

One company that understands these impacts is Character.AI, an AI-powered role-playing startup that has at least two teenagers died by suicide after long conversations with AI chatbots on its platform. Now Character.AI is making changes to its platform to guard teens and children, which could impact the startup’s financial results.

- Advertisement -

“The first thing we decided to do as Character.AI is to remove the ability for users under 18 to engage in any open AI chats on our platform,” Karandeep Anand, CEO of Character.AI, told TechCrunch.

Open conversation refers to the unrestricted exchange of opinions that takes place when users send a message to the chatbot and it responds with follow-up questions. experts say are designed to maintain users engaged. Anand argues that this sort of interaction – in which the AI ​​acts as a conversation partner or friend slightly than a creative tool – is not only dangerous for children, but also conflicts with the company’s vision.

The startup is attempting to move from “AI companion” to “role-playing platform.” Instead of talking to an AI friend, teens will use prompts to create stories or generate visuals together. In other words, the goal is to shift engagement from conversation to creation.

Character.AI will phase out access to the chatbot for teens by November 25, starting with a two-hour every day limit that may steadily decrease until it reaches zero. To make sure that the ban applies to users under 18, the platform will implement an internal age verification tool that analyzes user behavior, in addition to third-party tools akin to Persona. If these tools fail, Character.AI will use facial recognition and identity verification to confirm age, Anand said.

This move follows one other teen safety which has implemented Character.AI, including the introduction of a parental evaluation tool, filtered characters, limited romantic conversations and time spent notifications. Anand told TechCrunch that these changes resulted in the loss of a large portion of the under-18 user base, and he expects these recent changes to be equally unpopular.

Techcrunch event

San Francisco
|
October 13-15, 2026

“It’s safe to assume that many of our teenage users will likely be disappointed… so we expect continued customer churn,” Anand said. “It’s hard to speculate – will everyone leave completely, or will some move on to the new experiences we’ve been building over the last almost seven months?”

As part of Character.AI’s push to rework the platform from a chat-centric app to “full-fledged content-driven social platform”, the startup recently launched several recent entertainment-related features.

In June, Character.AI rolled out AvatarFX, a video generation model that transforms images into animated videos; Scenes, interactive, pre-populated stories where users can step into the narrative with their favorite characters; and Streams, a feature that enables dynamic interactions between any two characters. In August, Character.AI launched Social Channel, a social channel where users can share their characters, scenes, videos and other content they create on the platform.

In a statement addressed to users under 18, Character.AI apologized for the changes.

“We know that most of you use it Character.AI to spark your creativity in a way that is within the boundaries of our content policies,” the statement reads. “We do not take this step of removing open character chat lightly, but we believe it is the right decision given the emerging questions about how teens are and should be interacting with new technology.”

“We will not close the app to people under 18,” Anand said. “We’re closing open chats to people under 18 only because we hope that users under 18 migrate to other services and that these experiences will get better over time. That’s why we’re focusing on AI games, AI short films, and AI storytelling in general. That’s a big bet we’re making on bringing back people under 18 if they leave.”

Anand acknowledged that some teenagers could also be desirous to use other artificial intelligence platforms, akin to OpenAI, that allow them to have open conversations with chatbots. OpenAI also got here under fire recently after a teenager took his own life after long conversations with ChatGPT.

“I really hope that we are leading the way by setting the standard in the industry that for people under 18, open chats are probably not the right path or product to offer,” Anand said. “I think the compromise is the right one for us. I have a six-year-old and I want to make sure she grows up in a very safe environment with artificial intelligence in a responsible way.”

Character.AI makes these decisions before regulators force its hand. On Tuesday, sense. Josh Hawley (R-MO) and Richard Blumenthal (D-CT) said yes introduce laws banning AI chatbots from being made available to minors following complaints from parents who said the products encouraged their children to interact in sexual conversations, self-harm and suicide. Earlier this month, California became the first state to manage AI companion chatbots, holding corporations liable if their chatbots do not meet legal security standards.

In addition to those changes to the platform, Character.AI said it would establish and fund the AI ​​Safety Lab, an independent, non-profit organization dedicated to innovating solutions to make sure the safety of future AI-powered entertainment features.

“There’s a lot of work going on in the industry on coding and development and other use cases,” Anand said. “We don’t think enough work has been done yet on AI powering entertainment, and security will be very important here.”

Latest Posts

Advertisement

More from this stream

Recomended