
Character.AI is rolling out a new narrative feature called Stories, a visual, choose-your-own-adventure format that lets users stitch together short interactive tales starring their favorite characters. On paper, it’s a fun, image-driven update, but it’s also Character.AI’s first major attempt to rebuild the experience for teens after shutting down open-ended chats for users under 18 after intense scrutiny, lawsuits, and widespread safety concerns.
Stories, according to the company, are a “structured, visual, multi-path format” meant to give teens a safe way to keep engaging creatively with the platform without the risks that came with freeform chat. The new mode allows users to select two or three characters, choose a genre, write or auto-generate a premise, and then make choices as the story unfolds. It’s replayable, designed for sharing, and built around user-generated worlds. And importantly, Character.AI positions it as a tool “built for all users — especially teens.”
This pivot didn’t come out of nowhere. Last month, Mashable reported that Character.AI would “no longer permit under-18 account holders to have open-ended conversations with chatbots,” citing the company’s own admission that open chat poses unresolved risks for younger users. CEO Karandeep Anand called the decision “bold,” insisting it wasn’t tied to any one scandal, but to broader questions about the use of youth chatbots.
But of course, this followed a wave of lawsuits, including wrongful-death cases and claims from parents who said their children had been sexually groomed or traumatized by explicit bot interactions.
Our reporting earlier this year extensively documented these harms. Teens encountered chatbots that acted out sexualized role-play, simulated assault, and urged them to hide conversations from parents — behavior that one parent described as “like a perfect predator.”
Safety advocates and attorneys told Mashable that if a human adult had initiated the kinds of sexual exchanges found on Character.AI, it would clearly constitute grooming or abuse. Experts warned that young users often don’t realize they’re being manipulated, and that the emotional fallout can mirror trauma from real-world exploitation.
Against that backdrop, Stories could appear to some as Character.AI’s attempt to reengineer the product around its youngest users, especially after limiting their chats to two hours a day and announcing a full shutdown of teen open-ended chat access after Nov. 25.
By giving teens a guided, genre-driven sandbox filled with branching choices instead of freeform chat, Character.AI is trying to thread an impossible needle: Keep young users invested in the platform while addressing concerns about safety, trust, and its own role in the emotional dependencies some teens developed.
The company promises Stories won’t recycle sensitive or previously undetected content from old chats. In the months ahead, the company has plans for more teen-friendly “AI entertainment” features like gaming.
Safety advocates remain cautious. As one told Mashable back in October, the company’s new safeguards are a “positive sign” but also “an admission that Character AI’s products have been inherently unsafe for young users from the beginning.”




