Character AI Introduces Parental Supervision Tools to Enhance Teen Safety

Following a string of lawsuits and criticism for allegedly failing to protect its underage users from harm, Character AI, the startup that lets users create different AI characters and talk to them over calls and texts, said on Tuesday that it is rolling out a new set of parental supervision tools to increase safety for its teenaged users.

         Image:Character AI

Character AI will now give guardians and parents a summary of their teens’ activity on the app via a weekly email. The email will supposedly show the average time a child spends on the app and on the web, the time they spend talking to each character, and the top characters they interacted with during the week.

The startup says this data is aimed to give parents insights into their teens’ engagement habits on the platform. It specified that parents don’t get direct access to chats themselves.

Following the lawsuits, the startup last year added safety measures like a dedicated model for users under 18, time-spent notifications, and disclaimers to remind users that they are chatting with AI-powered characters. The company also blocked sensitive content for input and output by creating new classifiers for teens.

Earlier this year, the startup filed a motion to dismiss a lawsuit that alleged the company had played a part in a teen’s suicide.

Post a Comment

Previous Post Next Post