Character AI last fall cut off teenagers from its artificial intelligence chatbots, resulting in “tearful goodbyes to AI companions,” the Wall Street Journal reported.
Jeremy Straub
A 14-year-old's suicide after months of using the role-playing app drew significant attention to the potential harm that unsupervised access to chatbots can have on young people. Some called for only adults to have access, and Character AI responded.
People are also reading…
While the death of any young person is tragic, restricting AI is not a good solution. Instead, we should educate young people about AI chatbots and provide ways for anyone — including chatbots and their developers — to call for help and ensure that resources are in place to respond.
Age-restricting AI poses several challenges. The first is that it is likely to be ineffective. Young people have shown an incredible tenacity in bypassing most age-restriction systems, whether by using a fake ID at a bar or by using parents’ credit cards and IDs to access online content. Access regulations for adults may allow system operators to presume that minors are not using their service, relieving them of responsibility.
As alcohol use shows, an age restriction may not have the desired effect. Studies show positive benefits from a minimum drinking age of 21. But it has not eliminated all underage use because “drinking is fun,” teens say. A study of Florida’s drinking age change showed that it didn’t decrease alcohol use. Instead, it changed how and where alcohol was used.
Use restrictions on similarly “fun” chatbots and AI tools may turn obtaining them into aspirational goals, as some see fake IDs as a rite of passage.
AI age restrictions also may reduce system operators’ concern with — and implementation of — mechanisms to aid minors, because they can argue such use is illegal. Preventing access prevents teenagers from benefiting from the positive aspects of AI systems. Delaying AI use until 18 may do little beyond shifting the same challenges to a few years later, when young adults have less of a support system to help them address any negative consequences.
Education, not regulation, should be prioritized.
What AI is and how it works are part of coming of age in the 2020s. Students need to learn about AI in the same way they are taught about online misinformation. This education should cover how AI can help students in formal education and beyond. They also much learn its limitations, such as AI hallucinations.
These topics can be covered at age-appropriate levels throughout K-12 education and, for those who pursue it, collegiate and career education. Libraries, senior centers and other public education providers can help to educate those outside of formal education.
In addition, the government must play a role in solving this issue. A national and easy to use system for reporting concerns is needed, with liability-limited risk. The system should leverage cellular phone and internet providers to quickly locate and help a potential person in distress, whether a teenager, young adult or older.
Everything provided to this system should be treated as confidential to encourage use and complete disclosure while also maintaining users’ privacy. A mental-health first responder can assess the situation and take appropriate actions.
The capability for intervention already exists, but with limited resources that should be linked to chatbots and other AI systems.
AI is poised to help society tremendously through improving medicine, helping the paralyzed, combating crime, improving crops and making food. Education can help young people learn to use it effectively, as opposed to regulation attempting to proscribing AI use ineffectively. The deployment of a national mental health “call for help” capability can turn AI systems into good Samaritans that identify when their users may need help and connect them with immediate support.
Straub is the director of the North Dakota State University Cybersecurity Institute, a Challey Institute senior faculty fellow and an associate professor in computer science. He wrote this for InsideSources.com.

