OpenAI says ChatGPT might require age verification someday

Share This Post

ChatGPT mobile app on iOS App Store

In a torrent of controversy surrounding ChatGPT and teenage mental health, OpenAI says it’s working on some safeguards for the future.

CEO Sam Altman wrote in a somewhat short and vague company blog post this week that the firm is working on an automatic age detection feature that would, in theory, intelligently place users under the age of 18 into a restricted version of ChatGPT. Pursuant to that, adults may eventually need to provide some kind of proof of their ages in order to use the unrestricted version of the chatbot. However, it should be noted that Altman provided very few specifics, including a timeline for when this might roll out.

The announcement came as the parents of Adam Raine, a 16-year-old whose death by suicide was allegedly assisted by ChatGPT, spoke to Congress about potentially regulating the chatbot. Raine’s family file a wrongful death lawsuit against OpenAI in late August.

Around the same time, the company confirmed that some parental controls will come to ChatGPT in late September. These will include the ability for parents to link their accounts to those of their children and restrict access to the app. Parents will also receive notifications when ChatGPT “detects” distress on the part of the underage user, though it may also notify law enforcement if a parent can’t be reached.

Subscribe The Newsletter

Get updates and learn from the best

More To Explore

Do You Want To Stay Connected?

drop a line and keep in touch