Roblox’s New Safety Features Include an AI That Will Guess Your Age Based on a Video Selfie

Roblox has announced a new range of safety features directed specifically at teenagers ages 13-17, including a new age estimation technology that uses AI to guess a user’s age based on a video selfie they submit.
Today’s announcement reveals several new features being implemented in Roblox that the company claims will improve teen and child safety on its platform. At the core of the announcement are new features specifically for teens ages 13-17, giving them more freedom on the platform than younger children but still less than adults. Teens will be able to designate “trusted connections” on Roblox, with whom they will be able to chat on the platform without filters. Per Roblox, the goal is to better monitor conversations teens are having on Roblox rather than having them lured to third-party platforms where unmonitored conversations could become inappropriate.
Trusted connections are intended to be set between users who know one another well, and if a teen intends to set someone 18+ as a trusted connection, they can only do so using a QR code scanner or a contact importer.
In the past, Roblox has relied on the submission of a government ID verifying that users are 13+ or 18+ to unlock certain platform chat features. However, it is now implementing an alternative verification method. Individuals can submit a “video selfie” to Roblox, and an AI will determine if it believes the person in question is 13+ by analyzing it against “a large, diverse dataset.” Google began testing a similar feature earlier this year, as did Meta the year prior.
In addition to these changes, Roblox is also adding new tools such as online status controls, a do not disturb mode, and parental controls for parents who have linked their accounts to a teenage’s account.
Roblox has long been in an uncomfortable spotlight regarding its handling of children’s safety. In 2018, it made headlines when a mother reported her seven-year-old daughter’s Roblox character was violently sexually assaulted by other players in-game, and separately a six-year-old girl playing Roblox was reportedly invited into a “sex room”. In 2021, People Make Games published a report on the ways in which Roblox’s business model allegedly exploits child labor. In 2022, Roblox faced a San Francisco lawsuit accusing it of enabling the financial and sexual exploitation of a 10-year-old girl. In 2023, it was sued both for allegedly facilitating “an illegal gambling ecosystem” and more generally for having lax child safety protocols that allegedly led to financial loss and children’s exposure to adult content. Just last year, Bloomberg published a damning report highlighting the prevalence of child predators on the platform. That same year, the platform claimed it reported over 13,000 incidents of child exploitation to the National Center for Missing and Exploited Children in the year 2023, resulting in the arrest of 24 individuals who allegedly preyed on children through the game.
“Safety has always been foundational to everything we do at Roblox,” said Roblox chief safety officer Matt Kaufman in a statement alongside today’s new feature news. “Our goal is to lead the world in safety and civility for online gaming. We are dedicated to supporting experiences that are both deeply engaging, and empowering for players of all ages, while continuously innovating how users connect and interact.”
Rebekah Valentine is a senior reporter for IGN. You can find her posting on BlueSky @duckvalentine.bsky.social. Got a story tip? Send it to rvalentine@ign.com.