Roblox is rolling out a global age verification system that uses an AI-powered selfie scan to better manage interactions between adults and minors.
By the end of 2025, Roblox will require all users with communication access to complete an age check. The process combines AI-driven facial analysis, ID verification, and, for minors, parental consent. The aim is to accurately determine users' real ages and limit communication between adults and children.
This marks a shift from the current system, which only asks for a birthdate at account creation. With the new approach, adults won't be able to freely contact children on the platform unless they're connected in real life.
Age verification to enforce communication rules
The move is part of a broader push to tighten oversight of user behavior. Roblox is deploying new systems to restrict communication across age groups. The company says its selfie-based age estimation is more reliable than manual entry.
Text chats are already fully monitored and filtered on Roblox, and image sharing isn't allowed. Kids under 13 are blocked from using voice chat or sending private messages.
Parents can use a control menu to choose which features their children can access.
New rules and AI monitoring
In recent months, Roblox has raised age limits for certain content. Experiences labeled "Restricted" are now only available to users 18 and older, and content without an age rating is blocked. Social spaces with private rooms are subject to tighter regulation.
Since early 2025, Roblox says it has added more than 100 new safety features. These include an AI system called Roblox Sentinel to detect possible sexual exploitation of minors, improved voice filters, and tools that automatically shut down suspicious servers. Another system analyzes avatars and gameplay to identify rule violations.