Roblox Announces Major Safety Overhaul with Age-Based Accounts for Children and Teens
Popular social gaming platform Roblox has unveiled significant new safety measures, including the introduction of age-based accounts for kids and teenagers, in response to growing concerns over child grooming and inappropriate content. The platform, which hosts hundreds of online games and boasts an estimated 380 million users globally, will roll out these changes starting in June, alongside expanded parental controls specifically designed for users under the age of 16.
New Account Types and Enhanced Parental Controls
The two new account categories are Roblox Kids, tailored for children aged between five and eight, and Roblox Select, aimed at those aged nine to 15. Founder and CEO David Baszucki emphasized in an online announcement that these accounts will "more closely align content access, communication settings, and parental controls with a user's age." He added, "We're also establishing an ongoing selection process for games available to users under 16."
Key features of the new system include:
- Roblox Kids accounts: Users will only have access to games with minimal or mild content maturity labels. The chat function will be disabled by default, and a distinct background color will indicate the account type across the app.
- Roblox Select accounts: These users can access games rated with moderate content maturity, also featuring a unique background color for easy identification.
- Automatic progression: As users age, they will automatically transition through the age-based accounts, ensuring continuous age-appropriate settings.
Additionally, parents will gain greater control, with options to block specific games or approve new ones, empowering families to manage their children's gaming experiences more effectively.
Government Concerns and Regulatory Compliance
The safety changes come after Communications Minister Anika Wells met with Roblox representatives in February, expressing alarm over reports of graphic and gratuitous media, including sexual and suicide-related content, affecting children on the platform. In a letter prior to the meeting, Wells stated, "Even more disturbing are ongoing reports and concerns about children being approached and groomed by predators, who actively seek to exploit their curiosity and innocence."
While Roblox is not explicitly covered by the Australian government's social media ban for under-16s, which began in December 2025, the platform faces new regulatory pressures. Under recent legislation, gaming services can be fined up to $49.5 million by the eSafety Commissioner for non-compliance. New codes targeting age-restricted material like pornography and self-harm, effective from March 9, will also apply to Roblox, alongside requirements to combat grooming and sexual extortion.
Future Steps and Content Rating Transition
To further enhance safety, Roblox plans to transition to using the Australian Classification Board for assigning content ratings to games later in 2026. This move aims to help families more easily identify age-appropriate content, integrating age checks, account-level defaults, content ratings, ongoing moderation, and expanded parental controls into a unified framework for younger users.
For users who have not completed an age check, access will be restricted to games rated minimal or mild, with all communication features unavailable. Once an age check is completed, users will automatically be placed into the appropriate age-based account, streamlining the safety process.
These comprehensive updates reflect Roblox's commitment to addressing child safety concerns head-on, as the platform continues to evolve in a rapidly changing digital landscape.



