The popular online gaming platform Roblox will introduce new communication restrictions next month, preventing children from interacting with adult strangers. The move comes as the company faces multiple lawsuits alleging its platform has been exploited by predators targeting young users.
Beginning in December, Roblox will utilize facial age estimation technology to categorize users into specific age groups and limit communication accordingly. The system will initially roll out in Australia, New Zealand, and the Netherlands before expanding globally in January.
Users will be grouped into six categories: under nine, nine to twelve, thirteen to fifteen, sixteen to seventeen, eighteen to twenty, and twenty-one and over. Children will only be able to chat with others within their immediate age range. For instance, a twelve-year-old user would be restricted to communicating with those under sixteen.
Company officials described the approach as similar to school grade groupings, where elementary, middle, and high school students typically interact within their own age ranges. The facial data used for age verification will not be stored, according to platform representatives.
The safety initiative follows legal complaints filed in multiple US district courts alleging inadequate protection of minor users. Recent lawsuits describe instances where predators allegedly posed as children to establish relationships with young users, eventually coercing them into sharing explicit content.
A platform safety executive stated the new measures aim to build user confidence in the platform’s communication features. “We see this as an opportunity to enhance trust among our users regarding who they’re interacting with in these games,” the official commented.
Roblox maintains that its existing safety policies are already stricter than many competing platforms, including restrictions on image sharing and filters to prevent personal information exchange. Company representatives acknowledged that no safety system is perfect but emphasized ongoing efforts to improve platform protections.
Child safety advocates have responded cautiously to the announcement, noting that while the measures represent progress, the platform has faced criticism for being slow to address predatory behavior. One children’s digital rights organization founder expressed hope that the new system would indeed establish higher safety standards for the gaming industry.
The gaming platform, which reports approximately 150 million daily users, has seen rapid growth in recent years, particularly during the pandemic period. Its most popular games include viral hits that have attracted both young users and safety concerns from parents and regulators.