Roblox, the blockbuster gaming platform with over 85 million daily users, is facing serious backlash after a shocking new investigation revealed disturbing risks to young children. The research, led by digital behavior experts at Revealing Reality, has exposed a troubling disconnect between the platform’s child-friendly branding and the actual experiences of its youngest users.
The findings are as alarming as they are detailed: children as young as five were found to be interacting with adults, accessing sexually suggestive environments, and witnessing inappropriate behavior that many parents would find traumatizing.
“We found something deeply disturbing,” said Damon De Ionno, research director at Revealing Reality. “There’s a troubling disconnect between Roblox’s child-friendly appearance and the reality of what children experience on the platform.”
What’s Really Happening Inside Roblox?
Describing itself as “the ultimate virtual universe,” Roblox is home to millions of user-generated games and virtual experiences. But according to the report, these experiences can range from innocent to outright explicit—and safety controls are falling short.
To test the platform’s boundaries, researchers created several accounts posing as users aged five, nine, ten, thirteen, and over forty. Although the accounts interacted only within the study, the results were deeply unsettling.
Despite Roblox implementing new parental controls just last week, the report concluded these tools are “limited in their effectiveness,” warning of significant ongoing risks. In one instance, a 10-year-old avatar accessed a virtual hotel room where avatars were lying on top of each other in sexually suggestive poses. Another environment featured avatars urinating in public bathrooms while donning fetish-themed accessories.
Even more concerning, researchers using an adult-registered account were able to solicit the five-year-old avatar’s Snapchat details with lightly coded language—raising red flags about grooming and predatory behavior on the platform.
Voice Chat and AI Moderation Aren’t Enough
The report also shines a light on Roblox’s voice chat feature, which is supposedly restricted to users aged 13 and older with verified phone numbers. While the company says the voice chat is moderated in real-time by AI, researchers still heard avatars simulating sexual activity, complete with slurping, kissing, and grunting noises.
“Children can still chat with strangers not on their friends list,” said De Ionno. “With 6 million experiences on the platform—often with inaccurate descriptions and ratings—how can parents be expected to moderate?”