As a parent, I’ve seen how quickly children latch onto new games. Roblox, which began life as a hobby project by two American engineers, has turned into a global playground. At its start, the founders David Baszucki and Erik Cassel wanted to give people a space to build and share virtual worlds. In 2004 they called their idea “eBlocks,” then “DynaBlocks,” and eventually settled on “Roblox” – a blend of “robots” and “blocks”. By September 2006 the platform was open to anyone, and its early versions were fairly simple.
They were inspired by educational simulation software, and they hoped that players would not just consume games but create them.
That simple idea grew very quickly. Roblox now describes itself as a “virtual universe” with millions of games and experiences. In 2024 the platform recorded more than 85 million daily active users and roughly 40 percent were under the age of thirteen.
For children, the appeal is clear: you can customise an avatar, build a town, explore imaginary spaces and even design a game that others can play. Many creators have earned real‑world money from games they built, thanks to a developer programme introduced in 2013. This mix of play and entrepreneurship has attracted teenagers and adults too.
Yet there’s a darker side to this colourful world. A 2025 investigation by digital‑behaviour experts found a “troubling disconnect” between Roblox’s friendly appearance and the reality of what children experience. Researchers created test accounts registered to children aged five, nine and ten. Even when those accounts interacted only with one another, they still stumbled upon highly suggestive environments: avatars gyrating on beds and bathrooms where characters could choose fetish accessories.
They overheard other players describing sexual activity while using the platform’s voice chat. In one example, an adult account easily asked a test account registered as a five‑year‑old for their Snapchat details.
The same investigation noted that safety controls were limited and that children could communicate with adults despite a November 2024 update that supposedly restricted messaging for under‑13s to public broadcasts. This lax age verification leaves room for grooming. ParentsTogether, a US advocacy group, warned that adults are inviting children into private rooms, asking for personal contact details and exposing them to games that simulate sex acts, Nazi content and school shootings. In 2023 alone, Roblox reported more than 13 000 instances of child exploitation and over 1 300 law‑enforcement requests. The same group cited a case in California where an 11‑year‑old girl was allegedly groomed and sexually assaulted after meeting a stranger through Roblox.
These are not the only concerns. Many games on Roblox encourage players to buy “Robux” to unlock items or abilities. ParentsTogether has noted that the monetisation model can feel like gambling – children spend to progress and can be scammed out of their Robux. Some games glorify harmful themes, and watchdog groups have accused Roblox of allowing racist, homophobic or antisemitic content. Cybersecurity company Malwarebytes summarised it bluntly: the chief executive of Roblox told the BBC that if parents aren’t comfortable, they should not let their children use the platform. The same piece warned that children risk encountering inappropriate content, online predators and scams that trick them into sharing personal information.
Roblox has responded with safety measures. In November 2024 it introduced a system where parents can use their own device to set time limits, lock down private messaging and choose the maturity level of content. It also introduced a default setting that stops users under thirteen from sending direct messages unless a parent changes the setting. In November 2025 the company announced that it will begin facial age checks and age‑based chat groups, so minors can only chat with users in the same age bracket. Under‑9 players will have chat turned off by default. These features are being rolled out globally, starting in late 2025. The company says it invests in advanced filters, uses AI to moderate voice chat and works with law‑enforcement.
Do these steps solve the problem? The Guardian’s investigation suggested that the new controls were still easy to bypass. With more than six million experiences, often with inaccurate ratings, parents struggle to keep up. Nigerian families face extra hurdles: internet access can be patchy, parents may be less familiar with online games and local regulation lags behind. The global nature of Roblox means that predators can connect across borders, so any weak link matters.
What should we do? First, be present. Play with your child or watch them play. This helps you understand the games they enjoy and spot anything worrying. Second, use the parental controls already available. Set time limits, restrict or turn off chat, and select the appropriate content rating. Third, make sure your child knows never to share personal information such as their name, address, school or social‑media profiles. Teach them to ignore links that promise free Robux or ask them to leave the platform. If someone invites them to chat on another app or asks for pictures, they should tell you immediately. Fourth, keep devices up to date with security software and supervise any in‑game purchases.
Finally, trust your instincts. If you feel uneasy about Roblox, there are many other ways your child can be creative and social. Encourage outdoor play, board games and storytelling. As our kids grow up in an interconnected world, the line between fun and danger can be thin. By staying informed and involved, we can let our children explore their imaginations while keeping them safe.
Michael Irene, CIPM, CIPP(E) certification, is a data and information governance practitioner based in London, United Kingdom. He is also a Fellow of Higher Education Academy, UK, and can be reached via moshoke@yahoo.com; twitter: @moshoke








