Technology
Roblox Steps Up Safety: No Messaging for Young Children
- Roblox restricts private messaging for users under 13, requiring parental authorisation.
- Parents can remotely monitor accounts, including screen time, buddy lists, and game activity.
- New maturity ratings classify game content, complying with the UK’s Online Safety Act to protect young users.
Roblox, known for its interactive virtual worlds, has strengthened safeguards to enhance the safety of its younger users. The platform has updated its messaging policies, now restricting users under 13 from sending direct messages unless a verified parent or guardian provides approval. In addition, new parental controls allow parents to monitor friend lists, set daily playtime limits, and oversee their child’s gaming activity remotely.
Managing Safety Concerns on a Popular Platform
According to Ofcom research, Roblox is the leading gaming platform for eight- to 12-year-olds in the UK, increasing pressure on the company to improve safety for younger users. The new safety measures, which started rolling out this week, will be fully implemented by March 2025. Under the revised guidelines, young players can still engage in public chats within games, but private messaging will require parental approval.
Enhanced Parental Controls and Age Verification
Matt Kaufman, Roblox’s Chief Safety Officer, reiterated the platform’s strong commitment to safety, sharing that more than 10% of the company’s workforce is dedicated to safety initiatives. As Roblox continues to evolve, its approach to user protection will also improve.
Parents can now unlock enhanced parental controls by verifying their identity and age through a government-issued ID or credit card. Kaufman also recommended that parents input accurate age information when creating their child’s account to ensure they can fully access these new safety features.
Revamped Content Maturity Guidelines
The update also introduces clearer content maturity guidelines. Roblox will replace age-based suggestions with content labels that specify game types. These labels range from “minimal” (with occasional moderate violence) to “restricted” (featuring intense violence and mature themes).
Users under nine will only have access to “minimal” or “mild” content by default. To access “moderate” games, parental approval is required. Restricted content will be blocked until users turn 17 and verify their age.
Remote Parental Monitoring Features
Roblox now offers enhanced remote parental controls. Parents can link their accounts to their child’s, adjust settings, track screen time, and view their child’s friend list, all from their own devices. As children grow, the platform will automatically update safeguards, with parents receiving a 30-day notice before any changes take effect.
Responding to Critiques of Previous Safety Measures
These updates come after growing criticism of Roblox’s earlier safety measures. Concerns have been raised about the platform’s vulnerability to cyber predators, as well as its history of child exploitation lawsuits. A recent Bloomberg investigation revealed how predators might exploit the site, leading to renewed demands for stronger protections.
Industry Responses and Compliance with UK Regulations
Industry experts and child safety advocates have largely welcomed the changes as a positive move forward. Richard Collard from the NSPCC praised the reforms but urged Roblox to focus on robust age verification to ensure the effectiveness of these safeguards. Matthew Johnson of MediaSmarts noted that many of these features are already standard on other child-focused platforms, raising questions about the delay in their implementation.
These updates come as platforms in the UK gear up to meet the new requirements under the Online Safety Act. Ofcom, the agency overseeing the law’s enforcement, has warned businesses that failure to protect young users could lead to sanctions. Roblox’s new measures aim to align with these regulations while addressing ongoing concerns about child safety on the platform.