Compliance & Regulatory
The $49.5M Warning: Australia’s Quest to Hold Roblox Legally Accountable for User Safety
Table of Contents
The Roblox Crisis
Recently, the Australian communications minister Anika Wells has raised concerns regarding claims of child exploitation on Roblox.1 Reportedly, both Ms. Wells and eSafety commissioner Julie Inman Grant requested a discussion with the game platform about reports of alleged child grooming and exposure to harmful material.2 This is not the first controversy relating to child safety Roblox is under fire for.
The harmful material in question in the current situation allegedly consists of sexually explicit material.3 More specifically, Ms. Wells, in her letter to Roblox, emphasised two reports alleging the platform was being used by predators to groom children and expose them to sexually explicit and even suicidal material.4 In her letter, Wells states: “Even more disturbing are ongoing reports and concerns about children being approached and groomed by predators, who actively seek to exploit their curiosity and innocence… This is untenable, and these issues are of deep concern to many Australians parents and carers”.5
Guardian Australia conducted an undercover investigation in November 2025, and the results of the investigation were horrific, claiming that:
“while playing as an eight-year-old girl, the reporter was given a sexualised avatar, cyberbullied, aggressively killed, sexually assaulted, and shat on – all with parental control settings in place”.6
Ms. Wells has asked Roblox to clarify what measures they had put in place to protect users, especially children, from any harm, and has also asked the Classification Board to review Roblox’s current PG rating.7 In a statement Ms. Wells said: “The safety of children online is non-negotiable… The reports we’ve been hearing about children being exposed to graphic content on Roblox, and predators actively using the platform to groom young people are horrendous. Something must be done — now… These sorts of harms show why we need a digital duty of care, which will place the onus on digital platforms to proactively keep their users, particularly children, safe”.8
Roblox has already had consultations previously with different stakeholders regarding user safety. Last September, Roblox consulted with the eSafety Commissioner and put in place commitments to ensure the platform was compliant with the Online Safety Act.9 Some of the commitments made by Roblox include, among others: Making accounts for users aged under 16 private by default, introducing tools to prevent adult users from contacting under-16s without parental consent, and voice chat not allowed between adults and 13 to 15-year-olds, alongside prohibiting the function entirely for under-13s.10
Ms. Inman Grant has now informed Roblox that the regulator would be testing their compliance with the commitments they have set.11 Depending on the outcome of the tests, further action may be taken under the Online Safety Act, meaning Roblox could face penalties of up to AUD 49.5 Million.12
The Online Safety Act seems to be the main legal basis for scrutiny against Roblox. The Act will have new codes introduced from March 9th which focus on age-restricted material such as sexually explicit content and self-harm.13 Whilst Australia introduced a social media ban for minors under 16 years old, gaming sites like Roblox are not impacted.14 Ms. Inman Grant explained that although Roblox has chat features, it is not a social media platform, and is thus exempted.15
The reports are awful, but may not be unsurprising to Roblox, who have seemingly put a lot of effort into mitigating the issue. In the last few years, they have added multiple safety features to the game, for example introducing a ‘Sensitive Issues’ content tag to experiences “primarily themed on a sensitive social, political, or religious issue”,16 which will be inaccessible to players under 13 without parental consent.17 In November 2024, Roblox also made direct messaging on platform chats inaccessible for children under 13 years old.18
Most recently, Roblox began requiring users to verify their age before they can access the chat option.19 The platform claimed that with the change: “Roblox becomes the first large online gaming platform to require age checks for users of all ages to access chat“,20 terming it: “… an important investment in user safety and enables age-appropriate communication…”.21
Nevertheless, with the new reports and investigations, it seems Roblox still has some things to figure out when it comes to truly protecting users, particularly children, on their platform. Roblox has recently responded to the investigations, stating they will work closely with Australian law enforcement to support the investigations, but ‘reassured’ that the game has “robust safety policies and processes to help protect users that go beyond many other platforms, and advanced safeguards that monitor for harmful content and communications”.22
The Way Forward
The current situation seems to shed light on a very important problem the author identifies with regards to the regulation of video games and online platforms. The Australian government is now demanding ‘urgent’ action, yet they already have ‘hard laws’ in place such as the Online Safety Act to rely on. However, it required the surfacing of horrific reports to trigger the threat of fines, seemingly suggesting that even ‘hard laws’ fail if they are reactive rather than structurally preventative.
The Australian government took a significant step in the regulation of social media with a complete ban for minors under 16, and the author suggests it does something similar with regards to gaming, especially with platforms like Roblox, where the experience is chat-heavy and social in nature. Although such gaming platforms are not classified as ‘social media’ for regulatory purposes, the risks that arise in them are extremely similar and should arguably be treated the same. In fact, the author argues that the dangers looming in social media are more mainstream and widely known amongst regulators and even parents, yet the dangers found in games such as Roblox seem to often go under the radar.
Hence, the author suggests that policymakers in Australia, and other jurisdictions as well, should look toward a gaming-specific hard law framework. This framework would move beyond vague guidelines and commitments and instead mandate things like strict age verification tokens, a ban on the use of Direct Messaging and voice chats by e.g. under-14s, and regular disclosure and reporting obligations for the platforms.
By codifying these specific technical requirements into ‘hard law’, regulators can move away from the current approach of responding to horrific headlines and instead force a preventive ‘safety by design’ standard.
Takeaways
The current Roblox crisis is a signal that empty commitments and guidelines are no longer sufficient. For the esports and gaming community’s sake, the takeaway is clear: the industry must stop treating users’ safety as a recommendation and start treating it as a regulated and enforced order. If platforms cannot prove they are safe for their users, they should face not just fines, but potentially the same stop-supply notices given for defective goods.
For example, in Australia, under section 122 of the Australian Consumer Law, the Commonwealth Minister can issue a recall notice if goods ‘will or may cause injury’ or do not comply with a safety standard.23 The author believes the same logic could, and possibly should apply to gaming platforms as well. If a game like Roblox does not comply with safety standards, like the ‘hard law’ the author suggests implementing, or presents a risk of harm to consumers, like the current Roblox situation demonstrates, consequences must be as grave as a defective product.
- Vikki Blake, ‘Australian government calls for action from Roblox on “untenable” child safety concerns’ (GamesIndustry.biz, 9 February 2026) https://www.gamesindustry.biz/australian-government-calls-for-action-from-roblox-on-untenable-child-safety-concerns accessed 14 February 2026.
↩︎ - ibid.
↩︎ - Holly Tregenza, ‘Roblox on notice after “disturbing” reports of child grooming in Australia’ (ABC News, 10 February 2026) https://www.abc.net.au/news/2026-02-10/commonwealth-roblox-reports-of-child-grooming/106323242 accessed 15 February 2026.
↩︎ - ibid.
↩︎ - Vikki Blake, (n 1).
↩︎ - Vikki Blake, (n 1).
↩︎ - Holly Tregenza, (n 3).
↩︎ - Holly Tregenza, (n 3).
↩︎ - eSafety Commissioner, ‘Roblox commits to lift game to protect kids from online grooming under Australia’s world-leading online safety codes and standards’ (Media Release, 15 September 2025) https://www.esafety.gov.au/newsroom/media-releases/roblox-commits-to-lift-game-to-protect-kids-from-online-grooming-under-australias-world-leading-online-safety-codes-and-standards accessed 14 February 2026.
↩︎ - Holly Tregenza, (n 3).
↩︎ - Holly Tregenza, (n 3).
↩︎ - Holly Tregenza, (n 3).
↩︎ - Holly Tregenza, (n 3).
↩︎ - Vikki Blake, (n 1).
↩︎ - AAP, ‘eSafety commissioner says Roblox age restrictions come after negotiations, fine threats’ (SBS News, 19 November 2025) https://www.sbs.com.au/news/article/roblox-to-introduce-age-guessing-tech-to-avoid-looming-social-media-ban-for-kids/8kgad5lie accessed 15 February 2026.
↩︎ - Vic Hood, ‘Roblox to introduce new “Sensitive Issues” content descriptor’ (GamesIndustry.biz, 7 August 2025) https://www.gamesindustry.biz/roblox-to-introduce-new-sensitive-issues-content-descriptor accessed 15 February 2026.
↩︎ - ibid.
↩︎ - Sophie McEvoy, ‘Roblox adds further changes to safety systems and parental controls’ (GamesIndustry.biz, 18 November 2024) https://www.gamesindustry.biz/roblox-adds-further-changes-to-safety-systems-and-parental-controls accessed 15 February 2026.
↩︎ - Vikki Blake, ‘Roblox rolls out global age checks for any user wanting to use its chat feature’ (GamesIndustry.biz, 7 January 2026) https://www.gamesindustry.biz/roblox-rolls-out-global-age-checks-for-any-user-wanting-to-use-its-chat-feature accessed 15 February 2026.
↩︎ - ibid.
↩︎ - ibid.
↩︎ - Vikki Blake, (n 1).
↩︎ - Competition and Consumer Act 2010 (Cth) sch 2 s 122.
↩︎