Compliance & Regulatory
New Age Assurance Deadlines: the ICO and Ofcom Are Ending Self-Declaration in 2026
Table of Contents
The ICO’s Open Letter
Thus far, in the UK, many platforms set a minimum user age of 13, yet primarily rely on self-declaration to enforce the threshold.1 This ‘self-declaration’ approach raised the alarm bells of the Information Commissioner’s Office (ICO), who published an open letter highlighting their concerns. In the open letter, the ICO argues that self-declaration is easily circumvented, which exposes under-13s to several risks, such as the unlawful collection of their personal data.2
Instead of the self-declaration method, the ICO calls for platforms to “…make use of the viable technology that is now readily available to enforce their own minimum ages and prevent these children from accessing their services”.3 The technology that the ICO is referring to includes, inter alia, facial age estimation, digital ID, and one-time photo matching.4
The ICO has made it clear that they expect platforms to implement effective age gates, explicitly stating in the open letter: “If your service is not suitable for children under a minimum age set out in your terms of service, you should therefore prevent access to children under your minimum age by implementing an effective age gate”.5
From a legal point of view, the ICO claims that with minimum age thresholds, any processing of children’s personal data below that age lacks a lawful basis, and thus, platforms must implement effective age gates in order to ensure legal compliance.6 Nevertheless, the ICO also stresses that when implementing the age assurance solution, the platforms should ensure full compliance with data protection law: “including requirements for lawfulness, fairness, proportionality, security, data minimisation and transparency-particularly in communicating with children in an age-appropriate manner”.7
The ICO is expecting action to be taken by the platforms immediately, and has expressed that it expects full cooperation by the platforms over the coming months.8 Moreover, the ICO has signalled that further regulatory action may follow should the platforms fail to meet the ICO’s expectations.9 ICO’s CEO, Paul Arnold, stated: “Our message to platforms is simple: act today to keep children safe online… There’s now modern technology at your fingertips, so there is no excuse not to have effective age assurance measures in place…Platforms need to be ready to demonstrate what they’re doing to keep underage children out and safeguard those children that are old enough to access their services”.10
The ICO’s open letter represents the next phase of their Children’s Code Strategy, which aims at pushing platforms to improve their children’s privacy protections and accurately identifying which users are children in order to provide them with the safeguards to which they are entitled.11
Enforcement action has already been taken against some platforms that the ICO has identified as failing to adequately implement age assurance measures and unlawfully processing children’s data, such as issuing a USD 19.4 Million fine for reddit.12
Joint Action with Ofcom
The ICO also closely works with Ofcom, in charge of enforcing the Online Safety Act, to align expectations around age assurance and children protection.13 The two regulators are expected to publish a joint statement in March 2026 which will outline the main areas of interaction between online safety and data protection in relation to age assurance.14
Ofcom has also released an individual statement warning sites and platforms that they are actively investigating continued failings and will be taking enforcement action.15 Ofcom claims that while there are examples of welcomed progress, the industry is not currently doing enough in terms of child protection, which has caused parents to lose their trust in tech firms.16
In order to regain trust, and better protect children, Ofcom has issued four demands for further action. Firstly, Ofcom expects tech companies to implement effective minimum-age policies, as their research shows that current enforcement is extremely weak, with 72% of children aged 8-12 being able to access their sites and apps.17 Secondly, failsafe grooming protections are expected to be implemented i.e. strict controls stopping strangers from being able to contact children they do not know.18
Thirdly, Ofcom claims that algorithms are children’s main pathway to harm online, and they thus expect safer feeds for children.19 To achieve this, they will be issuing information requests to platforms to assess their systems.20 Lastly, Ofcom is demanding an end to product testing on children and a risk assessment of significant updates before deployment.21
Ofcom has set a deadline of April 30th for tech platforms to report on actions taken, and in May, Ofcom will publish a report on how companies have responded.22 Based on this report, Ofcom will assess the need for and announce any next steps for regulatory action.23
In the meantime, Ofcom has, similarly to the ICO, also begun enforcement. Most recently, Ofcom fined 4chan GBP 450,000 for failing to comply with age check requirements under the Online Safety Act.24 This fine signals Ofcom’s willingness to penalise platforms that fail to implement effective age gates.
Furthermore, the regulator called out Roblox, demanding highly effective age checks in order to prevent grooming and exposure to harmful content.25 This is the second regulator to scrutinise Roblox, as recently the Australian Communications Minister raised concerns over child exploitation in the game.
Regulatory Gray Areas: What Does This Mean for Games Companies?
While the ICO and Ofcom’s statements and demands seem positive from a user-protection perspective, they have also been met with some criticisms, particularly from lawyers. Video Game lawyer at Wiggin, Max Navarro, posted his thoughts on Linkedin, raising some concerns about what is demanded of games companies.
Navarro expresses his concern over the fact that the regulatory framework surrounding age assurance requirements is extremely complex for games companies to navigate, stating: “The intersection between the age assurance requirements under data protection law, the Children’s Code, and the Online Safety Act is already complex to navigate – even for well-resourced games companies”.26 Moreover, Navarro explains that if a platform has a minimum age in its Terms of Service, regulators are now expecting technological enforcement, and not just a legal disclaimer.27
All of this is exacerbated by the fact that regulators are often asking for measures that are not even explicitly legally required. Navarro gives the example of Ofcom’s statement, which acknowledges that: “Despite not being explicitly required by the Online Safety Act, we are calling on platforms to do this, using highly effective age assurance”.28 This creates a grey area for the legal departments of the companies and reduces overall certainty.
Navarro concludes by noting that age assurance requirements are nuanced, and they are not as simple as simply locking a game behind an age-gate.29
Takeaways and Possible Implications
The current regulatory climate signals a watershed moment for the gaming industry. Max Navarro’s assessment is accurate and important, as the overlapping requirements indeed seem to create a grey area of legal uncertainty. Nevertheless, complexity cannot be an excuse for inertia. For too long, the industry has hidden behind ‘self-declaration’ which proved completely ineffective.
The author believes that the demands by ICO and Ofcom are crucial, as game developers must move beyond simply updating their Terms of Service and must start integrating robust technological solutions, such as digital IDs suggested by the ICO. While the lack of certainty is a valid criticism, which should be addressed at one point, the prioritisation of child safety must remain the priority.
The urgency of these measures is underscored by the harrowing findings of a November 2025 investigation conducted by the Guardian Australia, where it was claimed that: “while playing as an eight-year-old girl, the reporter was given a sexualised avatar, cyberbullied, aggressively killed, sexually assaulted, and shat on – all with parental control settings in place”,30 demonstrate the child-safety risks plaguing games, and thus any regulatory measure taken should be welcomed with open arms.
The author argues that the demands by the ICO and Ofcom are necessary, yet, in the future, even stricter, mandatory standards should be introduced in order to ensure legally binding ‘safety by design’.
- Simmons & Simmons, ‘ICO urges tech firms to improve age checks and protect children’s data’ (19 June 2024) https://www.simmons-simmons.com/en/publications/cmmnmnstp0050ufws8n1fubol/ico-urges-tech-firms-to-improve-age-checks-and-protect-children-s-data accessed 19 March 2026.
↩︎ - ibid. ↩︎
- Information Commissioner’s Office, ‘Open letter issued to tech firms to strengthen age checks and protect children’s data’ (11 March 2026) https://ico.org.uk/about-the-ico/media-centre/news-and-blogs/2026/03/open-letter-issued-to-tech-firms-to-strengthen-age-checks-and-protect-children-s-data/ accessed 19 March 2026.
↩︎ - Simmons & Simmons, (n 1).
↩︎ - Information Commissioner’s Office, (n 3).
↩︎ - Simmons & Simmons, (n 1).
↩︎ - Simmons & Simmons, (n 1).
↩︎ - Simmons & Simmons, (n 1).
↩︎ - Simmons & Simmons, (n 1).
↩︎ - Chris Burt, ‘UK ICO wants platforms to go further on age assurance’ (Biometric Update, 12 March 2026) https://www.biometricupdate.com/202603/uk-ico-wants-platforms-to-go-further-on-age-assurance accessed 19 March 2026.
↩︎ - ibid.
↩︎ - ibid.
↩︎ - ibid.
↩︎ - Lewis Silkin, ‘Tech companies operating in the UK told to make sure their age assurance works’ (12 March 2026) https://www.lewissilkin.com/insights/2026/03/12/tech-companies-operating-in-the-uk-told-to-make-sure-their-age-assurance-works-102mmpw accessed 19 March 2026.
↩︎ - Ofcom, ‘Keep underage children off your platforms Ofcom tells tech firms’ (Wired-Gov, 12 March 2026) https://www.wired-gov.net/wg/news.nsf/articles/Keep+underage+children+off+your+platforms+Ofcom+tells+tech+firms+12032026153000?open accessed 19 March 2026.
↩︎ - ibid.
↩︎ - Lewis Silkin, (n 14).
↩︎ - Lewis Silkin, (n 14).
↩︎ - Lewis Silkin, (n 14).
↩︎ - Lewis Silkin, (n 14).
↩︎ - Lewis Silkin, (n 14).
↩︎ - Lewis Silkin, (n 14).
↩︎ - Lewis Silkin, (n 14).
↩︎ - Tom Bristow, ‘UK online safety regulator fines 4chan for not doing age checks’ (Politico, 12 March 2026) https://www.politico.eu/article/uk-online-safety-regulator-fines-4chan-for-not-doing-age-checks/ accessed 19 March 2026.
↩︎ - Nnamdi Ndife, ‘UK regulators demand tougher age checks from Meta, TikTok, YouTube, Snapchat and Roblox as Online Safety Act enforcement intensifies’ (Tekedia, 12 March 2026) https://www.tekedia.com/uk-regulators-demand-tougher-age-checks-from-meta-tiktok-youtube-snapchat-and-roblox-as-online-safety-act-enforcement-intensifies/ accessed 19 March 2026.
↩︎ - Max Navarro, ‘UK age assurance updates you may have missed’ (LinkedIn, 16 March 2026) https://www.linkedin.com/posts/max-navarro-782b0273_uk-age-assurance-updates-you-may-activity-7440076076925517825-aHTY accessed 19 March 2026.
↩︎ - ibid.
↩︎ - ibid.
↩︎ - ibid.
↩︎ - Vikki Blake, ‘Australian government calls for action from Roblox on “untenable” child safety concerns’ (GamesIndustry.biz, 9 February 2026) https://www.gamesindustry.biz/australian-government-calls-for-action-from-roblox-on-untenable-child-safety-concerns accessed 19 March 2026.
↩︎