An analysis of the ICO’s Reddit decision and Ofcom’s comparable powers under the Online Safety Act
In February 2026 the Information Commissioner’s Office (ICO) announced that it had fined Reddit, Inc nearly £14.5 million for failing to implement age verification and carry out a data protection impact assessment (DPIA) in relation to the risks of children accessing harmful content. It has recently published the full monetary penalty decision, setting out its reasons for that decision. Broader developments across the UK, EU and worldwide show that concerns about children’s use of social media is an issue of increasing global concern.
Key points
The ICO’s Reddit decision highlights some key points:
- The application of the ICO’s Age Appropriate Design Code (more commonly referred to as the “Children’s Code”) is not limited to websites and apps specifically directed to children, but also those likely to be accessed by a child. The Data Protection Act 2018 provides that the ICO must take the Children’s Code into account when considering whether an information society service (ISS) provider has complied with its obligations under the UK GDPR.
- This is an area of regulation where there is considerable potential overlap between the application of data protection law and the Online Safety Act.
- The ICO’s stated position is that, whilst platform moderation can have some effect in reducing the potential for harm to children, it is largely an after-the-event approach that means potentially harmful content is available for a period of time until it is taken down. Accordingly, use of moderators does not relieve ISS providers of their obligations under data protection law.
- Reddit could have been subject to enforcement under the Online Safety Act, with Ofcom’s enforcement powers extending beyond those of the ICO, including the ability to seek court orders to block access to the platform in the UK.
Background
In 2021 the ICO conducted a risk assessment to identify ISS providers that could be considered higher risk because of provisional indications of non-conformance with the Children’s Code. The ICO identified that Reddit’s website did not have any form of age-gating and had been classified as “poor” in terms of children’s safety in a guide produced by O2 and the NSPCC. It opened an investigation into Reddit, which found that:
- Reddit’s user agreement and privacy policy stated that its platform (i.e. its website and app) was not to be used by children under 13 years old. Users were asked to declare whether they were over 18 to access content flagged as “NSFW” (not safe for work), but (until July 2025) Reddit did not use any form of age assurance.
- Reddit asked users whether they consented to their personal data being processed via advertising and analytics cookies, but did not ensure that any such consent purportedly given by a child under 13 had been given or authorised by the holder of parental responsibility over the child, as required by Article 8 of the UK GDPR.
- Until January 2025 Reddit did not conduct any assessment of the risks of processing children’s personal data, even though it offered its platform to children aged 13-17. The ICO found that there was a high risk of children being exposed to harmful content, including pornography and content relating to suicide, self-harm, substance abuse and eating disorders.
- Reddit processed children’s data in ways including transferring it outside the UK (including to the USA), using it to determine the content of the children’s feeds, monitoring and analysing their usage and activities (including what they did on third-party advertiser websites), and sharing it with third parties, including for advertising purposes.
ICO decision
The ICO decided that Reddit had breached the following provisions of the UK GDPR:
Article 5(1)(a) – the lawfulness, fairness and transparency principle
Reddit did not have a lawful basis to process the children’s personal data. It could not rely on consent, for reasons explained below. Reddit tried to argue that it could rely on legitimate interests, but the ICO took the view that it could not satisfy the required balancing test, highlighting that Article 6(1)(f) provides that the interests of children must be given particular protection. Reddit also tried to argue that it could rely on contractual necessity, but again the ICO rejected this on the basis that, because Reddit did not offer its platform to children below 13, there was no other offer that these children could accept in order to form a valid contract.
In addition, Reddit’s privacy notice was incomplete and unclear, so it was in breach of the transparency principle. The lack of clarity was particularly important, because this meant that children would not be able to understand it.
Article 6 – Lawful basis and Article 8 – Conditions applicable to child's consent in relation to information society services
Reddit did not obtain valid consent to use the personal data of children under 13. In particular, it failed to comply with the Article 8 requirement to ensure that consent is given or authorised by the holder of parental responsibility over the child.
Article 35 – Data protection impact assessment
The ICO found that during the period to which the fine relates, Reddit did not conduct a DPIA to identify the risks to children and how to mitigate them. Because it was processing children’s data, this was a requirement under the UK GDPR, the Children’s Code, and the ICO and European Data Protection Board (EDPB) guidance on DPIAs.
The ICO specifically referred to the obligations contained within the Online Safety Act (OSA) to conduct a children’s access assessment, children’s risk assessment and illegal content risk assessment, illustrating the close links and potential overlap between the regulatory regimes
Other relevant ICO enforcement action and public statements
In February 2026 the ICO fined the owner of image sharing and hosting platform Imgur £247,590 for failing to use children’s personal information lawfully. The ICO found that Imgur breached the law by:
- Failing to implement any age verification measures.
- Processing the personal information of children under 13 without parental consent or any other lawful basis.
- Failing to carry out a DPIA to identify and reduce privacy risks to children.
In March 2026 the ICO published an open letter to social media and video sharing platforms operating in the UK, calling on them to strengthen age assurance measures so young children cannot access services that are not designed for them. The open letter sets out the ICO's expectations that platforms with a minimum age must move beyond relying on children to self-declare their ages and use appropriate age verification technology. The ICO announced that it had also written directly to platforms, including Snapchat, Facebook, Instagram, YouTube and X to ask them to demonstrate how their age assurance measures meet these expectations.
Also in March, the ICO published a joint statement with Ofcom to clarify how UK online services can meet their obligations under the Online Safety Act (OSA) and data protection law when implementing age assurance. Both Ofcom and the ICO play key roles in protecting children online, with Ofcom focusing on online safety (content and platform risks) and the ICO on data protection (data harms).
Their joint statement sets out a “common approach to age assurance” that is risk-based, flexible, tech-neutral, and future-proof, ensuring consistent regulatory expectations for services implementing age assurance.
Both regulators require services to implement age assurance measures that are effective, proportionate, and compliant with their respective regimes. Ofcom mandates “highly effective age assurance” (HEAA) for certain types of content and services, such as pornography and other primary priority content, while the ICO requires robust age assurance to prevent unlawful processing of children’s data and to comply with the Children’s code.
Importantly, the statement clarifies that while the OSA does not require all services to use HEAA to prevent minors from accessing their platforms, if a service does not implement HEAA, it must assume that minors are using the service. This triggers an obligation to conduct child safety risk assessments and implement appropriate protections for all children.
Ofcom has made clear that these steps are necessary to demonstrate compliance with the duty to protect children from harmful content under the OSA, and failure to do so may result in investigation or enforcement action. In addition, both regulators have made it clear that where platforms have contractual age limited HEAA measures must be in place to prevent those under that contractual limit from accessing the site to prevent unlawful processing and ensure online safety.
Both Ofcom and the ICO have the power to investigate, enforce, and require remediation where services fail to implement adequate age assurance or protections for children, whether the harm arises from content or data processing.
It is worth noting in this context that both Ofcom and the ICO are members of the Digital Regulation Co-Operation forum (DRCF) which is a body that brings together key UK regulators and allows them a forum and mechanism to collaborate on areas which overlap between their respective regulatory remits. The publication of the joint statement is a clear indication of how OFCOM and the ICO are working together on areas of mutual interest, particularly in relation to ensuring the safety of children online.
Ofcom’s regulatory remit
Ofcom, as the designated regulator under the OSA, mandates HEAA for certain types of content and services. If a platform does not implement HEAA, it must assume minors are using the service, triggering obligations to conduct child safety risk assessments and implement appropriate protections for children.
Reddit’s failure to implement robust age verification, allowing children under 13 to join and access harmful content, would likely also constitute a potential breach of its safety duties under the OSA. It is interesting to note that Ofcom’s enforcement powers under the OSA are broader than those available to the ICO: for example, Ofcom can apply to the courts for service restriction orders, requiring ISPs to block access to non-compliant platforms in the UK, a power not available to the ICO.
Therefore, Reddit’s conduct as set out in the ICO decision in particular issues including weak age checks, lack of risk assessments, and exposure of children to harmful content, could equally have triggered enforcement under the OSA by Ofcom, potentially resulting in sanctions such as fines or even court-ordered blocking of access in the UK.
Given the overlapping remits of both regulators, Ofcom focusing on content and platform safety, and the ICO on data protection and privacy harms, there is a strong emphasis on regulatory cooperation. Their joint statement on age assurance underscores a coordinated approach, with both regulators requiring services to implement effective and proportionate age assurance measures and clarifying that enforcement action will be coordinated where appropriate. In practice, this means that decisions regarding which regulator is best placed to address a particular breach (such as weak age checks or unlawful processing of children’s data) are likely to be discussed and agreed between Ofcom and the ICO, ensuring that enforcement is both targeted and effective.
In this instance, the primary breaches concerned unlawful processing of children’s personal data including failures in age verification, parental consent, and risk assessment data. Children under the age of 13 were able to access the site, resulting in processing that contravened the UK’s digital age of consent, which is currently set at 13. It is therefore understandable that enforcement action against Reddit was pursued by the ICO, given that these failings are squarely within the ICO’s statutory remit under UK GDPR and under the Children’s Code. While Ofcom’s role under the OSA focuses primarily on content and platform safety, the ICO remains the specialist regulator for data protection harms, making it the logical lead authority in this case. Although Ofcom could, in principle, have initiated parallel enforcement under the OSA, especially given the overlap in age assurance requirements, good regulatory practice and public law principles of fairness and proportionality favour coordinated action to avoid duplication and risks of 'double punishment'. The ICO and Ofcom, both members of the Digital Regulation Cooperation Forum, have signalled a commitment to collaboration and proportionality, ensuring that their respective enforcement action is targeted and effective. In reality, our view would be that dual enforcement being taken by both the ICO and Ofcom at the same time relating to the same facts is unlikely, but legally possible where breaches engage both regimes. In this instance, the ICO’s particular areas of focus and regulatory priorities made it best placed to address Reddit’s data protection failures. However, it will be interesting to observe how enforcement develops in future cases, and whether the ICO and Ofcom will in due course adopt a dual approach, with both regulators either taking action together or in parallel.
Parallel EU development: European Commission investigation into Snapchat's compliance with Digital Service Act’s child protection rules
In March 2026 the European Commission, which is the EU institution responsible for enforcing the EU Digital Services Act (DSA) against very large online platforms (VLOPs), announced that it has opened an investigation into Snapchat's potential breaches of the DSA including insufficient age assurance measures (ie reliance on self-declaration), as well as other concerns about failure to protect minors from harmful messages and other consent.
Parallel international development: global privacy sweep
In March 2026 the Global Privacy Enforcement Network (GPEN) published a report on the results of its 2025 sweep which examined almost 900 websites and apps used by children. The GPEN is an informal network of the privacy enforcement authorities of OECD (Organisation for Economic Cooperation and Development) member countries, and its tasks include joint enforcement initiatives, such as this sweep. The sweep found that for 72% of websites and apps reviewed, participants were able to circumvent age assurance measures, most often where self-declaration was used. The report also flags concerns about the collection of children’s data, a lack of information about protective controls and privacy practices tailored to children, absence of an accessible way to delete accounts, and access to inappropriate content and high-risk design features.