The internet was not built with children in mind. For decades, platforms designed for adults became default spaces for minors, with little regulatory pressure to do anything about it. The UK Online Safety Act 2023 is one of the most significant attempts to change that.
Enacted in October 2023 with provisions brought into force in phases through secondary legislation and Ofcom Codes of Practice, the Act aims to protect users online from illegal content and to provide additional protections for children against harmful content, requiring regulated online services and search engines to implement certain measures and processes, such as age assurance and age-appropriate access controls, and the removal of illegal content.
If you operate an online service that enables users to generate or share content, the Online Safety Act may apply to you. Continue reading to find out how you can achieve compliance.
Does the Online Safety Act 2023 apply to your organization?
While the word “online” within the name of the new legislation might give the impression that it applies to all online services, the Online Safety Act’s scope is limited to certain internet services.
The Online Safety Act will apply to an online service only if two conditions are satisfied cumulatively:
Condition 1: The online business is a user-to-user service or a search service
User-to-user services: It refers to internet services by which a user may generate, post, or share content, and such content might be encountered by another user of the internet service. For instance, social media companies, dating apps, and online forums fall under this category and will be subject to the Online Safety Act if other conditions are met.
Search services: It covers internet services that include a search engine. Search engines such as Google fall under this category.
Condition 2: The service has links with the United Kingdom
The Act will apply to a user-to-user service or a search service if the service has links to the United Kingdom. If a user-to-user online service has a significant number of UK users or targets the UK market, it will likely be sufficient to establish links with the UK for the new Act to apply.
Even if these circumstances are not present, an online service might still be subject to the Act if it is accessible to UK users and there is a significant risk of material harm to UK individuals. Put simply, even a non-UK website, mobile app, or online platform might fall within the scope of the Online Safety Act if it targets the UK market, is used by UK individuals, and presents a significant risk of material harm.

How to determine which specific obligations apply to a regulated service
Your Online Safety Act compliance journey starts with determining the specific obligations that apply to you. This is because the Act imposes different obligations on different categories of regulated services depending on whether a service is accessed by children, the size of its UK user base, and if it displays certain content, such as pornographic content.
When it comes to determining the applicable obligations, organizations can consider four distinct categories of obligations depending on their status under the Act:
- Key obligations that apply to Part 3 Services: If an organization provides a regulated search or user-to-user service, it will be subject to the primary obligations under Part 3 of the Act. These primary obligations include illegal content risk assessment, safety duties, children risk assessment, content reporting, and other obligations.
- Additional obligations and liabilities that apply to services that are likely to be accessed by children: If an internet service falling under the scope of the Act is likely to be accessed by children, additional obligations will apply to such service, including specific child safety duties, safety-by-design, and child risk assessment.
- Obligations on Category 1 Services: If a service is classified as a “Category 1 Service” under the Regulations, it must meet stricter requirements, such as a duty to prevent fraudulent advertising and to offer users an identity verification option. For example, if a user-to-user service has more than 34 million monthly active UK users and it relies on a content recommender system, it may be considered a Category 1 Service.
- Obligations on Category 2A and 2B Services: If a service is classified as a “Category 2A” or “Category 2B” service, it will have to comply with additional duties under the Act. These include the obligation to prevent fraudulent advertising and to publish annual transparency reports.
Knowing which category applies to your organization shapes everything that follows: the assessments you need to conduct, the safeguards you need to implement, and the timeline you're working against.
.webp)
What are the key obligations of Part 3 Services under the new law?
As explained above, an organization's obligations and liabilities under the Online Safety Act 2023 will depend on its classification under the Act.
In this section, we will outline the key obligations that apply to all regulated services under the Act.
Illegal content risk assessment and illegal content duties
Regulated internet services subject to the new Act are required to carry out and document an illegal content risk assessment.
When conducting this assessment, service providers should assess the risk of harm arising from priority offenses and other illegal content. The priority offenses include threats to kill, child sexual abuse material-related offenses, to encourage users to self-inflict harm, and unlawful supply of drugs, as set out in the applicable law.
This assessment will help service providers understand what types of illegal content service users might encounter and what risks to users' physical and psychological safety are involved in accessing and using the services. Identifying these risks and illegal content will enable service providers to implement appropriate measures to mitigate risks and to prevent the display, use, and dissemination of illegal content to users.
For example, service providers can fulfill their duties by implementing the measures outlined in Ofcom's Illegal Content Codes of Practice for user-to-user services. These measures include content moderation and using hash matching to detect and remove child sex abuse material.
Safety duties
Under the Act, service providers must implement appropriate measures to prevent the display, use, or dissemination of illegal content on the services so that other users are not exposed to such content. Furthermore, these measures must be designed to ensure that illegal content remains accessible on the services for as short a time as possible and is removed swiftly.
To fulfill this duty, service providers should implement content moderation and complaint procedures to detect and remove illegal content as quickly as possible.
Establishing a complaints procedure
The Online Safety Act requires service providers to establish and implement a complaints procedure that allows users to submit complaints related to alleged illegal content, service providers’ breaches of the Act (such as their illegal content duties), and service providers’ actions (such as the removal of content).
Service providers must ensure that the complaints procedure is easy to use and transparent.
Payment of annual fees
Ofcom may require service providers subject to the Online Safety Act 2023 to pay an annual fee.
Children’s access assessment
Service providers must conduct a “children’s access assessment” to assess if child users might access and use the regulated services falling under the scope of the new Act. If the assessment concludes that the services are likely to be accessed by children, the service provider must comply with additional requirements applicable to child-accessed services.
Additional obligations for regulated services likely to be accessed by children
In the previous section, we explained the key requirements that apply to all user-to-user and search services regulated by the Online Safety Act.
If such a service is likely to be accessed by children, it will have to comply with more stringent requirements in addition to the ones we described above.
Children’s risk assessment
Providers of child-accessed services must complete and document a children’s risk assessment, which should address the risk of child users encountering harmful content such as online pornography, the level of risk of harm posed to children, and the risk mitigation measures implemented by the service provider.
Child safety duties/child safety by design
If you operate a child-accessible regulated service, you are under a duty to implement child safety measures to mitigate the risks of harm to children that you identified in the children’s risk assessment.
For instance, service providers must use proportionate mechanisms such as age assurance controls to prevent child users from encountering primary priority content such as pornography, content that encourages suicide, or content that promotes self-injury.
Terms of service
Providers of child-accessible services must update their terms of service to address how they prevent child users from encountering harmful content. For example, they explain how they protect children from harmful content, which may involve age verification through credit card checks and photo ID matching.
What are the regulatory fines and criminal sanctions for non-compliance with the Online Safety Act 2023?
Ignoring the requirements of the Online Safety Act may result in both monetary penalties and criminal sanctions:
- Penalties: If you fail to comply with one or more of your obligations under the Online Safety Act, Ofcom may impose a monetary penalty of up to £18 million or 10% of your annual global turnover(qualifying worldwide revenue), whichever is greater.
- Criminal liability : The new Act also sets out criminal liability for senior managers of organizations. For example, if the organization gives false information to Ofcom, destroys information, or fails to comply with the Ofcom information notices, the named senior manager may face criminal liability for such acts or omissions.
Alongside monetary penalties and criminal liability, the new Act grants Ofcom additional enforcement powers. For example, Ofcom may apply to court to restrict access to an internet service, request information, or conduct audits.
Practical tips for organizations subject to the UK Online Safety Act 2023
The Online Safety Act 2023 represents a meaningful shift in how the UK expects platforms to treat their youngest users. More than a legal checkbox, compliance with the Act is an acknowledgment that children interact with online services differently, and that those services have a responsibility to reflect that.
If you are an internet service provider operating in the UK, start with the following questions:
- Does your service fall under the scope of regulated services under the Online Safety Act 2023?
- If so, can your services be accessed by children or may be classed as a Category 1, 2A, or 2B service? In those cases, you need to consider the additional, more stringent obligations you must comply with.
Once you determine your obligations, you can begin to implement appropriate measures and processes to achieve compliance. These may include conducting an illegal-content risk assessment, implementing safety-by-design measures, and considering age-assurance technologies (such as age verification or age estimation) to prevent children from accessing harmful content.
For organizations navigating these obligations, getting consent and data practices right for child-accessible services is increasingly central to the work. At Didomi, we help organizations build privacy frameworks that are robust enough to meet evolving regulatory demands. If you're thinking through your approach, our team is happy to help.
{{talk-to-an-expert}}
Frequently asked questions (FAQs)
Does the Online Safety Act apply to my app or service?
It depends on what your app or service does. If it allows users to generate, post, or share content that other users can see, it likely qualifies as a user-to-user service under the Act. That includes social apps, dating apps, gaming platforms, and online forums.
If your app also has a significant UK user base or targets the UK market, the Act almost certainly applies to you. The safest first step is to conduct an applicability assessment.
What is age assurance under the Online Safety Act?
Age assurance is the umbrella term the Act uses for mechanisms that verify or estimate whether a user is a child. It covers two main approaches: age verification (confirming a user's age through a document or credential check) and age estimation (inferring age from behavioral or biometric signals).
Services likely to be accessed by children are required to implement age assurance controls to prevent minors from encountering harmful content such as pornography or content that promotes self-harm.
What is the difference between age verification and age estimation?
Age verification confirms a user's age through a direct check, such as a credit card, photo ID, or third-party identity service. Age estimation infers whether a user is likely to be a child based on indirect signals, such as facial analysis or behavioral patterns, without collecting identity documents.
Both approaches fall under the broader category of age assurance. Ofcom's guidance allows services to use either method, provided it is "highly effective" at preventing children from accessing harmful content.
When do Online Safety Act duties come into force?
The Act is being implemented in phases. Illegal content duties became enforceable on 17 March 2025, meaning platforms must already have completed their illegal content risk assessments and have appropriate safety measures in place. Child safety duties came into force on 25 July 2025. The categorization of services as Category 1, 2A, or 2B under the threshold regulations came into effect on 27 February 2025.
Enforcement is already underway, though some elements, including the publication of the categorization register for Category 1, 2A, and 2B services, are still rolling out through 2026.
What is a children's access assessment?
A children's access assessment is a mandatory evaluation that service providers must carry out to determine whether children are likely to access their platform.
If the assessment concludes that child users can access the service, the provider becomes subject to a stricter set of obligations, including a children's risk assessment, child-safety-by-design requirements, and the implementation of age-assurance controls. The assessment must be documented and kept up to date.
Does the Online Safety Act apply to services based outside the UK?
Yes, it can. The Act applies to any user-to-user service or search engine that has a significant number of UK users, targets the UK market, or is accessible to UK users and presents a significant risk of material harm. This means non-UK businesses, including US and EU-based platforms, may still fall within scope.
If you have meaningful UK traffic, an applicability assessment is essential regardless of where your service is based.
How does the Online Safety Act relate to other UK privacy laws?
The Online Safety Act 2023 sits alongside the UK's existing privacy framework. Organizations subject to the Act must still comply separately with the UK GDPR, the Privacy and Electronic Communications Regulations (PECR), and the Data (Use and Access) Act 2025 when processing personal data, including data collected through age assurance mechanisms.
For a full breakdown of how UK privacy laws fit together, read our guide to UK privacy laws.














