Introduction and a Warning About Illegal Harm
The UK Online Safety Act is a transformative piece of legislation designed to make online spaces safer for users. Platforms that fail to comply risk hefty fines, reputational damage, and, in extreme cases, criminal liability. Illegal harm online, including child sexual abuse material (CSAM), terrorist content, and hate speech, is now under tighter scrutiny.
The regulations are international in their reach and apply to online services that have links with the UK, regardless of where the provider of the service is based or registered.
TL:DR – The deadline has passed for completing a risk assessment. You should still complete one immediately because if your business provides an online service (a service made available over the internet), such as a website or an app, the Act may apply. The Act introduces new regulations for online service providers to help keep people in the UK – especially children – safe from illegal and harmful content online.
Contents
- Introduction and a Warning About Illegal Harm
- Why Compliance with The Online Safety Act Matters
- Who the UK Online Safety Act Applies To
- User-to-user services
- Search services
- Video-sharing platforms
- Services with pornographic content
- Key Compliance Deadlines and Enforcement Timelines
- Understanding the Core Requirements of the Online Safety Act 2023
- Defining Illegal Harm Using the Act’s Terminology
- The 17 kinds of priority illegal content defined by the Online Safety Act 2023
- Other illegal content (including non-priority offences) noted in the Online Safety Act 2023
- A Summary of the Risk Assessment Duties
- How to Conduct Your Risk Assessment Using Ofcom’s Four-Step Methodology
- What Evidence to Use to Ensure Accuracy
- When to Carry Out a New Assessment
- Your Legal Obligations
- How to Conduct a Compliance Audit for the Online Safety Act
- Pre-Emptive Protection vs. Moderation: What’s Good Enough?
- Detecting, Removing, and Reporting Illegal Content
- Age Verification and Parental Controls: What’s Required?
- Transparency and Reporting: What Must Be Disclosed to Ofcom?
- User Safety Measures
- Proactive Steps to Stay Compliant
- Training Your Team: Educating Staff on Compliance Best Practices
- Algorithmic Accountability: Managing Risks in Automated Systems
- Moderation and Content Governance: Best Practices for Compliance
- The Role of Digital Services in Ensuring Platform Safety
- Data Privacy and Security in Online Safety Compliance
- Penalties for Non-Compliance: What’s at Stake?
- The Role of Regulators: What to Expect from Ofcom and Enforcement Agencies
- Third-Party Tools and Solutions to Support Compliance
- How to Develop a Long-Term Compliance Strategy
- Future Changes and Evolving Regulations: Staying Ahead of Updates
- Conclusion: Immediate Actions to Ensure Compliance Today
- Sources of this information
Why Compliance with The Online Safety Act Matters
This legislation was enacted to reduce the proliferation of harmful content, enhance platform accountability, and protect vulnerable users. It imposes strict requirements on service providers, requiring them to assess and mitigate risks associated with online harm. Failure to comply could result in enforcement actions by Ofcom, the regulator tasked with overseeing adherence to the Act.
Who the UK Online Safety Act Applies To
The Act affects businesses both inside and outside the UK if they provide services accessible to UK users. The level of obligation varies depending on the service type and the potential risks posed to users.
The regulations apply to a broad spectrum of online services:
User-to-user services
User-to-user services allow people to generate and share content for other people to see.
They include:
-
social media
-
video-sharing
-
private messaging
-
online marketplaces
-
dating
-
review
-
file- and audio-sharing
-
discussion forums
-
information-sharing
-
gaming
Search services
A search service allows you to search more than one website or database for information, or content.
There are two main types of search service: general search services and vertical search services.
- General search services allow you to search content from across the web.
- Vertical search services allow you to search for specific products or services offered by different companies, such as flights, credit cards or insurance.
Video-sharing platforms
Video-sharing platforms (VSPs) are online services that allow users to upload and share videos with other people. Most VSPs, like YouTube and Instagram, will have to follow the new online safety rules, and will have the same duties as other user-to-user services. But some VSPs (those established in the UK) are already bound by separate rules. These include Twitch, TikTok and Snapchat.
There are specific legal criteria to determine whether a service has the required links to the UK to be a UK-regulated VSP. If a service meets these criteria, they must notify Ofcom who hold a list of notified platforms to which these rules apply.
Services with pornographic content
Services with pornographic content Online services where the provider of the service publishes or displays pornographic content. They also include services which allow users to upload and share pornographic content which can be viewed by other users of the services. These services could be user-to-user sites and their apps or video-sharing platforms.
Key Compliance Deadlines and Enforcement Timelines
Organisations must prepare for enforcement as Ofcom introduces compliance expectations in phases. The first steps include submitting risk assessments and implementing safety policies. Delays could result in warnings, fines, or more severe regulatory intervention.
Date | Milestone | Action required |
March 2025 | Illegal Harms Codes of Practice comes into force |
Carry out illegal content risk assessment Carry out children's access assessment Publishers of pornographic content required to take steps to implement highly effective age assurance immediately |
April 2025 | First version of the protection of children codes of practice | Carry out children's risk assessment |
July 2025 | Protection of Children Codes of Practice come into force | Specific services must disclose their risk assessments to Ofcom |
Understanding the Core Requirements of the Online Safety Act 2023
Defining Illegal Harm Using the Act’s Terminology
The Act itentifies and categorises harmful content into 17 kinds of priority illegal content and other illegal content (including non-priority content). Platforms must actively prevent, detect, and remove material that falls under these definitions.
The 17 kinds of priority illegal content defined by the Online Safety Act 2023
- Terrorism: Content promoting or inciting terrorist activities.
- Harassment, stalking, threats, and abuse offences: Material involving harassment, stalking, threats, or abuse.
- Coercive and controlling behaviour: Content depicting or encouraging coercive control in relationships.
- Hate offences: Material inciting hatred against individuals or groups based on protected characteristics.
- Intimate image abuse: Sharing private sexual images without consent, often referred to as "revenge pornography."
- Extreme pornography: Content depicting extreme sexual acts that are illegal under UK law.
- Child sexual exploitation and abuse: Material involving the sexual exploitation or abuse of children.
- Sexual exploitation of adults: Content depicting or promoting the sexual exploitation of adults.
- Unlawful immigration: Material facilitating or promoting illegal immigration activities.
- Human trafficking: Content related to the illegal trade of humans for exploitation or commercial gain.
- Fraud and financial offences: Material promoting or facilitating fraudulent financial activities.
- Proceeds of crime: Content concerning the handling or laundering of illegally obtained money.
- Assisting or encouraging suicide: Material that encourages or assists individuals in committing suicide.
- Drugs and psychoactive substances: Content promoting the sale or use of illegal drugs and substances.
- Weapons offences (knives, firearms, and other weapons): Material promoting the illegal possession or use of weapons.
- Foreign interference: Content involving foreign entities interfering in domestic affairs.
- Animal welfare: Material depicting or promoting cruelty towards animals.
Other illegal content (including non-priority offences) noted in the Online Safety Act 2023
All service providers must consider whether there is a risk of harm from other illegal content appearing on their service and, if so, what the magnitude of this risk is. Some of this illegal content
is described in the Register of Risks as ‘non-priority illegal content’, but it may also be appropriate to consider other offences depending on the circumstances of your service and the evidence you
hold. In addition to considering each of the 17 kinds of priority illegal content, you should consider whether you have evidence or reason to believe that other types of illegal harm that
aren’t listed as priority offences in the Act are likely to occur on your service. If you do have evidence that a particular kind of non-priority illegal content is likely to occur then you should
include it in your risk assessment.
A Summary of the Risk Assessment Duties
All applicable services must conduct a comprehensive risk assessment to evaluate potential exposure to harmful content. This involves assessing the likelihood of illegal harm appearing on the platform and taking proactive steps to mitigate these risks.
How to Conduct Your Risk Assessment Using Ofcom’s Four-Step Methodology
- Identify risks: Determine areas where illegal or harmful content may emerge.
- Evaluate severity: Assess the impact such content may have on users.
- Mitigate risks: Implement safeguards to reduce exposure.
- Review periodically: Update the assessment as risks evolve.
What Evidence to Use to Ensure Accuracy
A robust risk assessment relies on incident reports, content moderation data, user complaints, and expert analysis. Companies should document risk mitigation efforts for regulatory scrutiny.
When to Carry Out a New Assessment
Risk assessments should be conducted regularly, especially after major platform changes, updates to the Online Safety Act, or emerging threats detected within the digital landscape.
Your Legal Obligations
How to Conduct a Compliance Audit for the Online Safety Act
Regular audits help ensure policies, reporting mechanisms, and moderation practices align with legal requirements. An audit should review data security, risk assessments, and compliance documentation.
Pre-Emptive Protection vs. Moderation: What’s Good Enough?
While platforms are not always required to prevent harm preemptively, they must demonstrate adequate moderation and risk mitigation efforts. Proactive monitoring may be necessary for high-risk content.
Detecting, Removing, and Reporting Illegal Content
Platforms must implement systems to swiftly identify and remove prohibited content while maintaining reporting mechanisms for authorities like Ofcom or law enforcement.
Age Verification and Parental Controls: What’s Required?
Certain platforms must enforce age verification to prevent children from accessing inappropriate content. Parental controls should be made available to empower guardians.
Transparency and Reporting: What Must Be Disclosed to Ofcom?
Platforms must submit compliance reports detailing risk assessments, content moderation statistics, and transparency reports outlining their approach to tackling online harm.
User Safety Measures
Proactive Steps to Stay Compliant
Implement automated content detection tools, human moderation teams, and clear user policies to ensure compliance.
Training Your Team: Educating Staff on Compliance Best Practices
Compliance isn’t just a technical issue—it requires ongoing staff training on moderation techniques, risk assessment, and crisis response.
Algorithmic Accountability: Managing Risks in Automated Systems
AI-driven moderation tools must be transparent and auditable to prevent bias or regulatory violations.
Moderation and Content Governance: Best Practices for Compliance
Establish clear community guidelines, enforce penalties for violations, and maintain an appeals process for content moderation decisions.
The Role of Digital Services in Ensuring Platform Safety
Third-party security tools, user reporting systems, and partnerships with law enforcement enhance compliance efforts.
Data Privacy and Security in Online Safety Compliance
Compliance measures must not infringe on user privacy rights. Secure data storage and encryption protocols are essential.
Penalties for Non-Compliance: What’s at Stake?
Fines for non-compliance can reach £18 million or 10% of global turnover, whichever is higher. Repeat offenders may face criminal liability or service restrictions.
The Role of Regulators: What to Expect from Ofcom and Enforcement Agencies
Ofcom holds investigative and enforcement powers, issuing compliance notices, fines, and injunctions for non-compliance. Platforms must be prepared for regulatory scrutiny.
Third-Party Tools and Solutions to Support Compliance
Companies can utilise third-party moderation software, AI-driven risk detection, and legal advisory services to strengthen compliance efforts.
How to Develop a Long-Term Compliance Strategy
Long-term compliance requires continuous monitoring, periodic audits, adaptation to legislative updates, and a designated compliance officer overseeing regulatory adherence.
Future Changes and Evolving Regulations: Staying Ahead of Updates
The Online Safety Act will evolve as new threats emerge. Staying proactive with legal updates and industry best practices ensures continued compliance.
Conclusion: Immediate Actions to Ensure Compliance Today
Organisations must conduct a risk assessment, implement safety measures, train staff, and develop clear content policies without delay. Ensuring compliance now will mitigate regulatory risks and protect users in an increasingly scrutinised digital landscape.
Sources of this information
We have reviewed Ofcom's guidance to assist service providers in complying with their illegal content risk assessment duties under the Act. See Ofcom website
Online platforms are required to conduct thorough risk assessments to identify and mitigate the presence of both priority and non-priority illegal content, ensuring compliance with the Act and fostering a safer online environment for all users. See Ofcom Guidelines (pdf)