top of page

COMPLY WITH ONLINE SAFETY

ch6682



ONLINE SAFETY ACT

 

The UK's Online Safety Act (OSA), enacted on October 26, 2023, governs user-generated online content, emphasizing the regulation of illegal material and content harmful to children. Throughout 2024, the implementation timeline experienced slight adjustments, as outlined here. However, Ofcom advanced toward establishing an operational framework, conducting numerous consultations and releasing draft legislation, particularly in the latter part of the year.

 

COMPLIANCE AND ENFORCEMENT

 

Compliance preparations accelerated significantly toward the end of 2024, culminating on December 16 when Ofcom published its Statement on protecting people from illegal harms online. This decision outlined the Illegal Harms Codes and accompanying guidance, specifying the obligations user-to-user and search service providers must meet to fulfil their illegal harms safety duties under the UK's Online Safety Act (OSA). Earlier, on November 20, the government released its draft Statement of Strategic Priorities (SSP), which Ofcom must consider when performing its regulatory functions, including enforcement. Ofcom is required to report to the Secretary of State on its actions aligned with these priorities, with the reports informing government measures on online safety. The priorities will be finalised with input from online safety experts and campaigners before Ofcom begins enforcing the OSA’s initial safety duties.

 

AGE ASSURANCE AND CHILDREN'S ACCESS

 

On 16 January 2025, Ofcom published its Age Assurance and Children's Access Statement under the Online Safety Act (OSA), alongside the following: guidance on highly effective age assurance and other Part 5 duties – aimed at pornography providers to support compliance with Section 81 of the OSA, particularly ensuring the use of age verification or estimation (collectively referred to as age assurance) measures to prevent children from typically encountering pornographic content; guidance on highly effective age assurance (HEAA) – intended for user-to-user and search services to help implement HEAA measures in line with their OSA obligations, with only those services that do so likely to demonstrate that children cannot access their platforms; and Children's Access Assessments Guidance – designed to assist user-to-user and search services in meeting the requirement to assess whether children can access their services.

 

Relevant obligations are summarised in quick guides which are helpful starting points:

 

FURTHER LEGISLATION

 

Legislation was also introduced, and in some cases finalised, in December 2024. The Online Safety Act 2023 (Commencement No. 4) Regulations 2024 were made on 10 December 2024, bringing into force sections of the OSA relating to pornography providers from 17 January 2025. The draft Online Safety Act 2023 (Category 1, Category 2A and Category 2B Threshold Conditions) Regulations 2025 were laid before Parliament in December 2024. These regulations are significant as they will define the thresholds at which services become categorised and therefore subject to additional obligations under the OSA. The proposed thresholds have been revised since Ofcom published its initial views in October 2024 and are now as follows:

 

Category 1 - Thresholds are met by a regulated user-to-user service where, in respect of the user-to-user part of the service, it:

 

  • has an average number of monthly active UK users exceeding 34 million and uses a content recommender system, or

  • has an average number of monthly active UK users exceeding 7 million, uses a content recommender system, and provides functionality that allows users to forward or share regulated user-generated content with other users of the service.

 

Category 2A - Thresholds are met by a search engine of a regulated search service or a combined service where it has an average number of monthly active UK users exceeding 7 million and is not a search service that:

 

  • only enables a user to search selected websites or databases related to a specific topic, theme, or genre of search content, and

  • is facilitated through an arrangement between the provider of a regulated search service or combined service and one or more entities, relying on an application programming interface or other technical means to present search results to users.

 

Category 2B - Threshold conditions are met by a regulated user-to-user service where, in respect of the user-to-user part of that service, it has an average number of monthly active UK users exceeding 3 million and provides functionality that allows users to send direct messages to other users of the same service, with the design ensuring that messages cannot be encountered by any other users unless further action is taken by the sender or recipient.

 

The draft regulations also clarify the definition of a recommender system and outline the methodology for determining the number of active UK users. The register of categorised services will be published in summer 2025, with proposals on additional duties for categorised services expected no later than early 2026.

 

WHAT THIS MEANS


  • Illegal harms duties - In-scope providers must assess the risk of illegal harms on their services by 16 March 2025. The Codes of Practice are expected to be finalised by 17 March 2025, when Ofcom will begin enforcement. Providers must follow the Codes or implement equally effective measures to protect users from illegal content. If using alternative measures, services must prove their effectiveness. Ofcom has stated it is "ready to take enforcement action if providers do not act promptly to address the risk on their services."

  • Children's access assessments - All in-scope user-to-user and search services (Part 3 services) must assess whether children are likely to access their service or parts of it by 16 April 2025, unless using HEAA. Ofcom expects most services will conclude that children can access them, requiring compliance with the children’s risk assessment and safety duties.

  • Measures to protect children on social media - Ofcom will publish its Protection of Children Codes and risk assessment guidance in April 2025. Services accessible to children must conduct a children’s risk assessment within three months (by July 2025) and implement protective measures as outlined in the Codes.

  • Services that allow pornography - Part 5 services publishing their own pornographic content, including some Generative AI tools, must implement age checks and HEAA by 17 January 2025. Part 3 services hosting user-generated pornographic content must have HEAA in place by July 2025.

 

VERY GOOD POWERS

 

The government has said it will keep the new online safety rules under review, particularly in relation to social media platforms and children. Appearing on the Laura Kuensberg Show on 12 January 2025, Peter Kyle, Secretary of State for Science, Innovation and Technology, described the Online Safety Act as an inherited landscape "where we have seen a very uneven, unsatisfactory legislative settlement". He stopped short of committing to changing it or publishing further online safety legislation, but said he was open-minded on the subject. He did, however, say that the OSA contained some "very good powers" which he plans to use to tackle online safety concerns.

 

RESPONSE FROM CYBER LONDON

 

Cyber London’s research arm, COSPI – the Centre for Online Safety, Safeguarding, Privacy and Identity – was established and co-founded in 2024 at City St Georges, University of London by Professor Muttukrishnan Rajarajan, also one of our directors, and Carrie Myers, a Professor of Criminology and Victimology. COSPI is a pioneering research centre that brings a unique interdisciplinary approach to exploring and understanding digital human security in order to make the next generation internet a safer and more secure platform. Its research culture actively fosters collaborative work that bridges traditional knowledge silos, bringing a more integrated understanding of the complex interactions between the socio-technical aspects of digital human security. Since COSPI functions as a bridge and a catalyst for innovative research in the online safety, online harm, identity fraud and information security sectors, it will be actively researching the ramifications of OSA compliance and enforcement in the UK.

 

 

 

 
 

Comments


Cyber London is the recognised Cyber Cluster for London and supported by: 

DSIT logo.png
UKC3 Logo.png
  • Facebook
  • Twitter
  • LinkedIn

Website design by S&E Newman

(C) Copyright 2024

Cyber London Limited

162 Farringdon Road

London

EC1R 3AS

Email: info@cyberlondon.com

Tel: 02078705755

Company No. 15080724

Cookie Policy

Privacy Policy

bottom of page