Skip to content Skip to footer
Enquiries Call 0345 209 1000
Person signing a document

The Online Safety Act: Introduction of new regulations by Ofcom for service providers

Introduction

There has been a lot of discussion over the last year about the Online Safety Bill, particularly for its controversial nature amongst the Big Tech companies. The Bill received Royal Assent on 26 October 2023 and is now the Online Safety Act (“Act”).

The Act is a new set of laws designed to protect online users by placing a duty of care on service providers, such as social media and tech companies, to take greater responsibility for their users’ safety on their platforms. One of its key aims is to make it more difficult for those under the age of 18 to access harmful content, such as pornography and content that promotes suicide or eating disorders, whilst also giving adults greater control over the content they interact with online.

The Act requires service providers to take responsibility for the safety of their users. This includes enforcing minimum age requirements, publishing risk assessments and removing illegal content such as that promoting self-harm and non-consensual intimate deepfake images. Platforms will need to consider how sites are designed to reduce the likelihood of them being used for criminal activity in the first place.

There have been concerns by the Big Tech companies regarding the compliance requirements of the Act. As an example, the Act requires encrypted messaging apps like WhatsApp to identify child sexual abuse content, whether communicated publicly or privately. WhatsApp claim this undermines their ability to provide end-to-end encryption, which has been a huge selling point for them amongst consumers concerned about their privacy.

Who the Online Safety Act applies to

It is aimed at a wide range of online ‘user-to-user services’ and online search services, therefore extending beyond the larger social media companies such as Facebook and Instagram. Subject to a few limited exceptions (such as internal message boards and intranets), it will essentially cover any service which allows users to upload and/or share content that others can access (either publicly or privately). The service must also have ‘links to the UK’, though this is widely defined in the Act and includes services:

i) with a significant number of UK users;

ii) where UK users are one of the service provider’s target markets; or

iii) that are capable of being accessed by users in the UK and ‘there are reasonable grounds to believe that there is a material risk of significant harm to individuals’ in the UK.

Impact

The introduction of the Act could have significant implications for service providers, such as:

  • Compliance may require a significant use of resources for the relevant organisations.
  • Relevant service providers will need to implement a vigorous governance process internally for risk-management purposes.
  • Service providers will also be required to take a pro-active approach to protecting its users. This will include verifying user identities where appropriate and screening for illegal content. Additional staff training to ensure staff are aware of potential risks will be necessary.
  • Organisations should also be mindful of data protection regulations and ensure their terms and conditions, privacy policies and cookies policies are updated to ensure effective protections are in place. This includes the service provider’s acceptable use policy.
  • It is also expected any complaints in relation to online abuse or inappropriate content are taken seriously and serious breaches are reported to the relevant authorities promptly.

Ofcom’s role

The Office of Communications (“Ofcom”) will be put in charge of implementing and overseeing the new regulatory regime. If online service providers do not comply with the new rules set by the Act, Ofcom could fine them up to £18 million or 10% of their global annual revenue, whichever is greater.

Ofcom is required to consult widely before exercising its appointed functions under the Act. In the immediate short term, Ofcom will:

  • publish and consult on the regulatory codes and guidance on draft codes of practice to address illegal online content;
  • create ‘risk profiles’ of different service providers to assess those providers that will be judged to be higher risk; and
  • draft guidance on identifying illegal content and record keeping, and on how Ofcom will approach enforcement of these duties imposed on service providers.

It is expected that additional duties will be applied to higher risk services based on thresholds such as user numbers, functionalities and features.

Next Steps

The Act does not provide detailed practical steps that online service providers will be expected to take to comply with the Act, as this will instead be the task of Ofcom to implement and regulate. Ofcom launched their consultation process on 9 November 2023 on illegal harms, including child sexual abuse material, terrorist content and fraud.

Service providers should be encouraged to engage with Ofcom’s consultation as it is aimed to shape the regulatory regime by tailoring the rules to ensure they are proportionate and appropriately risk-based. Our specialist solicitors can help you navigate the complexities of the Online Safety Act and provide you with the legal support you need.

For further information and guidance, please contact 0800 652 8025, or request a consultation.

Posted:

Your key contact

Sarah Coe

Partner

Bristol and London
Sarah is a Partner in the Corporate Commercial team, specialising in commercial, intellectual property, and technology law. Her work covers the full range of contractual and e-commerce issues relevant to businesses seeking to enter new markets or exploit new technology.
View profile for Sarah Coe >

More on this topic

Corporate and commercial law

Employee ownership trusts

This article describes what an employee ownership trust (EOT) is, the benefits of EOTs and what business owners need to consider.
Read more on Employee ownership trusts

Looking for legal advice?