CALL US TODAY: 0203 983 8278
Home // Who Does the Online Safety Act apply to?

The Online Safety Act 2023 (the OSA) represents a landmark shift in the regulation of online spaces, ushering in a new era of accountability and safety in the digital world. Enacted with the primary aim of protecting both children and adults from harmful online content and activities, the OSA imposes a wide array of obligations on companies operating in the digital sphere. It seeks to mitigate the risks associated with illegal and harmful content, ensuring that online platforms are safer and more transparent for users across the UK. In this article, we delve into the scope of the OSA and identify who is subject to its provisions.

What is the Online Safety Act?

The OSA is a comprehensive piece of legislation designed to make the internet a safer place for users, particularly vulnerable groups such as children. It introduces new duties for digital service providers, mandating them to take proactive steps to prevent the dissemination of illegal content and to protect users from exposure to harmful material. Central to the Act is the role of Ofcom, the UK’s communications regulator, which has been tasked with enforcing these new rules and holding companies accountable for non-compliance.

When Does the OSA Apply?

The OSA casts a wide net in terms of its applicability. It primarily applies to two types of services:

  1. User-to-User Services: These are platforms where users can interact with each other, such as social media networks, video-sharing platforms, online forums, and messaging services. The OSA covers any service that allows user-generated content, including comments, posts, and shared files. This means that popular platforms like Facebook, Twitter, YouTube, and WhatsApp, among others, are directly impacted by the OSA.
  2. Search Services: The Act also applies to search engines that help users find information online. This includes major players like Google and Bing, which must now ensure that their services do not inadvertently direct users to illegal or harmful content.

Does the OSA Apply to Companies Outside the UK?

One of the most significant aspects of the OSA is its extraterritorial reach. The Act applies not only to companies based in the UK but also to those outside the country if they provide services accessible to UK users. This includes platforms that have a significant number of UK users, target the UK market, or pose a material risk of significant harm to UK users.

For example, a social media platform based in the United States, but with a large user base in the UK would need to comply with the OSA’s requirements. Similarly, a search engine that is accessible in the UK and could potentially expose users to harmful content would also fall under the OSA’s jurisdiction. This ensures that international companies operating in the UK digital market are held to the same standards as domestic ones.

Key Duties Imposed by the OSA

The OSA imposes several key duties on service providers, tailored to the size and nature of the platform. These include:

  1. Illegal Content: All platforms must take robust measures to prevent illegal content from appearing on their services. This includes content related to child sexual abuse, terrorism, extreme violence, and other criminal activities. Companies are required to put in place systems that identify, remove, and prevent the reappearance of such content.
  2. Harmful Content for Children: Protecting children is a central focus of the OSA. Platforms likely to be accessed by children must implement stringent measures to prevent them from encountering harmful content. This includes pornography, content promoting self-harm or suicide, and material encouraging dangerous stunts. Companies must also ensure that age restrictions are enforced consistently across their services.
  3. Transparency and User Control: The OSA demands greater transparency from platforms about the kinds of content they allow and how they enforce their rules. Major platforms must provide adult users with tools to control the content they see and filter out harmful material, including hate speech and content that promotes eating disorders or self-harm.

OSA Enforcement and Penalties

Ofcom, as the regulator, has been given significant powers to enforce the OSA. It can impose fines of up to £18 million or 10% of a company’s global turnover, whichever is higher, for non-compliance. Additionally, criminal action can be taken against senior managers of companies that fail to meet their obligations, particularly in relation to protecting children. In severe cases, Ofcom can even block services from being accessed in the UK by cutting off their financial support and internet access.

Conclusion

The Online Safety Act 2023 is a powerful tool in the fight to make the internet a safer space for all users, particularly the most vulnerable. Its broad scope and strict enforcement mechanisms mean that companies providing digital services to UK users must now operate with a higher degree of responsibility and transparency. As the digital landscape continues to evolve, the OSA will play a critical role in shaping a safer, more secure online environment.

At Nath Solicitors, we specialise in providing expert legal counsel on all aspects of harmful online content. If you need advice on complying with the Online Safety Act or any other legal matter, contact us on 0203 983 8278 or get in touch with Shubha Nath at enquiries@nathsolicitors.co.uk or at shubha@nathsolicitors.co.uk

    CONTACT US TODAY

    I accept the privacy policy

    To prove you are not a robot, please answer the following question:

    Testimonials

    Copyright. Nath Solicitors Limited. Registered in England and Wales. Company Number: 08724944. VAT number: 207490711. Office Located at: 35 Berkeley Square, London, W1J 5BF. Nath Solicitors Limited is authorised and regulated by the Solicitors Regulatory Authority. Registration number 608014. Terms Of Use. Privacy Policy. Cookies Policy. Complaints Procedure.