The often-controversial UK Online Safety Act (the OSA) has finally become law after receiving Royal Assent yesterday UK children and adults to be safer online as world-leading bill becomes law – GOV.UK (www.gov.uk) heralding the end of the era of largely self-regulation for user generated content by technology platforms, whether large or small.
The OSA will impose new duties on all providers who host “user generated content” (i.e. services which allow users to post their own content and/or to interact with other users) and internet search engines. The OSA has extraterritorial effect meaning it applies not only to providers based in the UK but also to any overseas providers of services which have a significant number of UK users (or the UK is a target market for their service) and any such services which are capable of access by individuals in the UK and have a material risk of significant harm to individuals in the UK.
However, the OSA does exempt some services from its scope, of particular note being pure email or SMS services and so-called “below the line” or limited functionality services where users are only able to post comments or reviews of content published by the provider itself (as opposed to by other users) such as the ability to comment under online news stories published on a news website.
The OSA takes a tiered approach to regulation with a base level of new duties applying to all in-scope providers (estimated by government to number around 25,000) plus additional requirements for what is expected to be a relatively small number of larger providers (or providers of especially risky services) who are to be categorised as either “Category 1”, “2A” or “2B” under criteria which will be provided for under secondary legislation after a consultation process by Ofcom in its capacity as the official regulator for the OSA. However, it is safe to assume that all of the well known social media platforms will fall into one of these categories.
New duties applicable to all in-scope providers will include undertaking assessments of the risk of harm that users may face as a result of illegal content on their service (with additional requirements if that service is likely to be accessed by children); and the implementation of policies to the mitigate those risks including by providing users with an easy way to report illegal content or content likely to be harmful to children. However, these new duties will not have legal effect until Ofcom has completed a consultation process on them and published guidance and codes of practice which have then received parliamentary approval.
Based on Ofcom guidance issued yesterday (https://www.ofcom.org.uk/online-safety/information-for-industry/roadmap-to-regulation) it appears that the service provider categorisation process will not be completed until the end of next year at the earliest with draft proposals regarding the additional requirements for these larger providers to follow in early 2025. This is because Ofcom has chosen to prioritise its duty to consult on and publish the guidance and codes of practice for the new duties which will apply to all in-scope providers mentioned above which must then be approved by parliament before those new duties take legal effect – Ofcom is planning to move relatively quickly on this saying it will publish draft guidance and codes early next month with a view to submitting its final proposals for parliamentary approval in Autumn 2024.
At this stage the key action for any provider with an online presence in the UK will be to assess whether they are likely to fall within the scope of the Act and if so to commence preparation for their new duties in good time before those come into force. In-scope providers may also wish to participate in the consultation processes which will be run by Ofcom with a view to influencing what their new duties will be.