UK regulator, Ofcom, has laid out its plans for how it intends on implementing the UK’s Online Harms Act.
The UKs disastrous Online Harms Bill is now law. It passed a final vote and received Royal Assent last month. The new law would crack down on effective encryption and put people’s lives at greater risk, rip people’s personal privacy to shreds with age verification requirements, and a whole lot more. Insultingly, it is all being done in the name of safety.
Following the bills passage, National Crime Agency director, Graeme Biggar, said in a rant that platforms working to secure people’s communications and privacy is akin to turning a blind eye to child abuse. It was a rant so unhinged, it made Facebook seem like a respectable protector of people’s personal information which is quite an accomplishment for all the wrong reasons.
Regardless of the anti-privacy stances of some officials, implementing this travesty of a law is currently in the works. This by the UK regulator, Ofcom (Office of Communications). In a post that was recently published, the regulator says that this will all be handled in three phases:
We are moving quickly to implement the new rules
Ofcom will give guidance and set out codes of practice on how in-scope companies can comply with their duties, in three phases, as set out in the Act.
Phase one: illegal harms duties
We will publish draft codes and guidance on these duties on 9 November 2023, including:
- analysis of the causes and impacts of online harm, to support services in carrying out their risk assessments;
- draft guidance on a recommended process for assessing risk;
- draft codes of practice, setting out what services can do to mitigate the risk of harm; and
- draft guidelines on Ofcom’s approach to enforcement.
We will consult on these documents, and plan to publish a statement on our final decisions in Autumn 2024. The codes of practices will then be submitted to the Secretary of State for Science, Innovation and Technology, and subject to their approval, laid before Parliament.
Phase two: child safety, pornography and the protection of women and girls
Child protection duties will be set out in two parts. First, online pornography services and other interested stakeholders will be able to read and respond to our draft guidance on age assurance from December 2023. This will be relevant to all services in scope of Part 5 of the Online Safety Act.
Secondly, regulated services and other interested stakeholders will be able to read and respond to draft codes of practice relating to protection of children, in Spring 2024.
Alongside this, we expect to consult on:
- analysis of the causes and impacts of online harm to children; and
- draft risk assessment guidance focusing on children’s harms.
We expect to publish draft guidance on protecting women and girls by Spring 2025, when we will have finalised our codes of practice on protection of children.
Phase three: transparency, user empowerment, and other duties on categorised services
A small proportion of regulated services will be designated Category 1, 2A or 2B services if they meet certain thresholds set out in secondary legislation to be made by Government. Our final stage of implementation focuses on additional requirements that fall only on these categorised services. Those requirements include duties to:
- produce transparency reports;
- provide user empowerment tools;
- operate in line with terms of service;
- protect certain types of journalistic content; and
- prevent fraudulent advertising.
We now plan to issue a call for evidence regarding our approach to these duties in early 2024 and a consultation on draft transparency guidance in mid 2024.
So, a good chunk of how Ofcom views it enforcing the Online Harms law, specifically as it relates to “harmful” content will be coming out next week. That will undoubtedly offer up quite a bit of hints as to where the regulator is with that aspect of the new law. Next month, it looks like the issues surrounding age verification and related elements. Then, the rest will seemingly be dealt with in Spring of next year.
What strikes me is just how compressed such a schedule is. I mean, we are dealing with a law that is pretty much impossible to implement from a technical sense and Ofcom says it is releasing draft regulations for two major elements in under two months. It suggests that these regulations are being rushed which is generally a bad sign for those hoping that a regulator would get something like this right. Issues such as “awful but lawful” content is filled with nuance that would normally take years to sort out because there’s nuances as far as the eye can see on such an issue. Yet, here the regulator is saying that they can solve every problem regarding people saying mean things on the internet in within weeks. That should raise a whole lot of red flags for where this is heading in the first place.
The Online Harms Act was long suspected of being poorly thought out. Critics have long warned that it’s going to cause considerably more harm than good. The rushed schedule published by Ofcom will no doubt only serve as additional evidence that this is exactly the case even from an enforcement perspective.
Drew Wilson on Twitter: @icecube85 and Facebook.