Why the Online Safety Act is even worse than you think
Matthew Lesh on the authoritarian horrors still to come from the UK's new censorship regime.

Want to read spiked ad-free? Become a spiked supporter.
The ‘age verification’ phase of the UK’s Online Safety Act has been in force for just over two weeks, and already vast swathes of the internet have gone dark to the majority of users. Although the new rules are supposed to protect children from ‘adult’ material, in practice, a great deal of obviously harmless content is being censored. Worse still, this is only the second phase of the act’s implementation, with many more rules, duties and restrictions on the way.
Matthew Lesh – country manager at Freshwater Strategy – sat down with Fraser Myers for the latest episode of the spiked podcast to discuss the Online Safety Act’s terrifying implications for free speech and privacy. What follows is an edited version of the conversation. You can watch the full episode here.
Fraser Myers: What kind of content have you seen being censored that obviously shouldn’t be?
Matthew Lesh: The requirement to remove material which could be considered ‘psychologically harmful’ to children has led to the restriction of vast swathes of content – and not just pornography. This is very much what the act intends. If you read Ofcom’s guidance, it says to remove anything that could be considered, say, ‘violent’. So we’ve seen a lot of material removed in relation to the wars in Ukraine and Gaza – a lot of which is certainly horrific, but also educational and informative. We’ve seen Katie Lam, a Tory MP, have her parliamentary speech about the rape gangs removed from social media, even though she was quoting from court transcripts. Users of X will have been met repeatedly with the warning: ‘Due to laws in your country, this material has been hidden until we estimate your age.’
Defenders of the Online Safety Act have tried to say this is down to bugs or that something has gone wrong with how it’s being implemented. But the reality is that this is exactly what the act implies and what Ofcom expects of the tech companies.
Myers: The penalties levied on companies that don’t comply are extraordinary. When threatened with vast fines, they were always likely to censor first and ask questions later, weren’t they?
Lesh: Indeed, we’re looking at fines of up to 10 per cent of annual global revenue for tech companies that break the rules. That is billions of pounds in some cases.
The act gives companies an extensive set of ‘safety’ duties – an expectation that they’re going to prevent people from accessing certain types of content – and if they fail in those duties, they will be punished severely. Some tech execs might even face criminal sanctions and jail time if they’re unresponsive to Ofcom’s orders. So of course, the incentive is to act in a conservative way, and remove lots of content rather than risk removing too little.
Now, whatever you think of Elon Musk, he has generally tried to oppose government diktats to remove content. But in this case, I think X is acting in the most prudent, legally justifiable way – which is by hiding a lot of content from British users that might be considered harmful to children. This is regardless of whether the content is obviously educational – and even despite the fact that in the not-too-distant future, 16- and 17-year-olds are meant to be able to vote. This is the great irony, of course. We’re telling these teenagers that they can’t have access to the material on which they might wish to base their political decisions. It is fundamentally undemocratic.
Myers: All of this is happening under phase two of the implementation of the act. Could you tell us what is still to come?
Lesh: Something that’s not particularly understood even by advocates of the Online Safety Act is that it explicitly envisages quite an expansive regime of censorship for adults.
What we’re seeing now is the age-verification phase. For those who are unwilling to prove their identity, they lose access to information. Freshwater did a recent poll on how people are going to respond to age-gating, and about one in four said they would just exit the website. That’s one in four people who will not access that information.
Even if the remaining 75 per cent ultimately comply, they are still only going to receive a censored version of the internet, because the duties in the act that relate to ‘illegal content’ are very widely drawn. It expects companies to remove anything that they have ‘reasonable grounds to infer… is illegal’. Now I’m not a lawyer, but the lawyers I speak to say that’s hogwash. This simply is not a reasonable legal standard.
This definition would result in a lot of content being censored: automatically, at speed, usually using AI systems. So at a base level, we’re just going to see a lot more censorship of digital platforms in an effort to comply with these laws.
Myers: So even if you’ve submitted your ID, proving you’re an adult, you’re still only going to get a ‘kid’s version’ of the internet?
Lesh: Yes, it will create a more restricted version of the internet for everyone.
The act also creates a lot of room for political interventions. There are ‘emergency powers’ built in, too. So if there is, say, another pandemic or a heavily contested election, the culture secretary can direct Ofcom to put pressure on tech companies to censor certain content. The practical implication is that, in a crisis, Ofcom is just going to be constantly pushed to intervene.
Another big concern is what the Online Safety Act means for private messaging. All the same duties apply to messaging platforms. Ofcom even has the power to force platforms like WhatsApp, Signal and Telegram to install software for the purposes of preventing people from being able to share child-exploitation material. On the face of it, this sounds completely reasonable. The problem is that this would effectively bring an end to the end-to-end encryption of private messages. These would all have to be checked against some kind of central database. WhatsApp has been quite clear that it would exit the UK market if the government asked it to start spying on its users in this way. I think the fact that Ofcom has that power, even if it’s dormant for now, causes some quite serious problems for how we communicate.
Matthew Lesh was talking to Fraser Myers for the latest episode of the spiked podcast. Watch the full thing below:
Who funds spiked? You do
We are funded by you. And in this era of cancel culture and advertiser boycotts, we rely on your donations more than ever. Seventy per cent of our revenue comes from our readers’ donations – the vast majority giving just £5 per month. If you make a regular donation – of £5 a month or £50 a year – you can become a and enjoy:
–Ad-free reading
–Exclusive events
–Access to our comments section
It’s the best way to keep spiked going – and growing. Thank you!
Comments
Want to join the conversation?
Only spiked supporters and patrons, who donate regularly to us, can comment on our articles.