This is state censorship of the internet

UK government plans to tackle ‘online harms’ pose a huge threat to free speech.

Andrew Tettenborn

Share
Topics Free Speech Politics UK

The UK government has unveiled its proposals to tackle so-called ‘online harms’. It wants to regulate social media through Ofcom, which currently regulates the media and the telecoms industry.

Under the proposals, Ofcom will be empowered to ensure that tech firms adopt a ‘duty of care’ towards users, especially children. This is to protect users, first, from illegal content, such as child pornography, which Ofcom will require tech firms to remove; and second, from harmful but legal content. In the second case, Ofcom will require tech firms to be upfront about what behaviour is acceptable and unacceptable on their sites, in the shape of transparently enforced terms and conditions. So, if a social-media platform states that promoting self-harm is unacceptable, Ofcom is empowered to ensure that stipulation is enforced. In addition, all companies ‘will need to ensure a higher level of protection for children, and take reasonable steps to protect them from inappropriate or harmful content’.

Failure to comply with Ofcom’s demands could, or so at least one report suggests, result in executives at offending companies receiving substantial fines or even prison sentences.

Full details about the legislation and the powers it entails will be released this spring. But make no mistake: even as it stands this plan is a serious threat to internet freedom.

For one thing, these proposals don’t just encompass the internet’s social-media behemoths, such as Facebook. Ofcom’s writ will run to all sites that provide services allowing the sharing of user-generated content or user interactions. That means if you run a pressure group, or a political website, and publish material or comments from users, then you are potentially in Ofcom’s crosshairs.

What’s more, quite apart from demanding that tech firms take down illegal material, Ofcom will require all sites featuring user-generated content to ensure their own terms and conditions are enforced. That is quite a burden. First, all sites will be forced to draft terms and conditions, and conceive of ‘thresholds for harmful but legal content’. They will then also have to come up with processes and systems to deal with complaints and allow for redress. And then they will have to take responsibility for enforcing the terms and conditions or face the potential wrath of Ofcom.

Empowering Ofcom to enforce sites’ own regulation of ‘harmful but legal content’ could be disastrous. And you can bet that there will be plenty of people and pressure groups itching to use this new state power to suppress discussions they would rather not see take place.

Yes, the plan states that ‘safeguards for freedom of expression have been built in throughout the framework’. Hence the freedom to publish harmful but legal content – as long as it’s clearly permitted in a platform’s terms and conditions. But unfortunately, even this freedom is qualified by the imperative to respect the ‘rights of children’, and the corresponding demand that companies ensure there is ‘a higher level of protection for children’. From this, it could follow that there will be removal-of-content orders aimed at legal discussions of, for example, the morality of suicide, or anti-vaccination, because they are deemed too harmful to children.

Besides, the line between legal and illegal speech is pretty fluid anyway. Despite former policeman Harry Miller’s minor victory over an over-intrusive Humberside Police last week, the catch-all prohibition in section 127 of the Communications Act 2003 on ‘grossly offensive’ material online is open to interpretation. It still means that any pungent or forceful statement that happens to annoy some interest group or other could give Ofcom reason to think it criminal and demand removal.

For all home secretary Priti Patel’s talk of needing to tame the Wild West of the internet in order to protect our children, it is clear what we have here: a plan for worryingly sweeping restrictions on what we can say, or allow others to say, online – not to mention an enormous increase in bureaucrats’ power to snoop.

It is not even clear that any of this will be very effective. Even Ofcom accepts that it can only realistically intervene in sites in the UK. Depending on how the government responds to criticisms already made of its proposals, we shall have to see whether its plans merely prompt controversial sites to move abroad, or even to some convenient offshore jurisdiction, like the Isle of Man. If this happens, there will be precious little Ofcom will be able to do about them – even if what they say is truly criminal. Even by Ofcom’s curious standards, that would be a spectacular own goal.

Andrew Tettenborn is a professor of commercial law and a former Cambridge admissions officer.

Picture by: Getty.

spiked is free, and it always will be, which is why we need your help. We don’t have a paywall, or bonus content for paying customers, because we want our arguments for freedom and democracy, against misanthropy and identity politics, to reach as many people as possible. Which is why we ask those of our readers who can afford it to chip in. One-off donations are hugely appreciated, but monthly donations are even better. They allow us to plan for the future and to grow. Even £5 a month is a huge help. It’s much cheaper than your average magazine subscription, and it ensures that spiked is free and open to all. To make either a monthly or a one-off donation, click here. Thank you for your support.

To enquire about republishing spiked’s content, a right to reply or to request a correction, please contact the managing editor, Viv Regan.

Share