Donate

Yvette Cooper’s war on online privacy

Apple is being asked to open up its users to unfettered surveillance.

Freddie Attenborough

Topics Science & Tech UK

Want to read spiked ad-free? Become a spiked supporter.

The UK could be about to single-handedly dismantle online privacy for the world. Last month, Keir Starmer’s Labour government demanded that Apple break its encryption and allow law enforcement to access users’ iCloud data. The demand, issued under the sweeping surveillance powers of the Investigatory Powers Act 2016 (dubbed the ‘Snooper’s Charter’ by civil-liberties groups), has raised serious concerns over privacy and free speech.

As first reported by the Washington Post earlier this month, UK home secretary Yvette Cooper issued the legally binding order in the form of a Technical Capability Notice (TCN). Unlike a traditional data request, a TCN is not a demand for access to specific user information. Instead, it forces a company to create the capability for future government access. In this case, it could mean dismantling end-to-end encryption protections that even Apple itself cannot currently bypass – that is, the process by which your messages, photos, files and personal data are ‘scrambled’ to make it impossible for anyone other than you to access them.

The most chilling part of this story isn’t that the UK government instantly gains access to iCloud data worldwide – it doesn’t. The real issue is that the government is forcing Apple to break encryption pre-emptively. Instead of asking for access when necessary and legally warranted, the government is attempting to establish permanent access to encrypted data. The infrastructure for surveillance will exist, regardless of whether it is ever needed. Apple is expected to challenge the TCN through legal channels, but UK law requires compliance, even while an appeal is ongoing. The appeal process could take years.

The implications of this extend far beyond Britain. If Apple introduces a backdoor for the UK authorities, it is unlikely to end there. Authoritarian regimes will be quick to demand the same access. Once the technology exists, Apple will no longer be able to claim it cannot decrypt user data. It will merely be a question of who gets access. Governments in China, Russia, Iran and elsewhere will have a powerful new argument: if Britain gets access to the ‘backdoor’, why can’t they?

Apple has built its reputation on protecting user privacy and has long resisted government demands for backdoor access. In a submission to the UK parliament last year, the US tech giant warned that British surveillance powers could force it to withdraw encryption protections from the UK market entirely. It described end-to-end encryption as ‘an invaluable protection for journalists, human-rights activists and diplomats who may be targeted by malicious actors’.

Sadly, this isn’t the first time the UK has sought to undermine privacy under the guise of security. Just a few years ago, the UK’s Online Safety Act 2023, while still passing through parliament, attempted to introduce client-side scanning. This is a surveillance method that would have forced tech companies to scan all private messages before being encrypted. It would have given governments direct access to those conversations, fundamentally altering the security model of encrypted communications.

The proposal triggered widespread alarm. Messaging app Signal warned it would ‘100 per cent walk’ from the UK rather than comply. ‘Encryption is either protecting everyone, or it is broken for everyone’, declared the company’s president, Meredith Whittaker.

WhatsApp made similar threats, arguing that weakening encryption in one jurisdiction undermines security for all users globally. Critics of client-side scanning pointed out that such systems could be exploited not only by governments, but also by hackers. Even the UK’s own Information Commissioner’s Office acknowledged that encryption enhances child safety by reducing risks such as blackmail and exploitation.

In 2021, Apple itself abandoned an attempt to introduce client-side-scanning software after 14 leading computer scientists published a devastating critique of the practice. Their paper, ‘Bugs in our pockets’, identified 15 ways the technology could be exploited – both by governments, malicious actors, rogue employees and even the criminals it was meant to target.

Authorities often claim they only seek access for legitimate purposes, but history tells a different story. Once a surveillance system is built, it is almost always repurposed. Encryption backdoors, like the Clipper Chip in the 1990s and Dual EC DRBG in the 2000s and 2010s, have repeatedly led to catastrophic security failures. Weakening encryption never stays limited to its original purpose. Sooner or later, it gets abused.

Whatever happens next, the UK has just fired the starting gun on a battle that extends far beyond one company or one country. If encryption backdoors become the norm, the ability to communicate securely, without fear of surveillance or reprisal, will be permanently undermined.

This isn’t just about privacy – it is also about the very future of digital freedom.

Freddie Attenborough is the digital communications director of the Free Speech Union.

Comments

Want to join the conversation?

Only spiked supporters and patrons, who donate regularly to us, can comment on our articles.

Join today