Why you shouldn’t weep over WannaCry
The hacking of the NHS was bad, but not that bad.
As the whole world knows by now, if you’re still running Windows XP, Windows Server 2003 or Windows 8, and you haven’t downloaded Microsoft’s freshly issued software patch to protect yourself, your PC could be hit by a malicious piece of code called WannaCry. Blackmailers have targeted users of these systems, locked up (‘encrypted’) their computers, and demanded $300 from each in return for a promise – and it’s only a promise – to unlock them. In the UK, the NHS was hit.
The sums asked for and the amounts illegally collected so far are piffling. But while encrypting ransomware has grown since 2013, these attacks are on a new scale: there have been at least 200,000 of them, reaching 150 countries. That’s why we need to get the clearest possible perspective on them. Here are five reasons we shouldn’t weep over WannaCry.
Alarmist reactions only embolden hackers
The disreputable state of much of the world’s IT gives hackers confidence, and it’s no accident that countries such as Russia, China, India and Indonesia, where outdated systems are very prevalent, were among the hardest hit. But a second factor also gives hackers confidence: exaggerated fears. ‘The first death directly attributable to a cyber-attack suddenly seems possible’, said the Financial Times of the hacking of NHS computers, as if the 2015 hacking of Ashley Madison, an adult dating site, had not prompted suicides in Canada and Texas. This kind of worst-case-scenario alarmism can only encourage yet more spotty teenage males to try their hand, in the privacy of their own bedrooms, at cyber-tricks.
Like a good health or nuclear panic, the media love a cyber-story. BBC coverage continuously adopted scary metaphors from the worlds of disease and radiation, talking of infection and fallout (not to be outdone, Wired talked of meltdown). But of course it wasn’t just the media that overdid things. When a ‘second spike’ of attacks on the NHS failed to emerge on Monday 15 May, no lesser an authority than the National Crime Agency chipped in. Echoing George W Bush’s notorious defence secretary Donald Rumsfeld (‘absence of evidence is not evidence of absence’), the NCA discovered that the non-appearance of a second strike ‘doesn’t mean there won’t be one’.
In fact, WannaCry hit only 16 of England’s 47 NHS Trusts. Even the tawdry NHS has computer back-ups. Power stations, utilities, banks, television, mobile phones? They weren’t hit, even if the Spanish telecommunications firm Telefonica and the US delivery service, FedEx, were.
WannaCry is bad news, but it doesn’t deserve hysteria. It locked up data; it didn’t destroy data.
Live by the sword, die by the sword
It isn’t at all necessary to be anti-American or pro-Putin to agree with the Russian president that malware created by intelligence agencies can backfire on its creators. In the case of WannaCry, America’s National Security Agency is strongly implicated as an unconscious facilitator of the hackers.
The NSA, seeking to attack its enemies, was responsible for first finding a vulnerability in Microsoft’s operating system, then making tools to take advantage of it. Next, the NSA left its IT doors unlocked, allowing hackers to meld its tools with existing malware and do their worst.
Clearly the NSA is far from secure. As the Washington Post rightly observed, WannaCry ‘is the story of nearly all weapons technology: eventually, it will get out. And it will fall into the wrong hands.’ Nor is it just a one-off. Brad Smith, Microsoft’s president and chief legal officer, was only being accurate when, referring to software systems, he denounced what he called ‘the stockpiling of vulnerabilities by governments’.
We’ve just got to learn to expect this. But we must also learn to place no confidence in security agencies – either to cease their development of cyber-weapons, or to police the internet on our behalf.
Knowledge economy? You’re kidding
WannaCry confirms a second adage from the military world: accurate knowledge about a likely threat not only has to be acquired, it has to be understood, passed on and acted upon. As UK defence secretary Michael Fallon was forced to admit in relation to complacent NHS managers, ‘We warned them, and they were warned again in the spring’. But nothing happened.
Even before today’s cyber era, stupid leaders ignored good warnings. In 1940, in the old Soviet Union, everyone from Winston Churchill to Russian intelligence agents in Berlin and Tokyo warned Joe Stalin of Hitler’s likely invasion. He paid no attention. In 2001, in the weeks before al-Qaeda’s attacks on New York City, FBI chiefs in Washington reputedly disregarded the desire of Bureau agents in Minneapolis to intensify investigations of a 9/11 conspirator. Wearing a ratty old t-shirt and a baseball cap, Zacarias Moussaoui had been reported paying out thousands of dollars to spend 12 hours in a 747 flight simulator. But nobody at the FBI’s HQ saw any reason to act.
For all the hi-tech mystery that surrounds cyber-security, it remains a human question. But what WannaCry underlines is how, when economists and Silicon Valley go on about us all entering a knowledge economy, they’re missing the point. Even where it exists, and with capitalism it often doesn’t, knowledge, like innovation, has to be successfully diffused and applied. If NHS chiefs are continually distracted by constant government reorganisations, as they are, it’s no wonder they can’t take in the latest mind-numbing memo about IT.
Incompetent elites fail to invest, and fight among themselves
It’s true that WannaCry has exposed the myopia of health minister Jeremy Hunt, as well as a British investment crisis that, even if it isn’t by any means the NHS’s only problem, is certainly one of them.
But what the WannaCry ‘virus’ has also shown is that factionalism in high places isn’t just a problem besetting Donald Trump’s chaotic White House. It’s everywhere. In his remarks about the warnings given to the NHS, Fallon was really trying to pass the buck to Hunt. Similarly, relations between Silicon Valley and the US intelligence services are very fractious, as Brad Smith’s blasting of the NSA’s leak shows: it was, he wrote, like the Pentagon having ‘some of its Tomahawk missiles stolen’.
Of course, there’s more that binds Silicon Valley to the state than that which divides them. Still: expect more recriminations about cyber-security – and more elite instability as a result.
Fight back with a new cyber-industry
When commentators about IT are not forecasting doom over cyber matters, they wax lyrical about how artificial intelligence is already a wonderful thing. In fact we don’t at all have artificial intelligence today: we have clever software, but nothing that can emulate all the faculties that make us human. Nevertheless, there are advances happening in the less overblown field of machine learning (ML), pioneered by people like Demis Hassabis, CEO of the London company DeepMind (now part of the Google empire), and by US cybersecurity specialists CrowdStrike.
With ML, computers can be ‘trained’ not just to see how cyber-threats look, and to recognise threats seen before, but to anticipate what a threat will do and what it has in common with other threats. Naturally, ML has to work alongside, not replace, human analysts and conventional security information and event management (SIEM) systems. But given the scale of today’s threats, Britain and other countries urgently need to build up a new sector of wealth creation, to join electric cars and others: cyber-security based on ML.
That sector would be highly automated; but demand is likely to be such that jobs there will be durable. The WannaCry cyber-cloud, then, could yet have a silver lining.
Picture by: Getty