Donate

Facial recognition: Britain faces a dystopian future

Automated facial recognition is a grave threat to privacy and the presumption of innocence.

George Harrison

Topics Politics Science & Tech UK

Automated facial recognition (AFR) is the state’s latest, and most invasive, surveillance technology.

Since 2015, three police forces – South Wales, Metropolitan and Leicestershire – have made use of AFR in controversial live trials. Now, South Wales Police have been taken to court by office worker Ed Bridges, who started a crowdfunding campaign when he felt his privacy had been violated by AFR.

Bridges’ legal challenge has been backed by civil-rights organisation Liberty, which argues that the indiscriminate deployment of AFR is equivalent to taking DNA samples or fingerprints without consent. According to Liberty, there are no legal grounds for scanning thousands of innocent people in this way. It also claims the technology discriminates against black people, whose faces are disproportionately flagged by mistake, meaning they are more likely to be stopped by police unfairly.

In London, AFR has been put on hold while the Metropolitan Police carries out a review. The Met is also facing a legal challenge of its own from Big Brother Watch, another civil-liberties group. Director Silkie Carlo, a vocal critic of AFR since its inception, told spiked: ‘People are right to be concerned when they can see us moving towards a police state. The result of this technology is that the normal relationship between innocence and suspicion has been inverted.’

One camera, placed in a busy, inner-city location, can scan the faces of up to 18,000 pedestrians per minute, automatically logging the features of anyone unlucky enough to walk past. A computer immediately checks these faces against a database of wanted mugs and lets nearby officers know if there’s a match.

I have previously warned on spiked against the illiberal use of this technology, and the flaws inherent in AFR policing have since become even clearer. Around 50 deployments have taken place so far in Wales alone, including during the Champions League final in Cardiff in June 2017. On that occasion, the cameras scanned 2,470 people – 92 per cent of whom were wrongly identified as criminals.

The trials do not exactly inspire confidence in the accuracy of this technology. But even if AFR worked perfectly, its use would still violate our right to privacy and turn us all into suspects. In previous live AFR trials, it was unclear what would happen to members of the public who refuse to be scanned. Well, now we know: anyone who doesn’t consent to being turned into a walking ID-card will be treated like a criminal.

During a trial in east London earlier this month, a man spotted the police cameras and pulled up his jumper to cover his face. Officers then ordered him to uncover his face and then took a picture of him anyway. When he protested – pointing out that he hadn’t committed a crime – he was fined for disorderly behaviour. Is this a future anyone wants? A country where you can’t even walk down the road without being digitally violated, only to get mobbed by officers and fined if you complain? The presumption of innocence – arguably the key tenet of our legal system – is being thrown out of the window.

And there are still burning questions the police need to answer. We do not know what would happen, for example, to a Muslim woman who walks past a facial-recognition camera wearing a veil if she refuses to take it off. Although the Home Office insists that refusing to be scanned is not a crime – or grounds for suspicion on its own – it is clear from the latest trail that officers do not take kindly to citizens dodging their cameras.

Even more alarming is the fact that this technology has been deployed with no real oversight from politicians and no public consultation. As Big Brother Watch puts it: ‘This is the police making up their own rules as they go along, and we shouldn’t stand for it.’ These automated facial-recognition checkpoints have been allowed to spring up largely without our knowledge or consent.

Meanwhile, the government has largely kept quiet on the subject. The Home Office’s Biometrics Strategy outlines the benefits of AFR, and clearly views it as a useful aid to policing. It also suggests that AFR could eventually be used at UK borders to check travellers’ faces against watchlists. It says that police forces could benefit from a ‘common facial matching service’ which would harmonise the use of AFR across law-enforcement bodies and make it even more powerful.

It is not on the immediate horizon, but there are further fears from civil-rights groups that Britain’s vast network of CCTV cameras could eventually be fitted with AFR technology. Britain has the most CCTV cameras per head of any country in the world. Fitting them with AFR would create a surveillance state more powerful than any in history.

Thankfully, owing to pressure from campaigners, the wide-scale use of this technology in the UK is likely to be stalled – for now, at least. The Metropolitan Police will announce their decision on AFR in June. If the force decides to keep using it, it is likely to face a legal challenge, further slowing its deployment. But there is a lot resting on this. If the use of AFR isn’t stopped, we will be swiftly frogmarched to a future where we can’t even pop to the shops without a CCTV camera scanning our face. Essentially, we’d be living with biometric checkpoints on every street corner.

The word dystopian is often overused, but it is a perfectly apt description of Britain’s future if the use of AFR is allowed to spread unchecked.

George Harrison is a writer. He tweets at @George_Haz

Picture by: Getty.

To enquire about republishing spiked’s content, a right to reply or to request a correction, please contact the managing editor, Viv Regan.

Topics Politics Science & Tech UK

Comments

Want to join the conversation?

Only spiked supporters and patrons, who donate regularly to us, can comment on our articles.

Join today