The psychobabble behind the ‘AI is racist’ claim

James Woudhuysen


Astonishing news is in. Apparently, artificial intelligence can be bigoted, too.

The claim comes from three Princeton IT scholars who made Global Vectors for Word Representation (GloVe), a popular algorithm that performs unsupervised learning from text, crawl through billions of words on the internet, assessing the meaning of particular words statistically by checking which other words were near them. They found that the GloVe algorithm made the same iffy associations between words that human beings do. Or, as Wired put it: ‘Just like humans, artificial intelligence can be sexist and racist.’

GloVe linked words such as ‘woman’ or ‘girl’ with the arts, rather than with mathematics. It linked African-American names with unpleasant words. From this, the IT scholars concluded that ‘machine learning absorbs stereotyped biases as easily as any other’. IT that can learn, understand and produce language, they wrote, will acquire ‘cultural associations, some of which can be objectionable’.

For them, it’s a worry that companies may screen job applicants using software that has come to ‘imbibe cultural stereotypes’. But this denudes sexism and racism of their real meaning. For them, these phenomena have no ‘explicit’ roots in the structures, economy, policy or institutions of the US – in labour utilisation or immigration control, for example. ‘Before providing an explicit or institutional explanation’ for why individuals make ‘prejudiced decisions’, the authors intone, ‘one must show that it was not a simple outcome of unthinking reproduction of statistical regularities absorbed with language. Similarly, before positing complex models for how stereotyped attitudes perpetuate… we must check whether simply learning language is sufficient to explain (some of) the observed transmission of prejudice.’

In other words, these experts believe sexism and racism are simply a result of unconscious – ‘implicit’ – prejudice. This is a common view, which underpins anti-bias training in universities and workplaces. But it is highly problematic: it trivialises oppression where it does still exist. From this perspective, progress lies not in taking on social structures that hold certain people back, but in improving our language and continually ‘updating’ AI in line with this.

But the way in which implicit bias is measured in both humans and machines is highly questionable. The Implicit Association Test (IAT), currently used to measure a subject’s sexism, racism and much else besides, was born out of the work of Anthony Greenwald and Mahzarin Banaji. In 1995, they argued that attitudes and stereotypes were activated unconsciously and automatically. And in 1998, Greenwald and two of his colleagues at the University of Washington put it to the test. Using just 17 Korean-American students, they found that the group took longer to respond favourably to a Japanese name than to a Korean name. The IAT is, to put it plainly, incredibly reductionist.

The Princeton scholars used the IAT to measure the prejudice of AI. Indeed, they believe their results ‘add to the credence of the IAT by replicating its results in such a different setting’ – namely, that of clever bots.

Might biased AI be a problem in future? Perhaps. But biased psychobabble is a real danger in the present.

James Woudhuysen is visiting professor of forecasting and innovation at London South Bank University. He is also editor of Big Potatoes: the London Manifesto for Innovation. Read his blog here.

spiked needs your support

Defending liberty isn’t easy – especially in times of crisis, when freedom is so often traded away in search of security. But amid the coronavirus pandemic we at spiked have continued to speak up for our principles, calling for more scrutiny of the authoritarian measures being wielded over us and more debate on the best way forward. To continue to do that, we need your help. spiked is free and it always will be, because we want as many people to read us as possible. But to keep spiked free we rely on the generosity of our readers, particularly those who can give regularly. Even £5 per month can make a huge difference to us. We know it’s hard out there for many of you, now more than ever. But if you support what we do here and you can afford to contribute, to make sure we can continue to produce our free and fearless journalism for anyone who wants to read it, please do consider making a donation today.

Thank you! And stay safe.

Donate now

To enquire about republishing spiked’s content, a right to reply or to request a correction, please contact the managing editor, Viv Regan.



Leave a comment

You must be logged in to comment. Log in or Register now.