In defence of rational man
Behavioural economics adds a pseudo-scientific sheen to anti-democracy.
An individual’s stance on politics largely depends on his view of the capacity of humans to exercise reason. It is no exaggeration to say that the ideas of freedom and democracy are premised on the notion that people can act rationally. Think of any political decision of any importance. Should Britain be in the European Union? Who should be president of the United States? What economic policies would serve the public best? The case for opening these questions up to public debate, and putting them to the vote, is premised on the notion of human rationality.
Of course, even today, anti-democrats are generally a little wary of attacking democracy or human rationality directly. Instead they typically work on the implicit assumption that only the technocratic elite is fully capable of rational decision-making. The rest of us are seen as impaired in our ability to act rationally. They think we are a bit stupid but they are often guarded about saying so in public.
This is where the field of ‘behavioural economics’ comes in. Contrary to the label, it has little to do with economics – at least in the conventional sense of the study of how production and consumption are organised. It is better seen as an ostensibly scientific assault on the rational capacity of human beings. The main reason ‘economics’ is included as part of the name is that the discipline is conventionally seen as embodying rationality. Mainstream economics is presented by the critics as exemplifying what they see as the erroneous assumption that people tend to behave rationally.
Behavioural economics goes far beyond discussions of how to understand the material world. It is premised on the idea that the psychological evidence shows that humans have severe cognitive limitations. As a result, it assumes that people are prone to making serious mistakes.
Michael Lewis’s The Undoing Project provides a useful account of how these ideas developed. His great gift is to outline difficult ideas in an accessible way. Many of his books, often on relatively obscure financial topics, have become hugely popular sellers. Some of them, such as Moneyball and The Big Short, have also provided the basis for Hollywood blockbusters.
The unlikely precursors to what has since become known as behavioural economics were two Israeli psychologists who started work in the 1950s. Daniel Kahneman, who was to win the Nobel Prize for economics in 2002, spent most of his childhood years in France. He probably only escaped death at the hands of the Nazis because his father worked as a chemist for L’Oreal, the giant French cosmetics company. Kahneman senior was allowed to live as his work was deemed useful to the war effort.
Amos Tversky, Kahneman’s professional collaborator, was born in what was then British mandate Palestine but later became the state of Israel. He was far more of an extrovert than Kahneman, but they shared a common interest in human psychology. Tversky died in 1996 so, although he was well known in professional circles by then, he did not enjoy the huge public recognition Kahneman has had in recent years.
Although Michael Lewis does not labour the point, he does a good job of describing the peculiar circumstances in which Kahneman and Tversky developed their ideas. They were working shortly after the Holocaust and in the midst of a nation that was in a permanent state of war. In such circumstances it is not surprising they developed a keen interest in how people think and that their conclusions were often negative ones.
A large part of Kahneman and Tversky’s working method was to shut their office door behind them before obsessively debating psychological phenomena. In Lewis’s telling they were as close as, if not closer than, a married couple. Another important part in their work was quizzing people on various puzzles they posed to them. The main conclusion they reached was that people have systematic cognitive biases. They don’t just make mistakes; their errors are also predictable and often point in a particular direction.
Their first big idea became known as the availability heuristic. This is essentially a mental shortcut that relies on immediate examples of what comes to mind. For example, they asked a group of students in Oregon about the frequency of letters in the English language (excluding words with fewer than three letters). Typically students said that they thought the letter ‘k’ was twice as likely to appear as the first letter of an English word as the third letter. But in reality the opposite is true. The letter ‘k’ appears twice as often as the third letter than as the first letter in English words.
Kahneman and Tversky proposed that people’s memories were skewed because it was easier to remember words with k as the first letter than as the third letter. The mental shortcuts people used to make everyday decisions were prone to error in such circumstances.
Another example is known as the anchoring and adjustment heuristic. Kahneman and Tversky took two groups of high-school students and asked them to estimate the answer to a maths question within five seconds. The first group was asked to estimate: 8x7x6x5x4x3x2x1. The second group was asked to estimate: 1x2x3x4x5x6x7x8. A moment’s reflection should indicate the answers to the two questions are identical (40,320). But the first group’s median answer was 2,250 and the second group’s median answer was 512. The first group had used eight as a mental starting point and the second group had used one.
At first sight, such examples might appear a world away from politics, but that depends on what kinds of conclusions are drawn from them. There is nothing inherently wrong with investigating cognitive biases of this type. On the contrary, it is a perfectly reasonable sphere for psychologists to investigate. But when such cases are used to give scientific credence to the idea that humans are irrational, that is a different matter.
Even Kahneman, who, unlike some of his colleagues, dislikes describing human behaviour as irrational, is prone to such extrapolation. For instance, in Thinking, Fast and Slow (2011), his international bestseller, he gives the example of the behaviour of taxi drivers to illustrate his point (He also used the example in a BBC Horizon programme). He says taxi drivers typically make the mistake of giving themselves a daily earnings target. As a result, they work for too long when it is sunny (so earnings are typically lower) and not long enough when it is rainy (when earnings are typically high).
But it is always best to be wary when experts make sweeping statements about how ordinary people run their everyday lives. There can be all sorts of reasons people act in the way they do. Taxi drivers, for example, might have weekly bills to pay whatever the weather. And they may not have savings that can see them through lean periods. Many drivers reject the premise that earnings are better during rainy times (I know because I have been asking them whenever I have had the opportunity). Some claim that passengers take shorter journeys when it is raining and that people go out less than when it is sunny.
Human behaviour tends to be far more complex than the simple lab experiments described above. Self-proclaimed experts are much too quick to assume that people are acting against their own interests.
This is even more the case in complicated political decisions such as Brexit. What matters to technocrats and technocratically inclined politicians – in the case of the European Union (EU) the ability to make key decisions away from public scrutiny – is not the same as what matters to most people. When fans of the EU describe Brexit supporters as irrational in such circumstances, it is essentially a form of defamation. In reality, those voting Brexit just used different – although perfectly rational – criteria when deciding how to vote in the referendum. Such disagreements on principle should constitute the essence of political debate.
In any case, the claim that a premise of rationality assumes that people must be perfect calculating machines is absurd. No serious political thinker would argue that anyone is immune to error. The key point is that people are capable of understanding and articulating their interests in rational terms.
Political institutions should be organised in such a way that they facilitate rational decision-making. That means encouraging free and frank debate. Such discussion allows people to reflect on their decisions in a serious way. They also help people to consider what impact decisions might have on society as a whole, rather than just seeing them in narrow personal terms.
The weakness of The Undoing Project is that it fails to see behavioural economics in its broader political context. Elite disdain for the rational capacities of the public goes back at least as far as Plato’s Republic. Behavioural economics has gained such popularity in establishment circles because it gives a pseudo-scientific sheen to anti-democratic ideas.
The Undoing Project: A Friendship that Changed the World, by Michael Lewis, is published by Allen Lane. (Buy this book from Amazon(UK)).
To enquire about republishing spiked’s content, a right to reply or to request a correction, please contact the managing editor, Viv Regan.