Let’s fight back against the new Model Army

Like voodoo forecasts, computer models of climate change are being used to stifle political discussion and resign man to his Fate.

Various Authors

Share

Everybody models nowadays. Nothing gives a new forecast, policy or strategy more weight than knowing that it is, in some way, the product of a computer model. The world’s top capitalists use models to perform ‘what if?’ exercises on crises and disasters, and to simulate future business growth. Governments bow down to models on all sorts of issues: right now, Britain’s chief scientist is modeling the future of UK obesity. Yet computer models are not all they’re cracked up to be. They remain based on a host of untested assumptions; worse, they tend to reduce human beings simply to the role of passive victims – helpless spectators in front of unfolding Great Events.

In no other arena has modeling gained so much kudos as in climate change. Back in 2005, spiked forecast that forecasting itself was due for a boom, because of the growing sense of uncertainty suffered by government and business (see All eyes on the future, by James Woudhuysen). This prediction has proved right, but the trend has been reinforced by the way in which models of climate change have become the gold standard upon which all decision-making must be based.

The rise of models has coincided with the evaporation of the concept of human agency, of human beings consciously gaining and applying new insights through struggle. While we’re supposed to realise that climate change demands the most profound spiritual and lifestyle revolution for each and every person on the planet, in computer models of the future we are consigned to a fate that is pretty much pre-ordained. Such a view demeans the capabilities of people, distorts policy, and is also simply unrealistic. In the real world, human beings do not wait for things just to happen to them; we react, adapt and innovate around problems as they arise.

Just a third of a century ago, when politics actually meant something, highly regarded analysts derided vapid computerisations of the future. When the Club of Rome published its epoch-making bestseller The Limits to Growth in 1972, reaction was sharp. Christopher Freeman, then director of the Science Policy Research Unit (SPRU) at the University of Sussex and doyen of the world’s technology policy gurus, satirised the Club’s approach as ‘Malthus with a computer’. The Fall of Man into a world of depleted resources, it was felt, could not be verified by the movement of electrons and punched cards around an IBM mainframe (1).

Things have changed. In October 2006, when Sir Nicholas Stern published his 700-page UK Treasury report on the economics of climate change, he referred more than 500 times to models of climate change and its monetary cost; models of hydrology, crop growth, risk and uncertainty; and models of innovation, technology and energy. Yet rapture, not criticism, was the main reaction to his argument (2).

Why have models taken on such importance in policymaking today? Whatever happened to the healthy scepticism that accompanied the portentous conclusions of models in the past?

During the 1950s and 1960s, the Pentagon notoriously corralled emerging, mathematics-based disciplines – cybernetics, game theory – into the cause of the Cold War. Particularly after the development of the integrated circuit in 1957, computers were also used, in practice and in propaganda, to lend a veneer of respectability to the campaign. Since those years, the prestige of IT has grown. Today’s Unbearable Rightness of Foreseeing, then, is the product of both climate catastrophism, and revived chutzpah on the part of those promoting IT.

Even before the dot.com boom of the late 1990s, Shoshana Zuboff’s seminal In the Age of the Smart Machine claimed that IT didn’t just automate industrial processes, but gave rise to new insights ‘into functional relationships, states, conditions, trends, likely developments and underlying causes’ (3). Today, with the rise of the supposedly all-conquering technologies of Web 2.0 and their dramatic use in Barack Obama’s campaign for the Democratic Party nomination, only a few detractors are on hand to sniff about IT’s deficiencies.

As solutions to the problem of seeing into the future and providing a vision for it, IT-based methods have gained in credibility. At the same time, the past 20 years have also seen the decline of politics as a vehicle for change. In place of the clashing ideas of left and right have come neutralising, anaesthetic diagnoses and cures. We had think tanks and audits in the 1980s, going on to balanced scorecards and Key Performance Indicators in the 1990s. Until the demise of his government-by-sofa, former UK prime minister Tony Blair had Lord Birt, ex-boss of the BBC and all-round management Dalek, to perform ‘blue skies’ thinking for him on everything from transport to the prison system.

Alongside this managerial approach to every political issue, a New Scientism has sprung up in relation to global warming, converting questions of economic and technological development into matters of physics or climatology – perfect for number-crunching modelers. The main thing about this approach is that it looks hip, modern, cool and unanswerable. But if more people now turn to expose the emperor’s new clothes, the profound fatalism that informs the modeler’s prognoses about the future will finally come out.

For proof, let’s look at the two most recent summaries for policymakers produced by the Intergovernmental Panel on Climate Change (IPCC). In a previous article, we took up the ideas in the summary produced by IPCC Working Group I, Climate Change 2007: The Physical Science Basis (see A man-made morality tale, by James Woudhuysen and Joe Kaplinsky). Now whatever its faults, this summary at least confined itself to the physical science of climate change. But the IPCC’s subsequently published Working Group II summary, on the impacts of climate change, adapting to it, and mankind’s vulnerability to it, is a very odd document. So is the Working Group III summary on how to mitigate climate change. Each of these summaries is, in fact, an eclectic mish-mash of monolithic computer simulations. Each uses computer models to predict social phenomena – developments quite different from those covered by climatology. (4)

The Working Group II summary runs to 22 pages. Interestingly enough, however, it only deals with how human beings might respond to the impact of climate change on page 17. Most of the summary is devoted to repeating the forecasts of climate science. We learn that sea defences might be a good idea; but then again ‘altered food and recreational choices’, ‘altered farm practices’ and more regulation are also part of the IPCC’s oh-so-scientific approach.

The bad news, according to Working Group II, is that climate change itself can slow progress toward sustainable development. We learn, too, that there are ‘formidable environmental, economic, informational, social, attitudinal and behavioural barriers to implementation of adaptation’; indeed for developing countries, a principal barrier blocking adaptation to climate change is – wait for it! – the fact that they have yet to build the capacity to adjust to climate change. How brilliant is that?

The Working Group II report puts forward six different Emission Scenarios, describing six different possible worlds of the future. The ‘A2 storyline and scenario family’, for example, projects rapid population growth resulting in problems of food supply, coastal flooding and water scarcity for particularly large numbers of people. Here, more population means that the effects of global warming will hit more people.

These kinds of banalities have nothing to do with climate science. They are waves of the arm about the economics, psychology and fertility of the future. Broadly, the suggestion is that there is little that the world can do to adapt to climate change.

A similar insouciance marks the Working Group III summary on mitigation, which runs to 35 pages. Economic and political assumptions are there, yet precisely what these are is never made clear, even in the fuller versions of the reports available to date. For example, we are reassured to learn that, by 2030, average CO2 emissions in the Third World are projected to remain substantially lower (2.8-5.1 tonnes of CO2 per head) than those in First World regions (9.6-15.1 tonnes). But where is the natural science in that ‘projection’?

Working Group III is adamant that changing its projections of population, or using market exchange rates rather than purchasing power parities to compare the GDPs of different countries, are adjustments that conveniently make no difference to the level of greenhouse gas emissions it projects for 2030. On the other hand, we are made to understand that most models of mitigation assume ‘universal emissions trading… transparent markets, no transaction costs, and thus perfect implementation of mitigation measures throughout the twenty-first century’. These are quite extraordinary assumptions to make.

To conclude, about the only time the IPCC’s Working Group III admits the case for human agency is when, on page 16, it acknowledges that the macroeconomic costs of mitigation might be lower if the human species were to engage in technological change. But it is quick to admonish: ‘However, this may require higher upfront investment in order to achieve costs reductions (sic) thereafter.’

Instead of raising technology to a higher level, the IPCC seems to prefer that motorists adopt what it calls an ‘efficient driving style’. So while technologies to save the planet are held to be a bit expensive, we’re told that changes in lifestyle and behaviour, by contrast, can mitigate climate change ‘across all sectors’.

Such a view fits in nicely with the low horizons of modern politics. With the IPCC, the modern computer modeler’s work is complete. The conclusions are already there in the premises; but the presentation as the product of cold, logical number-crunching ensures that this work will brook no counter-argument.

But there is a counter-argument. We can uphold humanity’s talent for taking the future into its own hands. And we can mount our own, humanistic critique of voodoo forecasts. Computer models of the future are both products and producers of political muddle. It’s time they were held up to the light, then given the searing interrogation they deserve.

James Woudhuysen is professor of forecasting and innovation, De Montfort University, Leicester. His website is here.

Joe Kaplinsky is a science writer.

Previously on spiked

Woudhuysen and Kaplinsky suggested that the IPCC’s fourth report was a man-made morality tale. John Brignell argued that anything today could be blamed on climate change. Tony Gilland revealed the political roots of the IPCC. Or read more at spiked issue Environment.

(1) See The Limits to Growth, Donella H. Meadows, Dennis L Meadows, Jørgen Randers, William W. Behrens, New American Library, 1972, and ‘Malthus with a computer’, Chris Freeman, in Thinking About the Future: a critique of the limits to growth, HSD Cole, Chris Freeman, Christopher, Marie Jahoda and KLR Pavitt, Chatto & Windus for Sussex University Press, 1973

(2) Stern review

(3) In the Age of the Smart Machine, Shoshana Zuboff, Heinemann, 1988

(4) Climate Change 2007 – Impacts, Adaptation and Vulnerability: Summary for Policymakers, 13 April 2007; Climate Change 2007 – Mitigation of Climate Change: Summary for Policymakers, 4 May 2007

To enquire about republishing spiked’s content, a right to reply or to request a correction, please contact the managing editor, Viv Regan.

Share