After Buncefield: a dark day for whom?
by Jennie Bristow
Jennie Bristow
The curious rise of anti-religious hysteria
by Frank Furedi
Search for
Complementary and alternative medicine
Drugs and health
Plagues of the future?
Human body parts
Fearing the unknown
After Katrina
War on Iraq
War on Terror
Sun, sea
and scaremongering

After 11 September
Global warming
Blood clots
Mad cow panic
Food scares

spiked-risk debates

Sponsored by the Wellcome Trust

(This debate is closed and is a read-only archive.)
Are we too risk averse?
Taking sides
'In most instances it is the society that has to face the consequences of the hazards, and the industry will take most of the benefits.'
Arpad Pusztai
Norwegian Institute of Gene Ecology
I was struck by a number of interesting points about this spiked-debate so far. The first and most startling was that many respondents were not clear about the definition of hazards, risks and risk assessment - which was not helped by the red herring of fear thrown in by people such as John Ryan, director of the bionanotechnology IRC at Oxford University.

In fact, most scientists and technologists know what the hazards are for a particular new product. In many instances these can be determined and even measured. However, the probability of these hazards happening is more difficult to forecast, particularly when probability in most instances is based on assumptions about which people can sharply differ. In addition, only some of the many possible consequences of these hazards can be reliably assessed, and so there will be widely differing views. It is quite difficult to predict the unknown, particularly because this depends on whose crystal ball it is being viewed through. It appears that what industry may regard as an unnecessary hindrance to progress, others in society may view as an important safety measure.

A good example of the confusion can be found in Helene Guldberg's lead piece, when she states that 'science is by its very nature risky: experimentation involves investigating the unknown'. This can be misunderstood, because it is not the experimentation (usually done in the confinement of a laboratory) that is risky, but the unforeseen and possibly dangerous consequences of the transfer of research results into practical applications. Quite rightly, governments have enacted laws regulating and controlling this transfer, particularly when the new products or technologies could have major potentially harmful effects on public health and the environment.

When Guldberg describes all the wonderful inventions, such as antibiotics and in vitro fertilisation, which, according to her, could have never come about if we had not taken risks, she misses one crucial point: that the law stipulates that these inventions had first be rigorously tested in animal and then in human clinical trials before most people would ever hear about them. A good point is made by Sean Davidson about the opposite case when breakthroughs in fact backfired.

Although zero risk does not exist, as we are constantly and painfully reminded by cases such as the thalidomide catastrophe, this should not absolve us from doing everything possible to eliminate or minimise it. People's are in fact less risk averse when the assessment of a new product is carried out by a well-trusted public regulatory body with a good track record, such as the Food and Drug Administration (FDA) in the USA. People are quite ready to accept even drugs produced by recombinant genetic technology, such as insulin, because it is known that these had to be first subjected to human clinical trials. In contrast, people have a negative attitude to GM foods because they know that, unlike new drugs, the FDA does not require their rigorous testing.

Unfortunately, while in most instances it is the society that has to face the consequences of the hazards, the industry will take most of the benefits. In this light it was interesting that those contributing to this spiked-debate from the industry were much more keen on the notion that for progress' sake 'we' ought to take more risks, while others from the general public and from the academic community, particularly from social sciences, were less convinced.

Another major point missed by some respondents was connected with the question of trust in the regulators and regulation. It serves no purpose - and may even be counterproductive - for the industry to lament the irrational and emotional responses from the public or even from 'a nervous and insecure elite', if this is meant to replace the proper but costly testing of new products with bland assurances about safety. Based on previous experiences of disasters, people nowadays have a sophisticated understanding of what the risks are. Condescending and patronising phrases from the industrial and political establishments, such as that according to 'sound scientific evidence' people have nothing to fear, will not allay public fears.

The public understands perfectly that the lack of evidence of harm is no evidence of safety, particularly when people see no signs of intention to carry out proper, transparent safety studies on new products that are confirmed by independent scientists. The contrast with the past, when publicly funded research institutes were regarded as public watchdogs, is quite revealing. When in the 1960s a new director took over the Norwegian Institute of Gene Ecology's scientific direction his first words to his staff were: we are not to accept any direct money from the industry, so as not to jeopardise our independence in the eye of the public. Trust is a very special commodity and scientists must earn it. It cannot be bought by expensive advertisements.

People are more likely to accept risks when they feel that they are in the driving seat. Forcing them to accept something that they do not want is futile, as the GM food fiasco clearly shows. It is an open secret that even now, about 10 years after the introduction of GM, there are only about a dozen or so safety studies published in peer-reviewed science journals. Exhortations by the industry and labelling sceptical scientists as Luddites will not change that. Tony Juniper from Friends of the Earth makes a good point when he asks the question: who decides what risk is acceptable for whom?

It was also quite revealing that even someone like Gill Samuels, the science policy supremo of Pfizer, misunderstood the application of the precautionary principle in therapeutic decision-making. When it is explained to them, most people will understand what risks are inherent in the medical treatment proposed for them, even when the outcome is uncertain. The point is that the risk will be taken by the patient and not by the doctor, and when the patient trusts the doctor, he or she is more likely to make the decision to face the risk.

Julian Little from Bayer CropScience UK argues for weighting the evidence to be put to the public. This patronising attitude is not likely to get him far in persuading people to take risks because they, quite rightly in my opinion, will ask the obvious question: who is going to do the weighting? I have a feeling that in his opinion this ought to be done by the industry's representatives because they know the facts.

I tend to agree with Adrian Holme that there is no simple answer to the question of whether we are more or less risk averse than previous societies. The final conclusion must be that our answer mainly depends on which side of society we come from. For the industry, the best solution would be to have a public that explicitly trusts them and willingly take all the risks inherent in our technologically speeded up age. On the other side, there will always be a minority of rigidly conservative-minded people who object to practically any innovation, but in a democratic society they may not be decisive.

However, if and when the industry and its regulators behave honestly and openly, and don't just appear to go through the motions of establishing what are the best and safest products and technologies that truly advance peoples' interests and wellbeing, they will find that most people are willing to take risks for the common good. As someone who has always been interested in history I personally do not believe that our society is now more risk averse than any that preceded us.

Árpád Pusztai works at GenØk, the Norwegian Institute of Gene Ecology.

View list of responses

Debate home
The head-to-head
Helene Guldberg
managing editor, spiked
Alan Irwin
professor of sociology, Brunel University
Commissioned responses
John Ryan
Gill Samuels
Jane Gregory
Tony Juniper
Stuart Derbyshire
Arpad Pusztai
View the list of responses

Useful resources
Science, risk and the price of precaution
by Sandy Starr

Risk: Improving government's capability to handle risk and uncertainty
Prime Minister's Strategy Unit

Rio declaration on environment and development
UN Environment Programme

Corrections Terms & Conditions spiked, Signet House, 49-51 Farringdon Road, London, EC1M 3JP
email spiked © spiked 2000-2006 All rights reserved.
spiked is not responsible for the content of any third-party websites.