Skip to content
Science, Maths & Technology
Author:

Evidence, oversight and regulation - not to be messed with?

Updated Tuesday, 5th March 2013

What have we learnt from the organisational and managerial failures of the Space Shuttle Challenger disaster?

This page was published over five years ago. Please be aware that due to the passage of time, the information provided on this page may be out of date or otherwise inaccurate, and any views or opinions expressed may no longer be relevant. Some technical elements such as audio-visual and interactive media may no longer work. For more detail, see our Archive and Deletion Policy

Challenger ice Creative commons image Icon NASA under Creative-Commons license Ice on the launch tower hours before Challenger launch 'What is science? Science is a way to teach how something gets to be known. In as much as anything can be known, because nothing is known absolutely. It’s how to handle doubt and uncertainty. Science teaches us what the rules of evidence are. We mess with that at our peril.'

These words are spoken by the actor William Hurt (playing Richard Feynman) to a lecture hall of students in the drama The Challenger – intercut with footage of the ill-fated launch of the space shuttle Challenger. 

Whether or not they are entirely attributable to Feynman is not something I’ve checked. But that’s not particularly important because they certainly capture Feynman’s approach to science.We witness this through the role he plays in the Rogers Commission set up to investigate the Challenger disaster.

As the dramatised events of the film illustrate – and as the writer of The Challenger no doubt intended –  Feynman’s opening remarks take on a particular power and significance as the work of the Commission and the story behind the Challenger accident unfolds.

'Safety culture'  

Science and engineering clearly occupy centre stage when investigating and explaining a key failure (the O rings) that led to the Challenger accident. But there were many other contributory factors that lay outside the normally accepted realms of engineering and science.

Some of the key issues were organisational and managerial in nature. We could also refer to them as institutional and systemic. That is, they were part and parcel of NASA as an organisation and were embedded in the systems, practices and culture of the organisation and its relationship with the wider environment in which it operated – including specially with Morton Thiokol, responsible for the O ring design and application.

Feynman in particular was highly critical of NASA’s 'safety culture'. Indeed, the basis of the production of his appendix to the Rogers Commission report was that he believed this particular aspect of the disaster was not sufficiently recognised by the Commission.  

The Challenger drama rightly focuses on how Feynman’s approach to science plays out in this regard: how NASA dealt with doubt and uncertainty – and thus risk – and the resulting estimates of the probability of failure. We are left in no doubt that ‘we mess with the evidence at our peril’.

Pressure to 'deliver'

A lot of time, money and effort clearly went into investigating the Challenger disaster. And yet in 2003 the Columbia space shuttle was lost killing all seven crew. Sadly, the investigation that followed once again focused on attitudes within NASA to safety, and flawed decision making and institutional failure, concluding that too few lessons had been learnt from the Challenger disaster.   

It is, of course, highly improbable that NASA’s staff and management intended that their actions would lead to the deaths of more than a dozen people. Nevertheless, that was the outcome.

To my mind one of the key, underlying, reasons for this – and one largely ignored by the inquiries – is the extent and degree to which NASA personnel – and in the case of the Challenger, Morton Thiokol personnel too – were under intense pressure to 'deliver'.

By this I mean get something done, or produce some output or result by a particular time, because failure to do so would almost certainly result in serious consequences for their organisation.

In short, people were responding to a real or perceived 'threat' which had organisational – but also by definition, personal –  ramifications (e.g. loss of job, income, etc). In the case of Morton Thiokol it was the potential loss of a very large contract. In the case of NASA it appears to have been the potential loss of the space shuttle programme. 

Failures, oversight and regulation

Faced with such threats many people – probably the vast majority – inevitably feel they have little choice but to respond in a certain way: 'cutting corners', 'turning a blind eye', 'looking the other way', and 'covering up' are all behaviours that many of us will be familiar with, even if not directly.  

For reasons that are pretty obvious, this type of behaviour and the risks and failures that inevitably go with it are found most frequently where large sums of money are at stake: a government IT system that fails to operate as planned and is years behind schedule; a new aircraft that’s significantly over budget and behind its delivery schedule; an oil rig that fails to start producing when expected; railway repair and safety procedures that aren’t carried out properly; and NHS hospitals where the death rate is higher than the accepted norm. 

In all these cases and many more there are important aspects of the findings of the inquiry into the Challenger (and Columbia) disaster that still have currency today. The one I’d highlight here is the crucial importance of truly independent and well resourced systems for regulation and oversight. This applies to almost any field of human activity, as recent and not so recent events in areas as diverse as banking and cycling have demonstrated.

Unfortunately in the UK and elsewhere this course of action is anathema to many people, who instead reduce all such suggestions to arguments about ‘too much red tape’, or something similar. And yet the evidence of many costly events – in money, lives and livelihoods - stretching back over many, many, decades is that independent regulation and oversight, combined with transparency and openness, provide essential protection against potentially catastrophic and costly events such as Challenger. As Feynman may have said, ‘we mess with that evidence at our peril’.

For even more... 

  • Take our quiz, based on a fictional investigation that is inspired by The Challenger drama, to find out how you would deal with a number of difficult scenarios in the world of technology management.
  • Try our free course extract on Evaluating Technology 
 

Author

Ratings

Share

Related content (tags)

Copyright information

For further information, take a look at our frequently asked questions which may give you the support you need.

Have a question?