Skip to content
Skip to main content

About this free course

Download this course

Share this free course

Engineering: The nature of problems
Engineering: The nature of problems

Start this free course now. Just create an account and sign in. Enrol and complete the course for a free statement of participation or digital badge if available.

7.3 Ethics and safety

A practising engineer makes ethical decisions, with moral and physical implications of varying magnitudes, on a daily basis. Examples of ethical dilemmas are limitless, ranging from the engineer who takes home the odd pen, file or discarded paper 'for the children', to the engineer who signs off a project without checking the details and identifying a simple arithmetic error of magnitude. The implications of either may be negligible – such as where the cost is more than compensated in unpaid overtime, the error merely accidentally increases the factor of safety – or catastrophic, such as when a discarded piece of paper has sensitive industrial information that ends up with a major competitor, or an arithmetic error decreases the factor of safety and a component fails in use at the cost of human life. For the occasions when the ramifications of our decisions are not apparent to anyone else, then ethics are a matter of personal conscience. However, when the ripples of our actions spread out and cause damage or injury then we are legally responsible for the result. Very often, the difference between the two is a matter of luck.

The very nature of engineering implies that safety must be a primary issue. Even the most remote of robots will have some human interface somewhere along the line, and most engineering design, whether industrial or domestic, requires direct contact at one or more levels. Ethics and safety are often closely interwoven – our responsibility for safety in design is as much moral as it is professional – and there are safety practices to be observed at every stage of the design process.

Much of what we know now has been learned from bitter experience but, amazingly, evidence suggests that we are still inclined to become complacent over long periods of technological triumph, leading us to more narrow margins of safety and, ultimately, repeated disaster. Consider what you know of the most publicised engineering disasters over the last century, and how safety was compromised in each case. Often, these great catastrophes are the result of some very minor error, and not the technological billion-to-one misfortune we might hope to believe – see Table 6.

Table 6 Causes of some notable engineering disasters
DateDisasterFundamental cause
2000Concorde: fire and crash, shortly after take-off (113 dead)Debris on runway and fuel tank susceptible to damage from same
1986Chernobyl: meltdown of nuclear reactor core, and large-scale radioactive contaminationSafety procedures ignored, and design flaws
1986Challenger Space Shuttle: exploded 73 seconds into flight (7 dead)Design flaw in O-ring seals on the booster engines
1981Hyatt Regency Hotel: suspended catwalk collapsed over a dance floor (114 dead)Design change and failure to anticipate overload
1979Three Mile Island: 51 per cent meltdown of nuclear reactor coreIncorrect procedures
1940Tacoma Narrows Bridge: bridge collapsedUnexpected wind-induced vibrations

The study summarised in Table 7 investigated 800 cases (and millions of pounds worth) of structural failure, in which 504 people died and 592 were injured. When engineers were to blame, the study categorised the causes of failure (and hence breaches in safety).

Table 7 Causes of failure, where engineers were to blame*
Insufficient knowledge68%
Underestimation of influence16%
Ignorance, carelessness or negligence14%
Forgetfulness, error13%
Relying on others without sufficient control9%
Objectively unknown situation7%
Imprecise definition of responsibilities1%
Choice of bad quality1%
Other3%
Source: Swiss Federal Institute of Technology

*Note that the percentages add up to more than 100 – some failures were attributed to more than one cause.

You can see that in a whopping 68 per cent of cases, 'insufficient knowledge' on the part of the engineers was a contributing factor. Again, this has to be an ethics issue – can we really accept that all these engineers were so lacking in self-awareness that they truly believed in their own abilities, or were some of them just not brave enough to admit they were out of their depth at the time? The lesson is clear. You don't need to store everything you study in a photographic memory compartment, but it is essential to remember that, as a professional engineer, you are accountable for your actions; and this includes recognising when you need to bring in expertise from a colleague or external sources.