7.3 Ethics and safety
A practising engineer makes ethical decisions, with moral and physical implications of varying magnitudes, on a daily basis. Examples of ethical dilemmas are limitless, ranging from the engineer who takes home the odd pen, file or discarded paper 'for the children', to the engineer who signs off a project without checking the details and identifying a simple arithmetic error of magnitude. The implications of either may be negligible – such as where the cost is more than compensated in unpaid overtime, the error merely accidentally increases the factor of safety – or catastrophic, such as when a discarded piece of paper has sensitive industrial information that ends up with a major competitor, or an arithmetic error decreases the factor of safety and a component fails in use at the cost of human life. For the occasions when the ramifications of our decisions are not apparent to anyone else, then ethics are a matter of personal conscience. However, when the ripples of our actions spread out and cause damage or injury then we are legally responsible for the result. Very often, the difference between the two is a matter of luck.
The very nature of engineering implies that safety must be a primary issue. Even the most remote of robots will have some human interface somewhere along the line, and most engineering design, whether industrial or domestic, requires direct contact at one or more levels. Ethics and safety are often closely interwoven – our responsibility for safety in design is as much moral as it is professional – and there are safety practices to be observed at every stage of the design process.
Much of what we know now has been learned from bitter experience but, amazingly, evidence suggests that we are still inclined to become complacent over long periods of technological triumph, leading us to more narrow margins of safety and, ultimately, repeated disaster. Consider what you know of the most publicised engineering disasters over the last century, and how safety was compromised in each case. Often, these great catastrophes are the result of some very minor error, and not the technological billion-to-one misfortune we might hope to believe – see Table 6.
Date | Disaster | Fundamental cause |
---|---|---|
2000 | Concorde: fire and crash, shortly after take-off (113 dead) | Debris on runway and fuel tank susceptible to damage from same |
1986 | Chernobyl: meltdown of nuclear reactor core, and large-scale radioactive contamination | Safety procedures ignored, and design flaws |
1986 | Challenger Space Shuttle: exploded 73 seconds into flight (7 dead) | Design flaw in O-ring seals on the booster engines |
1981 | Hyatt Regency Hotel: suspended catwalk collapsed over a dance floor (114 dead) | Design change and failure to anticipate overload |
1979 | Three Mile Island: 51 per cent meltdown of nuclear reactor core | Incorrect procedures |
1940 | Tacoma Narrows Bridge: bridge collapsed | Unexpected wind-induced vibrations |
The study summarised in Table 7 investigated 800 cases (and millions of pounds worth) of structural failure, in which 504 people died and 592 were injured. When engineers were to blame, the study categorised the causes of failure (and hence breaches in safety).
Insufficient knowledge | 68% |
---|---|
Underestimation of influence | 16% |
Ignorance, carelessness or negligence | 14% |
Forgetfulness, error | 13% |
Relying on others without sufficient control | 9% |
Objectively unknown situation | 7% |
Imprecise definition of responsibilities | 1% |
Choice of bad quality | 1% |
Other | 3% |
*Note that the percentages add up to more than 100 – some failures were attributed to more than one cause.
You can see that in a whopping 68 per cent of cases, 'insufficient knowledge' on the part of the engineers was a contributing factor. Again, this has to be an ethics issue – can we really accept that all these engineers were so lacking in self-awareness that they truly believed in their own abilities, or were some of them just not brave enough to admit they were out of their depth at the time? The lesson is clear. You don't need to store everything you study in a photographic memory compartment, but it is essential to remember that, as a professional engineer, you are accountable for your actions; and this includes recognising when you need to bring in expertise from a colleague or external sources.