I cannot remember how I found this website, but I remember taking my time reading “How Challenger Exploded, and Other Lessons in the Catastrophes We Make” on the Motherboard site. I wanted to know why the NASA Shuttle ‘Challenger’ exploded: the explosion in 1986 was big news, but a good, more or less detailed explanation of the cause of that accident never made the headlines, at least not here in Belgium. In summary: the rubber rings used as seals on the rocket boosters were not meant to be used at the (unusual) below-zero temperature measured at the launch site. The risk had been brought to the attention of NASA by some of the engineers, but was discounted when deciding to go for launch.
Why was the risk discounted by an organisation that had (and has) a reputation for high-quality engineering and a prudent approach on deciding whether to launch a vehicle into space? The particular reasons for the failure to correctly evaluate this specific risk isn’t exactly clear, but it’s clear that the potential impact of the risk wasn’t sufficiently clear.
The author of the article makes clear that NASA isn’t the only organisation that does not succeed in correctly evaluating the consequences of specific risks. The automotive industry is also discussed, as is the financial crisis of 2008, which showed that “insurance” isn’t going to provide an answer to all potential problems!
What every organisation needs, is a culture where risks can be discussed more or less openly:
Despite its shortcomings, NASA’s Space Shuttle left a positive legacy for spaceflight and for everyone. Part of that was the lesson that not making mistakes requires the brute force of computers, torrents of data, and an understanding of laws, both the physics and the government kind. It demands focus. But it also needs human doubt and dissent.
A mechanism for “human doubt and dissent” should be ‘built’ into enterprises – but I haven’t seen many examples in my admittedly limited experience. But don’t forget: such a mechanism probably needs to be ‘cultural’ rather than ‘formal’, and ‘culture’ is hard to engineer…
There is a fine example of what is needed, just at the end of a one and a half hour documentary on Youtube (part 1 and part 2) about the 1994 South Canyon Fire on Storm King Mountain in Colorado. I do hope Prineville Hotshot Tom Shepard is speaking for all firefighter crews working today and in the future:
If [any member of the crew sees] something they don’t understand or see something that they think is out of whack, they’re encouraged, today, to speak up, ask about it. [pause] Ya know, it takes twenty people to run a hotshot crew. Not just one; it takes twenty, and every voice is important. There just could be something that someone misses along the way that the newest recruit notices and brings to the supervisor’s attention and ends up saving some lives.
“Every voice is important“. Every voice in an organisation should count, and that means everyone has to listen as well.