At that particular moment, like many other people I was glued to my television set watching this historic launch. Like many other people I stared at the screen perplexed by the ball of smoke, vapour and fire wondering what happened then realising that the worst had happened. Over the years, I have thought many times about the crew with the hopes they were spared the agony of being cognisant of their fate as they fell back to Earth and to their death. How many pictures have I seen over the years of a plane crash caught in mid-air realising that at that particular moment, people are alive and screaming or gritting their teeth as they hang on with some shred of hope they are going to be spared from that inevitable meeting with destiny.
Beyond the tragedy, I have thought a great deal about everything surrounding the accident itself and how such a large organisation, supposedly well planned with the best project management in the world could make such a mistake. The Presidential Commission on the Space Shuttle Challenger Accident or the Rogers Commission, named after its chair, investigated the disaster and arrived at some damning criticisms of NASA and its partners.
While the commission explained the cause of the accident, the failure of the O-ring, it also uncovered questionable management practices. The flaw in the design of the O-rings had been known within engineering circles since 1977 but the flaw or the gravity of the flaw was never discussed outside of the normal reporting channels. As such, the flaw became an "acceptable flight risk" and in the end, the decision to launch the Challenger was based on incomplete and sometimes misleading information. I quote from the Wikipedia article:
One of the commission’s most well-known members was theoretical physicist Richard Feynman. During a televised hearing, he famously demonstrated how the O-rings became less resilient and subject to seal failures at ice-cold temperatures by immersing a sample of the material in a glass of ice water. He was so critical of flaws in NASA’s "safety culture" that he threatened to remove his name from the report unless it included his personal observations on the reliability of the shuttle, which appeared as Appendix F. In the appendix, he argued that the estimates of reliability offered by NASA management were wildly unrealistic, differing as much as a thousandfold from the estimates of working engineers. "For a successful technology," he concluded, "reality must take precedence over public relations, for nature cannot be fooled."
This condemnation of NASA and the environment which put public relations ahead of safety is one of the most telling criticisms of people working together. Any organisation works with the problem of maintaining public confidence. Not only must the group deliver the goods, it must ensure it delivers goods which work and which are safe. Unfortunately, the public is fickle and is easily swayed by sound bites and tidbits of information which may offer incomplete and uncorroborated evidence. Nevertheless, the public can quickly react not to a real problem or threat, but to a perceived problem or threat. It comes down to the sad fact that the majority of us never bother to properly verify "the facts" before we make a decision.
However, as the shuttle disaster so clearly demonstrated, those who are at a distance from the epicentre of any problem may not fully appreciate the magnitude of the problem. In fact, their entire decision making process may be based on something which has absolutely nothing to do with safety. Years ago, I saw a PBS special on the Challenger disaster and it pointed out how continued funding for NASA had somehow be directly tied to the successful launches of the shuttles. There was the threat of funding being cut or there was the perception of funding being cut which could have inadvertently led to more pressure being put on the organisation to succeed no matter what. If a manager who is not an engineer is standing there weighing the options of funding being cut or an O-ring failing, there is an excellent chance he may not have a clue of what an O-ring is. However, I bet he’s going to easily understand what funding being cut means.
I have half jokingly; half seriously painted the scenario of a boss somewhere looking at these two issues. "If I launch, I get funding but I risk having the O-ring fail. If I don’t launch, I risk having my funding cut and I could be out of job. What the heck is an O-ring anyway? I want to keep my job." What happens? 73 seconds later he’s standing in front of a television set like the rest of us thinking to himself, "Oh, that’s what an O-ring is for!"
For me, what is so fascinating about my little scenario is that I have actually lived through this. Okay, the results haven’t been as dramatic as a loss of life, certainly nothing as spectacular as being televised in front of the entire world, but the central idea of mismanagement is very much true. I am picturing this to be a much more common occurrence in companies than we would care to admit.
In my own experience, I have watched people in positions of authority make decisions which are completely based on what amounts to being absurd reasoning. How many times have I had a superior tell me that I don’t understand funding as I try to impress upon them the importance of an O-ring? This falls on deaf ears. However, because nobody is killed, because it’s not televised, most of the time these issues fall by the wayside and go unnoticed by the higher-ups. What the Rogers Commission found out, that public relations comes before safety, seems to be a cultural phenomenon which may be a natural characteristic of any group of people working together. The only difference between NASA and these other possible incidents in any one of a number of organisations is the number of people killed and the televised coverage.
Today is the 25th anniversary of the Challenger Space Shuttle disaster. This was a tragedy. However, it is even more tragic when we realise that it was preventable. Yes, going into space is a complicated business fraught with danger. As a spokesman for NASA jokingly explained at a press conference, "Let’s not forget, this is rocket science."
The old saying goes, "ignorance is bliss" but sticking your head in the sand like an ostrich is not going to solve your problems. Even if nothing goes wrong today, sooner or later, something is going to come back and bite you on the posterior.
According to Wikipedia:
The Challenger accident has frequently been used as a case study in the study of subjects such as engineering safety, the ethics of whistle-blowing, communications, group decision-making, and the dangers of groupthink. It is part of the required readings for engineers seeking a professional license in Canada and other countries.
Ah, "groupthink". What a term; what a concept. As defined in Wikipedia:
Groupthink is a type of thought within a deeply cohesive in-group whose members try to minimize conflict and reach consensus without critically testing, analyzing, and evaluating ideas.
I can’t help thinking of this demotivational poster: