Notes on the Challenger Disaster


Click here to see blow-by data. 

Was it a communication problem?

Probably not. The engineers at Thiokol noticed the problem and communicated it to top management. NASA was also aware of the problem. It had more to do with decision-making under uncertainty and responding to stakeholders.


What were some of the reasons for making a poor decision?

Previous successes led to over confidence and the belief that observed problems, such as o-ring blow-by, were not serious (since it had happened many times without serious consequences).

Looked for proof that the flight would fail, rather than proof that it would succeed (i.e., success had become normal)

They concentrated on analyzing just flights where something went wrong, instead of all flights. See the data they used.

Pressures from the White House.

Thiokol wanted to be on good terms with NASA because of upcoming booster rocket contracts.

Avoiding appearing incompetent -- there had been several delays and cancellations already. NASA under severe budget constraints, is concerned about losing funding.

Couldn't delay much longer without canceling (other launches scheduled), losing the teacher-in-space photo-op

The notion that you cannot control all the risks.

Attitudes towards engineers: they are perfectionists, they are not risk-takers, they always want to collect more data, and they do not see the big picture. Their input is important but if you listen to them too much you will find that you never do anything.

Astronauts were not part of the decision process, so loss of life did not loom particularly large in decision-makers' minds.


The Need for a Decision-Making system

Bounded Rationality. Human beings are rational within limits. Rationality means that you consider all the possible courses of action and choose the one that has the highest expected value (probability of success times payoff). But humans cannot consider all possible courses and cannot calculate probabilities of all outcomes. We have minds that think with relatively simple heuristics and have biases. For example, in testing hypotheses, people have a strong tendency to seek confirmatory evidence, and don't even know where to look for potentially disconfirmatory evidence.

So we create systems to help us make the right decisions. For example, we divide up problems into smaller chunks and assign each chunk to a different person. In a way, the purpose of organizations is to overcome the limited cognitive capacity of human beings.

GroupThink. Small groups of homogeneous managers, such as a small group of white males from similar backgrounds who have worked together for years, can develop groupthink, which is an artificial kind of consensus where they see the world exactly the same way and do not allow innovations or contrary evidence to really enter their consciousness. They tend to devalue the opinions of people who are different from themselves (such as engineers). (This is known as homophily -- the appreciation of those similar to oneself.) It's what allows things like the Bay of Pigs initiative to seem like a good idea when any outsider would see that it was not.

The new system. Since the disaster, NASA has constructed a pre-launch system that actively gathers input from diverse groups including the astronaut crew, engineers and operations crews and does not allow them to be easily over-ridden. And the final decision to launch is now made by a single person who is an astronaut. Being an astronaut makes it easy for that person to keep the best interests of the astronauts in mind, rather than the budget, the politics, public opinion, scientific needs, etc.