The Space Reviewin association with SpaceNews

Columbia debris in KSC hangar
Debris recovered from the space shuttle Columbia is stored in a hangar at the Kennedy Space Center for analysis. (credit: NASA/KSC)

The dangers of “creeping determinism”

It seems so obvious now, many people observing the Columbia investigation are saying. Foam from the external tank hit the leading edge of the left wing during launch, causing one of the reinforced carbon-carbon tiles there to either fall off or become so damaged it could not prevent hot plasma from getting through 16 days later during reentry. That damage eventually led to the structural failure of the wing and the loss of the orbiter. The images, the paper trail of memos and emails, all seem to show concern among engineers that such an incident during the launch could have caused an accident just like the one that befell Columbia on February 1.

It all seems so obvious now. Or is it?

That’s the danger of drawing too many conclusions too early in the investigation of the Columbia tragedy. Buoyed by the limited evidence released to date—from email exchanges among engineers to limited-resolution video of the shuttle’s launch—everyone from independent analysts to journalists to members of Congress are ready to claim they know the sequence of events that led to the shuttle’s demise. More importantly, they are ready to cast blame at people and institutions within NASA that they believe are at fault for the accident.

Creeping determininsm refers to “the sense that grows upon us, in retrospect, that what has happened was actually inevitable.”

While in retrospect what happened to Columbia may seem obvious, the problem is that was not necessarily the case as it happened for those involved. As engineers and managers reviewed the data available to them during the flight, they made decisions based not only on that data, but also on the experience drawn from previous flights, and the knowledge of and confidence in shuttle hardware and the analysis tools available to them. In retrospect they made have been wrong, but at the time they had no reason to believe they were making a grievous error.

In an essay in the March 10 issue of The New Yorker, Malcolm Gladwell gives this phenomenon a name: “creeping determinism”. The term, actually coined three decades ago by psychologist Baruch Fischhoff, refers to “the sense that grows upon us, in retrospect, that what has happened was actually inevitable”, Gladwell writes. In other words, creeping determinism is when we apply a filter to the past to produce a series of events that seems obvious in hindsight, tossing out everything else that kept those events from being obvious as they happened. It’s connecting the dots on an otherwise blank piece of paper, having erased the previous scatter of dots from which no discernable pattern could emerge.

A prime example of creeping determinism, outlined in Gladwell’s essay, is the series of events that led up to the invasion of Israel at the start of the Yom Kippur War in 1973. In hindsight, those events—from the mobilization of troops in Egypt and Syria to the evacuation of the families of Soviet advisors in those nations—appear as obvious signs of an impending invasion that the Israeli government ignored. However, Gladwell points out, that analysis is only true in hindsight: it ignores the fact that there were many other events in the preceding two years that could also have been interpreted as evidence of an impending war, except that hostilities never started. The Egyptian army, for example, mobilized for war 19 times from January to October 1973 without going into battle. To Israel, the events that led to the beginning of the Yom Kippur War appeared to be no different than those previous false alarms until it was too late.

Hindsight has provided stark reminders that some of those decisions made during the flight were wrong. However, it does not mean that those errors were as obvious then as they are now.

Gladwell uses the concept of creeping determinism to anchor a critical review of intelligence reform in the US in the wake of the September 11 terrorist attacks. The same concept can also be used to more carefully evaluate the events surrounding the loss of Columbia. What seems obvious now, after the tragic accident, clearly was not obvious to the shuttle engineers and managers during the flight. They saw the video of foam falling off the external tank and striking the shuttle, but they were aware of events on previous flights where foam fell off and caused no appreciable damage. Moreover, a software package that in the past had overestimated the amount of damage to tiles caused by an impact showed no indication that any foam striking the shuttle as observed could have caused serious damage. The seemingly damning email chatter among engineers is not that unusual, NASA officials say, although without looking at email exchanges from past flights it is difficult to judge that claim. Yes, NASA officials turned down an opportunity to have spy satellites or ground-based telescopes observe the damage, but past experience—such as on STS-95, when telescopes were called on to check the shuttle Discovery after its drag chute door fell off during launch—led them to conclude that the images were of limited value.

Hindsight has provided stark reminders that some of those decisions made during the flight were wrong. However, it does not mean that those errors were as obvious then as they are now. The events that led to the loss of Columbia were complex, perhaps involving a series of circumstances and interactions not previously foreseen by NASA, and invalidating the experience and expertise that had safely managed past missions. In our zeal to find out what went wrong, how it can be fixed, and upon whom blame should be placed, remember the lesson of creeping determinism: what seems obvious now may not always have been so clear.