The Space Reviewin association with SpaceNews

Mars Observer
Mars Observer was one of three NASA Mars missions lost in the 1990s because of technical errors, and not as part of a broader conspiracy.

The dark side of space disaster theories

<< page 1: Mars Polar Lander

Mars Climate Orbiter (1999)

Dark Mission’s treatment of the Mars Climate Orbiter failure in 1999 (pp. 313–315) is erroneous, misleading, and unintentionally deliciously ironic.

All observers are free to offer their own hypotheses about the causes of space disasters. But their fundamental credibility must be founded on an adequate understanding and accurate recounting of the official theories that they want to supplant. Here, the authors fail spectacularly, again.

“According to a terse press release, the spacecraft was thrown off course when one navigational team in Colorado and the other at JPL used two separate measurement systems (metric and imperial) in key navigational calculations.” Dark Mission called this “highly implausible and bizarre”, and added: “The notion that this error could have been induced from the beginning of the mission and gone unnoticed is ridiculous… [T]o anyone with the slightest understanding of measurement systems and orbital mechanics, this statement… is ludicrous… That is why NASA’s explanation is so unbelievable.”

But the proximate cause of the navigation error was exactly as NASA admitted—a units foulup. Dark Mission’s problem is that the authors completely misunderstood the NASA description, a description verified by independent investigators with private access to the space navigation team. While denouncing the official version of the accident as “ludicrous”, they themselves never understood it, and so grossly misreported it in their book.

A workable safety strategy is to develop a system that expects human errors and has defense in depth to detect, identify, and counteract them. It is not enough to simply assume, as NASA did in this case, that people won’t make mistakes.

In reality (and interested readers can search the Internet for both official and unofficial contemporary stories with full details), the units problem was isolated to one data table that specified expected minor course disturbances from routine attitude control (pointing control) maneuvers during the coast periods. As small thrusters were fired on autopilot to correct for the gradual buildup of small orientation errors, the thrusters did not impart purely rotational impulses because of the way they were attached to the probe’s exterior. Instead, they induced small propulsive forces as well.

Based on a reference table, an adjustment factor of these accumulated disturbances (which were too small to be detected by the probe’s on-board accelerometers) could be calculated and factored into the periodic course corrections.

The table of expected accumulation of propulsive disturbances based on the amount of orientation control firings was the document that was in the “wrong” units, but was not labeled so explicitly. So when this “fudge factor” was calculated based on the known number of orientation thruster firings (with a calculation of the expected course disturbance caused by them), the factor was off by a factor of about three. This led to an unexpected and persistent trajectory “disturbance” that was noticed, wondered about, and periodically corrected for during the trans-Mars cruise.

All tracking and course correction maneuvers were calculated exclusively in metric, so the flight path was under close observation and control. Dark Mission’s allegation of a wildly-off-course spacecraft based on its authors’ misunderstanding of the fundamental error is entirely illusory. Their mocking rebuttal of the units error cause is a joke on themselves, since they never knew enough about the probe’s navigation error to fairly judge its cause.

Approaching Mars, the disturbances grew. The uncertainties proved too high, and the error factor was in the wrong direction. So when the probe skimmed around the backside of Mars to fire its braking engine it was a few hundred kilometers too low—and so it hit the upper atmosphere, as last-minute tracking plots indicated it would, too late for any course change.

It was easy—and self-serving—for NASA flacks to blame low-level engineers making a conversion error. But the blame lay elsewhere. The debacle’s root cause was a series of management flaws—including the refusal to respond to navigators’ concerns until they could “prove the probe was off course”—and these were ultimately responsible for not reacting correctly to the suspicious disturbances in time.

A workable safety strategy is to develop a system that expects human errors and has defense in depth to detect, identify, and counteract them. It is not enough to simply assume, as NASA did in this case, that people won’t make mistakes. And it was dishonest for NASA, in the immediate aftermath of the crash, to blame the disaster on a human mistake, and not on an imprudent NASA management philosophy of assuming no mistakes (and hence, eliminating double-checks).

In hindsight, it is an instructive example of a bad attitude toward safety and reliability that had infected NASA culture in the 1990s when under Dan Goldin’s leadership, shaped in still-undetermined degrees by White House pressure, politics superseded safety as primary motivator. The spread of that attitude led directly to the bad decisions that later destroyed Columbia and killed its seven astronauts.

That hideous cost underscores the importance of correctly understanding space disasters, learning their lessons (often lessons already learned once, but forgotten), and responding correctly and constructively to them. And this suggests that bizarre conspiracy theories with false causes and phantasmagorical explanations (as in Dark Mission) aren’t merely entertaining: in the worst case than can be distracting, misleading, and ultimately, costly. And that’s much worse than merely “ludicrous”.

The Hoagland “theory” of the failures

Focusing attention on these garbled and nonsensical Hoagland/Bara criticisms of the “establishment explanations” of these Mars probe failures distracts from actually considering what alternative scenarios Dark Mission advocates. They are not worthy of the appellation “theories” since they can’t be verified or falsified, but as imaginative suggestions, they are very revealing of the thought processes behind them and similar fringe disaster scenarios.

Essentially, they argue that the accidents were all staged to prevent the need for public release of photographic evidence the probes were sure to gather concerning alien ruins on Mars (as if individual images couldn’t just as easily be made to disappear as an entire spacecraft). Either the probes were turned off entirely, or their data was switched to some alternate top secret military control network that was developed for such a purpose.

Specifically, Dark Mission argues that the Mars Observer mission was terminated by decision of NASA based on a televised debate with a NASA scientist that Hoagland claims he won. Within hours of the debate, word was released that contact with the probe had been lost.

“In hindsight, it isn’t difficult to figure out what actually happened,” states Dark Mission (p. 87). “After other high NASA officials (and their bosses) watched [the scientist’s] lame spin control fail miserably—and on live television—NASA went to Plan B. They either pulled the plug on the mission outright—out of fear of what uncensored images would reveal—or NASA simply took the entire mission ‘black’.”

Mars has in the past decade been swarmed by spacecraft. Their images have exceeded the quality of those that would have been obtained by the doomed Mars Observer. How were they then allowed to succeed?

This postulated causal chain doesn’t even pass the simple timeline check. The debate, on a Sunday morning, occurred half a day after the probe had failed to resume communications—a development that was announced later on Sunday. Dark Mission would have the probe actually perform the “maneuver” (the pyro firing) correctly, come back on the air and transmit routine data for many hours, and then have NASA officials suddenly order everybody involved to pretend they had not received the expected signals. All the recordings of the previous twelve hours then needed to be erased, and a false series of bogus uplink communications commands needed to be thrown together—and everybody sworn to testify to this charade before the investigation board.

And all for what? Mars has in the past decade been swarmed by spacecraft from the US, Europe, Japan (unsuccessful, and understandably so), and, soon, Russia again. Their images have exceeded the quality of those that would have been obtained by the doomed Mars Observer. How were they then allowed to succeed?

But even without an analysis of the proposed true cause, a careful assessment of the misinterpretations and misrepresentations in the book’s treatment of well-investigated spacecraft disasters provides a strong argument that these alternate ideas are not useful. Certainly one owes it to Murphy, the god of things going wrong in space, to look at any suggestion—and just as certainly, after looking at them, scenarios such as these can be ruled out.