The Vision for Space Exploration and the retirement of the Baby Boomers (part 3)
Build an industry, not a program
by Charles Miller and Jeff Foust
|History demonstrates that the “Big Government Program” mindset almost always leads to a strategy of “Pick the Solution”, which often leads to failure.|
In the 1970s, NASA set out to design and build a “National Space Transportation System”, also known as the Space Shuttle. NASA told the White House Office of Management and Budget in 1971 that the Shuttle would fly 50 times per year at a cost of $10.5 million a flight and deliver 29,500 kilograms (65,000 pounds) per launch. This equates to an estimate of $162/pound (FY1971) or $650/pound (2008 dollars)1. America spent $30 billion (2008 dollars) to develop the Shuttle, and it failed to produce anything close to the costs that were promised. In fact, it is over an order of magnitude higher in cost, as well as an order of magnitude less reliable, than was promised.
In the 1984, this nation initiated a second major effort to achieve CRATS, which started out as a classified DARPA project called “Copper Canyon.” The project became the X-30 National Aerospace Plane Program (NASP) after a State of the Union speech by President Reagan in February 1986 proposed an Orient Express. NASP was extremely technically challenging, requiring breakthroughs in at least six distinct technologies. In a December 3, 1992 report to Congress, the Government Accounting Office (GAO) reported that the NASP program cost had grown by more than a factor of four from the original 1986 estimate of $3.3 billion in 1992 dollars ($4.8 billion, 2008 dollars) to as much as $14.6 billion ($21.3 billion, 2008 dollars). After 10 years of activity, the NASP X-30 program was cancelled in 1994.
We did not wait long after NASP to start the next effort to achieve CRATS—almost certainly because of the public support and excitement generated by the DC-X program. In January 1995, NASA announced the X-33 program with lots of public support. However, NASA chose by far the most technically risky of the three leading concepts, rather than the most likely to succeed at demonstrating an operational RLV capability. This was a mistake as neither NASA nor Lockheed Martin were willing to pay for cost overruns created by the ensuing X-33 technical problems. In an August 1999 report to Congress, the GAO estimated that the total cost to the US taxpayers for the X-33 was $1.23 billion ($1.5 billion in 2008 dollars). The X-33 RLV program was shut down in 2001.
Seven years later, we have not yet (officially) tried again. This is not because advocates of RLVs have stopped trying. From 2002–04, starting right after the X-33 was cancelled, NASA and the DoD combined forces in the form of the National Aerospace Initiative (NAI). Most of the NAI history is not public, so we talked to several sources in the initiative. One source reports that the NAI tried to meet all the launch requirements of both the DoD and NASA, thereby repeating part of the history of the Shuttle, and the total cost estimates approached $50 billion. This source reports that the NAI collapsed from the sticker shock. A second source had a slightly different but consistent perspective, stating that the cost of the NAI was so large it forced an either/or choice between the NAI and the VSE, and the NAI collapsed after the President picked the VSE.
|It is not what you know that will get you. It is not even what you know that you don’t know. It is what you don’t know that you don’t know.|
The reason that the Shuttle, NASP, X-33, and the NAI failed to achieve the goal of CRATS is not because the people managing these programs were not smart—they are very smart. In many cases, they were chosen because they were among the smartest people we knew on the subject of space transportation. They knew a lot about space transportation. They were among the “best and brightest” people on this subject that we knew.
We assert that the fundamental reason for the repeated failure of these programs has to do with the limits of knowledge, and the blind spots that we all have.
It is not what you know that will get you. It is not even what you know that you don’t know. It is what you don’t know that you don’t know. This is a mental blindspot created by the fundamental nature of knowledge.
We focus on what we know. We look at facts based on our previous experience. We create mental models based on what we know. The curse is that the smarter you are, and the more you know, the more likely you will become confident, and then arrogant, about what you know. When you become confident about your knowledge, the blind spot increases in size. When you become arrogant, you stop listening, and the bigger your blind spot becomes. In reality, there is not much difference, if any, between confidence and arrogance.
In The Black Swan: The Impact of the Highly Improbable, Nassim Nicholas Taleb writes:
“Let us examine what I call epistemic arrogance, literally, our hubris concerning the limits of knowledge. Episteme is a Greek word that refers to knowledge; giving a Greek name to an abstract concept makes it sound important. True, our knowledge does grow, but it is threatened by greater increases in confidence, which make our increase in knowledge at the same time an increase in confusion, ignorance, and conceit.”
Smart, highly educated, and accomplished people often start thinking they are smarter than almost everybody else. We often focus on what we know that others do not know. The authors of this essay are not immune—we too are often blinded by our arrogance. This arrogance is a disease—it creates blinds spots, which stops each and every one of us from thinking.
These blind spots help explain some of the history of attempts to achieve cheap and reliable access to space. We have now had three major program failures in a row. In each case, the lead government agency picked what it thought was “The Solution”. After each failure, they figured out the specific problem with the latest attempt, decided what the right solution to that specific problem was, and then picked somebody they knew was “best and brightest” to try again.
Do you see the blind spots?
Everybody agrees that achieving CRATS is critical for the nation. Everybody agrees that we should marshal the efforts of the best and brightest people in the nation to tackle this challenge. One problem is that we don’t know who the best and brightest people really are. A second problem is that we really can’t know what is the best solution when we develop a plan to attack the problem. A third problem is that we get too specific with lessons learned.
Many very smart people have studied the failures of the Shuttle, NASP, and X-33 to provide lessons learned for the next attempt to organize a government program to build RLVs. Many papers, and even a few books, have discussed this subject. The managers of NASP learned from the failures of the Shuttle, and the managers of the X-33 program learned from the failures of both the Shuttle and NASP. We do not argue with the precise technical lessons learned each new generation of engineers learned from the previous failure. However, we do suggest that they have a tendency to focus on the wrong lesson.
In The Black Swan, Taleb writes:
Learning to Learn
Another related human impediment comes from excessive focus on what we do know: we tend to learn the precise, not the general.
What did people learn from the 9/11 episode? Did they learn that some events, owing to their dynamics, stand largely outside the realm of the predictable? No. Did they learn the built-in defect of conventional wisdom? No. What did they figure out? They learned precise rules for avoiding Islamic prototerrorists and tall buildings… The story of the Maginot Line shows how we are conditioned to be specific. The French, after the Great War, built a wall along the previous German invasion route to prevent reinvasion—Hitler just (almost) effortlessly went around it. The French had been excellent students of history; they just learned with too much precision.
We do not spontaneously learn that we don’t learn that we don’t learn.
Those who promote a pure laissez-faire approach have just as much epistemic arrogance as those who promote yet another big government program. Both sides tend to ignore the hard empirical data that is inconsistent with their views.
|We do not argue with the precise technical lessons learned each new generation of engineers learned from the previous failure. However, we do suggest that they have a tendency to focus on the wrong lesson.|
The empirical data is clear: the laissez-faire “just leave us alone” approach has not succeeded at producing CRATS either. A series of companies, with names like Earth/Space, Inc., Pacific American, Kistler, Pioneer Rocketplane, Kelly Space & Technology, Rotary Rocket, and many others have tried to privately finance totally reusable launch vehicles over the last three decades. Many of these companies have had brilliant concepts, led by brilliant engineers, but the size of the investment required to complete their system has proven to be beyond any of their reach.