The next best thing
by Dan Lester
|We’re not talking about “virtual” presence, any more than you’re “virtually” talking to your friend on the telephone. There is nothing “virtual” about a telephone conversation.|
But we’ve reached a curious point in our history, where our technology now allows us to experience distant venues through electromechanical surrogates. This technology provides us with keen vision, precision mobility, and a measure of dexterity that approaches that of our hands and arms. Should hearing, smell, and touch be of interest, we could do that too. The idea of exercising our senses through surrogates is nothing new. We’ve been using telephones widely for more than a century, using an induction coil attached to a diaphragm as a remote surrogate for our eardrum and middle ear. Vidicons long ago put our eyes in faraway places. The early implementations of these surrogates often required, of course, having a person on the other end to enable the surrogate, holding the telephone up to their head, and moving the vidicon camera this way and that. While we’ve sent surrogates for our eyes to distant parts of the solar system, the Mars rovers now exercise our mobility, and to some extent our dexterity, in a gravity field on distant soil. Thanks to the RAT tools on Spirit and Opportunity, and now the ChemCam laser on Curiosity, we’re leaving marks inscribed in rocks on Mars, and these vehicles certainly leave their tread marks, if not boot prints, in the soil.
It’s true that Lewis and Clark hadn’t looked down on the Louisiana Territory from orbit. They didn’t because they couldn’t. When you get right down to it, if they could have done that, they would have. In fact, if Thomas Jefferson had that capability, his Corps of Discovery might have been ensconced in a control room hunched over display terminals instead of hauling gunpowder and cartography equipment, and they would have gone home every day when their shift was done.
The progress of electromechanical surrogates, which we abbreviate with the term “telerobots”, has been startlingly rapid. In the last decade we’ve seen these surrogates extend both our awareness and our manipulative abilities into the ocean depths and even inside of human bodies through telerobotic surgery. These are places that we’d otherwise think of as being visited by small numbers of people encased in heavy pressure vessels, and even by Raquel Welch, in her completely fictitious Fantastic Voyage. It is reasonable to think that these surrogates will eventually relay complete senses and dexterity, as well as provide the mobility, of at least a spacesuited human. We’re not talking about “virtual” presence, any more than you’re “virtually” talking to your friend on the telephone. There is nothing “virtual” about a telephone conversation. Defining “presence” as where your cognition is, rather than where your body is, we’re talking about real presence through surrogates. The idea of “telepresence”, which used to be considered somewhat technologically fantastical, is now becoming wholly credible. Isn’t it time for our perception of exploration to graduate from its historical underpinnings of dirt in boots and mature with our technology?
But there is a problem. The distances over which we want to exercise these surrogates impose a time delay on their control. For the Moon, that two-way time delay is at least 2.6 seconds, and for Mars it is far longer: 8 to 40 minutes. The lure of personal experience is in many ways defeated by these delays that render operation through surrogates in real time a decidedly local enterprise. These time delays, absolutely dictated by the speed of light, are what we call “latency”, and a fundamentally constrain earthbound humans in using these surrogates. These delays are, at minimum, what we routinely endure in “experiencing” Mars through our rover surrogates. What kind of personal experience has you turning your head, and waiting 40 minutes to see the view? Is experiencing distant space destinations through electromechanical surrogates really possible?
|So, one might say, if we’re sending astronauts 99% of the way to the surface of Mars, why don’t we just send them down to the surface? Perhaps because we don’t need to.|
It isn’t that easy to do so from Earth. But perhaps it is possible if we can get people close enough to those destinations. NASA has been recently thinking about strategies for on-orbit telerobotics, which would have astronauts travel close to a distant site, but not require them to descend into a gravity well; perhaps, instead, being in orbit around the site. Their lives in orbit would, in many respects, benefit from our vast experience with the International Space Station. Their exposure to space radiation would be higher than for ISS, but not necessarily much higher than if they were on the surface of a planet, such as Mars, which has a thin atmosphere and weak magnetic field. From their high perch, they could control surrogates in near-real time at many different surface locations, quite unlike the capabilities of astronauts who would land at one place on the surface. In the very near term, we’re thinking of doing that at the Moon, from Earth-Moon L1 (near side) or L2 (far side). While the latency advantages would be far less than for Mars, the concept of operations of on-orbit telerobotic control would be exercised.
So, one might say, if we’re sending astronauts 99% of the way to the surface of Mars, why don’t we just send them down to the surface? Perhaps because we don’t need to. Perhaps because surface operations add a thick layer of additional expense, complexity, and risk to a human trip to Mars. Perhaps because the astronauts can actually cover more ground from their high perch. Perhaps because planetary protection makes human visits to the surface problematical. Perhaps because even for resource development, we don’t need astronauts sitting in bulldozers or wrestling with shovels and pickaxes. So while they might like to get dirt in their own boots, these on-orbit astronauts could get dirt in a lot of surrogate boots all over the planet, and even shake the dirt out of them in real time.
My colleagues and I held a symposium at NASA Goddard Space Flight Center a few months ago to assess the promise of real-time robotic surrogates for exploring distant destinations. See http://telerobotics.gsfc.nasa.gov/, where videos and slide sets from the plenary presentations are posted. This symposium was attended by almost a hundred members of the planetary science, robotics, and human spaceflight communities, as well as representatives of terrestrial commercial telepresence activities, such as surgery, mining, undersea operations, and inter-office cooperation. Participants came from NASA, the European Space Agency, the Canadian Space Agency, and the Japan Aerospace Exploration Agency, as well as industry and academia. This diverse group was there to think about putting human “presence” at places where it was hard, or at least really inconvenient, to put humans, and to report on that to NASA. Stay tuned.
So with our Earth-controlled Mars rovers and orbiters reaping a new wealth of science, and in the face of serious funding challenges for space endeavors of all kinds, maybe the new next best thing for planetary exploration is on-orbit telerobotics and exploration telepresence: putting real-time human cognition on a planetary surface without quite putting people all the way there. To the extent you eventually want to get dirt between your toes (perhaps a tray full of regolith in a surface habitat would let you do that?) as opposed to dirt in some boots, this strategy may help pave the way.