Zoë, a robot configured to create geologic maps, explores the Mojave Desert near Amboy Crater, which is seen in the background on the right.

Henry Bortman (Astrobiology Magazine)

On a dry lakebed in the Mojave desert, a small experimental rover named Zoë wanders back and forth between dusty clay sediments and black fields of basaltic lava, belched out during eruptions that formed the nearby cinder cone, Amboy Crater.

Atop a small rise in the landscape, in the shade of a six-foot-square canopy, a quartet of researchers sit in folding chairs, portable computers on their laps, alternately scanning screens of status data transmitted by Zoë, discussing the robot's behavior, and writing on-the-spot patches to its control software.

Zoë carries with it, on a hard disk, a crude map of the region it's exploring, based on data collected by ASTER, an infrared spectrometer onboard NASA's Earth Observing Satellite. Zoë's goal is to produce a more-detailed map that clearly defines the boundaries between areas of the terrain dominated by clay and those dominated by basalt. It would be a relatively simple matter for a field geologist to do this work. But what the Carnegie Mellon University researchers who are part of this Science on the Fly experiment want to know is: can a robot can do it too?

To members of the CMU team, it's obvious that the crumbly brown stuff is clay and the hard black stuff is basalt. Color and texture make the distinction clear in an instant. The researchers have purposely picked a simple landscape for their test, dominated by two distinct rock types. Many geologic settings are more complex, but they wanted to give Zoë a relatively clear task on its first full field trial. If Zoë can learn to perform its task here, one day a more-refined version of its software could be used on a robotic mission to Mars, where geologists are not likely to be wandering about any time soon.

But unlike humans, Zoë can't easily tell basalt from clay just by looking at color and texture. Instead, the robot relies primarily on spectral data, which can be ambiguous. Every type of mineral, such as quartz, or mica, absorbs light at certain frequencies and reflects it at others. This absorption and reflectance pattern is known as a spectral signature and distinguishes one mineral from another. A graph of this spectral response looks like a series of peaks and valleys. Rocks, such as volcanic basalts, or clay sediments, are mixtures of minerals, and so their spectral signatures are also mixtures. This can make identification more difficult, because different combinations of minerals can produce similar spectral signatures. So Zoë focuses on a handful of spectral frequencies, many of them in infrared wavelengths, at which basalt and clay are clearly distinct.

On each of its traverses, Zoë attempts to build a detailed map of an area ranging from 30,000 to 75,000 square meters (about 7.5 to 18.5 acres). The robot can't cover the entire area; rather it winds along a narrow path of its own choosing, up to a kilometer (six-tenths of a mile) long, moving about one meter, and taking one spectral image, per second. Each time it captures an image, it rebuilds its map. It analyzes every pixel, identifying it as either basalt, or clay, or still-unknown, and assigning a confidence level to its classification. Then it decides where to go next. (Zoë doesn't actually distinguish between "basalt" and "clay"; it knows only that they are two different classes of rocks, with different spectral signatures. It needs a geologist to decide what types of rock they are.)

The hard part is choosing the most effective path, the path that will yield the greatest amount of information. Ideally, Zoë wants to go directly to the place it knows least about. But there are some obstacles. For one thing, the most ambiguous part of the map may be far away. So the robot needs to figure out how to do useful science on its way to getting there.

CMU roboticist David Wettergreen explained the problem by analogy to a football field. Imagine, he said, you had just stepped onto one corner of the field and you wanted to know as much as possible about the entire field. "If you were unconstrained by time and space, you might hop to the far corner. And then, having seen that, you might hop to the 50-yard-line. But that's not a practical way to explore. I might know the least about that far corner, but I have to take one step. So how to take that one step is the problem that we're trying to solve here with the robot."

Zoë also has to avoid obstacles, such as steep hills — and bushes. When the CMU team scouted the Amboy Crater area, there were some creosote bushes scattered about, but they were dry and leafless — from Zoë's point of view, too insubstantial to pay attention to. Then it rained, and the bushes leafed out, making them more formidable obstacles, which Zoë had to be taught to drive around.

Wettergreen is pleased with the success of the Science on the Fly field test. In an email after the field work wrapped up, he wrote that, though a great deal of development remained to be done before software like Zoë's could be used to guide a rover on Mars, the "system demonstrated many of the 'common sense' intuitions that field geologists take for granted: spatial awareness, the ability to correlate surface and orbital views, [and] a tendency to seek novel data."

One aspect of Zoë's software, however, is closer to being ready for prime time: the use of far-field sensing — looking off into the distance, rather than looking just a few feet ahead — to help figure out how to avoid obstacles. Far-field sensing may seem like a no-brainer to us — we do it all the time — but it actually involves quite a bit of brainpower. That's what makes it challenging for a robot. Nevertheless, says Wettergreen, he and his colleagues now "have proven the concept." What remains is "to test, test, test."

The Science on the Fly project was funded by NASA's ASTEP (Astrobiology Science and Technology for Exploring Planets) program.

Comments


You must be logged in to post a comment.