Wed. Nov 30th, 2022
Block Island, the largest meteorite ever found on Mars and one of many identified by the Mars Exploration Rovers.

Block Island, the largest meteorite ever found on Mars and one of many identified by the Mars Exploration Rovers.

NASA’s Opportunity Mars rover has done many great things in its more than a decade of service, but initially rolled 600 feet past one of the initiative’s greatest discoveries: the Block Island meteorite. About 67 centimeters in diameter, the meteorite was a telltale sign that the Martian atmosphere had once been much thicker, thick enough to slow the rock that flew at a dizzying speed of 2 km/s so that it didn’t disintegrate on impact. . A thicker atmosphere could mean a softer climate, possibly able to support liquid water on the surface, maybe even life.

Still, we only know about the Block Island meteorite because someone from Opportunity’s science team manually spotted an unusual shape in low-resolution thumbnails of the images and decided it was worth going back a few days to investigate it further. Instead of this machine purposefully heading for the rock from the start, the team barely saw its greatest triumph in the rear-view mirror. “It was almost a miss,” said Mark Woods, head of autonomy and robotics at SciSys, a space exploration IT solutions company that works for the European Space Agency (ESA) among others.

Opportunity, of course, made this near-miss maneuver all the way back in July 2009. If NASA were to attempt a similar initiative today in a far-flung corner of the galaxy — as the space agency plans in 2020 with the Mars 2020 rover (the ESA has similar ambitions that year with its ExoMars rover) – modern scientists have a particularly notable advantage that has evolved since then.

“The rover lacked intelligence,” Woods says bluntly. “It could pass a Martian, and we wouldn’t even know it.”

Our Q&A with NASA roboticist Terry Fong at the 2018 Collision Conference (click here for transcript; interview produced by Nathan Mattise and additional imagery courtesy of NASA).

AEGIS and OASIS

The discovery of Block Island was so close because in 2009 the software and hardware responsible for a rover’s mobility developed much faster than the communication bandwidth. Previously, the Sojourner rover traveled about 100 meters during its mission of about three months in 1997. Opportunity, on the other hand, set a record of the displacement distance of about 220 meters in a single Mars day. Unfortunately, the amount of data that both rovers could send to Earth remained about the same. The more robbers could cover, the more science they were likely to lack at the time.

Therefore, not long after Sojourner, the team at NASA’s Jet Propulsion Laboratory began thinking about making the rover’s hardware to some extent autonomous. The idea was to design algorithms that would recognize interesting phenomena found in the rover’s environment during movements and either notify the scientific team on Earth to request instructions, or investigate those phenomena right away. And at the time, it was quite a challenge because of the limited computing power the rovers could muster: For example, for Opportunity, the JPL team had to figure out how to run advanced artificial intelligence systems on the rover’s BAE RAD6000 processor clocked at 25 MHz. .

“[It was] much slower than a typical smartphone. We’re talking about pre-Pentium processor performance,” Woods told Ars. Still, JPL made it happen. After years of development, the Autonomous Exploration for Gathering Enhanced Science (AEGIS) software was successfully uploaded to the Opportunity rover in 2010. Its successor, Curiosity, received the AEGIS update a few years later.

AEGIS is a relatively simple system. At the most basic level, the goal is to point a camera at an interesting rock and take measurements. An algorithm called Rockstar finds rocks in images looking for shapes predefined by the science team on the ground. The team turns the AEGIS on and off remotely, depending on how much energy the rover has left and how computationally intensive its other tasks are.

This software is still in use today and has been effective enough to be included in NASA’s upcoming Mars 2020 rover. But the way it works now, AEGIS is primarily a time-saving tool. Normally there is about a 20 minute delay in communication between Earth and Mars. If AEGIS is disabled and a team on Earth sends a command to the rover to advance a few meters, it must wait 20 minutes for the command to come through. Once the rover has moved those few meters, it will send images of its surroundings back to Earth. That’s another 20 minutes. When the images reach Earth, the science team looks for promising targets and sends commands to take high-resolution images back to the rover. Twenty minutes. It all seems painfully slow, because it is. But with AEGIS enabled, a rover itself analyzes rocks, takes pictures, and immediately sends them back to Earth after reaching its destination. Each time it happens, it saves about an hour. Combined with automatic obstacle avoidance systems, that’s about all the autonomy planetary rovers now have.

But future planetary rovers are about to get much smarter. AEGIS was originally designed as part of a more advanced system called OASIS (or Observation and Analysis of Smectic Islands In Space). In OASIS, the intelligence comes from the interplay between two main modules: an automatic science gathering component and a scheduling system called CASPER (Continuous Activity Scheduling Planning Execution and Replanning), which is also responsible for allocating available resources to all tasks on the agenda. This autonomous ecosystem all starts with an image.

“When building rovers’ AI, we focused primarily on vision,” Woods says. “When you think about energy budget, it’s the cheapest way to collect data, because rovers use their cameras all the time for navigation.” Each image goes first to the feature detection component. Algorithms then recognize the horizon line and then proceed to identify rocks on the ground (AEGIS is responsible for that) and clouds in the sky. If rocks or clouds are an unusual size or shape, the system marks them as promising and plans additional data collection steps, such as getting closer to the target or using more precise instruments. Those steps go into a planning module that determines how much time and energy it would take to complete them. If a goal seems really new, CASPER is more likely to try it, if not, the planner kills the project and moves on to pursuing more important goals.

By akfire1

Leave a Reply

Your email address will not be published.