A small computer vision lab on Mars

To avoid the delays associated with transferring data between Earth and Mars, NASA scientists have developed software that allows the rovers roaming Mars to analyze on the fly.  The Jet Propulsion Lab (JPL) in Pasadena, CA built a platform called Onboard Autonomous Science Investigation System (OASIS). OASIS allows the rover to perform image and data analysis and react to the results without human intervention.

To supplement the efforts of OASIS, Autonomous Exploration for Gathering Increased Science (AEGIS) was uploaded to Opportunity in December of 2009. Prior to the introduction of AEGIS, images of the Martian surface were transmitted to Earth, analyzed and then translated into geological targets. The target would then be uploaded to the rover. The time associated with data transfer and the near archaic processing speeds mandated by the limitations of space platforms meant that this entire process took days to accomplish. Often, by the time the operations team received the images the rover had already moved far beyond the location. In short, there was no way to react in real-time.

Since the advent and implementation of AEGIS, Opportunity has the ability to complete onboard reasoning. The seven-step process of determining what may be “interesting” begins with AEGIS prioritizing targets within a wide-field of vision. Rocks within the field are analyzed against the target parameters sent to the rover by the craft’s sequencing team at JPL. Attributes such as brightness, shape and size are judged. Those objects that possess attributes deemed as “interesting” are inspected further as are objects found in close proximity to the “interesting” object.

AEGIS collects 13-color filter, panoramic images of a range of terrains including rock outcrops, crater ejectors, boulders and cobbles. The color images identify the physical, mineralogical and photometric properties of surface of the Red Planet and aid scientists and researchers in determining its geologic and aqueous history.

Developing vision software for use in space applications has proven to be simpler than developing similar software for use on our home planet. The barren environment found on Mars reduces the distractions and variables that often impede computer vision.  The supervisor of computer vision group at JPL, Larry Matthies, notes that the planet is basically desert and void of weather; the only shadow that exists is that cast by the rover itself. Even with this practically variable-free environment, research performed in support of the Mars missions has contributed to breakthroughs in stereo vision and visual odometry. Previously considered to be very expensive and error-prone, extracting 3D information was based on high-contrast points in images such as corners and edge, which led to sparse 3D images. Matthies has since developed a fast algorithm that produces dense 3D information with high reliability based on a large number of points within the image, which is being utilized by Curiosity rover.

Avi Valentine is a contributing writer for Computer Vision Online.

Photo credit: NASA

Via: ACM

Add new comment