SBIR-STTR Award

Advanced Bayesian Methods for Lunar Surface Navigation
Award last edited on: 7/10/2020

Sponsored Program
SBIR
Awarding Agency
NASA : GRC
Total Award Amount
$699,936
Award Phase
2
Solicitation Topic Code
O4.03
Principal Investigator
Julian Center

Company Information

Autonomous Exploration Inc

385 High Plain Road
Andover, MA 01810
   (978) 683-0290
   jcenter@ieee.org
   www.autonomous-exploration.com
Location: Single
Congr. District: 03
County: Essex

Phase I

Contract Number: ----------
Start Date: ----    Completed: ----
Phase I year
2010
Phase I Amount
$99,937
The key innovation of this project will be the application of advanced Bayesian methods to integrate real-time dense stereo vision and high-speed optical flow with an Inertial Measurement Unit (IMU) to produce a highly accurate planetary rover navigation system. The software developed in this project will leverage current computing technology to implement advanced Visual Odometry (VO) methods that will accurately track much faster rover movements. Our fully Bayesian approach to VO will utilize more information from the images than previous methods are capable of using. Our Bayesian VO does not explicitly select features to track. Instead it implicitly determines what can be learned from each image pixel and weights the information accordingly. This means that our approach can work with images that have no distinct corners, which can be a significant advantage with low contrast images from permanently shadowed areas. We expect that the error characteristics of the visual processing with be complementary to the error characteristics of a low-cost IMU. Therefore, the combination of the two should provide highly accurate navigation.

Phase II

Contract Number: ----------
Start Date: ----    Completed: ----
Phase II year
2011
Phase II Amount
$599,999
The key innovation of this project is the application of advanced Bayesian methods to integrate real-time dense stereo vision and high-speed optical flow with an Inertial Measurement Unit (IMU) to produce a highly accurate planetary rover navigation system. The software developed in this project leverages current computing technology to implement advanced Visual Odometry (VO) methods that will accurately track much faster rover movements. Our fully Bayesian approach to VO will utilizes information from the images than previous methods are capable of using. Our Bayesian VO does not explicitly select features to track. Instead it implicitly determines what can be learned from each image pixel and weights the information accordingly. This means that our approach can work with images that have no distinct corners, which can be a significant advantage with low contrast images from permanently shadowed areas. We have shown that the error characteristics of the visual processing are complementary to the error characteristics of a low-cost IMU. Therefore, the combination of the two can provide highly accurate navigation.