SBIR-STTR Award

Real-time Hazard Detection via Deep Learning
Award last edited on: 1/23/2023

Sponsored Program
SBIR
Awarding Agency
NASA : GSFC
Total Award Amount
$874,900
Award Phase
2
Solicitation Topic Code
H9.03
Principal Investigator
Kori Macdonald

Company Information

Astrobotic Technology Inc

1016 North Lincoln Avenue
Pittsburgh, PA 15233
   (412) 682-3282
   contact@astrobotic.com
   www.astrobotic.com
Location: Single
Congr. District: 18
County: Allegheny

Phase I

Contract Number: 80NSSC21C0093
Start Date: 5/12/2021    Completed: 11/19/2021
Phase I year
2021
Phase I Amount
$124,996
On-board hazard detection is critical to the success of landed missions, as available orbiter data does not capture the lunar terrain at a resolution that enables identification of potentially mission-threatening rocks and craters on the centimeter-scale. Current state-of-the-art technologies in hazard detection typically use LiDAR data to address low/variable illumination conditions during landing operations, however the option to include image data can result in a hazard detection solution that is more frequently updated at a higher resolution. The proposed work applies a deep learning approach to this problem, as the highly parallelizable nature of learning-based computations naturally extends to hardware acceleration, enabling additional computational power to compute and combine hazard maps across both LiDAR and camera data. The output of this development will be a demonstration of the feasibility and performance of a deep-learning based hazard detection system that leverages both LiDAR and image data to achieve mission-speed performance on path-to-flight hardware. The proposing team is currently developing a LiDAR-based hazard detection module for Astrobotic’s Griffin Mission One to deliver NASA’s VIPER rover to the lunar south pole, planned for late 2023. Techniques developed in the proposed work will benefit from the V&V infrastructure developed for this and future missions. Additionally, Astrobotic will leverage the LunaRay Suite, which is capable of generating and verifying accurate terrain data, including terrain models, photometrically accurate image data, as well as simulated LiDAR data at input locations, times, and viewing positions. As such, a large and widely varied training dataset will be produced, enabling the training of a robust network. By providing a robustly trained solution on relevant hardware, the proposing team seeks to drive forward the market of applied deep learning technologies in the space industry. Potential NASA Applications (Limit 1500 characters, approximately 150 words): As landing precision requirements continue to grow with increasingly complex mission scenarios, customers will look to a flexible solution which utilizes as much data as possible to produce an accurate solution. Astrobotic’s own participation in NASA’s CLPS program will provide an internal customer enabling demonstration of this technology on a landed mission. With flight heritage and demonstrated successes, this system will become a sensor considered as a strong option for future missions through the CLPS and Artemis programs. Potential Non-NASA Applications (Limit 1500 characters, approximately 150 words): The ability for an airborne system to track objects in real-time may be of interest to the DOD to gain intelligence and ensure troop safety in uncertain environments. The DOD may be interested in a hazard detection system for missions landing in uncertain areas as well. Hardware acceleration for deep learning applications would find a host of applications, such as in the autonomous vehicle sector. Duration: 6

Phase II

Contract Number: 80NSSC22CA108
Start Date: 6/13/2022    Completed: 6/12/2024
Phase II year
2022
Phase II Amount
$749,904
HazNet is a robust hazard detection solution that leverages deep learning and hardware acceleration to achieve mission-speed performance on path-to-flight hardware. The HazNet solution seeks to maximize data use while maintaining flexibility by leveraging the independent strengths of LiDAR and camera data to produce a single hazard map. Flexibility is maintained by using two independent convolutional neural networks for computation, one for LiDAR data and one for image data, which are combined into an existing hazard map to improve knowledge, resolve unknown regions, and increase hazard map resolution. This method de-risks the transition from traditional hazard detection to deep learning-based algorithms by leveraging well-proven, rather than experimental models to identify hazards. It improves upon these traditional methods by acting on the strengths of complimentary sensors, enabled by hardware acceleration. Astrobotic proposes the development of a prototype sensor package for HazNet, while further advancing developed hazard detection models and techniques. This Phase II effort will entail five major efforts: working in collaboration with NASA and Astrobotic stakeholders to develop a reference mission and associated requirements; advancement of models developed in the Phase I effort to incorporate uncertainty and combine hazard map outputs; development and testing of custom sensor package; demonstration in a series of relevant simulations; and a final technology demonstration across a descent-like scenario in a lunar-relevant environment. Potential NASA Applications (Limit 1500 characters, approximately 150 words): Generally speaking, as NASA targets increasingly complex and challenging landing scenarios on the Moon, asteroids, Mars, icy moons, and beyond, the Agency and its commercial contractors will be looking for flexible systems like HazNet which utilize as much data as possible and as little hardware as possible to produce accurate landing solutions. HazNet will be a valuable tool not only for future CLPS missions, but also for NASA’s forthcoming Artemis landings, which are also targeting rugged polar sites. Potential Non-NASA Applications (Limit 1500 characters, approximately 150 words): HazNet’s generation of real-time maps would benefit the Department of Defense (DOD) in identifying and tracking hazards in real time. It would benefit the Department of the Interior (DOI) and Department of Agriculture (USDA), as terrain models aid in the understanding of agricultural needs, geological impacts of climate change, and improved understanding of forests, oceans, and remote regions. Duration: 24