SBIR-STTR Award

Visual Relative Navigation
Award last edited on: 3/17/2021

Sponsored Program
STTR
Awarding Agency
DOD : DARPA
Total Award Amount
$725,000
Award Phase
2
Solicitation Topic Code
ST18C-006
Principal Investigator
Timothy E Fair

Company Information

Toyon Research Corporation (AKA: Data Tools for Citizen Science)

6800 Cortona Drive
Goleta, CA 93117
   (805) 968-6787
   toyoninfo@toyon.com
   www.toyon.com

Research Institution

California Institute of Technology

Phase I

Contract Number: W911NF19C0030
Start Date: 3/28/2019    Completed: 1/27/2020
Phase I year
2019
Phase I Amount
$225,000
As unmanned aircraft systems (UAS) become more prevalent there is an increasing desire to automate UAS navigation and control. To enable future UASs to perform a wider variety of missions, the they must be able to complete autonomous relative navigation to accomplish missions. Current technologies rely heavily on GPS measurements, which are undesirable since GPS signals may be unavailable in many DoD applications. Active sensing technologies are also undesirable due to increased SWAP and the desire to limit emitting/communications to enable covert operations. Therefore, a system is needed to provide visual relative navigation (VRN). Toyon and CalTech propose developing a Visual Relative Navigation via Intelligent Ephemeral Relationships (VRNIER) system that consists of passive optical sensors and the processing needed to process the images into accurate six degree of freedom relative estimates. The approach combines computer vision and deep learning for estimating relative platform relationships with online integrity modeling to enable automated VRN. All of the key algorithms will be developed and demonstrated with simulated and surrogate data in Phase I to create a low risk path for demonstrating a prototype system in Phase II. The work leverages Toyon’s extensive history in automated image processing for navigation and other applications.

Phase II

Contract Number: W912CG-20-C-0016
Start Date: 4/22/2020    Completed: 4/27/2021
Phase II year
2020
Phase II Amount
$500,000
As unmanned aircraft systems (UAS) become more prevalent there is an increasing desire to automate UAS navigation and control. To enable future UASs to perform a wider variety of missions, they must be able to complete autonomous relative navigation to accomplish missions. Current technologies rely heavily on GPS measurements, which are undesirable since GPS signals may be unavailable in many DoD applications. Active sensing technologies are also undesirable due to increased SWAP and the desire to limit emitting/communications to enable covert operations. Therefore, a system is needed to provide visual relative navigation (VRN). Toyon and Caltech propose developing a Visual Relative Navigation via Intelligent Ephemeral Relationships (VRNIER) system that consists of passive optical sensors and the processing needed to process the images into accurate six degree of freedom relative estimates. The approach combines computer vision and deep learning to estimate relative platform relationships with online integrity modeling to enable automated VRN. All of the key algorithms will be developed and demonstrated with simulated and surrogate data in Phase II to create a low risk path for demonstrating a prototype system in Phase II. The work leverages Toyon’s extensive history in automated image processing for navigation and other applications.