-->

Awards Registry

Visual Relative Navigation
Profile last edited on: 3/17/2021

Program
STTR
Agency
DARPA
Total Award Amount
$725,000
Award Phase
2
Principal Investigator
Timothy E Fair
Activity Indicator

Location Information

Toyon Research Corporation

6800 Cortona Drive
Goleta, CA 93117
   (805) 968-6787
   toyoninfo@toyon.com
   www.toyon.com
Multiple Locations:   
Congressional District:   24
County:   Santa Barbara

Phase I

Phase I year
2019
Phase I Amount
$225,000
As unmanned aircraft systems (UAS) become more prevalent there is an increasing desire to automate UAS navigation and control. To enable future UASs to perform a wider variety of missions, the they must be able to complete autonomous relative navigation to accomplish missions. Current technologies rely heavily on GPS measurements, which are undesirable since GPS signals may be unavailable in many DoD applications. Active sensing technologies are also undesirable due to increased SWAP and the desire to limit emitting/communications to enable covert operations. Therefore, a system is needed to provide visual relative navigation (VRN). Toyon and CalTech propose developing a Visual Relative Navigation via Intelligent Ephemeral Relationships (VRNIER) system that consists of passive optical sensors and the processing needed to process the images into accurate six degree of freedom relative estimates. The approach combines computer vision and deep learning for estimating relative platform relationships with online integrity modeling to enable automated VRN. All of the key algorithms will be developed and demonstrated with simulated and surrogate data in Phase I to create a low risk path for demonstrating a prototype system in Phase II. The work leverages Toyon’s extensive history in automated image processing for navigation and other applications.

Phase II

Phase II year
2020 (last award dollars: 2020)
Phase II Amount
$500,000
As unmanned aircraft systems (UAS) become more prevalent there is an increasing desire to automate UAS navigation and control. To enable future UASs to perform a wider variety of missions, they must be able to complete autonomous relative navigation to accomplish missions. Current technologies rely heavily on GPS measurements, which are undesirable since GPS signals may be unavailable in many DoD applications. Active sensing technologies are also undesirable due to increased SWAP and the desire to limit emitting/communications to enable covert operations. Therefore, a system is needed to provide visual relative navigation (VRN). Toyon and Caltech propose developing a Visual Relative Navigation via Intelligent Ephemeral Relationships (VRNIER) system that consists of passive optical sensors and the processing needed to process the images into accurate six degree of freedom relative estimates. The approach combines computer vision and deep learning to estimate relative platform relationships with online integrity modeling to enable automated VRN. All of the key algorithms will be developed and demonstrated with simulated and surrogate data in Phase II to create a low risk path for demonstrating a prototype system in Phase II. The work leverages Toyon’s extensive history in automated image processing for navigation and other applications.