SBIR-STTR Award

Smart Interaction Device for VTOL UAS
Award last edited on: 5/20/2019

Sponsored Program
SBIR
Awarding Agency
DOD : Navy
Total Award Amount
$895,457
Award Phase
2
Solicitation Topic Code
N111-070
Principal Investigator
Glenn Taylor

Company Information

Soar Technology Inc (AKA: SoarTech)

3600 Green Court Suite 600
Ann Arbor, MI 48105
   (734) 627-8072
   info@soartech.com
   www.soartech.com
Location: Multiple
Congr. District: 12
County: Washtenaw

Phase I

Contract Number: N00014-11-M-0228
Start Date: 5/9/2011    Completed: 9/15/2012
Phase I year
2011
Phase I Amount
$149,851
Current Unmanned Air Systems (UAS) require a great deal of specialized training and experience to use effectively. Furthermore, their use is dominated by a paradigm of control in which a highly trained operator is giving very precise, low-level commands to a UAS. As UAS become more autonomous, they will require less control, but will instead require input from a range of users, including high-level directions or other kinds of mission-relevant information. However, current operator control units (OCUs) are stuck in the paradigm of explicit control. To take advantage of increasing autonomy and to broaden the range of users, new kinds of robust natural interfaces, including multiple modes of input and output, must be developed that allow users with various skill levels to interact with UAS in a variety of environments. SoarTech proposes to leverage prior work on a multi-modal interfaces for interacting with autonomous systems. We will extend this work to the VTOL UAS domain, making it robust to a range of user skill levels and experience, including multiple redundant forms of input and output that are appropriate to a range of environments in which forward deployed units need to interact with autonomous UAS.

Benefit:
A system for robust, natural interaction with autonomous UAS platforms has the potential to revolutionize how UAS systems are deployed and used. Current UAS require specialized training and a great deal of experience to use effectively, where the operator must learn to conform to the UAS particular language for control. Instead, the system should conform to how the user wants to (or is able to) interact in natural ways that may include multiple forms of input and output. Our proposed SID system will help make UAS systems more accessible to a range of users, including enlisted warfighters with limited training. Furthermore, SID will be able to be used remotely (e.g., at a combat outpost) or onboard a UAS where there is a warfighter acting as an onboard teammate with the UAS, interacting from within the platform. Beyond UAS applications in resupplying or rescuing forward deployed units, SID can provide the basis for robust, natural interaction with a wide variety of unmanned vehicles in a variety of domains. As a knowledge-based system, SID has a core set of capabilities for robust interaction, but is agnostic as to the particular vehicle that it is controlling. System-specific plug-ins make it possible for SID to interact with any kind of UxV, or even with simulated versions of those platforms. By making interaction with a single UxV easier, it will be possible for users to interact with multiple UxVs simultaneously as if they were teammates rather than remote controlled vehicles that need constant attention. SID could also be adapted to interaction with a range of autonomous systems, such as interaction with computer-driven avatars simulation-based training or commercial gaming environments.

Keywords:
Intelligent User Interface, Intelligent User Interface, Autonomous UAS, Natural Dialogue, Robust Interaction

Phase II

Contract Number: N00014-12-C-0362
Start Date: 7/18/2012    Completed: 1/31/2014
Phase II year
2012
Phase II Amount
$745,606
Current Unmanned Air Systems (UAS) require a great deal of specialized training and experience to use effectively. Furthermore, their use is dominated by a paradigm of control in which a highly trained operator is giving very precise, low-level commands to a UAS. As UAS become more autonomous, they will require less control, but will instead require input from a range of users, including high-level directions or other kinds of mission-relevant information. However, current operator control units (OCUs) are stuck in the paradigm of explicit control. To take advantage of increasing autonomy and to broaden access to a wider range of users, new kinds of robust natural interfaces, including multiple modes of input and output, must be developed that allow users with various skill levels to interact with UAS in a variety of environments. SoarTech proposes to leverage our prior work on a multi-modal interfaces for interacting with autonomous systems. We will extend this work to the VTOL UAS domain, making it robust to a range of user skill levels and experience, including multiple redundant forms of input and output that are appropriate to a range of environments in which forward deployed units need to interact with autonomous UAS.

Benefit:
A system for robust, natural interaction with autonomous UAS platforms has the potential to revolutionize how UAS systems are deployed and used. Current UAS require specialized training and a great deal of experience to use effectively, where the operator must learn to conform to the UAS particular language for control. Instead, the system should conform to how the user wants to (or is able to) interact in natural ways that may include multiple forms of input and output. Our proposed Smart Interaction Device (SID) system will help make UAS systems more accessible to a range of users, including enlisted warfighters with limited training. SID will be able to be used remotely (e.g., at a combat outpost) or onboard a UAS where there is a warfighter acting as an onboard teammate with the UAS, interacting from within the platform. Beyond UAS applications in resupplying or rescuing forward deployed units, SID can provide the basis for robust, natural interaction with a wide variety of unmanned vehicles in a variety of domains. As a knowledge-based system, SID has a core set of capabilities for robust interaction, but is agnostic as to the particular vehicle that it is controlling and agnostic to the particular inputs devices. System-specific plug-ins make it possible for SID to interact with any kind of UxV, or even with simulated versions of those platforms. By making interaction with a single UxV easier, it will be possible for users to interact with multiple UxVs simultaneously as if they were teammates rather than remote controlled vehicles that need constant attention. SID could also be adapted to interaction with a range of autonomous systems, such as interaction with computer-driven avatars simulation-based training or commercial gaming environments.

Keywords:
Autonomous UAS, multi-modal interaction, Robust Interaction, Intelligent User Interface, Natural Dialogue