SBIR-STTR Award

Dynamic Robust Hand Model for Gesture Intent Recognition
Award last edited on: 6/19/2018

Sponsored Program
STTR
Awarding Agency
NSF
Total Award Amount
$1,316,828
Award Phase
2
Solicitation Topic Code
IT
Principal Investigator
Raja Jasti

Company Information

ZeroUI Inc

33 West San Carlos Street Suite 1080
San Jose, CA 95110
   (408) 863-0555
   info@zeroui.com
   www.zeroui.com

Research Institution

Purdue University

Phase I

Contract Number: 1549864
Start Date: 1/1/2016    Completed: 12/31/2016
Phase I year
2016
Phase I Amount
$225,000
The broader impact/commercial potential of this project stems from addressing the important hand gesture based input challenges of VR and AR industries that are expected to grow to $150B by 2020. Piper Jaffray identifies VR as the next mega trend and estimates the VR market to be worth more than $60B by 2025. Piper Jaffray highlights new market opportunities for peripheral devices that bring hands and feet into VR. This technology if successful in mitigating the high technical risks represents a huge leap in the state of the art in 3D hand models for gesture recognition and has the potential to be the industry standard for AR, VR and 3D applications. Our company will commercialize the project by licensing this technology as a hand model SDK to the AR/VR and 3D camera device makers and application developers to bring highly interactive VR/AR and 3D gesture applications to gaming, entertainment, education, healthcare, design, architecture, and manufacturing.This Small Business Technology Transfer Research (STTR) Phase I project develops a breakthrough innovation in 3D hand gesture intent recognition that can robustly work across different 3D cameras, orientations, positions and occlusions. It addresses a key challenge in gesture recognition while enabling natural spatial interactions for Virtual and Augmented Reality (VR/AR) and many other applications enabled by 3D depth cameras. It solves the following key challenges faced by existing academic and commercial hand models and involves very high technical risks: 1) robustness under heavy occlusions 2) invariance to view-point changes 3) low computational training and tracking complexity 4) discriminative to frequent gesture/micro-gesture sequences. We will tackle these by developing a novel dynamic, robust hand-tracking model inspired by a machine learning technique that is not commonly used by computer vision community. We will achieve this by developing the following objectives 1) hand pose hypothesis generation using trained classifiers 2) hand model fitting using joint matrix factorization and completion 3) user study and evaluation of the hand model.

Phase II

Contract Number: 1738888
Start Date: 9/1/2017    Completed: 2/29/2020
Phase II year
2017
(last award dollars: 2020)
Phase II Amount
$1,091,828

The broader impact/commercial potential of this project stems from addressing theimportant hand gesture based input challenges of VR/AR (expected to grow to $150B by2020), Robotics and IoT ($135B by 2019 and $1.7T by 2020 respectively). This technology,if successful in mitigating the high technical risks, represents a huge leap in the state of theart in 3D hand models for gesture recognition and has the potential to be the industrystandard for AR, VR, Robotics and IoT applications with broad societal impact in education,medical and healthcare. Its broader impact is further amplified by the potential in serving theneeds of the disabled community in improving their quality of life by being better able tocommunicate, learn and adapt to their interaction needs.This Small Business Technology Transfer (STTR) Phase 2 project aims to significantlyadvance current 3D hand gesture recognition technology by developing a dynamic handtracking model for gesture intent recognition. It is robust against occlusion and tolerant tovariations in camera orientation and position. This research will result in a transformativeleap above the current state of academic and commercial hand models and overcome keytechnical hurdles that have so far proven difficult to overcome. It solves the following keychallenges and involves very high technical risks: 1) robust hand tracking while holdingobjects and 2) robust tangible interactions using objects without using any fiducial markers2) low profile hand wearable for touch interaction detection. This Phase 2 project willachieve these objectives by 1) data acquisition and hand-object pose estimation, 2)understanding user intents with enhanced tangible interactions, and 3) system validationand user testing.