SBIR-STTR Award

DC-ATR: Distributed Co-operative ATR Using Low Resolution Mobile Sensors
Award last edited on: 4/13/2009

Sponsored Program
STTR
Awarding Agency
DOD : Navy
Total Award Amount
$802,214
Award Phase
2
Solicitation Topic Code
N07-T024
Principal Investigator
Carol Cheung

Company Information

iRobot Corporation (AKA: IS Robotics Inc~IS Robotics Corporation)

8 Crosby Drive
Bedford, MA 01730
   (781) 430-3000
   info@irobot.com
   www.irobot.com

Research Institution

Georgia Institute of Technology

Phase I

Contract Number: N00014-07-M-0435
Start Date: 7/20/2007    Completed: 5/20/2008
Phase I year
2007
Phase I Amount
$69,997
We propose to leverage recent advances in computationally efficient and accurate object detection and recognition using EO imaging to develop a novel multi-sensor distributed ATR capability. The proliferation of UGVs and UAVs has resulted in a unique opportunity for large scale surveillance and monitoring. However, weight and power constraints require that a distributed ATR system be both extremely efficient to avoid using unnecessary power and extremely effective at integrating information from multiple camera viewpoints to avoid the limitations of inherently low-resolution sensors. There is a need for a cooperative ATR approach which is based on the extraction of potential targets by individual mobile sensor platforms followed by a fusion stage which pools information across multiple camera viewpoints to create an accurate, high-resolution ATR decision.

Keywords:
Unattended Sensors, Embedded Algorithms, Distributed Co-Operative Atr, Sensor Correlation, Low Resolution Low Cost Sensors, Small Unmanned Vehicles, Swarm

Phase II

Contract Number: N00014-09-C-0101
Start Date: 10/31/2008    Completed: 4/30/2010
Phase II year
2009
Phase II Amount
$732,217
The iRobot team proposes to develop a deployable pedestrian detection module that will analyze a monocular video stream and identify all instances of visible pedestrians. To address the problem of pedestrian detection, we will employ data-driven approaches that help recognize specific appearance and motion patterns generated by moving humans. We will also develop detection algorithms that can leverage the decentralized use of multiple mobile sensor platforms and improve detection accuracy. To reduce false positives while also maintaining an acceptably high detection rate, we will explore mechanisms for exploiting scene context. We will also explore methods to obtain useful context. Our primary technical objectives in Phase II include developing a pedestrian detection algorithm that performs competitively with other published pedestrian detection algorithms; exploring mechanisms for integrating vehicle telemetry information and other contextual information with the detection results; evaluating the ability of our integrated detection solution to approach the detection performance of existing compute-intensive systems for pedestrian detection; operating the pedestrian detection algorithm onboard a tactical UGV platform without significant sensor or hardware modifications; and evaluating the pedestrian detection algorithm performance onboard a tactical UGV platform under urban outdoor conditions involving groups of five or more targets.

Keywords:
Ugv, Embedded Algorithms, Automatic Target Recognition, Distributed Atr, Pedestrian Detection, Packb