SBIR-STTR Award

Real-time Overlay of Map Features onto a Video Feed
Award last edited on: 2/20/2015

Sponsored Program
SBIR
Awarding Agency
DOD : AF
Total Award Amount
$842,551
Award Phase
2
Solicitation Topic Code
AF093-206
Principal Investigator
W Andrew Scanlon

Company Information

ObjectVideo Inc (AKA: Diamondback Systems Inc~ObjectVideo, Diamondback Vision)

11600 Sunrise Valley Drive Suite 210
Reston, VA 20191
   (571) 327-3673
   info@objectvideo.com
   www.objectvideo.com
Location: Multiple
Congr. District: 11
County: Fairfax

Phase I

Contract Number: ----------
Start Date: ----    Completed: ----
Phase I year
2010
Phase I Amount
$99,936
This Small Business Innovation Research Phase I project will develop a system to overlay geo-referenced informational data on UAV video in real-time. The proposed system will provide enhanced situational awareness and decision making capabilities to remote users by accurately overlaying the information of their choice. The key innovations in this effort include i) feature matching and filtering techniques to reduce error in sensor data, ii) GPU processing for real-time performance with COTS hardware, iii) an API which facilitates future expansion of the system to arbitrary data types and standards such as KML and Shapefiles, and iv) an intuitive and non-invasive user interface enabling users to select and customize data sets to overlay v) a secondary map display to provide additional context by overlaying the geo-registered video onto a wider map view. The Phase I effort will include: concept development, implementation of proposed enabling technologies, demonstration of proof of concept, and analysis of operational feasibility.

Benefit:

Unmanned Aerial Vehicles (UAVs) are a critical component of Intelligence, Surveillance and Reconnaissance (ISR) capabilities for all branches of the armed forces, but the limited viewing angle and resolution of typical UAV video can limit the ability of users on the ground to act. The proposed system would increase the situational awareness of users by providing important contextual and targeted information, overlaid onto the video in a clear and accurate manner, in real-time.

The enabling technologies proposed have the following benefits, within this project and in future efforts:

  • Sensor data error mitigation: The project will develop a technique to filter sensor position and orient metadata and match the video to reference imagery to reduce error, which will improve the accuracy of the generated overlay as well as provide more accurate data about the sensor and platform, which has many applications.
  • Real-time GPU processing with COTS hardware: The project will use common GPU hardware to improve overlay quality and performance while freeing CPU resources and keeping size and cost low.
  • Software Development Kit (SDK) for Data Source Ingestion: The system will use a plug-in architecture to enable importing data sources without modifying the system. An easy-to-use SDK will aid the process of creating new plug-ins, and allow users to import arbitrary sources in the future.
  • Intuitive User Interface: The system will incorporate interface design elements from several well-known applications such as Google Earth and Maps, NASA WorldWind, and TiVo with the goal of reducing the end-user’s learning curve by providing them with controls with which they are already familiar. Additionally, user interaction with overlay elements will be achieved through the use of tool tips and text balloons to display additional textual information that would otherwise clutter the video and be difficult to read on moving imagery.
  • Secondary Map Display: An optional map interface will be available to allow users to see additional contextual information, choose information for overlay, and view the geo-registered UAV video in a wider context on the map.
  • Integrated Video and Data Management System (VDMS): ObjectVideo’s VDMS, created to meet the needs of the ARGUS-IS project, will be leveraged for this effort. The system includes integrated management and storage of metadata along with video, as well as a rich player interface with support for metadata and TiVo-style controls for review and measurement tools for geo-registered video. The VDMS will be expanded to support the proposed overlay information, including controlled airspace symbology, restricted airspace boundaries, FAA centers, air corridors with altitude limits, and SAR and IR data.


Keywords:
Intelligence, Surveillance And Reconnaissance (Isr), Real-Time Video Overlay, Mpeg-4, Uav, Ugs

Phase II

Contract Number: ----------
Start Date: ----    Completed: ----
Phase II year
2013
Phase II Amount
$742,615
Current video feeds from airborne sensors such as Predator, Argus-IS, Argus-IR, Gorgon Stare and others excel at providing high resolution imagery from a birds-eye vantage point. While those pixels offer the analyst the eye-in-the-sky, today's systems rely on the analyst to interpret the scene without the benefit of additional context that can be obtained from readily available data types, such as terrain and elevation information, roadways, points of interest, Controlled Airspace Symbology, Restricted Airspace Boundaries, FAA Centers, radar, and LIDAR data. Our goal under this proposed effort is to augment the user experience by adding geo-registered layers from these information sources. In Phase I of this project, ObjectVideo demonstrated prototypes of all the necessary components and core technologies of a viable real-time overlay system. For Phase II, ObjectVideo will focus on refining and extending the core technologies and integrating these components into an end-to-end overlay system. OV will also implement novel features including utilization of external reference data for improved overlay accuracy, optional rendering of occluded overlay elements, and an expanded context view providing greater situational awareness of regions surrounding the UAV’s field-of-view. The system will provide users with critical single-click, single-glance information on any client capable of displaying video.

Benefit:
Unmanned Aerial Vehicles (UAVs) are a critical component of Intelligence, Surveillance and Reconnaissance (ISR) capabilities for all branches of the armed forces, but the limited viewing angle and resolution of typical UAV video can limit the ability of users on the ground to act. The proposed system would increase the situational awareness of users by providing important contextual and targeted information, overlaid onto the video in a clear and accurate manner, in real-time. The enabling technologies proposed have the following benefits, within this project and in future efforts: • Performance and Scalability: The system takes full advantage of COTS GPU hardware to render multiple overlay streams in real-time. System architecture allows scalability in several dimensions. • Accuracy: advanced computer vision based techniques correct for common errors in sensor metadata and yield far more accurate overlay results than naïve approach. • Low-Cost: COTS PC and GPU hardware • Flexibility: The system can be configured as a scalable client-server architecture, or deployed on a single laptop for stand-alone use. The standards-based client-server architecture allows interoperability with a wide variety of client platforms, including mobile devices. • Extensible via SDK: Overlay and VDMS SDKs allow developers to add data types, custom overlays, or incorporate the system into new applications • Compatibility: Standards-based approach allows interoperability with a wide variety of data and applications. • Intuitive User Interface: Easily leveraged by novice users. The completed system will provide an inexpensive, high-performance overlay system which provides valuable situational awareness to consumers of UAV video feeds. Potential commercial and military applications include: • Augmented Reality • Airborne Video Exploitation • Soldier Helmet-Mounted, Smart-Phone and Vehicle-Mounted cameras

Keywords:
Real-time video overlay, GPU, UAV, GIS overlay, metadata correction, MPEG-4, ISR