SBIR-STTR Award

Situational Awareness in Autonomous Agriculture
Award last edited on: 2/25/2019

Sponsored Program
SBIR
Awarding Agency
NSF
Total Award Amount
$225,000
Award Phase
1
Solicitation Topic Code
EW
Principal Investigator
Jianfei Chen

Company Information

Aware Vehicles Inc

1224 West 62nd Street Suite 2
Kansas City, MO 64113
   (816) 844-9649
   info@awarevehicles.com
   www.awarevehicles.com
Location: Single
Congr. District: 05
County: Jackson

Phase I

Contract Number: 1818982
Start Date: 6/15/2018    Completed: 2/28/2019
Phase I year
2018
Phase I Amount
$225,000
The broader impact/commercial potential of this project is to enable real-time situational awareness in autonomous vehicles. With the global population expected to reach 9 billion by 2050 and the uncertain climate changes that create concern over the resources allocated to farming activities, precision agriculture has become the ultimate solution to increase agricultural productivity and efficiency. The proposed system, aiming to integrate aerial and terrain robotics and provide high-throughput crop imaging, will push precision agriculture to the next evolutionary stage ? fully autonomous agriculture. With the R&D efforts in this project, the integrated aerial-terrain robotics and high-throughput imaging system will be ready for commercialization under the proposed sustainable business model and impact farming industries globally. Moreover, the system prototype enabled by smart and mobile docking can be readily adapted to accommodate needs in a variety of other industries, where geospatially large-scale sensing and analytics are in demand, such as transportation network monitoring, civil infrastructure and urban monitoring, logistics and freight management, and monitoring of environmental hazards. The proposed invention and its future robotic products are expected to impact all these sectors by imparting automation in terms of high-dimensional data collection and real-time analytics. This Small Business Innovation Research Phase I project provides a technology leap that furnishes state-of-the-art terrain robotics with long-range and real-time situational awareness, including pre-operation reconnaissance and post-operation evaluation. A smart docking platform will be developed that provides an unlimited energy supply while serving as an ad-hoc computing engine and enables the possibility of high-throughput imaging for single plants or plant groups. Such a systematically coupled docking-imaging-computing platform is not found to date. The docking will be realized through three independent mechanisms, including kinetic sensing and calculation, low-cost stereo vision, and radar ranging; and real-time positioning algorithms based on the three mechanisms will be developed through a fail-safe data fusion process. The aerial imaging drone, charged by the docking platform, can perform two modes of imaging activities either towards conventional terrain/field mapping or the novel 4-dimensional (4D) reconstruction proposed in this project. The reconstruction algorithms will be developed based on the fusion of the stereo data and the hyperspectral data, which produces the first-of-its kind 4D spatial-spectral models for high-throughput phenotyping.This award reflects NSF's statutory mission and has been deemed worthy of support through evaluation using the Foundation's intellectual merit and broader impacts review criteria.

Phase II

Contract Number: ----------
Start Date: 00/00/00    Completed: 00/00/00
Phase II year
----
Phase II Amount
----