-->

Awards Registry

Bounding generalization risk for Deep Neural Networks
Profile last edited on: 3/11/2021

Program
STTR
Agency
NGA
Total Award Amount
$99,475
Award Phase
1
Principal Investigator
Gabriel Perdue
Activity Indicator

Location Information

Euler Scientific

2014 Calle Buena Ventura
Oceanside, CA 92056
   (310) 869-1967
   N/A
   euler-sci.com
Multiple Locations:   
Congressional District:   49
County:   San Diego

Phase I

Phase I year
2020
Phase I Amount
$99,475
Deep Neural Networks have become ubiquitous in the modern analysis of voluminous datasets with geometric symmetries. In the field of Particle Physics, experiments such as DUNE require the detection of particle signatures interacting within the detector, with analyses of over a billion 3D event images per channel each year; with typical setups containing over 150,000 different channels.  In an analogous data intensive field, satellites continually produce datasets requiring the detection of millions of objects per 1000 sq km over the full surface of Earth. Understanding the uncertainty induced by the underlying Machine Learning Algorithm is important to such analyses. This error has not been included in analyses in a fundamental way and is currently included exclusively in sophisticated and costly empirical studies. We will develop a theoretical bounds on this error utilizing Fourier analysis (Xu, Zhang, Luo, Xiao, & Ma, 2019) and will build upon the a priori generalization bound established for shallow networks (Xu, Zhang, Zhang, & Zhao, 2019) by considering deep Rectified Linear Unit (ReLU) neural networks of minimal width and disparate test and train domains. We will then work on extending our bounds to simple Deep Convolutional Neural Networks, to simple empirical studies on disparate test and train domains, and to empirical studies for object detection.

Phase II

Phase II year
---
Phase II Amount
---