SBIR-STTR Award

Automated Factor-Based Sensitivity Analysis
Award last edited on: 9/8/22

Sponsored Program
SBIR
Awarding Agency
DOD : MDA
Total Award Amount
$1,664,681
Award Phase
2
Solicitation Topic Code
MDA20-001
Principal Investigator
Bret Kragel

Company Information

Numerica Corporation (AKA: Numerica Inc)

5042 Technology Parkway Suite 100
Fort Collins, CO 80528
   (970) 461-2000
   info@numerica.us
   www.numerica.us
Location: Multiple
Congr. District: 02
County: Larimer

Phase I

Contract Number: HQ0860-21-C-7028
Start Date: 12/28/20    Completed: 6/30/21
Phase I year
2021
Phase I Amount
$154,918
Numerica proposes a novel approach to automated factor-based sensitivity analysis in high-dimensional search spaces, leveraging recent advances in physics and machine learning utilizing low-rank tensor models and adaptive sampling techniques to overcome the curse of dimensionality for many problems. Based on this technology, we propose an automated software analysis tool that exercises the algorithm under test (e.g., a mission-critical BMDS algorithms such as AEGIS FOM) by searching the potentially huge space of input parameters (model/algorithm parameters, scenario parameters, Monte Carlo realizations, code modifications, etc.) to analyze a specified set of quantities of interest (e.g., to identify optimal performance configurations, sensitivities to input data, performance boundaries, behavioral trends, potential bugs, etc.). To allow deeper insights into the internal details of the algorithm under test, we also propose to incorporate outputs of code execution analysis tools - such as code coverage and profiling tools - in addition to the algorithm’s native outputs. Approved for Public Release | 20-MDA-10643 (3 Dec 20

Phase II

Contract Number: HQ0860-22-C-7103
Start Date: 3/3/22    Completed: 3/1/24
Phase II year
2022
Phase II Amount
$1,509,763
Numerica proposes a novel approach to automated factor-based sensitivity analysis in high-dimensional search spaces that leverages recent advances in physics and machine learning utilizing low-rank tensor models and adaptive sampling techniques to overcome the curse of dimensionality for many problems. Based on this technology, we propose to develop an Automated Sensitivity Analysis Product that exercises the algorithm under test (e.g., a mission-critical MDS algorithm such as AEGIS FOM) by searching the potentially huge space of input parameters (model/algorithm parameters, scenario parameters, Monte Carlo realizations, code modifications, etc.) to analyze a specified set of quantities of interest (e.g., to identify optimal performance configurations, sensitivities to input data, performance boundaries, behavioral trends, potential bugs, etc.). To allow deeper insights into the internal details of the algorithm under test, we also propose to incorporate outputs of code execution analysis tools--such as code coverage and profiling tools--in addition to the algorithm’s native outputs. Approved for Public Release | 22-MDA-11102 (22 Mar 22