SBIR-STTR Award

Automatic Feature Evaluator (AFE)
Award last edited on: 4/11/2014

Sponsored Program
SBIR
Awarding Agency
DOD : Navy
Total Award Amount
$2,012,191
Award Phase
2
Solicitation Topic Code
N02-111
Principal Investigator
Clifford V Comisky

Company Information

Kab Laboratories Inc

1110 Rosecrans Street Suite 203
San Diego, CA 92106
   (619) 523-1763
   info@kablab.com
   www.kablab.com
Location: Multiple
Congr. District: 52
County: San Diego

Phase I

Contract Number: N68786-02-C-7022
Start Date: 8/15/2002    Completed: 2/15/2003
Phase I year
2002
Phase I Amount
$68,757
This proposal attacks the Navy clustering problem by first dividing the reported features into two classes: primary features (those intended to be useful) and secondary features (those unintentionally useful). Subject matter experts will then explain how they have used the secondary features to form initial clusters of primary features. An expert system based upon the human experts will be developed and iteratively combined with statistics such as a modified Bayesian statistic to estimate the number of clusters, eigenvectors to estimate the number of dimensions, and a modified F-ratio to estimate the strength of each feature. The result will be an estimate of which features to use and how to use them for the new class, and will provide an initial set of clusters. This development would have very broad applicability to commercial systems that need to operate in real time based upon inputs that are varied in type and quality.

Phase II

Contract Number: N00039-03-C-7063
Start Date: 4/27/2011    Completed: 10/27/2012
Phase II year
2003
(last award dollars: 2011)
Phase II Amount
$1,943,434

The Automatic Feature Evaluator (AFE) Phase II program will develop a demonstration capability to show how data with different, missing, and corrupted attributes can be assembled into a decision-making process. The unit will address three areas of concern: clustering, the formation of some initial groupings (clusters) of measurements, each representing an object; classification, the subsequent evolution of the set of clusters as new reports come in and are assigned to clusters; and maintenance, the routine and non-routine analysis of the cluster space to detect and correct problems. Although the algorithms are in general statistical in nature, they do not assume any particular distribution of the elements reported. The algorithms are able to deal with measurements that are non-ideal in other ways also. They can handle elements that are discrete and even non-numeric. They can deal with reports that contain missing data, outliers, or gross errors. They can also handle multi-modal distributions and are able to track changes in the underlying distributions over time. Some of these issues are addressed on the basis of knowledge of the reports and their content, but most of the issues are addressed in general terms. Benefit This technology solves the problem of using very different data inputs to derive grouping and classification solutions Keywords Bayes Information Criterion, Clustering, Classification, selection set rule