SBIR-STTR Award

Providing Tools for Richer eLearning Assessment
Award last edited on: 1/17/2014

Sponsored Program
SBIR
Awarding Agency
NSF
Total Award Amount
$600,000
Award Phase
2
Solicitation Topic Code
-----

Principal Investigator
Linda J Chaput

Company Information

Agile Mind (AKA: Agile Mind Educational Holdings Inc)

4101 William D. Tate Avenue Suite 220
Grapevine, TX 76051
   (817) 329-2223
   N/A
   www.thinkfive.com,www.agilemind.com
Location: Multiple
Congr. District: 24
County: Tarrant

Phase I

Contract Number: ----------
Start Date: ----    Completed: ----
Phase I year
2004
Phase I Amount
$100,000
This Small Business Innovation Research (SBIR) Phase I project will study the feasibility of creating test construction tools that allow school educators to conveniently produce and deliver tests ranging from informal assessments of mastery that can be given and taken on the fly, to tests that benchmark progress of instruction against goals. The key innovations are (1) the capability to define answer analyses for stored question items so that the test constructor can know in advance what the test can report about what test-takers likely know and do not know, and (2) the capability to represent question items in a form in which actual experience can be used to improve the assessment corpus. The objective is to create tests that move beyond the current broadly-accepted applications that consist entirely of multiple-choice questions and that include varied and even game-like question types incorporating motivational and pedagogically effective feedback; that is, question types which teach while they assess. These are question types, such as drag-and-drop, matching, fill-in-the-blank sentence, and table builders that may have multiple correct answers, which do not have broadly agreed upon wrong answer distractors, and which typically require more experience to define what errors imply about the test-takers' knowledge and skills. The aim of the project is not to compete with high-stakes tests, but to move beyond current applications that consist entirely of multiple-choice questions. This project is a first step in determining whether the strategy of focusing on improving .low-stakes. assessments has merit commercially as well as intellectually. Multiple-choice and constructed answer exams have long proven highly efficient tools for state and national high-stakes exams. A problem with multiple-choice questions, however, is that many do not assess what students know but only what students demonstrate they know. Certain types of students typically perform better than others on multiple-choice tests. In a period of heightened accountability, the difficulty of designing fair test items that can withstand legal challenge has made multiple-choice, by consensus, the only efficient, reliable form of high-stakes assessment in states representing most of the school-age population. Because there are many students in environments in which their learning is measured almost solely by multiple-choice tests who are not served well, a significant contribution to exploratory learning can be made by increasing what learners can experience by making assessments more intrinsically interesting and also by improving the kinds of formative feedback available to students, teachers, and administrators

Phase II

Contract Number: ----------
Start Date: ----    Completed: ----
Phase II year
2006
Phase II Amount
$500,000
This Small Business Innovation Research (SBIR) Phase II project will study effective models for carrying out assessments employing challenging puzzle-like questions that incorporate distractor analyses in which meaning is assigned to complex responses. Such distractor analyses apply where there is the possibility that the test taker can give alternative correct, partially correct, and incorrect answers. Metadata and distractor analyses will be combined to provide in-depth reports on student test performance. This new rule-based solution to distractor analysis meets a significant challenge in being able to include engaging problems in assessments of student progress in quantitative courses, such as Algebra and Geometry. The research will further develop question authoring and test construction tools.

As a consequence of this work, educators using these new technologies will be able to move beyond online testing based solely on multiple-choice, single-answer questions that are known to be unmotivating for many students. The goals are twofold: to provide varied, interesting, and even gamelike learning interactions that incorporate motivational and pedagogically valuable feedback; and to do so in a form in which empirical evidence can be used to improve the assessment corpus - both the metadata and the rules used for defining distractor analysis, especially where the items are novel question types