Fusion - a multimodal interaction framework for task-oriented and object-oriented interfaces
by Scott Halstvedt for Natural User Interface Group
A multimodal interface provides the user a simultaneous combination of different interaction modes (i.e., keyboard, speech, touch, pen) to increase usability and interaction bandwidth. The product of this project is a framework for a modular but initially touch-and-speech-only multimodal interface, an engine to generate a grammar from a model of the application's objects and functions, and a demonstrative application that shows the concept in action with CCA/Sphinx and a traditional mouse.