Natural User Interface Group
businessWeb Page: http://wiki.nuigroup.com/Google_Summer_of_Code_2009
Mailing List: http://nuigroup.com/forums
Natural User Interface Group or NUI Group is an interactive media group researching and creating open source machine sensing techniques to benefit artistic and educational applications. As a global community nearing 5000 members, we offer a unique collaborative environment for developers that are interested in learning and sharing emerging HCI (Human Computer Interaction) methods and concepts. This may include topics such as: voice/handwriting/gesture recognition, touch computing, computer vision, and information visualization. Our members consist of students, researchers, software engineers, interaction and user interface designers all working together towards a better understanding of HCI. Our current focus is "Open Source Interface" which is solely for accelerating development of existing hardware and sensing solutions, thus allowing us to find the cheapest and most effective ways to construct our input devices. One very important aspect of our community is to create and utilize open standards that allows for development and innovation to flourish. For example, we use the TUIO protocol, which is the standard for tangible object communications. Another crucial standard that must be created in an open environment is "Gesture Standards", which allows for fluid interaction across devices. We are currently maintaining a set of open source projects including: T-Beta – cross-platform computer vision framework (GSoC 2008 project) Touchlib - first open source library for multi-touch screen operation working under Linux and Windows touche – multi-touch application suite for Mac OS X BBTouch - multi-touch framework for Mac OS X pyMT – multi-touch library and set of applications written in Python OpenTouch - multi-touch library for Mac OS X (WinLibre GSoC 2007 Best Success) TouchAPI- a library which allows for rapid prototyping of NUI client applications in Adobe Flash/Flex/AIR or Silverlight/WPF QMTSim – TUIO Simulator written in C++ using Qt library (GSoC 2008 project) TUIO Simulator - Java TUIO protocol simulator allowing you to test your multitouch apps without a hardware input device (written by reactivision project team) Other than the above mentioned projects, we are also looking forward to working with other open source projects like reactivision, libavg, opentable and many more that are widely used and developed by NUI Group members. For example, we have recently started to work on applications for iPhone/iPod Touch using the iPhone SDK and have hopes of innovative project proposals from the community. We believe that open community is more powerful than money or technology.
Projects
- AN OPENFRAMEWORKS TOOLKIT FOR DEVELOPING MULTI-TOUCH MUSICAL INTERFACES A musical toolkit is an ideal candidate to test the viability of creating a platform based on OpenFrameworks for developing an accessible & extensible musical interface builder. The thrust of this project will focus on the research and development of an OF addon for users to build OpenGL-powered interfaces for musical (and media) applications. This project should demonstrate the feasibility of future interface solutions built on top of OpenFrameworks.
- Component-Based HCI UI Framework and High-Level HCI Library for Java I propose the development of a component-based UI framework for Java. This framework will allow Java developers to create interfaces for their Java applications that support a variety of advanced forms of HCI. Developers instantiate UI components, and component listeners to hook into their application code. The components of the framework will already know how to respond to a variety of HCI actions. The framework will simultaneously expose a component-based system for defining new gestures.
- Creating Models for Learning and Recognizing multitouch gestures. This project is aimed at creating a language / framework independent Gesture Recognition toolkit that takes OSC messages formatted with TUIO specification as input and outputs recognized gestures via OSC protocol. I will use the gesture recognition toolkit AMELiA to describe models specifically for the domain of multitouch gestures. This project will enable multitouch application developers to easily define a gesture and utilize it within their application, creating more engaging experiences.
- MPX X.Org TUIO Driver With the creation of Multi-Pointer X (MPX), the possibility of creating Multi-Pointer native applications is possible. To utilize these features with the currently existent TUIO trackers, I will be writing an xorg-input-tuio driver to receive TUIO events, analyze them, and pass them along as MPX blob events (XBlobEvent). I will also be creating both CLI and GUI applications to manage and configure the driver and connected TUIO trackers.
- Multitouch Extension Module for Google SketchUp An Extension Module that would allow Google SketchUp to receive TUIO and hence allow manipulations via Multitouch gestures. Adding multitouch ability to an easy to use CAD/modeling software - Google SketchUp would not only make the design process intuitive but also add a lot to the usability factor. An additional Bimanual Gesture support that’d allow an IR Pen and hand simultaneously has also been proposed as an add-on.
- NUIPaint : Exploring and developing new multitouch interactions by creating new widgets for a concrete graphics application Digital Photo Editing and Drawing is one of those multimedia applications which require a lot of user interactions to achieve a given task.This proposal is for a multi-touch based photo manipulator and drawing application that would take advantage of the speedier,fewer and more intuitive inputs of a multi-touch surface thus enhancing the productivity of the user. The focus is to explore new interactions & developing UI widgets for PyMT framework which would be reusable for other applications.
- Recognition, Tracking and Association of Hands, Fingers, and Blobs: A Tbeta Upgrade This proposal aims to provide a plugin for Tbeta that could allow Hand, Fingers and Blobs recognition and Tracking, but most importantly to make an association and see if they belong to each other, and later to provide this information on the TUIO string, permiting the applications developers to make use of the new data on a wide range of applications. It envisions then, to augment the opportunities provided by the already amazing Tbeta.