Sunday, December 11, 2011

Gesture and Tactile Interfaces: Applications in Mobile Computing and American Sign Language


Google Tech Talk June 16, 2010 Presented by Thad Starner. Our explorations in gesture recognition and tactile interfaces for wearable computing have revealed some surprising applications for mobile computing and the Deaf community. Mobile Music Touch, a glove with embedded vibrators, allows users to learn to play piano melodies while performing other tasks, such as reading this abstract. The Textile Interface Swatchbook demonstrates seven functional GUI-like interface widgets rendered on fabric with embroidered conductive thread. BuzzWear uses electro- and vibro-tactile displays on a wristband to present incoming alerts and to provide feedback for gesture input using proximity sensors. The MAGIC toolkit addresses the false triggering problem that limits the use of gestures to initiative interactions with motion sensor equipped mobile devices. CopyCat is an educational game that use our computer vision-based American Sign Language recognition system to help young deaf children acquire language skills. SmartSign teachs parents of deaf infants sign through sign lessons delivered on mobile phone. Finally, BrainSign attempts to recognize sign language gestures directly through brain signals in the motor cortex. Thad Starner is an Associate Professor at Georgia Institute of Technology's School of Interactive Computing. Thad was perhaps the first to integrate a wearable computer into his everyday life as an intelligent personal assistant. Starner's work as a PhD student would ...

Cheaper King Koil Mattress Purchase Graco Play Yards




No comments:

Post a Comment


Twitter Facebook Flickr RSS



Fran�ais Deutsch Italiano Portugu�s
Espa�ol ??? ??? ?????







Sponsor Links