The Future of Video Game Input: Muscle Sensors

A muscle computer interface allows interaction with a computer without touching an keyboard, mouse or other input device. In tests, a "gesture recognizer" is learning strumming and fretting so a person can play air guitar. (Image credit: Saponas et al. / University of Washington, Microsoft, University of Toronto)

Motion control and multi-touch have become common in devices ranging from Nintendo's Wii to Apple's iPhone. But a muscle-sensing system could someday allow gamers to play air "Guitar Hero" without a controller, or help harried parents with full hands open car doors.

Electromyography (EMG) sensors can decode muscle signals from the skin's surface as a person performs certain gestures. Researchers attached such sensors to their forearms, and built a gesture recognition library by monitoring muscle signals related to each gesture. The project emerged as a collaborative effort between Microsoft, University of Washington in Seattle, and the University of Toronto in Canada.

The possibilities become seemingly endless when the muscle sensing system combines with existing gadgets. Joggers can switch songs on their mp3 player by using a few quick hand gestures without breaking stride, and people with full hands might just squeeze whatever they're holding to pop the car trunk or unlock the doors.

The system could also add extra interactive possibilities for future motion control in video games, such as Microsoft's camera-based Project Natal.

{{ embed="20091028"

}}

Researchers have similarly used the muscle sensing system to add user interface features to the Microsoft Surface, a multi-touch tabletop device. Pressure-sensitive painting allowed Surface users to "draw" on the tabletop and apply more color saturation by pressing down harder. Users could even choose to "pick up" virtual photos using a pinching and lifting motion, and then could "throw" the pictures back on the table.

EMG sensors have previously helped clinical researchers assess muscles during rehabilitation, and have also enabled prosthetic users to directly control their artificial limbs. But the new interface research promises to put the technology in the hands of far more consumers, if it proves flexible enough.

The University of Washington group will present a paper on their latest work at the Interactive Tabletops and Surfaces 2009 conference held in Banff, Canada near the end of November.

This article was provided by TopTenREVIEWS.

Jeremy Hsu
Jeremy has written for publications such as Popular Science, Scientific American Mind and Reader's Digest Asia. He obtained his masters degree in science journalism from New York University, and completed his undergraduate education in the history and sociology of science at the University of Pennsylvania.