Get the world’s most fascinating discoveries delivered straight to your inbox.
You are now subscribed
Your newsletter sign-up was successful
Want to add more newsletters?
Delivered Daily
Daily Newsletter
Sign up for the latest discoveries, groundbreaking research and fascinating breakthroughs that impact you and the wider world direct to your inbox.
Once a week
Life's Little Mysteries
Feed your curiosity with an exclusive mystery every week, solved with science and delivered direct to your inbox before it's seen anywhere else.
Once a week
How It Works
Sign up to our free science & technology newsletter for your weekly fix of fascinating articles, quick quizzes, amazing images, and more
Delivered daily
Space.com Newsletter
Breaking space news, the latest updates on rocket launches, skywatching events and more!
Once a month
Watch This Space
Sign up to our monthly entertainment newsletter to keep up with all our coverage of the latest sci-fi and space movies, tv shows, games and books.
Once a week
Night Sky This Week
Discover this week's must-see night sky events, moon phases, and stunning astrophotos. Sign up for our skywatching newsletter and explore the universe with us!
Join the club
Get full access to premium articles, exclusive features and a growing list of member rewards.
Touchscreens may be popular both in science fiction and real life as the symbol of next-gen technology, but an innovation called Skinput suggests the true interface of the future might be us.
Microsoft and Carnegie Mellon University unveiled Skinput recently, showing how it can turn your own body into a touchscreen interface.
Skinput uses a series of sensors to track where a user taps on his arm. Previous attempts at using projected interfaces used motion-tracking to determine where a person taps.
Skinput uses a different and novel technique: It "listens" to the vibrations in your body.
Tapping on different parts of your arm creates different kinds of vibrations depending on the amount and shape of bones, tendons and muscle in that specific area. Skinput sensors can track those vibrations using an armband and discern where the user tapped.
"Accuracy is already good, in the high 90s percent accuracy for finger input," said project team member Chris Harrison, from Carnegie Mellon's Human-Computer Interaction Institute.
"The arm band is a crude prototype,” Harrison said. “The next generation could be made considerably smaller – likely easily fitting into a wristwatch."
Get the world’s most fascinating discoveries delivered straight to your inbox.
From there it's fairly simple to associate those tappable areas with different commands in an interface, just as different keystrokes and mouse clicks perform different functions on a computer.
When coupled with a small projector, Skinput can simulate a menu interface like the ones used in other kinds of electronics. Tapping on different areas of the arm and hand allow users to scroll through menus and select options.
Skinput could also be used without a visual interface. For instance, with an MP3 player one doesn't need a visual menu to stop, pause, play, advance to the next track or change the volume. Different areas on the arm and fingers simulate common commands for these tasks, and a user could tap them without even needing to look.
Skinput is the product of a collaboration between Carnegie Mellon's Harrison and Desny Tan and Dan Morris of Microsoft Research. For now, Skinput is only a proof-of-concept for alternate ways to interface with electronics, but the team isn't ruling out that it could become a commercial product someday.
Harrison also pointed out that the next generation of miniature projectors will be small enough to fit in a wristwatch, making Skinput a complete and portable system that could be hooked up to any compatible electronics no matter where the user goes.
Besides being bulky, the prototype has a few other kinks that need to be worked out. For instance, over time the accuracy of interpreting where the user taps can degrade.
"We (the researchers) have worn it for extended periods of time," Harrison told TechNewsDaily. "But it does occasionally need to be retrained. As we collect more data, and make the machine learning classifiers more robust, this problem will hopefully reduce."
Skinput, and similar sensor devices developed by the team could have applications beyond simple menu screens. Tan recently demoed a Skinput-like interface that allowed him to play Guitar Hero, a popular music game, without the requisite plastic guitar controller. The results were still a little crude, but impressive because it proved the viability of game controllers that don't require physical controls.
This is especially relevant considering the Project Natal technology Microsoft is developing for the gaming industry and which has been gathering a lot of attention. Despite working in vastly different ways, both systems focus on letting users play games with their own bodies, without the need for accessories and game controllers.
