Search With Google

Custom Search

Friday, March 5, 2010

Skinput turns Body Parts into a touchscreen

Devices with significant computational power and capabilities can now be easily carried on our bodies. However, their small size typically leads to limited interaction space (e.g., diminutive screens, buttons, and jog wheels) and consequently diminishes their usability and functionality.
Microsoft and Carnegie Mellon University unveiled Skinput recently, showing how it can turn your own body into a touchscreen interface. Skinput uses a series of sensors to track where a user taps on his arm. Previous attempts at using projected interfaces used motion-tracking to determine where a person taps.
Skinput uses a different and novel technique: It "listens" to the vibrations in your body. Tapping on different parts of your arm creates different kinds of vibrations depending on the amount and shape of bones, tendons and muscle in that specific area. Skinput sensors can track those vibrations using an armband and discern where the user tapped.

No comments:

Post a Comment