There is a really interesting write up of several MSR projects in the topic of natural user interfaces over on the Research.Microsoft.com site.
One project I’m very interested in is led by Desney Tan, a senior researcher in the Visualisation and Interaction for Business and Entertainment (VIBE) group. Tan is working on providing ‘mobile natural user interfaces’, which he demonstrated with his project called ‘NUI with Physiological Sensing’.
The demo has two parts. One utilizes electromyography—the sensing of electrical muscle activity—to infer finger gestures. The second uses bio-acoustic sensors that detect energy transmissions through the body, transforming the human body into a tap-based input device. Both parts of the work can be activated by wearing a simple armband on the upper forearm, thereby sending the signals wirelessly to a computing device.
There are several other projects worth taking a look at. However, if you feel a spud walking down the high street with your mobile stuck out in front of you as it gives you directions, I’m not sure these projects will reduce the level of embarrassment.