The TechFest preview included visits to handful of booths set up in Microsoft’s on-campus conference center showing research done in Redmond and at labs in Europe and Asia.
One focused on prototype input systems worn on the arm, including one called “Skinput” that projected virtual buttons onto the skin, so a computer could be controlled with taps along an arm or even on the side of a coffee cup held in the hand.
Microsoft researcher Desney Tan demonstrated a system using armbands with muscle sensors that worked as computer interfaces. Instead of clicking a button or touching a screen, users move their fingers and hands to control input and the bands worked by sensing muscle activity.
Tan demonstrated the system by playing “air guitar” to control the “Guitar Hero” game.
There was no word on when the arm devices could become real products, but Rick Rashid, senior vice president of research, noted that this sort of prototype demonstration leads to products such as the Xbox Natal controller.
Other demonstrations included a system for realistically rendering brush strokes on a computer, using sensors in a pad on which the user “paints,” and a gyroscopic “cloud mouse” used for navigating 3D interfaces that may be used to present rich data from online “cloud” services.
Researchers also showed a phone system that simultaneously transcribes and translates voice conversations in real time. The transcription feature is coming soon to Microsoft’s Exchange messaging system but the translation feature is still being developed.
A remote datacenter handles the translation and transcription so the service works with most any device, and it continuously trains itself to better understand individual users.
Here’s Tan demonstrating the “Always Available Input With Muscle Computer Interfaces”: