In the last week or two there have been several interesting new links about NUI research and innovations. I thought I'd share them all together in hopes that they are interesting to you as well.
Researchers at University of Lancaster are coordinating cell phone accelerator data timing with touch data on a touch table to uniquely identify users. When a user uses the phone to touch surface they can create user specific reactions, such as pulling data from that phone, as opposed to another person's phone. Of course, the phone needs to run some software and already be paired via bluetooth or Wifi but this is an interesting idea about sensor fusion.
Ina Fried has a good write up about a new Microsoft Research project by Andy Wilson, one of the founders of the Surface Computing group at Microsoft, and his team. They've taken Surface Computing a step further and using ceiling-mounted projectors and depth cameras (perhaps borrowing a bit from Kinect technology), have turned an entire room into a Surface. A standard wood table-top is now touch-sensitive Surface.
The interesting thing with this is that it can detect presence and coordinate interactions between the interactive zones. The video shows them picking up a video from the table, carrying a representative dot across the room and putting it onto a wall to play the video. They also show a menu system created in a column of air where you select an item by hovering your hand in the right place. The menu items are projected onto your hand.
The interface and interaction technique are still clunky and unrefined, but this is still a research project and they're exploring new capabilities. I think about these types of future interactions a lot. We're going to be seeing more and more of this integrated interactions and reacting to presence as well as touch in the future.
Mary Jo Foley updates us about another Microsoft Research project, Manual Deskterity. A video of Manual Deskterity was released earlier this year and now the team has published a paper with more details of their findings and user research. The project focuses on multi-modal and bi-manual (two-handed) interactions.
They use an IR light pen and Microsoft Surface but the techniques are applicable to any type of pen and touch system. (Mary points out a connection to the leaked Courier videos last year and the InkSeine project.) Note that most commercial tablet PCs available today that support stylus and touch only allow one at a time. You'll need special drivers to do both at once.
Bill Buxton, "NUI - What's in a name?"
Bill Buxton recently gave another presentation about NUI, this time at the Microsoft Development Center in Copenhagen. The presentation is about 60 minutes with another 30 minute Q&A. It combines some of his previous speeches this year with a few new things. I'd encourage you to plan some time to watch this, even if you've already seen everything else he's ever presented. He has a great way of breaking down difficult-to-understand topics and making you see things in a new way.