The post below is cross-posted from the Kinect for Windows Developer blog. The introduction is written by Ben Lower of the Kinect for Windows team and the body is written by me (Josh).
The following blog post was guest authored by K4W MVP, Joshua Blake. Josh is the Technical Director of the InfoStrat Advanced Technology Group in Washington, D.C where he and his team work on cutting-edge Kinect and NUI projects for their clients. You can find him on twitter @joshblake or at his blog, http://nui.joshland.org.
Josh recently recorded several videos for our Kinect for Windows Developer Center and will be contributing three posts this month to the blog.
I've been doing full-time natural user interface (NUI) design and development since 2008: starting with multi-touch apps for the original Microsoft Surface (now called “PixelSense") and most-recently creating touch-free apps using Kinect. Over this time, I have learned a great deal about what it takes to create great natural user interfaces, regardless of the input or output device.
One of the easiest ways to get involved with natural user interfaces is by learning to create applications for the Kinect for Windows sensor, which has an important role to play in the NUI revolution. It is inexpensive enough to be affordable to almost any developer, yet it allows our computers see, hear, and understand the real-world similar to how we understand it. It isn't enough to just mash up new sensors with existing software, though. In order to reach the true potential of the Kinect, we need learn what makes a user interface truly ‘natural’.
The Kinect for Windows team generously offered to record several videos of me sharing my thoughts on natural user interface and Kinect design and development. Today, you can watch the first three of these videos on the Kinect for Windows Developer Center.
Introduction to Natural User Interfaces and Kinect
In this video, I present the most important ideas and concepts that every natural user interface designer or developer must know and give concrete examples of the ideas from Kinect development. This video covers: what natural user interfaces are, what ideas to consider when designing a natural user interface, and the difference between gestures and manipulations.
Kinect PowerPoint Control
This pair of videos covers my Kinect PowerPoint Control sample project. The “Design” video quickly demonstrates the features of the application, and the “Code Walkthrough” video explains the most important parts of the code. The project source code is available under an open source license at https://kinectpowerpoint.codeplex.com.
I use this app all the time to control my PowerPoint presentations (such as the Intro to NUI video above) with Kinect. The app demonstrates the bare minimum code required to do simple custom gesture recognition using Kinect skeleton data and how to respond to basic voice commands using Kinect speech recognition. I have found many Kinect developers have trouble getting started with gesture recognition, so the features in the sample are kept minimal on purpose so that the code is easy to read and learn from.
Keep watching this blog next week for part two, where I will share more videos showing advanced features of the Kinect SDK. I will introduce two more of my sample Kinect projects including a completely new, previously unpublished sample that uses Kinect Fusion for object scanning.
-Josh
@joshblake | joshb@infostrat.com | mobile +1 (703) 946-7176 | http://nui.joshland.org
No comments:
Post a Comment
Note: Only a member of this blog may post a comment.