Thursday, January 20, 2011

Natural User Interfaces in .NET book update

Today Manning pushed out a new update to my book’s MEAP (early access Ebook). This is the first update in quite a while, and that’s because I’ve been secretly updating chapters and writing new chapters that include NDA content.

Note: please read through to the end of this post for a 50% off coupon for my book good until February 20, 2011.

The secret NDA content is about the future of the Surface platform and Surface SDK. Now that Surface 2.0 has been announced, I can start talking about it and pushing out the non-NDA content. You’ll notice in this update an updated table of contents and a new title, Natural User Interfaces in .NET, to reflect the reorganized content and focus. Here is the new cover:


Based upon reader and reviewer feedback and the announcement of Surface 2.0 and Kinect, we decided to focus the book on the WPF 4 platform with Surface SDK 2.0. The combination of WPF 4 and Surface SDK 2.0 makes the most advanced NUI platform available that targets a wide variety of devices, form factors, and NUI input technologies. Surface SDK 2.0 is the successor to both Surface Toolkit and Surface SDK 1.0 and will run on any Windows 7 machine, including desktops, tablets, slates, and Surface 2.0 units. 

I’ve also decided to include Kinect content because I have determined that we can use the same WPF 4 and Surface SDK 2.0 platform to target depth sensing camera interfaces as well. In early experiments, I have injected hand tracking and hand pose information into the WPF 4 touch stack and can control any WPF 4 touch interface with my hands. Using this unified NUI platform, you can write a single interface that users can use with all of the Windows 7 multi-touch devices, Surface 2.0, as well as Kinect and similar depth sensors.

Of course, you are going to want to design the interface differently based upon which input device your application is using. For example, Surface lets you detect tagged objects and finger orientation and hand tracking depth sensors can give you extra information on hand poses and gestures. That’s why I’ve organized the book into the three parts:

Part 1: Introducing NUI Concepts
What developers need to know about NUI design

Part 2: Learning WPF Touch and Surface SDK
A unified NUI platform

Part 3: Building great experiences
How to take advantage of the unique capabilities of your input devices

I still have a few more chapters to complete, but the content complete milestone is in sight. I’ve been getting great feedback from my MEAP subscribers and continue to make improvements based upon feedback. (Anyone who pre-orders the eBook or print book is also subscribed to MEAP updates and can read the completed chapters as they become ready.)

If you haven’t pre-ordered my book yet, you can buy the eBook or print book (and get the MEAP for free) for 50% off until February 20, 2011. All you need to do is enter in coupon code multinet50 during checkout. If you buy it directly after clicking this link, Manning will give me a couple extra bucks and I would appreciate it. You can also download a free sample chapter and the sample code from that link.


  1. I was hoping that the Kinect content would be included in the new MEAP update. If you're writing in chapter order then it looks like you have yet to get to that chapter (Chapter #13, I'm guessing).

  2. Mike,
    I'm still doing some research and organizing the Kinect content, but I'll push it to readers as soon as possible. It will be in Chapters 11 and 13.

    Is there anything in specific you'd be interested in seeing regarding Kinect/motion tracking?


Note: Only a member of this blog may post a comment.