Thursday, September 15, 2011

Windows 8 on Microsoft Surface 1.0 (video)

I've been neglecting this blog a bit since I'm trying to focus on finishing my book, but I had to break my self-imposed radio silence to break this news:

I installed Windows 8 on Microsoft Surface 1.0 hardware, and it works! By "it works", I mean the Microsoft Surface 1.0 shell and applications run as expected on Windows 8, and immersive-mode applications and regular desktop applications can be controlled using the Surface vision system touch input.

Windows 8 Start screen working on Microsoft Surface 1.0
I tweeted the above photo while video taping proof of this technical hackery. The video is embedded below:

Yes, this is the Windows 8 that was released publicly only a little over a day ago. I did not have early access, so this is another 24 hour hack for me like Kinductor from the Kinect Code Camp, though it didn't take me a full 24 hours of straight effort.

How did I do it?
Since Surface 1.0 is intended to run on Vista, the fact that any of this works is a testament to the backwards compatibility story of Windows 8. In short, I shrunk the Surface's Vista partition then installed Win 8 from USB key to the new partition (make it 40 GB minimum if you want to develop on it). The installer automatically set up dual booting, so I can boot back to Vista for the "correct" Surface 1.0 environment if I like.

The tricky part was getting the Surface SDK (full version that comes on Surface, not the workstation edition) to install (edit the msi launch conditions using Orca) and then getting the inputs from the Surface vision inputs to drive the Windows 8 UIs. To do that I used MultiTouchVista, which has a component that can connect to SurfaceInput.exe as a client as well as push touch data to an HID touch input driver. Windows 8 doesn't know it's not a capacitive touchscreen at all. (This technique also works for combining Windows 7 with Surface.) Win8 UI also won't be aware of tags or blobs but if you want to write a Surface application that uses those, just use the Surface SDK! Some other time I'll post a deep technical walkthrough, but this thread should be sufficient to get you started.

All of the Surface hardware drivers and Surface applications run great. The immersive mode touch responsiveness is a bit flaky as you can see in the video, but this could have been any number of things from the MultiTouchVista plug-ins to the Win8 touch stack to the Start screen. The Win8 touch visualizations (trails and auras) were always working fine, so I'm leaning towards the problem being in the Win8 UI layer. This is a pre-beta build though, so entirely understandable.

I do need to say that this configuration is completely unsupported and would probably violate the warranty if it wasn't already expired for this particular machine. It also has nothing to do with Surface 2.0. You won't ever see Microsoft create a configuration like this for general use since running three shells (Surface, Immersive, and Desktop) side-by-side would be entirely confusing for real users. In the video I was basically in the equivalent of Surface administrator mode, manually running SurfaceShell.exe, and then switching ad-hoc into immersive mode, which is like a full screen start menu. There are too many competing metaphors and modes for this configuration to be used by the general public.

Does Immersive mode make the Surface shell obsolete?
No way. It makes the Surface shell even more important. My initial impressions with actually using the Win8 immersive mode on Surface is that it really doesn't work for a large, horizontal, multi-user form factor. For one, everything is oriented in one direction, and two, I need to reach too far to get to certain interface elements across the table. That's the opposite of the ergonomic intent of the immersive mode, which was design for smaller screens and slates. It might do better on a vertical wall-mounted form factor. There is definitely a place here for both Immersive mode as well as the Surface shell since they are each designed to do different things in different scenarios. This is especially true if we account for the improved Surface shell that is coming soon with Surface 2.0 hardware.

I think we'll see that there will is a large difference between the design of an immersive mode application to run on a Slate or PC and used by one person, and the design of Microsoft Surface applications to be used by multiple people in a 360 degree interface.

What Surface and Win8 can learn from each other
Even though the designs for each of these modalities need to be different, there is a lot that they can learn from each other. I really like swiping in from the edge to open the charms and app bars, and could see that being incorporated into the version of Surface after 2.0. Windows 8 isn't perfect though, since as soon as there is an extra finger somewhere in immersive mode, the whole interaction breaks down.

It may just be the developer bits, but many or most of the Win8 sample games were almost unusable as a Surface application. My 3 year old tried to play a few with me but we kept getting in each others way since the games were designed for a single orientation. Worse than that, my touches would block her from interacting with the game, and vice versa. Some of the interns that developed those samples apparently did not account for multiple types of input (mouse and touch) or even multi-touch at all, indicating they were still thinking about coding interactions from a single-point mouse point-of-view.

I know that on slates and PCs with a single user they may not need to plan for a full 360 degree interface, but Windows 8 developers need to learn the lessons that Microsoft Surface developers have learned years ago about allowing multiple inputs, multiple touches, and letting the user recover from mental mistakes and accidental touches.

Watching the Windows 8 unveiling during the Build Day 1 keynote was somewhat amazing for me because all of a sudden NUI design and development skills (my focus since 2008) has gone from a small niche to a core requirement for the Windows platform. Anyone interested in building truly great Windows 8 applications needs take an active interest in learning from Surface developers. I'm expecting that some of my colleagues in the Surface community will end up coming up with the most amazing Windows 8 applications in no time at all. (Feel free to send those apps to me for testing on my FrankenSurface Window 8 machine!)

Natural User Interfaces in .NET
While at first I was nervous that the Windows 8 announcements would make my almost-complete book a waste, now that I think about it things are still pretty good. If you want to use the premier technology for creating Natural User Intefaces, then .NET and WPF is still the way to go. You can't target Surface or Kinect in immersive mode, and the immersive mode interface isn't well suited for multiple people. You also have to consider that Windows 8 is still six months to a year away, but you can develop NUIs for Windows 7 multitouch, Surface, and Kinect today with WPF. You can even give them a metro look-and-feel.

If this interests you and you haven't already, take a look at the free sample chapter of my book, and if you like it pre-order it and get the early access PDF today with all of the completed chapters.

Monday, April 18, 2011

Official Kinect SDK to be Open Source

Microsoft had previously announced its plans to make the Kinect for Windows SDK available this Spring, initially for non-commercial use and later for commercial use. At the MIX11 conference this past week during the day 2 keynote (starting at the 1:40:10 mark of the video) they reiterated this plan and gave a few more details, including support for C++, C#, and VB development. They did not mention it explicitly but they used WPF during the live coding demonstration.

On day 3, several Kinect and Microsoft Research (MSR) presented a panel Q&A session. At 26:05 of the Kinect panel video, I introduced myself as the founder of the OpenKinect community and asked whether they had considered embracing the open source Kinect movement started by this community, which has lead to hundreds of Kinect Hacks around the world, by open sourcing their code.

KristinTolleTo my surprise, Dr. Kristin Tolle, Director of MSR’s Natural User Interface team, responded that they will be publishing the Kinect SDK as open source! She referenced earlier discussion about embracing and enabling the community to take what Microsoft has created and using it in new ways and mentioned that MSR has a history of releasing open source SDKs.

When I returned to my seat, another MSR researcher approached me and suggested that Kristin may have misunderstood my question or had misspoken; however, I followed up with Kristin afterwards and reconfirmed that she meant open source as in the public can modify and recompile the sources as well as share the modifications. She also emphasized that they want to enable people to share modifications and enhancements.

This is great news!

It is not clear yet what type of license they will use. If the license allows modification and redistribution but is, at least initially, limited to non-commercial use, then it would not be a standard open source license according to the Open Source Definition. Looking at other projects published by MSR, I could see them choosing the MSR-LA as used by the Singularity experimental OS.

I’ll restate here that Microsoft has on multiple occasions confirmed that a commercial license will be available at a date after the initial non-commercial release. The ideal scenario would be if they switched from MSR-LA to a true open source license and set up a collaborative project between MSR, Kinect team, and the open community styled after the successful Nuget project, which uses the Apache 2 license and includes Microsoft and non-Microsoft contributors.

Considering the reaction by the other MSR researcher to this open source statement, there may be some confusion internally about whether this will happen like this. I think the best thing that we as a community can do is to express support for Microsoft open sourcing the Kinect for Windows SDK (including the skeleton tracking and training components), even if they use a non-commercial license initially, and keep the pressure on them to create a better relationship with the open community and transition to a Nuget style collaboration with a liberal license such as Apache 2.

Update 4/18 7:34pm: Mary Jo Foley picks up this story, but the Microsoft spokesperson she talked to denied that the Kinect SDK will be open source. As she notes, Microsoft has pulled 180’s before regarding Kinect. After spokespeople initially were hostile to the idea of Kinect hacking, Xbox executives later embraced the idea that people are using Kinect for non-gaming purposes on the PC. Let’s hope Microsoft stays open to this idea.

Monday, April 4, 2011

NUI at MIX11

It's only a week away from MIX11 and I'm very excited. NUI will be represented even more than last year. Here’s a run down of all the NUI events I’m aware of:

Monday, April 11
Ballroom D
Open Source Fest
A showcase of 52 open source projects, including InfoStrat.MotionFx, our project for integrating Kinect with WPF 4 Touch.
Tuesday, April 12
Lagoon H
A Whole NUI World: Microsoft Surface 2 and Windows Touch
by Luis Cabrera of the Surface team. Luis will get everyone up to speed about developing with Surface SDK for Surface 2 and Windows 7 touch devices.
Tuesday, April 12
Lagoon H
The Microsoft Surface MVPs Present: Natural User Interfaces, Today and Tomorrow; An Interactive Discussion and Demonstration
by me and the rest of the Surface MVPs. We will have several cool demos for Surface 2 and Kinect.

(This is the open call session I submitted and voted in by the community. Thanks!)
Tuesday, April 12
Lobby in front of The Commons
NUI MIXup at MIX11
NUI MIXup is back! RSVP at the link above. Last year’s NUI MIXup was a great success. Frank La Vigne, a Microsoft Public Sector Developer Evangelist, will be sponsoring drinks and appetizers for the group. I hope to see you there!
Wednesday, April 13
Lagoon H
Audio for Kinect: From Idea to "Xbox, Play!"
by Ivan Tashev of Microsoft Research, the man behind Kinect’s audio processing algorithms.
Thursday, April 14
Lagoon H
Interactive Panel: Kinect and Natural User Interfaces (NUI)
A panel featuring luminaries from Microsoft Research talking about Kinect
April 12-14
Breakers L
UX Lightning Sessions
Three UX Lightning sessions, each with four speakers. Not purely NUI, but probably still of interest to most NUI enthusiasts.
April 12-14
Shorelines A
The Connect Lounge
Microsoft Research and Coding4Fun will be showing off some cool NUI and Kinect projects.

I can’t say where yet, but InfoStrat will also have a presence with some of our demos. I’m also hearing rumors of some things that might be announced during the keynotes, but we’ll just have to wait and see.

If you’re looking for me at MIX11, the best way is to check my twitter stream @joshblake or send me a mention. You can also email me joshblake atgmail com.


Thursday, January 27, 2011

MIX11 Open Call Surface and NUI Sessions

MIX is a great conference that focuses on development, design, and upcoming technology trends. Last year, MIX started an “Open Call for Content” where the general public was invited to submit sessions and the general public voted to decide which sessions would be given actual spots during the conference. The Open Call sessions can be about any topic the community desires to hear and they augment the Microsoft sessions that make up the meat of the conference.

This year Open Call is back and there are 228% increase in proposals! Out of 384 entries, they posted 207 sessions. I read through every single one yesterday and expected to see some great sessions and some mediocre sessions, but to my surprise almost every single session sounds fantastic. MIX11 is going to have some great content.

300x250_Mix11_011011_US_bLast year, I was fortunate enough to be one of 12 MIX10 Open Call sessions. I presented “Developing Natural User Interfaces with Microsoft Silverlight and WPF 4 Touch [and Windows Phone 7]”. (Read my MIX10 recap.) My session was one of three that focused on NUI, multi-touch, or Surface topics.


This year there are 8 9 entries about Surface or NUI and I’m hoping several will be picked. I’m organizing a panel session featuring the Microsoft Surface MVPs, some interesting discussion, and some cool demos. You can vote for up to 10 sessions, so please consider including the Surface MVP panel and any of the other NUI sessions that sound interesting to you in your voting.

The Microsoft Surface MVPs present: Natural User Interfaces, Today and Tomorrow; an interactive discussion and demonstrationJoshua Blake; Neil Roodyn; Dennis Vroegop; Rick Barraza; Bart Roozendaal; Josh Santangelo; Nicolas Calvi

The Natural User Interface (NUI) is a hot topic that generates a lot of excitement, but there are only a handful of companies doing real innovation with NUIs and most of the practical experience in the NUI style of design and development is limited to a small number of experts. The Microsoft Surface MVPs are a subset of these experts that have extensive real-world experience with Microsoft Surface and other NUI devices.This session is a panel featuring the Microsoft Surface MVPs and an unfiltered discussion with each other and the audience about the state of the art in NUI design and development. We will share our experiences and ideas, discuss what we think NUI will look like in the near future, and back up our statements with cutting-edge demonstrations prepared by the panelists involving combinations of Microsoft Surface 2.0, Kinect, and Windows Phone 7.

[Vote for this session here!]

I also want to highlight my InfoStrat colleague Josh Wall’s entry, which should be very interesting.

Microsoft Surface at the Smithsonian – 30 seconds to MagicJosh Wall

How do you design a Microsoft Surface application that creates a magical experience in less than thirty seconds, uses real-world physical objects, and is both fun and educational for kids? That is quite a task, but oh yeah – you need to build seven of these magical applications that are distinct yet connected by a common theme. This was the challenge presented to InfoStrat, a Surface Strategic Partner in Washington DC, by the Smithsonian Institution. This session will explore the design and development process that went into building the innovative physical object interactions for a suite of Microsoft Surface applications used in the “Wonder of Light: Touch and Learn!” exhibit at the Smithsonian Institution in Washington, DC. Josh Wall, director of the InfoStrat Advanced Technology Group, will demonstrate and analyze five different types of physical object interactions that take advantage of the unique vision system in Microsoft Surface. He will also discuss lessons learned from the experience, including designing Surface applications for kids, when to use a linear or non-linear task path, and how to apply Natural User Interface (NUI) concepts to interactive museum exhibits.

[Vote for this session here!]

Here are the other NUI sessions:

Microsoft Surface v2 – designing for the new form factor
Josh Santangelo

Building Really Social Software
Neil Roodyn

Wave, Touch, Pen, Speech, Mouse and Keyboard
Neil Roodyn

How to build a great Microsoft Surface application
Neil Roodyn

How NUI can help to perform complex tasks, like flying a helicopter
Bart Roozendaal

Understanding the Metro design language – from Zune to mobile to Microsoft Surface and beyond
Nathan Moody

[Update 1/27 9:40am: I overlooked the following session! Sorry Davide!]

Silverlight, Windows Phone 7, Multi-touch and Natural User Interfaces
Davide Zordan

There are also many other interesting topics, but I hope you’ll help vote in the NUI and Surface sessions. The MIX content team takes into account the popularity of various topics. (For example, last year they accepted 12 Open Call sessions when they were originally planning only 10.)

Voting is open until midnight PST on February 4th, so lock in your votes now!

P.S. Here is a really nifty pivot viewer for visualizing and filtering the Open Call entries:

Thursday, January 20, 2011

Natural User Interfaces in .NET book update

Today Manning pushed out a new update to my book’s MEAP (early access Ebook). This is the first update in quite a while, and that’s because I’ve been secretly updating chapters and writing new chapters that include NDA content.

Note: please read through to the end of this post for a 50% off coupon for my book good until February 20, 2011.

The secret NDA content is about the future of the Surface platform and Surface SDK. Now that Surface 2.0 has been announced, I can start talking about it and pushing out the non-NDA content. You’ll notice in this update an updated table of contents and a new title, Natural User Interfaces in .NET, to reflect the reorganized content and focus. Here is the new cover:


Based upon reader and reviewer feedback and the announcement of Surface 2.0 and Kinect, we decided to focus the book on the WPF 4 platform with Surface SDK 2.0. The combination of WPF 4 and Surface SDK 2.0 makes the most advanced NUI platform available that targets a wide variety of devices, form factors, and NUI input technologies. Surface SDK 2.0 is the successor to both Surface Toolkit and Surface SDK 1.0 and will run on any Windows 7 machine, including desktops, tablets, slates, and Surface 2.0 units. 

I’ve also decided to include Kinect content because I have determined that we can use the same WPF 4 and Surface SDK 2.0 platform to target depth sensing camera interfaces as well. In early experiments, I have injected hand tracking and hand pose information into the WPF 4 touch stack and can control any WPF 4 touch interface with my hands. Using this unified NUI platform, you can write a single interface that users can use with all of the Windows 7 multi-touch devices, Surface 2.0, as well as Kinect and similar depth sensors.

Of course, you are going to want to design the interface differently based upon which input device your application is using. For example, Surface lets you detect tagged objects and finger orientation and hand tracking depth sensors can give you extra information on hand poses and gestures. That’s why I’ve organized the book into the three parts:

Part 1: Introducing NUI Concepts
What developers need to know about NUI design

Part 2: Learning WPF Touch and Surface SDK
A unified NUI platform

Part 3: Building great experiences
How to take advantage of the unique capabilities of your input devices

I still have a few more chapters to complete, but the content complete milestone is in sight. I’ve been getting great feedback from my MEAP subscribers and continue to make improvements based upon feedback. (Anyone who pre-orders the eBook or print book is also subscribed to MEAP updates and can read the completed chapters as they become ready.)

If you haven’t pre-ordered my book yet, you can buy the eBook or print book (and get the MEAP for free) for 50% off until February 20, 2011. All you need to do is enter in coupon code multinet50 during checkout. If you buy it directly after clicking this link, Manning will give me a couple extra bucks and I would appreciate it. You can also download a free sample chapter and the sample code from that link.

Tuesday, January 18, 2011

OpenKinect: Creating next-generation interfaces, Jan 20, Arlington, VA

This Thursday, January 20 at 7pm I'll be presenting at the CapArea.NET Silverlight SIG in Arlington, VA. If you're in the area, please come by.

Navy League Building
First Floor Conference Room
2300 Wilson Blvd
Arlington, VA
[Bing Maps | Google Maps] and is Metro accessible (the Court House station).

Free street parking after 6pm.

Presentation Title:
OpenKinect: Creating next-generation interfaces

Presentation Description:
Kinect is awesome. Kinect on your PC is even more awesome. An international community of developers creating free, open source drivers and software for using Kinect with your PC is the most awesomest of all. That is what the OpenKinect Community is about. In this session, Joshua Blake, founder of the OpenKinect Community, will tell the story of how the community worked together to open up Kinect, demonstrate Kinect working on a PC and some videos of what other people have created so far, and show you how you can start programming with Kinect and create next-generation natural user interfaces.

Friday, January 14, 2011

Kinect hand tracking with WPF 4 and Bing Maps 3-D

I’ve been playing around with Kinect and PrimeSensor in WPF for a bit and have achieved a small technical success that I wanted to share in a new video from InfoStrat. Using the same techniques that allow us to use WPF 4 on Surface 1.0, we can now use depth camera hand tracking to control multi-touch applications.

Here is a very rough proof-of-concept where I’m controlling the InfoStrat.VE WPF 4 multi-touch control using a depth camera.

Bing Maps controlled with Kinect 3D-sensing technology

This is just a multi-touch application and I have added my one line of code to enable hand tracking to feed the WPF 4 touch stack.  I also display outlines of the tracked hands to provide better feedback about what is going on. In this video I also used OpenNI and NITE from PrimeSense.

The tracked hands can participate in all of the multi-touch manipulations and gestures that you’ve already written for your touch application. You can even interact using hand tracking and touch at the same time in the same window. The code that enables this is part of our internal InfoStrat.MotionFx project and will eventually be open sourced, but it needs a bit more work to be practical.

One enhancement we’re planning is adding hand pose extraction, including palm orientation and finger positions. This would allow you can use hand poses and hand gestures to control whether a hand is “touching” the screen or not instead of the current technique. Currently, it determines that a hand is “touching” if it is over a certain distance from the shoulder. Knowing the hand pose would also enable new types of interactions, just like finger orientation, blob size, and tagged objects on Surface enables new interactions.

In the end, we will want to design interfaces that use motion tracking to take advantage of the unique capabilities of that modality to create truly natural interactions. (Pinch-to-zoom with large arm movements is not the most natural interaction.) What I’ve shown above, though, is that it is feasible to use the WPF 4 Touch stack and Surface SDK as the unified platform for both multi-touch and motion tracking modalities.

Thursday, January 6, 2011

Microsoft Surface 2.0 Data Visualization Controls by InfoStrat

On Wednesday at CES2011, Microsoft finally announced the next version of Microsoft Surface. It takes advantage of a new technology called PixelSense where IR sensing pixels are interwoven with display pixels right in the LCD display itself. The computer specifications were updated and form factor went through a major overhaul, making Surface 2.0 one sexy table computer:


My employer, InfoStrat, is a member of the Surface Technology Adoption Program (TAP), so we’ve had access to early SDK builds and have been giving feedback to the Surface team since May 2010. (Serious feedback… at the last TAP event I won a backpack full of swag for finding and submitting the most bugs during the TAP. Don’t worry, the bugs were all addressed.)

InfoStrat and other Surface TAP members have also been working on applications using Surface SDK 2.0. If you were watching the Microsoft CES2011 keynote (skip to 44:15), then you saw one of those applications. Unfortunately there wasn’t time to feature all of the partner applications during the keynote, but that doesn’t mean that we can’t show them to you anyway!

Today InfoStrat announced that we will be open sourcing several of our WPF 4 data visualization controls that we designed for Surface SDK 2.0. This includes the DeepZoom and Pivot controls I’ve been hinting about on twitter as well as a few others. Unfortunately, we cannot publish the controls or code quite yet. We have to wait until Surface SDK 2.0 becomes available to the public.

We’ve been working on these controls for several months and so I’m very excited to share with you a video of these controls used in our Surface 2.0 application, InfoStrat Show for Microsoft Surface. This application is designed to take advantage of the unique capabilities of Microsoft Surface to enable natural and engaging presentations and small group collaboration. This video is just a sneak peak at the controls. We’ll be sharing more as we get closer to the code release. Below the video is InfoStrat’s press release about the controls.

Microsoft Surface 2.0 Data Visualization Controls by InfoStrat. Also shown are the updated visual styles of the Surface SDK 2.0 controls.

For Immediate Release

9 a.m. PST

January 6, 2010

InfoStrat Releases Next-Generation Data Visualization Controls for Microsoft Surface 2.0

Washington DC – January 6, 2010 – InfoStrat today announced plans to support Microsoft Surface 2.0 by releasing a control suite that accelerates the development of next-generation multi-touch data visualizations. The controls will be made available as open source software at no charge on in the first half of 2011.

This data visualization control suite provides multi-touch versions of the following controls:

§ Deep Zoom multi-resolution image control that allows high performance display of very high-resolution imagery

§ PowerPoint Viewer which enables slide decks to be arranged and presented using multi-touch

§ Pivot Viewer chart control that allows dynamic sorting and categorization of data

§ Physics Canvas which provides an infinite, dynamic canvas for viewing and organizing content

Other features of the controls:

§ Works on both Microsoft Surface and Microsoft Windows 7 with touch

§ A single application built with the data visualization framework can support multiple hardware form factors including: horizontal multi-touch tables, tablets, and large format vertical touch screens

§ Innovative object recognition to enable rapid data manipulations (only on Microsoft Surface)

Watch a sneak preview of the control suite on YouTube:

InfoStrat is a member of Microsoft’s Technology Adoption Program (TAP) for Microsoft Surface. As a Microsoft Surface 2.0 TAP member, InfoStrat receives early access to hardware and software, allowing InfoStrat to gain expertise and influence the development of the product before it was released to the public.

In 2008, InfoStrat solved the problem of using Bing Maps 3D on Microsoft Surface in a way that performed well and was WPF-friendly. InfoStrat open-sourced the solution as a reusable control for the WPF and Surface community. Since then, the control has received over 120,000 page views and has over 8200 downloads, and has also been featured in many of our own applications. This control, known as InfoStrat.VE, has become one of the most popular controls for building mapping applications on Microsoft Surface:

“We are proud to be part of the Microsoft Surface development community,” according to Jim Townsend, president of InfoStrat, “and excited about the possibilities of Microsoft’s new version of Surface.”

Microsoft Surface provides a new way to experience and use information and digital content, engaging the senses, improving collaboration and empowering people to interact. Microsoft Surface is at the forefront of developing software and hardware that uses vision-based technology to fundamentally change the way people use computing devices. More information can be found at

Information Strategies ("InfoStrat") is an award-winning Microsoft Gold Certified Partner and a Microsoft Surface Strategic Partner and member of the Technology Adopter Program.

For more information, press only:

Josh Wall, InfoStrat, (202) 364-8822 ext. 202,