Sunday, March 28, 2010

MIX10 NUI session sample code

I promised that I would provide the source code for the TouchGrid and ScatterTouch samples for WPF, Silverlight for web, and Silverlight for Windows Phone 7 from my MIX10 session. I provided a zip file with the sources to the people running the MIX website after my session, but it still hasn't been posted yet. So I'll just post it here on my blog.

http://www.handsonnui.com/files/MIX10_NUI_EX18.zip

Important: I cannot re-distribute the Microsoft Surface Manipulations and Inertia Sample for Microsoft Silverlight files so you will have to download that yourself and add a reference to System.Windows.Input.Manipulations to these projects SL_ScatterTouch and WP7_ScatterTouch. See the readme.txt for more details.

Of course if you want to run the Windows Phone 7 samples you'll need the Windows Phone 7 SDK.

Have fun!

Wednesday, March 24, 2010

MIX10 NUI Recap

Well I'm pretty much recovered from MIX10 by now (March 15-17 in Las Vegas). The whole experience was a blast. Here are the highlights for me:

Windows Phone 7 Series - Finally we learn the full scoop of the WP7 development story. Very excited about the Silverlight and XNA focus. I'm also relieved to learn that WP7 flavor of Silverlight has manipulation processors built in. It isn't quite the same as the WPF manipulations or even the Silverlight manipulations sample (no rotation support) but it makes multi-touch development feasible.

NUI MIXup - I organized a get-together for NUI and multi-touch enthusiasts Monday night. Marc Schweigert (@devkeydet) was able to get sponsorship from his team for drinks and appetizers for the group. It was well attended, about 25 or 30 people. We even had Dr. Neil and Bill Buxton (yes, keynote favorite Bill Buxton!) hang out! After we chatted for about an hour in one of the conference spaces, we moved down to the Border Grill where Marc treated us to drinks and light dinner. See some pictures below, thanks to fellow presenter and NUI enthusiast Mario Meir-Huber (@mario_mh):


Bill Buxton, Ben Reierson (@seraph_ben), and Dr. Neil chatting about Windows Phone 7 design.


The whole group of us heading towards dinner. Marc leads the charge on the left, and I'm in the middle with the black shirt chatting with James Chittenden (@okayjames).

Just about the highlight of my whole week, I had the opportunity to chat with Bill for over an hour at dinner along with several others that attended. It was a humbling experience.

NUI Development session - I didn't go to many sessions in person (they're available for download) because I spent a lot of time Monday and Tuesday putting the final touches on my presentation for Wednesday. I prepared a custom multi-touch presentation tool called NaturalShow that takes the place of PowerPoint. Even Tuesday evening I was sweating some last minute changes in the application, trying to maximize the performance. I also have to thank Josh Wall for listening to at least two dry runs and giving me great advice and feedback.

You can watch or download my presentation at the MIX10 site.

The presentation itself went pretty smoothly. The session had great attendance (144+) and the audience seemed engaged. My demos worked as expected, and NaturalShow didn't crash or otherwise behave abnormally. I included a hashtag #MIX10NUI so that twitter inclined people could follow the conversation. One posted this photo mid-presentation:

Taken from about 1/4 of the way back. The smudges around the word "design" are auras of my fingers starting to zoom in.

I have to admit that part of the reason I threw out the #MIX10NUI hash tag was so I could easily go back and read real-time feedback. When I did I was blown away by the things people said. There were two fair complaints (I took a shortcut when I described Occam's Razor and I forgot to zoom in on code, even though I had practiced doing that) but overall everyone said great things. At the risk of seeming self-focused for a moment, here are two of the most superlative comments that really made my day:

My session was also mentioned in several other people's conference recap blog posts: Luke Wroblewski, Cory Plotts, Roland Weigelt, Ben Reierson.

I guess the point here is that MIX10 was very much about the networking and making new acquaintences as well as meeting online friends for the first time. I'm still pumped by getting to meet so many awesome people, including the ones mentioned above, Steve Dawson, Pete Brown, Laurent Bugnion, Sean Getery and his crew, and many others I'm know I'm missing. Many times it was just a challenge to associate someone's real name with their twitter handle.

After the conference itself died down, Josh Wall and I went out to dinner at Hard Rock Cafe, checked out how they were using Microsoft Surface and a custom touch wall, and chatted about how we were going to apply the loads of information and ideas we gathered at MIX10 to improve InfoStrat.

This touch wall at Hard Rock let you explore memorabilia and was very popular.

Josh Wall testing the limits of interaction design at the Hard Rock Surface deployment.

This girl played with a bar-embedded Surface for over an hour.

Thank you so much to all the people I met and those who attended my session. I hope that we will continue our conversations throughout the year, and I can't wait for MIX11!

Saturday, March 13, 2010

NUI at MIX10

I'm headed to Las Vegas tomorrow for the MIX10 conference. I'm excited about a lot of the topics and news that will be discussed. It will be great to meet a lot of the people I've only talked to online and meet new people.

If you'll be at MIX10, I'd love to meet you! Here are two places I'll definitely be:

NUI MIXup, Monday the 15th, 7pm, The Commons
I'm organizing a tweetup at MIX10 (hence, MIXup) for NUI and multi-touch enthusiasts. Come hang out and chat with others about our shared interests. Depending upon how many people come, we'll chat for a bit then head to a bar or restaurant. Marc Schweigert will be buying a round or two.

Developing Natural User Interfaces session, Wednesday the 17th, 10:30am, Lagoon F
This is my session! I'm very very excited about it (and thank you again to those who voted for me in the Open Call). I've created a custom multi-touch presentation application that takes the place of PowerPoint so I can walk the walk as I discuss how to create NUIs and multi-touch applications.

I'm available on twitter @joshblake or by email joshblake at gmail dot com. Send me a note if you'll be coming.

See you there!

Monday, March 1, 2010

What is the natural user interface? (book excerpt)

[Update 4/6/2010: Based upon feedback and careful thought, I have slightly modified my preferred definition. See my new post.]

Recently, there has been some discussion on establishing a definition for the term "natural user interface". My friend Richard Monson-Haefel (who just signed with O'Reilly on an iPad SDK book, congrats!) went through several iterations of a definition on his blog and ended up with this:
"A Natural User Interface is a human-computer interface that models interactions between people and the natural environment."

Wikipedia also has a paragraph describing natural user interfaces as invisible interfaces and lacking keyboard and mouse, but did not have a real concise definition. Ron George was a major contributor to the NUI wikipedia article. The first sentence says, in part, that a natural user interface is:

"...a user interface that is effectively invisible, or becomes invisible with successive learned interactions, to its users."

NUIGroup.com also has a wiki page with a lot of description language on the natural user interface, but no concise definition.

As you may have heard, I'm writing a book Multitouch on Windows. A key part of my approach is teaching the readers not just the APIs but also the new ways of thinking required for creating natural user interfaces. My first chapter is titled "The natural user interface revolution" (appropriate since it was also the title of my first blog post) and so right up front, I had to tackle the problem of defining natural user interface for my readers in a concise and comprehensive way.

I took into account both Richard's and the Wikipedia article's approaches, but I was not satisfied with what they had. I think Richard is on the right track, but the way he phrases it seems limiting. Whether or not he intended it this way, modeling the interactions between people and between people and the natural environment implies rather literal interface metaphors with NUI interactions that simulate real-world interactions, but there is no reason why this should be so. The Wikipedia's description talks about invisible interfaces, but to a lay-person this does not make sense and requires additional explanation of what an invisible interface means.

Now, I don't necessarily disagree with how Richard and the Wikipedia article are describing NUI. NUI does have something to do with how people interact with the environment, and NUI interfaces do seem to be invisible, but why are these descriptions true? To help figure this out, I turned to Bill Buxton's presentation in January where he talked about natural user interfaces. I took detailed notes and one particular thing that he said really resonated with me:
An interface is natural if it "exploits skills that we have acquired through a lifetime of living in the world."

I used that definition to write a section in chapter 1 on what "natural" means, and then derived my own definition. Below is an excerpt from chapter 1 of my book where I present my definition for natural user interface.



There are several different ways to define the natural user interface. The easiest way to understand the natural user interface is to compare it to other type of interfaces such as the graphical user interface (GUI) and the command line interface (CLI). In order to do that, let's reveal the definition of NUI that I like to use.
A natural user interface is a user interface designed to use natural human behaviors for interacting directly with content.
There are three important things that this definition tells us about natural user interfaces.

NUIs are designed

First, this definition tells us that natural user interfaces are designed, which means they require forethought and specific planning efforts in advance. Special care is required to make sure NUI interactions are appropriate for the user, the content, and the context. Nothing about NUIs should be thrown together or assembled haphazardly. We should acknowledge the role that designers have to play in creating NUI style interactions and make sure that the design process is given just as much priority as development.

NUIs use natural human behaviors

Second, the phrase "designed to use natural human behaviors" tells us that the primary way humans interact with NUI is through our natural behaviors such as touching, gesturing, and talking, as well as other behaviors we have practiced for years and are innately skilled at. This is in contrast to GUI, which is described as using windows, menus, and icons for output and pointing device such as a mouse for input, or the CLI, which is described as having text output and text input using a keyboard.

At first glance, the primary difference between these definitions is the input modality -- keyboard verses mouse verses touch. There is another subtle yet important difference: CLI and GUI are defined explicitly in terms of the input device, while NUI is defined in terms of the interaction style. Any type of interface technology can be used with NUI as long as the style of interaction focuses on natural human behaviors.

NUIs have direct interaction with content

Finally, think again about GUI, which by definition uses windows, menus, and icons as the primary interface elements. In contrast, the phrase "interacting directly with content" tells us that the focus of the interactions is on the content and directly interacting with it. This doesn't mean that the interface cannot have controls such as buttons or checkboxes when necessary. It only means that the controls should be secondary to the content, and direct manipulation of the content should be the primary interaction method.
Excepted from Multitouch on Windows by Joshua Blake
Chapter 1, "The natural user interface revolution"


I think this definition is very powerful. It gets right to the core of what makes natural user interfaces so natural in a way that does not restrict the definition to particular input technology or interaction pattern. It also can support the points-of-view presented by Richard and on Wikipedia, but in a more general way. 

By talking about directly interacting with content, we establish that content interaction should be primary and artificial interface elements should be secondary and used only when necessary. This is an easier way to say the interface is invisible. 

By framing the definition around natural human behaviors, we can talk about reusable patterns of behavior derived from human-human and human-environment interaction without implying we should model the interface after specific interactions. We can apply natural behaviors by reusing existing skills, which is what Bill Buxton was talking about. In the chapter, I spend a lot of time discussing these skills and how to apply them.

If you would like to read more on this, the entire chapter 1 is available for free download from Manning, where you can also pre-order the MEAP and read chapters as I write them.