Tuesday, April 6, 2010

Why won't WPF controls work with touch manipulations?

I recently tweeted asking for some scenarios using WPF Touch, and I got an email from Josh Santangelo (@endquote) with few interesting touch challenges. I created some sample code and sent him back some solutions to his challenges, but I figured they would be useful for the community.

In this blog, I'll answer the question: "Why won't WPF controls work with manipulations?" (This is my own phrasing, not Josh Sanangelo's.)

Problem: 
You have a container with some manipulations (perhaps like a ScatterView) and this container has some standard WPF controls like buttons or checkboxes. You can use touch to manipulate the container, and you can use the mouse to click the controls, but touch mysteriously doesn't affect the controls. (The mouse also can't manipulate the container, but that is another post.)

Reason:
When you touch the screen, WPF generates touch events, such as TouchDown, TouchMove, and TouchUp. These are all routed events, which means that first the PreviewTouchDown event is fired on the root of the visual tree, then the next element down the hierarchy, then the next, all the way down the source element that the touch event occured over, as long as the event is not handled along the way.

Once it reaches the source element, the TouchDown event is fired starting from the source element and proceeding up the visual tree to the root. At any point if any of the visual elements set e.IsHandled = true, then the event propagation stops.

On the other hand, if the event propagation reaches all the way up to the root, then the touch event is promoted to a mouse event. At this point, PreviewMouseDown and MouseDown is fired down and up the visual tree.

In our case, when you use touch and there is no manipulation, the touch events return unhandled, then the mouse events are fired and the button handled that and calls your Click event handler. The WPF controls such as button only listen for mouse events (exception: ScrollViewer when PanningMode is set.) This process is illustrated in figure 1.

Figure 1. (Click for larger size.) The touch event flow with no manipulations. The touch events are unhandled, so WPF promotes the event to the mouse equivalent. The button is listening for mouse events and handles that, then calls your Click event handler.

When you do have manipulations enabled in the visual tree, something different happens, as illustrated in figure 2.
Figure 2. (Click for larger size.) When the border has IsManipulationEnabled = true, the manipulation processor handles the TouchDown event and captures the touch device. All future events go directly to the Border, and the ManipulationProcessor handles the rest.

In the case where the Border has a manipulation, the touch event flow is never promoted to mouse events, so the button doesn't even have any idea what is going on. Button didn't get the memo.

Solution:
Of course, it doesn't have to be this way. Sometime soon, the Surface Toolkit for Windows Touch will be released and you'll be able to use the SurfaceButton and the other controls which are designed for touch and handle the touch events. Figure 3 shows a list of the WPF controls the Surface toolkit optimizes for touch.

Figure 3. (Click for larger size.) Surface Toolkit for Windows Touch offers touch-optimized controls that can replace most of the common WPF controls. This is only a partial list of what the Surface Toolkit offers.


But that doesn't help you now. Suppose you have this XAML inside of a container with manipulations:

   1:  <TextBox Name="txtCounter"
   2:           Text="0"
   3:           Margin="10"
   4:           HorizontalAlignment="Center" />
   5:  <Button Content="Native Button Won't work"
   6:          Margin="10"
   7:          Height="40"
   8:          Click="button_Click" />

and this code in the code behind:

   1:  private void button_Click(object sender, RoutedEventArgs e)
   2:  {
   3:      IncrementCounter();
   4:  }
   5:   
   6:  private void IncrementCounter()
   7:  {
   8:      int number = int.Parse(txtCounter.Text) + 1;
   9:      txtCounter.Text = number.ToString();
  10:  } 

This button will not work and the event flow will look like figure 2. Instead, update the button XAML to this:

   1:  <Button Content="Will work with TouchDown/Up"
   2:          Margin="10"
   3:          Height="40"
   4:          Click="button_Click"
   5:          TouchDown="button_TouchDown"
   6:          TouchUp="button_TouchUp" />

and add these methods in the code behind:


   1:  private void button_TouchDown(object sender, TouchEventArgs e)
   2:  {
   3:      FrameworkElement button = sender as FrameworkElement;
   4:      if (button == null)
   5:          return;
   6:              
   7:      button.CaptureTouch(e.TouchDevice);
   8:   
   9:      e.Handled = true;
  10:  }
  11:   
  12:  private void button_TouchUp(object sender, TouchEventArgs e)
  13:  {
  14:      FrameworkElement button = sender as FrameworkElement;
  15:      if (button == null)
  16:          return;
  17:   
  18:      TouchPoint tp = e.GetTouchPoint(button);
  19:      Rect bounds = new Rect(new Point(0, 0), button.RenderSize);
  20:      if (bounds.Contains(tp.Position))
  21:      {
  22:          IncrementCounter();
  23:      }
  24:              
  25:      button.ReleaseTouchCapture(e.TouchDevice);
  26:   
  27:      e.Handled = true;
  28:  }

Now your button will work inside of the container with both mouse and touch. A little explanation:
  • Line 7: We capture the touch so that the TouchUp and other touch events will be sent to this button, even if it occurs somewhere else.
  • Line 9: We must set this event to handled, otherwise the TouchDown event will continue to bubble up and the border's manipulation processor will steal the capture, depriving the button of a TouchUp event.
  • Lines 18-20: We check to make sure the TouchPoint is still within the bounds of the button. The user could have touched the button, but changed his or her mind and moved outside of the button to release the touch. This is consistent with the mouse behavior of buttons.
  • Line 22: We just call the same function that the button_Click() event handler called to get the same effect.
  • Lines 25-27: We release the capture and handle this event as well. Technically we might be able to get away without these lines in this scenario but we should do it anyway. In a more complicated scenario there might be an unintended side-effect if we do not do this.

You can apply this technique to the other non-touch aware WPF controls if necessary, although it may get a little tedious. I almost wrote a Blend behavior (part of the Blend 3 SDK and usable in more than just Blend) for this, but figured that the Surface Toolkit will be out soon enough anyway.*

(* No I don't have a date for the release of Surface Toolkit. Sorry!)

You can download the source code for this project below:

Thanks to Ryan Lee for Gesturecons, where I got the hand icon used in the figures.

16 comments:

  1. Great explanation Josh!

    Just wanted to comment that the Microsoft Surface Toolkit for Windows Touch (Beta) just released last week.

    Take a look at
    http://blogs.msdn.com/surface/archive/2010/04/21/reminder-surface-toolkit-for-windows-touch-available.aspx

    for more information about the toolkit.

    Also, we have a channel 9 video at
    http://channel9.msdn.com/posts/LarryLarsen/Surface-Toolkit-for-Windows-Touch/

    The toolkit helps with scenarios like the one you describe above and much more.

    Best of luck!

    -Luis Cabrera

    ReplyDelete
  2. Thanks Luis. I'm very excited about the Surface Toolkit.

    ReplyDelete
  3. Great writing, Josh. I can't wait to read the book! (Awesome diagrams too!)

    ReplyDelete
  4. Hi Josh! I'm pretty sure you know about these but I thought I'd share them anyways.

    Touch Behaviors for WPF 3.5 SP1 and Silverlight 3.

    http://touch.codeplex.com/

    Cheers,

    Tanagram

    ReplyDelete
  5. Thanks Tanagram. Those behaviors fill a gap for the WPF 3.5 and Silverlight scenarios, in particular. Fortunately WPF 4 + Surface Toolkit = Win. Too bad Surface Toolkit isn't available for Silverlight!

    ReplyDelete
  6. Special thanks to the detailed explanation of the problem, and the practical solution :) ! Cheers !

    ReplyDelete
  7. There is another (and probably better) solution to this problem.
    Using the ManipulationCompletedEventArgs e.Cancel() function.
    This will cancel the manipulation and sends finger inputs as mouse events.

    private void ManipulationCompleted(object sender, ManipulationCompletedEventArgs e)
    if (e.TotalManipulation.Translation.X == 0 && e.TotalManipulation.Translation.Y == 0)
    {
    e.Cancel();
    }

    ReplyDelete
    Replies
    1. This solves my issue.. thanks for the reply flip

      Delete
    2. Thank you very match! This helped me!

      Delete
  8. flip,

    That might work, but only if the manipulation has not moved at all (which would be rare if you are touching it) and only if the TouchDevice supports Mouse promotion, which some but not all TouchDevices support. It also doesn't account for a mouse control such as a slider, where you are supposed to drag it, when it is inside of a manipulatable container. Regardless, the solution now is to the Surface SDK 2.0 controls.

    ReplyDelete
  9. Josh,
    what can I do if I have a ListBox inside a container that handle manipulation events and want to select an item? Any chance to get a ListBox SelectionChanged events? SurfaceListBox can be the best option?
    Anyway thanks for the well explained topic.

    ReplyDelete
  10. Roberto: ListBox has a lot of internal controls and interactions that you'd have to rewrite for touch, so yes, using SurfaceListBox would be your best option.

    ReplyDelete
  11. Hi Josh,
    Your solution works for me perfectly, but it causes for another problem. The button does not change it's style to "pressed" mode.
    There is no any identification that the button was pressed.
    Do you have any ideas how to solve this problem?

    Thanks

    ReplyDelete
    Replies
    1. Alexander, It sounds like you need to customize your button's style to either add a trigger for touch to change the VisualStateManager or set the VisualStateManager state programmatically.

      Delete
  12. Isn't there a risk that the button will respond to both the click event and the touch event and increment the counter twice?

    ReplyDelete
  13. How do I tell what the value of IsManipulationEnabled is?

    ReplyDelete

Note: Only a member of this blog may post a comment.