The human-computer experience is a combination of:
At Autodesk Labs, we have a Perceptive Pixel Multi-touch Wall created by Jeff Han. Senior Systems Software Developer, Hans Kellner, is one of our multi-touch developers. He works for the Geospatial group.
Using a development copy of Mudbox that Hans altered to interface with the Peceptive Pixel device, Hans created a brief video that demonstrates some of the work he is doing with Mudbox:
- YouTube: http://www.youtube.com/watch?v=wh1Qy6OvI1A
- non-YouTube: http://labs.blogs.com/video/Hans_Kellner_Mudbox.wmv
Mudbox is Autodesk's digital sculpting application for film, games, television, and design projects. What's interesting about this example is how the fingers are used.
- A single finger gesture is used for pan.
- Two fingers are used to pinch for zooming.
- Three fingers are used for orbit.
- Later in the video you can see Hans editing the model using one finger.
The application recognizes the use of a single finger based on context. It can tell if Hans is touching the model (edit) or someplace that is not part of the model (pan). This is a great example of context sensitive input in the area of multi-touch. We're all familiar with context-sensitive input for the mouse. For example, Windows "grays out" menu items that are not appropriate based what has been selected. Here we have an example of an application taking appropriate action based on what the user selected with his finger.
Continuing to explore the world of human-computer interaction is alive in the lab.