Pen + touch Interface

Touch screen interfaces are the gadget design trend-du-jour, but that doesn't mean they do everything elegantly. The finger is simply too blunt for many tasks. Microsoft Research's "Manual Deskterity," attempts to combine the strengths of touch interaction with the precision of a pen.

"Everything, including touch, is best for something and worse for something else," says Ken Hinckley, a research scientist at Microsoft who is involved with the project, which will be presented this week at the ACM Conference on Human Factors in Computing Systems (CHI). The prototype in the video above for Manual Deskterity is a drafting application built for the Microsoft Surface, a tabletop touchscreen. Users can perform typical touch actions, such as zooming in and out and manipulating images, but they can also use a pen to draw or annotate those images.
The interface's most interesting features come out when the two types of interaction are combined. For example, a user can copy an object by holding it with one hand and then dragging the pen across the image, "peeling" off a new image that can be placed elsewhere on the screen. By combining pen and hand, users get access to features such as an exacto knife, a rubber stamp, and brush painting.
 
What Was The Inspiration?
Hinckley says the researchers videotaped users working on visual projects with sketchbooks, scissors, glue, and other typical physical art supplies. They noticed that people tended to hold an image with one hand while making notes about it or doing other work related to it with the other. The researchers decided to incorporate this in their interface--touching an object onscreen with a free hand indicates that the actions performed with the pen relate to that object.
Hinckley acknowledges that the interface includes a lot of tricks that users need to learn. But he thinks this is true of most interfaces. "This idea that people just walk up with an expectation of how a [natural user interface] should work is a myth," he says.
Hinckley believes that natural user interfaces can ease the learning process by engaging muscle memory, rather than forcing users to memorizes sequences of commands or the layout of menus. If the work is successful, Hinckley says it will show how different sorts of input can be used in combination.
Hinckley also thinks it's a mistake to focus on devices that work with touch input alone. He says, "The question is not, 'How do I design for touch?' or 'How do I design for pen?' We should be asking, 'What is the correct division of labor in the interface for pen and touch interactions such that they complement one another?'"
 
What's Next?
The researchers plan to follow up by adapting their interface to work on mobile devices. 
JAS

Inventor, Technologist, Futurist.

http://www.evilrobot.com
Previous
Previous

Next
Next

the moon is our backup drive