Is pinch-to-zoom just too tough to remember? What about tap to highlight? Well, fret not dear user. A solution for you, the frazzled, confused technological rube that just misses the good old days of dry-erase boards and magnifying glasses, has arrived.
The concept is called Touch Tools, and it's best described as skeuomorphism applied to interaction design. The group of engineers at Carnegie Mellon behind the idea have taken a collection of things you'd do to paper in the real, physical world: clip it, photograph it, magnify it, measure it and so on. Then, they have applied the same hand gestures to an iPad interface, letting you "measure", for example, by stretching your imaginary tape measure across the screen:
Why introduce a new, complicated set of gestures into the mix? Here's how the authors — CMU's Chris Harrison, Robert Xiao, Julia Schwarz, and Scott E. Hudson — describe their paper:
The average person can skillfully manipulate a plethora of tools, from hammers to tweezers. However, despite this remarkable dexterity, gestures on today's touch devices are simplistic, relying primarily on the chording of fingers: one-finger pan, two-finger pinch, four-finger swipe and similar. We propose that touch gesture design be inspired by the manipulation of physical tools from the real world.
Over on Co.Design, Mark Wilson wonders whether it's as clever as it seems, saying, "I'm not so convinced." He has a point: Not only is the iPad already pretty damn intuitive, but adding another layer of real world metaphors to "simplify" an already simple interface seems almost condescendingly out-of-touch with how canny the average tablet or phone user already is.
Nonetheless, the team at CMU has touched on an interesting idea — doubly interesting given the constantly rotating wheel of popular opinion currently places skeuomorphism at its absolute nadir. Leather-stitched interfaces and textured app icons may currently hold the same cultural relevancy as Gangnam Style right now, but is there there still value in skeuomorphism as a broader idea in UX design?
I'm willing to bet there is. The authors aren't exactly arguing that we should model all future digital interactions after real-world analogs. They're saying that the library of interactions we currently use is quite thin — and by looking at the world around us, and the "natural modality" of our 10 fingers, designers might find unexpectedly smart solutions to digital problems. [Engadget via Co.Design]