While certainly impressive technology, modern touchscreens are still kind of limited in that they only really know where on the display they’ve been touched. So researchers at Carnegie Mellon University are hoping to advance their capabilities with a new system that can tell exactly what they’ve been touched with.
Developed by students Chris Harrison and Julia Schwartz, along with their professor Scott Hudson from the school’s Human-Computer Interaction Institute, the TapSense system uses a microphone attached to the screen which can be used to discern exactly what has interacted with the display.
Instead of just taps, the system is capable of distinguishing between taps with the tip of a finger, the pad, the fingernail and even the knuckles. So in addition to detecting gestures, software could “listen” to what part of the finger was used and act accordingly. The knuckle could exclusively be used to bring up context menus, while drawing apps could swap between brush types when the user switches from their fingertip to their fingernail.
Unfortunately, at the moment the TapSense system relies on an external microphone to work, because the mics included in smartphones are optimised for picking up voices, not the subtle sounds of a finger tap. But the system could still easily be implemented on a smartphone with the addition of an extra mic. And besides fingers and other parts of the body, TapSense could also differentiate between different types of stylus tips, allowing multiple users to collaboratively work on the same touchscreen display.