Typing on a touchscreen is not one of life’s pleasures: the one-size-fits-all nature of most virtual keyboards is a hassle that puts many of us off using them. I’ve lost count of the number of times I’ve seen journalists put down an iPad, for instance, and pick up a laptop or netbook to do some serious notetaking or writing.
IBM, however, says it doesn’t have to be that way. In a recently filed US patent application, three IBM engineers posit the notion of a virtual keyboard in which the position of the keys and the overall layout is entirely set by the user’s finger anatomy [PDF] . That way, they argue, people will be better able to type at speed, with all keys within comfortable range and so end up, with fewer errors.
After an initial calibration stage, in which the keyboard asks users to undertake a series of exercises to set response time, anatomical algorithms get to work, sensing through the touchscreen the finger skin touch area, finger size and finger position for the logged in user.
As this information is gathered – IBM does not say over what period this learning takes place – the virtual key buttons are automatically resized, reshaped and repositioned in response.
The patent shows a keyboard with some keys subtly higher than others, and with some fatter than others. This “adapts the keyboard to the user’s unique typing motion paths” governed by their different physical finger anatomies, says IBM, which suggests the idea being used in both touchscreen and projected “surface computing” displays.
There does seem scope for such ideas. In a review of the Apple iPad, review website MacInTouch said: “A touch typist found it frustratingly glitchy versus a real keyboard, producing all sorts of ghost characters when the screen repeatedly misinterpreted his fingers’ intentions.”
Perhaps anatomical profiling is just what’s needed.