Science

Kinect Has Gotten Super Good At Reading Sign Language

Reading sign language has always been in the cards for Kinect, ever since that showed up as a feature in some of the early patents. A while ago, it managed to read two of the most sweeping, exaggerated arm-based gestures. But since then, it’s gotten good. Like, really, really good.

A joint project between Microsoft Research Asia and the Institute of Computing Technology at the Chinese Academy of Sciences has allowed the bowing black box to read just about every gesture in American Sign Language and translate it into plain English or another language. And it doesn’t have to be just one at a time either; the software developed by the team lets Kinect parse whole sentences as one gesture flows into the next at natural speeds. It’s pretty impressive.

The ability to sign at a TV would definitely open up some options for deaf tech-fans, but it also shows just how far the Kinect has come. And with the fidelity of the new Kinect being as high as it is, chances are we are in for some more awesome stuff in the future. So long as you don’t mind that the robot eye in your living room has damn good vision. [Inside Microsoft Research via Polygon]


Have you subscribed to Gizmodo Australia's email newsletter? You can also follow us on Facebook, Twitter, Instagram and YouTube.