I tried out a few different applications built for Intel’s Perceptual Computing Beta. The input source that was perceiving me was a Kinect-like sensor on top of a computer made by Creative for the project. Using the sensor’s stereo cameras, the demo was able to perceive me as an object in 3D space and remove me from the scene. The Creative camera also has a built in mic for voice recognition.
Now, voice recognition and body tracking aren’t new. What’s impressive is the technology Intel has built for processing huge amounts of information locally, compared the very simplistic eyes of Kinect and the voice interpretation brain in your car. And, of course, the dope processors Intel built to churn all the data in real time. It’s so good that last night Intel announced that the camera could track eyes. Holy crap. The SDK is in beta, but Intel reps told us the company wants to launch for real this year.