How Intel Will Build The Holodeck

How Intel Will Build The Holodeck


Mooly Eden steps out of the world of transistors and microprocessors for a moment. “If you want a simple explanation of what we’re doing, just look to Asimov,” the head of Intel’s Perceptual Computing push says, explaining. “Or Star Trek, Star Wars and Avatar. The ideas have been in science fiction for years, and now they’re becoming fact.”

That’s as direct a line into Perceptual Computing as you’ll find, since the plans that Intel has shown us to this point have been fairly ambiguous. Right now, we’re seeing the vanguard arrive, with features like the eye-tracking Tobii, a Kinect-like gesture and motion sensor, and human recognition and overlay.

But the long-term goals are much more ambitious. “Wearable is inevitable,” Eden says, referring to Google Glass-like tech, “And implantable is probably the next step.” Imagine an implantable computer that could monitor, or even help hinder, Alzheimer’s symptoms. Or even brain-reading implants, which Eden mentioned but was very clear it was meant in a general sense.

It’s the kind of project you’d expect to be off on its own, like Lucius Fox’s R&D Department, operating well away from whatever real world business Wayne Enterprises or Intel are conducting. Instead, it’s at the heart of everything Intel is doing.

***

Intel’s Perceptual Computing initiative began in earnest about 18 months ago, with heavy corporate investment, but its roots are from the PC Client division, where Mooly Eden and his team had been picking at it for years before that. It revolves around a totally different way of thinking about how to build a microprocessor. “For years, I would look at a microprocessor and just see floating-point f numbers, or other specs,” Mooly says. “But I’d go home to my wife, and she’d say, ‘Well, what does that do for me?’ And I didn’t really have an answer.”

That’s when Eden, part of Intel’s PC Client Group for over 30 years, and its general manager since 2009, changed things up. Instead of focusing purely on drilling down on specs and metrics, Intel began tailoring its future goals for its microprocessors around the results from sociological studies about how people actually use their computers. But it also took cues from Mooly’s team, which anticipated things like the need for extreme power efficiency and smarter sleep states.

How is that going to manifest as a future-perfect space toy for you in the near future? Beyond the performance and efficiency benefits to today’s gadgets as Eden marshals the weaponry he needs, there are already some ideas on the table. If you have an always-on camera, for instance (you didn’t think Google Glass was the only one with its eyes on that prize, did you?) that was smart enough to know what your car looks like, you could ask it, in plain language, “Where did I leave my car?”, and it would show you a screenshot or a few seconds of footage, saving you a half hour of wandering around the parking garage like a jackass.

But how we get from here to there is a process. Instead of just hunkering down for a few years and maybe, hopefully, possibly coming out the other side with something real people will be able use someday (á la Microsoft Research), Intel’s doing the legwork on features that will be key to future uses. And then it’s implementing them now, like with the mandatory touch on next-gen PCs, or the advanced-but-still-getting-there motion sensing it showed at CES.

***

In the near future, a lot of those building blocks will involve this year’s Haswell processors, and next year’s Broadwell. On the surface, these just seem like just the fourth generation of Intel’s Core series. But that’s a total misnomer according to Eden. “Haswell was built from the ground up with the intention of improving the experience for people,” he says. It implements advanced power-saving methods like finer screen refresh control, advanced sleep states, and controlling power for nearly everything on the motherboard.

Intel’s teams are still ploughing ahead with more traditional microprocessor tech. “We’re basically trying to defy gravity,” Mooly says of the new 22nm process used in the Haswell architecture. The transistors remain crucial. “If I’m trying to build the world’s greatest Lego tower,” he continues, “I need the best Lego bricks that we can possibly make.”

But while Intel talks a big game about taking these technologies, and its goals are absolutely pointed in the right direction, a second glance at its real world efforts to this point gives you a second of pause. For all the technical advancements and achievements — dedicated hardware accelerators, impressive new architectures, faster power state shifts — it still has a problem: It’s got a long way to go in mobile before it’s on equal footing with the competition.

Clover Trail Atom processors haven’t made the splash Intel was hoping for, and the attention focused on Medfield last year seems to have shifted to Lexington, a lower cost value platform. Basically, Intel’s talking a big game about the future of computing, but it’s still playing catchup on the present of mobile.

Further, if Intel’s going to find traction with perceptual computing, it has to nail the interface, or face the same adoption resistance that Windows 8 is staring down now. Basically, people are happy to gawk at a tech demo, but if they’re going to use something day-to-day, it has to work perfectly.

***

The biggest obstacles to natural interfaces are the input and output. How we interact with the future computers is inherently unnatural. Touch is a good start, but the real advancements will come from the aforementioned ideas like eye tracking, gestures and speech. Problem is, none of those are quite ready for day-to-day use yet.

Take natural language interpretation. It’s been a white whale for engineers for years, and it doesn’t really seem to be improving at the rate we expected. Siri is a dud, and anyone else making a similar attempt has scaled back expectations considerably. Even then, those services suffer from what Eden considers the fundamental problem with speech: it just flat out gets words wrong. Before you can work on algorithms or context, you’ve got to get that right first. Those building blocks again.

There’s something to that, of course. Back at Siri’s launch, thick accents were one of its major and often amusing downfalls.

Mooly, who speaks with a Hebrew accent, was able to get it to recognise his “th- ” sounds after repeating a calibration word just five times. Now he can say “untethered” and have it recognised with almost 100 per cent accuracy. That’s how you build a voice recognition system. And piece by piece, that’s how Eden and the rest of Intel are trying to build the future, right here in front of us.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.