Intel's Chief Wizard Conjures The Cloud, Apple And A Phone That Keep Secrets

If anybody knows the future of computing, it might be Intel CTO and Labs chief Justin Rattner. So we had to ask him, “What’s next?” Well, for one, Intel Inside your phone.

Gizmodo: So what’s next? When we interviewed [former Intel CEO]Craig Barrett, he said Intel hires more software engineers than hardware engineers at this point.

Justin: Everybody wants to know what’s next. It’s not just one thing, you know. I’ll give you a few things. I think the Google TV announcement was significant for us because it’s really the first example of experience-driven architecture, experience-driven system design, that rather than take a bottom-updated approach of speeds and feeds, I think following the Viiv experience we sort of took a step back and said, ‘Let’s define the next generation television experience and design the silicon in response to that.’ As opposed to well, there should be a PC in every living room and all we have to do is put a little eye candy around the box and everybody will say, ‘Oh yeah it never occured to me to do that.’

There actually is a fellow whose name is Brian David Johnson. He is the experience architect for the future of television effort. Brian just finished a book called Screen Future, which is basically Intel’s vision for the future of television. Brian and that whole TV experience team literally built the entire experience and then took it out to the industry and people like Google and Sony and other folks then said, ‘That’s consistent with our vision.’ I think in a lot of cases our vision went way beyond where their vision was! But we actually did a complete UI. There is a service provider; they’re not like Google or Sony. They don’t have engineers creating UIs so, they were very interested in taking the UI we had developed as part of the total experience architecture. They’ll actually bring that to market.

Gizmodo: Not too long ago, the future was a PC on every desktop. But now it’s like computers embedded in everything. Where does Intel fit into that? Where’s the future of Intel in this kind of computer?

Justin: One of the big things in the lab is what we call everyday sensing and perception. And we think the sensing and perception technology is to the point now where they can actually perform at an acceptable level in the real world. It’s one thing to be able to do something in laboratory conditions, right? But if you want to do like everyday object recognition, you have to be able to do it in everyday settings, right? You have to be able to come into a Starbucks and say, ‘Ok they’ve got a couple of drinks here, and they have sunglasses on the table, blah blah blah.’ And being able to do something like that is the foundation for a dramatically different user experience.

One of the things we discovered when we put a gyro into a TV remote was that your hand motions, the minute hand motions of your hand, are like a fingerprint in terms of their ability to resolve an individual. So just these in three axes are all you need to basically say, ‘Oh it’s Matt!’ I know who’s picked up the remote. As soon as you do that, now you can personalise the viewing experience, because it’s ‘Oh Matt’s picked up the remote, here’s your preference, here’s your set of program recommendations. By the way, I also know the calendar says you have a couple of hours break so hey, there’s time to watch a movie or watch a sporting event.’ Or, ‘Gee, you’ve only got 15 minutes or it’s late at night and you really need to get to bed, so I’m only gonna recommend this kind of programming.’

Gizmodo: In terms of what people are experiencing, you look at phones, right? Increasingly they’re shrinking down to where they’re just a screen. And then you have software and that’s what people are interacting with more so than hardware.

Justin: Some people think that’s all gonna happen in a cloud, so really all you need is the camera and the imager and enough bandwidth to send the image up to the cloud. Our research says no, you’re gonna need to do a lot of computing here on the device because if everybody was walking around doing that, there just aren’t enough servers on the planet to handle all of these real-time video feeds. If you look at Google Goggles, it takes a picture, then compresses the hell out of the picture and sends that upstream. It’s fine, it’s clever in the sense that by forcing you to take a picture before you get recognition, it’s helping manage the latency. Sort of like, ‘Well I just took a picture, that takes a few seconds’. But really you just want to be able to hold up your camera. That’s my wish. This little camera in my lapel and a little hidden earpiece that says ‘It’s Matt.’

To come back to the original question. The next big thing across all systems is really context awareness. Dieter Fox, who runs our Seattle Lab, made this comment to me when I was interviewing him, oh gosh, over a year ago. He says, ‘You know, my iPhone doesn’t know anymore about me than it did the day I bought it.’

The next thing is where all of these devices – and I’m not limiting it to Intel devices – I mean, your car, all these things, making use of hard and soft sensors, they’re going to be much more aware of the current – well not just the current context, but the context of your life. Who’s your social network? Are you going to work or at work or coming home or going to the airport? That awareness of your everyday life is going to reach that long imagined goal of personal assistance. You get into your car and your calendar says, ‘Hey it’s a weekday,’ and you’re driving to work and it says ‘traffic jam here’ and you don’t have to pull that information. So, I think anticipating your needs and your actions and all that, that’s… Often times, people will say ‘Gee, what’s beyond the iPhone?’ The iPhone 4 is certainly a nice looking device, but I was really hoping it would take that next jump.

Gizmodo: How does Intel fit into the cloud thing that seems to be happening now, and how big of a factor is it?

Justin: Most of the cloud’s running on Intel. From the research side, what we’re exploring in terms of cloud architecture is creating platforms that are really engineered and optimised for what we call IP data centres or cloud data centres. What we have today is largely server architecture that was driven by classic enterprise computing needs.

On the research side, we built the single chip cloud computer, which was really intended as an experiment. You know, what if you had more, simpler processors? You could have more of them on a chip. They’d be more energy efficient. How much computing can you get done in a single socket? As soon as you start talking multiple sockets, the costs are just screaming up to efficiently connect the sockets. The cloud data centre of the future is probably based on single socket systems with high core counts – high core count processors in single socket designs.

The other thing we’re working on is taking our Light Peak technology and bringing it in to the data centre. We’ve been working with extreme computing group in Microsoft to build some experimental designs that used Light Peak infrastructure to build a fabric for the cloud data centre. And to really try to take the bandwidth way up, and take the latency way down, and dramatically simplify the switching architecture. Because Ethernet is fine in a classic enterprise data centre. But when you have 100,000 servers or half a million servers, you’re probably burning more power and you probably have more money in the network than you actually have in the servers themselves. I don’t think we’ve really seen at a production level even the first generation of cloud optimised silicon. That’s still a ways out.

Gizmodo: Why is Intel not really in mobile phones yet?

Justin: We definitely have phones and tablets in our sight. We expect to compete for those, all of those sockets in the coming years. And that’s true whether you’re talking about phones or cars or televisions.

I think we have to establish credibility. I don’t know whether it’s going to be a phone or a TV or what. Even beyond delivering the silicon, somebody has to take that silicon and put it in a product that sells in the 10 or 100s of millions, before you’re really a credible supplier.

You’re right we’ve not really focused on this space since we sold off the StrongARM family, since we sold that to Marvell. But there were StrongARM processors in all the BlackBerrys, so we’d been there. But when we made the decision to focus entirely on Intel architecture, that took a while to create Atom.

We knew we could do it on the research side. We had done experimental designs; we were quite confident we could do it. There was a lot of concern that what became Atom would cannibalise the laptop business, which didn’t materialise. Instead we created a whole new category around netbooks! But that was the big fear, and that cost us probably a good two, maybe even three years, before everybody was convinced we could introduce a cheaper, slower, more energy efficient product and not damage the main revenue.

Gizmodo: The most exciting things, to me, are phones, because the space is moving so quickly. You might hold on to a laptop for four years. Phones are outdated in two years. They seem to be moving twice as fast.

Justin: I have high confidence that… it’s cheap to say it, but the next fundamental advance in phones is actually going to happen on an Intel-based phone. You can kinda see bits and pieces around what I was talking about in terms of sensing and perception and context awareness and augmented reality. We have all those things running in a lab on Moorestown and were gonna have Medfield silicon here pretty quick.

We actually have phones, we actually built the entire phone. You’ll want to dump that [gesturing to my Nexus One]in the East River as soon as you see one of these things. They represent that much of an advance. I think even the announcement of the iPhone 4, it was sort of like iPhone 1.2. If the original was 1.0 and the 3G was 1.1, this is 1.2! I think people were at least hoping for a real ‘wow’ kind of advancement and it wasn’t that.

Gizmodo: I think it’s interesting that Google is pushing Apple to be more cloud-centric, and Apple is pushing Google to be more polished.

Justin: I wish Apple had put more pressure on them by doing something that was just… Some real wow capability in the hardware that would have sort of had Google going, ‘Oh, wow.’

There isn’t enough conversation around security, and I think that’s also, where from a security point of view, there are things you can only do locally, depending on the cloud for device security. If I still want to have access to my data, local to my phone, even if I’m disconnected, that means any secrets I’m trying to keep on that phone are at risk, and I think people are now and will increasingly be uncomfortable when keeping their secrets in the cloud. So I think giving all of our platforms the ability to keep secrets is one of our biggest architectural challenges and one of the toughest research challenges, because these are basically open platforms.

I think as silicon developers certainly, we have a requirement to provide truly secure capability when it’s needed. I mean, we could obviously lock the thing down and say no you just can’t do anything. But nobody’s gonna buy a phone that you can’t buy applications for.

So, what’s next I think, making it possible to protect those systems from the various forms of malware and then, in addition, giving them the ability to keep their secrets no matter what happens. I mean, literally short of coming down on the silicon with some electron beam probe, this thing will not give up a secret, whether the operating system is compromised or anything else. If the hardware is still correctly functioning, you can’t pull a secret out of it. That’s where we gotta go.

Thanks to Intel CTO Justin Rattner for talking to us!


Have you subscribed to Gizmodo Australia's email newsletter? You can also follow us on Facebook, Twitter, Instagram and YouTube.