Last night, I donned a Microsoft HoloLens for the second time. It was incredible. I could see objects made of light appear in the real world — and this time, I could freely walk around them without a tether. I even built my own holographic app. It felt so easy.
Now let me tell you why I’m still a bit sceptical.
In January, the first time I tried HoloLens, it was a big, bulky contraption with exposed circuitry everywhere, plus a separate processor unit you literally had to hang from your neck. It was tethered to the ceiling. Just about the prototype-iest prototype ever. Oh, and you had to walk into some very specific, very small rooms in the basement of a Microsoft building to see it in action, which raised a few questions about whether the demo were staged.
Well, all of that has definitely changed. HoloLens is now a slick, futuristic headset that doesn’t require a cord. It actually looks kind of like a consumer device, with a microUSB port and a headset jack instead of cable soup. It already lasts up to four hours on a charge. At the Intercontinental Hotel in San Francisco, they just handed me one, and let me put it on my head by myself. I walked around a giant room. I think it’s safe to say this demo wasn’t staged.
But what you might not understand — at least, not without trying HoloLens yourself — is how little the experience you see resembles Microsoft’s demo videos.
Here, try this for me real quick. Pick up your smartphone. Hold it about a foot in front of your face. Now imagine that the phone is a window into a parallel world. Through that window, you can see holograms that appear to exist in the real world — but all around that phone, you’re only seeing the world. In other words, you have to be looking directly at a digital object to see it, because HoloLens currently has a ridiculously tiny field of view. As soon as you turn your head a little bit, the holograms disappear.
And if those holograms are large enough, you’ll only be able to see a little piece of them, too. See this human body? She can probably only actually see his neck, shoulder, and jaw from the distance she’s standing:
And he can’t actually appreciate this big screen on the wall:
Unlike the Oculus Rift and other virtual reality headsets, it’s just not an immersive experience. It’s a bit of a bummer.
Does that make the technology any less exciting? Perhaps a bit. It’s certainly worrying that the field of view hasn’t improved. (I also found the prototype pretty uncomfortable to wear, even though I really like the design of the folding, stretching band.) But it’s still so amazing to me that this works at all — that a portable device can convincingly place CG objects into the real world. And the current HoloLens does feel good enough for a developer kit; to give game developers and app developers a glimpse at what they’re building.
Love the flexible, sliding design, but I had to cinch the band uncomfortably tight on my prototype.
And to my complete and utter surprise, building an app for HoloLens — the first app I’ve ever built, mind you — was remarkably exciting.
OK, ok, so I didn’t actually write any code. Microsoft just sat us down with the Unity game engine, Visual Studio, and a whole bunch of premade 3D objects and scripts. All I had to do was check some boxes, drag and drop some objects, hit a few keys to compile, and give it a try on the headset itself.
But it wasn’t like Microsoft hid any complexity, either. I could look through every script in Unity to see exactly how they worked, how few lines of code holographic apps will require. To turn a normal Unity game into a HoloLens game, for instance, all you’ve got to do is add an object called a holographic camera. You can add new voice commands with a single line of code — I decided that “pew pew pew” would cause an crunched up virtual paper ball to drop onto a virtual pad of graphing paper, and “by the beard of Zeus” would return it to the sky.
But the most impressive part is how the Hololens’s array of cameras can turn the real world into a video game canvas right before your eyes. With the flip of a switch in Unity, I was able to see the table and couch and objects around me as rough polygonal objects, and place my virtual graphing pad on a real table. All of a sudden, the real and the virtual started to combine.
“Pew pew pew.” A paper ball falls from the sky. It rolls off a table and onto the floor. A table and a floor which I never had to draw, render, or define. The HoloLens just saw them, told Unity that they existed, and my voice command set the wheels in motion for a CG object to interact with objects that actually exist in my life. I hear the light, crunchy sound the ball makes as it hits the table’s surface. Thanks to 3D audio, I can even hear where it falls if I turn around.
It’s a shame that just as I start turning, the ball disappears from my vision — god that field of view is so, so tiny — but this feels like a great start.