Apple Glasses Rumours Have Me Stressed About AR’s Biggest Unsolved Problems

The Cinemizer was a gadget meant to let you watch movies from an iPod or iPhone in 2009. Not quite AR, but similar ideas have been kicking around for years.  (Photo: Ryan Anson/AFP, Getty Images)
The Cinemizer was a gadget meant to let you watch movies from an iPod or iPhone in 2009. Not quite AR, but similar ideas have been kicking around for years. (Photo: Ryan Anson/AFP, Getty Images)

The hype behind Apple’s rumoured augmented reality headset and glasses just won’t die. Every few weeks, almost like clockwork, some sort of Apple AR rumour or news makes its way through the tech blogosphere, even as other smart glasses and AR companies are capsizing left and right. This has always irked me, but it wasn’t until my coworkers and I chatted about a recent Apple patent that I realised why.

The AR patent in question was uncovered by AppleInsider. The patent itself isn’t necessarily anything mind-blowing. The filing details research from 2016 on how someone wearing a pair of Apple smart glasses might interact with objects in an augmented or virtual environment — basically, by using infrared sensors to turn any surface into a touch interface. That’s cool, but patents may never turn into actual products and are sometimes filed just to prevent competition from doing something similar. But immediately, I began questioning two things: Wouldn’t this very cool idea still be victim to the very real problems with augmented reality displays? And how does it account for everyone’s vision being unique?

The display problem is not new. The way most AR headsets and glasses work involves projecting an image onto a lens (or screen) or into your eyeball. I’ve tried several smart glasses and AR headsets. All of them run into the same issue: Whenever you’re in an extremely bright environment, the image you’re seeing gets washed out. When I tried a pair of Focals by North — RIP — the projected image was invisible once I stepped outside. I had to stick a sunglasses attachment on, and even then, it wasn’t the clearest. If you push aside other issues, like the need for bulky arms to encase sensors and components, it’s all moot if you can’t see anything except under ideal circumstances. AR glasses that you can’t reliably use outdoors eliminate many of the use cases — like directions — that make the tech attractive in the first place.

Even bulkier headsets, like Magic Leap and Microsoft HoloLens 2, also have display issues — their field of view is not all that great — never mind the lack of applications outside of the enterprise space. These are hulking headsets that you wouldn’t reasonably wear all day. No company has yet figured out a way to meld the immersive experience of bulkier headsets with the science-fiction dream of a sleek, everyday form factor. Unless Apple’s got a secret team designing new components and hardware from scratch, the idea that it might be the first to crack this problem seems overly optimistic.

The other issue is the fact that vision is highly subjective. No two eyes are the same, which makes mass production a conundrum. A friend can’t hand me a pair of their smart glasses to borrow, even if we both have 20/20 vision. The physical shape of your eyeballs and your head are also factors. So far, companies have dealt with the issue by calibrating smart glasses to each individual user. I’ve gone through several calibration processes and it’s not fun. If you don’t do it right, whatever image you see will just appear murky or disappear if you slightly adjust your glasses. You might have to recalibrate several times, which is just annoying enough that you might give up on the glasses altogether.

One company may have a solution. At CES 2020, Bosch made a splash with its prototype smart glasses, the main difference being that they used tiny laser arrays to beam images straight onto your retina. These also required specific calibration, but the supposed benefit was that it somewhat eliminated the display problem. Because the lasers were reflecting an image straight onto your retina, the image was clear and circumvents the problem of your eye constantly refocusing, according to IEEE Spectrum.

Cool! Bosch’s glasses aren’t likely to be anything more than a prototype, but it doesn’t erase the fact your eyes are delicate organs. (That’s another reason why smart contacts should be approached with extreme caution.) Shining any type of light directly onto your retina is potentially dangerous. There’s been plenty of medical studies written about how lasers can cause retinal injury or damage. Granted, many of these injuries don’t happen from commercial lasers, because they’re generally not strong enough to cause lasting damage from momentary exposure. But smart glasses are not meant for short, momentary exposure. They’re meant to be worn for an extended period of time. Who exactly is conducting peer-reviewed clinical studies at scale to see what the appropriate light levels for your retina are? Who is studying the relationship between AR/VR and eye strain?

Maybe Apple has taken all these things into consideration. After all, it surprised the world when it launched the Apple Watch Series 4 with FDA clearance — a process that is not easy and generally requires a lengthy period of time (and studies to back up the medical claims). It’s also not clear whether Apple’s AR project is going to be a product along the lines of Google Glass or Magic Leap, or simply an extension for the iPhone. But it’s foolhardy of some people to tout any version — rumoured or not — of Apple Glasses without discussing these two issues.