When you picture a pair of smart glasses, chances are youâ€™re envisioning something slick. Perhaps, a stylish pair of aviators that with either a discreet tap or voice command, will immediately pull up an array of information based on whatever youâ€™re looking at. In any case, you probably imagine someone like James Bond or Tony Stark wearing them.
Thatâ€™s because science fiction has played a huge role in how we, the average consumer, think this technology should work. But the truth is smart glasses arenâ€™t some futuristic gadget that has yet to come to fruition. Theyâ€™re already hereâ€”and theyâ€™ve been around for quite some time.
So if smart glasses are already a thing, why doesnâ€™t everyone already own a pair? Sure, design and cost and factor in, but the answer isnâ€™t as straightforward as you might think. To find out whatâ€™s taking so long, we talked to a bunch of designers, engineers, historians, sci-fi authors, and futurists about smart glassesâ€”where that techâ€™s been, how it was developed, and what the future actually looks like.
The most â€œrecognisableâ€ pair of smart glasses might be the Google Glass Explorer Edition, but you can trace the idea of an information-laden visual apparatusÂ all the way back to World War II pilots.
It was hard for pilots to identify targets based on voice instruction alone, so naturally, militaries turned to technology to solve the problem. The result was heads-up displays (HUDs). By the end of the war, some pilots were capable of reflecting radar information onto glass in front of a pilotâ€™s controls.
But the first HUD with computing power came in the late 1960s in the form of a huge, hulking device called the Sword of Damocles. The name is a reference to the old tale of Damocles, an annoying courtier in King Dionysius IIâ€™s court. To demonstrate the perils of ruling, Dionysius sat Damocles on the throneâ€”with a giant sword hanging by a single hair over his head. The tech version was the invention of Ivan Sutherland and his team of computer scientists at the University of Utah, and is widely regarded as the first virtual reality and augmented reality headset.
Except, no one really knew what to do with this thing. Its size and bulk made it more of a theoretical proof of concept than an actual piece of technology anyone could use. That didnâ€™t stop people from tinkering around though.
The Private Eye came out in the late 1980s and was popular with the DIY community. Meanwhile, Steve Mannâ€”oft-quoted as the father of wearablesâ€”developed the Eye Tap in the late 1990s. They were closer to what we envision smart glasses to look like, but still werenâ€™t the sleek, discreet designs weâ€™ve come to expect.
And thatâ€™s one of the main problems with designing a pair of smart glassesâ€”how these devices look is incredibly important. People are incredibly vain, so much so that needing glasses can feel like a death sentence to Eternal Dorkdom when youâ€™re a kid. Glasses are way more fashionable these days, but smart glasses operate within a smaller margin. Yes, they have to look great but they canâ€™t simply be fashionable. They also have to add all that AR functionality roo. The challenge is creating a comfortable enough pair, that would appeal to peopleâ€™s individual styles, accommodate everyoneâ€™s vision, but also be mass reproducible. Oh, and not cost a fortune.
Thatâ€™s a tall order and the technology hasnâ€™t been there yet. So in the meantime, science fiction has to fill the gaps. Writers like William Gibson and Robert A. Heinlein included descriptions of augmented reality headsets in novels like Neuromancer and Starship Troopers. George Lucasâ€™s Star Wars: A New Hope in 1977 introduced audiences on a massive scale to HUDs thanks to Lukeâ€™s Death Star trench run.
That continued on in the 1980s in Terminator and RoboCop with computer vision. Hell, even Sailor Mercury had a form of AR headset in the popular â€˜90s anime Sailor Moon.
â€œThey create sort of an ideal vision,â€ Madeline Ashby, a science-fiction novelist and futurist, told Gizmodo. â€œThey create a vision of what might be possible, and thatâ€™s sort of the science fiction writerâ€™s privilege. We can write about what we think is possible without having to do all the leg work of actually designing it. We write the wish list. Someone else has to fulfil it.â€
Given the limitations, it makes sense that it took until 2012 and Google Glass for the first viable consumer smart glasses to burst onto the scene. Lighter than your average pair of sunglasses, these babies were sleeker than what had come beforeâ€”even if they did cost a whopping $US1,500 ($2,210). Plus, the concept video Google dropped looked amazing. It gave us a first-person view into what a future with smart glasses might actually look like.
Except thatâ€™s where the expectations science fiction gave us and reality of what was possible began to split. Functionally, Glass had a handful of useful apps and some promising use casesâ€”but nothing compelling enough that anyone but the earliest of adopters would shell out for. There was, so to speak, no â€œkiller app.â€ Plus, while sci-fi novelists, Hollywood, and futurists were laser-focused on the positive possibilities, the public had different concerns once Glass was in the wild.
â€œThe issue Google Glass experienced when it launched was two-fold. The way it looked, you literally had a piece of tech hanging off your face,â€ says Chuck Yust, a designer with Frog Design. â€œThe second part is people reacting to being filmed all the time and having cameras in their face.â€
Yust told Gizmodo that heâ€™d heard Glass described as a Segway for your face. Yeesh. But that uber nerdy design is just one reason wearers were called Glassholes. (Though you really do have a problem when even supermodels canâ€™t make a thing look cool.) Aside from aesthetic concerns, Glass triggered some legitimate societal concerns, especially with privacy and security.
Countless think pieces were penned, but perhaps the most famous incident involved a woman who was attacked at a bar in San Francisco for wearing Glass in 2014. The whole scuffle happened because bar patrons felt upset at the idea the woman could be recording them at any moment in a public area.
â€œSmart glasses to me, are the easiest to observe people in what they think is a private environment,â€ Ashby added. â€œThatâ€™s why they always show up in spy movies.â€
â€œEven though logically we know the smartphone has a camera, we have a very good sense if someone is filming you with a smartphone,â€ says Marc Weber, the Internet history program founder at the Computer History Museum. â€œItâ€™s harder with a headset because thereâ€™s not as many cues. But the immediate default assumption is that you could be filmed without your knowledge.â€
So here we are in 2019â€”four years after the Explorer Edition of Google Glass was scrapped. The design, technological, and societal challenges facing smart glasses are still the same. So is that it? Are all these challenges just too expensive and complicated? Are we going to have to rely on science fiction for another 20, 30 years before another company delivers something viable? Perhaps not.
Backlash to Google Glass was brutal, but far from a final deathblow. All the designers, engineers, and futurists we spoke to agree on one thingâ€”smart glasses are coming, and right now, theyâ€™re finding a second life in enterprise.
This is a two-part video series exploring the challenges of making smart glasses people would actually use. Part two will air next week.