Let me explain. Right now, authorising and targeting air strikes is a process that’s sometimes bureaucratic, and sometimes dangerous as hell. Bureaucratic as in the Stanley McChrystal phase of the Afghanistan war, when it took a gaggle of lawyers, intelligence analysts, air controllers, and commanders at multiple layers to put steel on target.
The result was fewer civilian casualties – but more US troops, locked in firefights without air support. Dangerous as hell as in the Libya war, where NATO jets are accidentally offing Libyan rebels with such alarming regularity that the opposition forces are now painting their vehicles’ roofs pink, to distinguish them from Gadhafi’s rides.
Darpa believes there might be a single technological fix to both problems: give a single guy on the ground a direct data link to the drone (or manned plane) circling above. That would eliminate the multilayered, bureaucratic approach, in which information is often passed through IM windows and static-ridden radio connections. That same lone “Joint Terminal Attack Controller”, or JTAC, might be low-profile enough to slip into a situation like Libya without causing too much of an international ruckus.
The program to make this all happen is called Persistent Close Air Support, or PCAS. And the goal is to give that controller the ability to “request and control near-instantaneous airborne fire support.”
But the military also gave a million bucks to the relatively tiny Vuzix Corp. of Rochester, New York. Which is a little odd, at first blush, because Vuzix is an eyewear company, specializing in augmented reality specs.
But a little augmented reality may be just what a JTAC needs, in order to call in those airstrikes on his own. Rather than staring down at a bunch of maps and computer screens – and calling up intelligence analysts at headquarters for more info – it’d be better (and faster, and less prone to error) if he could get all of that data right on his augmented reality goggles. Oh, and if there was an integrated head-tracker, so the attached computer could basically see what the JTAC sees.
“It is all about speeding up the CAS [close air support]mission and eliminating friendly fire issues that can occur if the user on the ground may not have the whole picture of what is around them,” Vuzix executive Stephen Glaser tells Danger Room.
“The head tracker knows where the user is looking, so the information the user is seeing changes as he moves or turns his head. Theoretically you could look up in the sky and a little green triangle would appear telling you, you have an F-16 48km out at 21,000 feet. It could also tell you what type of ordnance the plane was carrying, so you could make a quick decision if that plane would be appropriate for the mission.”
Some of this can be done today with pilots’ heads-up displays. But those require so much power and light, a JTAC would need to lug around an extra 3.6kg of batteries to make it work. (And it still wouldn’t work in direct sunlight.) That’s where the holograms come in.
Vuzix’s setup uses a more or less traditional microdisplay, then mates that up to a flat piece of glass called an optical waveguide. The light from the display travels down the glass and bounces around inside the glass parallel flats. Those beams are directed to holographic film, which bounces the image to the eye.
If the plan works, the system will be tiny – just 3 mm thick. And when the display is off, it’ll be totally see-through. Glaser notes: “This will ultimately allow us to design the display right into a pair of sunglasses, so no one will know you are even wearing a display.” Which could make the goggles good for civilians, as well as troops called into a robotic, lethal hail.
Photo: US Air Force. Illustration: Vuzix.