Drones allow us to go places other vehicles can’t, places that might be difficult to navigate or otherwise dangerous. Ideally, you’d like to do a few dry runs, get your bearings and so forth, even collect some data for the real thing. To this end, a team from MIT has developed special VR software that can beam simulated environments to drones, essentially making it “hallucinate” its surrounds.
Called “FlightGoggles” and built on top of the Unity game engine, the tech allows researchers to “fly their drones in the most realistic simulation possible”, according to MIT associate professor Sertac Karaman, who specialises in aeronautics and astronautics and worked on the project:
The drone flies under a motion capture system; exteroceptive measurements (such as camera, LiDAR) are created photorealistically in real-time, using Unity rendering with Nvidia Titan X GPUs, and beamed to the drone with little latency.
In this way, the drone experiences real physics, gets real inertial measurements, but gets photorealistically simulated camera images. This allows researchers and developers to fly their drones in various simulated environments.
That’s right — the drone is fed a fake perspective, which it can then react to as it would in the real world.
As you can see, while the drone sees a room filled with various obstacles, in reality its zooming around a completely empty space.
It’s a neat use of an emerging technology, one that makes a lot of sense when you think about it. While drones are more disposable than humans, if we can come up with a way to do simulated runs first, why wouldn’t you?