Until now, MAVs found their way from A to B with one of two methods — GPS navigation or a human pilot. The roblem is both of those rely on external systems to guide the robot. For a more autonomous MAV, the quadcopter must see for itself.
GPS signals can have an error measurement of up to 70 metres, depending on the terrain, so it’s generally useless in tightly spaced city blocks. And if you want an “autonomous” MAV (Micro Air Vehicle), putting a human at the helm is just setting yourself up for failure. So how does one get an MAV to fly on its own without outside direction? Researchers from the Autonomous Systems Laboratory at ETH Zurich, as part of the EU’s sFly project, figured the easiest way is to just give it a pair of eyes and a sense of balance.
Well, three eyes, actually. The system developed at the ASL uses a trio of cameras — one collects flight data, while the other two cooperate to give the MAV stereoscopic vision and generate data for the 3D modelling of the structures around it. This data is fed into an on-board microcomputer which actually assembles the 3D “mental” map and compares the feed from the cameras to a set of required flight values (i.e. making sure the MAV isn’t flying upside-down or going to hit a tree). The MAV can then corrects and optimise its flight path. This could make the drones invaluable for emergency search and rescue in collapsed structures, remote pipeline inspection and a host of other applications that are a better fit for robot eyes than human. [sFly via Physorg]