One-Eyed Bug Vision Helps Drones Land

In an effort to build—and control—ever smaller drones, researchers have been looking at how insects navigate. Insects use a technique called optical flow, based on the apparent speed of objects passing by in their field of vision. In fact, humans use optical flow to give us a sense of how fast we’re going when we’re driving. 

But unlike humans in cars, drones have a third dimension to worry about. They also have to keep track of their height in order to land successfully.  Stereo vision would allow them to estimate distances, but if the baseline between sensors is too small, those measurements are imprecise.

Guido de Croon, a researcher at the Delft University of Technology in the Netherlands set out to use optical flow data captured by a single camera—research inspired by his previous tenure at the European Space Agency, where he developed lightweight optical backup landing systems for spacecraft. When trying to land a drone softly using this information, de Croon noticed a strange phenomenon, which he reported in the most recent issue of Bioinspiration & Biomimetics:

“When [the drone] gets close to the ground, at a certain point, the control system will become unstable; it starts to oscillate. It is not that the ground causes an aerodynamical effect; the drone itself is inducing these oscillations.”

In an effort to build—and control—ever smaller drones, researchers have been looking at how insects navigate. Insects use a technique called optical flow, based on the apparent speed of objects passing by in their field of vision. In fact, humans use optical flow to give us a sense of how fast we’re going when we’re driving.

But unlike humans in cars, drones have a third dimension to worry about. They also have to keep track of their height in order to land successfully.  Stereo vision would allow them to estimate distances, but if the baseline between sensors is too small, those measurements are imprecise.

TUDelftdrones

Guido de Croon, a researcher at the Delft University of Technology in the Netherlands set out to use optical flow data captured by a single camera—research inspired by his previous tenure at the European Space Agency, where he developed lightweight optical backup landing systems for spacecraft. When trying to land a drone softly using this information, de Croon noticed a strange phenomenon, which he reported in the most recent issue of Bioinspiration & Biomimetics:

“When [the drone] gets close to the ground, at a certain point, the control system will become unstable; it starts to oscillate. It is not that the ground causes an aerodynamical effect; the drone itself is inducing these oscillations.”

De Croon compares this behaviour to that of an inexperienced pilot who overcorrects the flight path approaching the runway, causing the plane to oscillate up and down. For an airplane, this oscillation can cause a hard landing. But for a drone, it appears to be a welcome bonus.

“I found that this oscillation occurs at a specific distance from the ground. We could see this as a problem, of course, but actually, it is also an opportunity for the robots. If it can detect this oscillation, then it can actually know its height,” says de Croon. This effect is also noticeable in insects. For example bees hover in front of a flower before landing on it.

Besides landing a drone safely, these oscillations can also be used for navigating at a specific height by tuning the strength of the so-called “control gain” reaction that causes the oscillations. De Croon found that he could set the gain so that the drone started to shake whenever it approached a certain distance to the ground. This allowed him to keep the craft at a specific height by keeping close to this edge of oscillation. This technique can also be used to avoid obstacles as it works also for lateral movements, he adds.

For the experiments, de Croon used the commercially available Parrot AR Drone 2. The drone’s camera was used for the flight experiments, while its height-measuring sonar was used for confirming the camera’s readings. De Croon also used off-the-shelf software (the Paparazzi open source autopilot) to control the drone.

Performance landing algorithm? Check. Monocular obstacle avoidance? Check. And because optical-flow processing needs very little hardware and software, future insect-size drones could be equipped with navigation systems that weigh no more than a gram, says de Croon.

Source: IEEE Spectrum

Leave a Reply

Your email address will not be published. Required fields are marked *