Drones

Insect-inspired eye may allow drones to navigate their environment more naturally

Insect-inspired eye may allow drones to navigate their environment more naturally
A newly-created compound eye-inspired system may complement existing accelerometer navigation in drones to help improve their autonomous capabilities (Photo: Expert & Ruffier/ISM/CNRS/AMU)
A newly-created compound eye-inspired system may complement existing accelerometer navigation in drones to help improve their autonomous capabilities (Photo: Expert & Ruffier/ISM/CNRS/AMU)
View 1 Image
A newly-created compound eye-inspired system may complement existing accelerometer navigation in drones to help improve their autonomous capabilities (Photo: Expert & Ruffier/ISM/CNRS/AMU)
1/1
A newly-created compound eye-inspired system may complement existing accelerometer navigation in drones to help improve their autonomous capabilities (Photo: Expert & Ruffier/ISM/CNRS/AMU)

Most modern aircraft, cruise missiles, spacecraft – in fact, almost all flying vehicles – use an accelerometer for flight stabilization. Living creatures that fly, on the other hand, rely on their own innate sense of balance determined by environmental observation and inbuilt organ-based systems. Now French researchers have designed a bio-inspired, sight-based system that could be used in conjunction with accelerometers to vastly increase the autonomous capabilities of drones by endowing them with more natural flying abilities.

To this end, researchers working at Aix-Marseilles University's Institut des Sciences du Mouvement Etienne-Jules Marey in France have built a bee-inspired flying robot that incorporates what is known as optic flow visual navigation. This is where, in a compound insect eye, the area to the front of a flying insect remains relatively stable in the sight, but to either side, objects and terrain as they approach in the peripheral areas of vision are observed to give spatial sense to the path of the insect in flight.

To replicate this compound eye arrangement, the researchers built on earlier work in which they participated and created an electronic optic flow sensor incorporating a set of 24 photodiodes in an artificial eye arrangement. In this new research, the eye has been attached to a tethered 80 g (2.8 oz) 470 mm (18.5 in) long drone that the researchers have dubbed "BeeRotor."

To test this setup, the researchers then set the drone to negotiate a series of varying elevations of terrain and obstacles using only the visual cues provided by the optic flow sensor eye, and without resorting to measuring its velocity or altitude.

To achieve this, a group of three feedback loops built into the on-board electronics used the data collected by the optic flow sensors to spatially position the robot in flight. In detail, one of these feedback mechanisms controlled the drone's altitude by responding to visual information to make it follow either the floor of the test rig or the ceiling of a tunnel on the same setup.

The other two feedback mechanisms managed the robot's velocity based on optimum speeds required to navigate a specific tunnel size, and the orientation of the optic sensor eye in regard to the slope over which it was traveling, so that the unit was afforded the optimum field of view, regardless of pitch. The latter mechanism also provided the robot with the ability to navigate even greatly sloping terrain without requiring an accelerometer. As seen in the video below, the robot was successfully able to navigate a tunnel with moving walls, while avoiding obstacles.

Though some drones already incorporate a form of object avoidance, it is generally one based on proximity to an object – for example in the infrared system used by the palm-sized Zano drone – rather than the premise of optic flow as incorporated in the BeeRotor.

As such, the members of the BeeRotor research team, Fabien Expert and Franck Ruffier, are of the opinion that the demonstrated abilities of their robot to orient itself spatially without resorting to standard accelerometers may provide a hypothesis to explain how insects achieve this feat using similar biologically-based feedback mechanisms, also not reliant on gravity-based sensors.

Regardless of whether this hypothesis proves true, the optic flow technology realized in the research may well be used in future drone aircraft to complement their accelerometer-based flight orientation and perhaps provide a more naturalistic method to further improve autonomous navigation.

The results of this research were recently published in the journal Bioinspiration & Biomimetics.

The short video below shows the BeeRotor in action.

Source: Aix-Marseilles University's Institut des Sciences du Mouvement Etienne-Jules Marey (PDF)

Beerotor Video (Multimedia Extension 2); Credits: Expert & Ruffier, ISM (AMU / CNRS)

No comments
0 comments
There are no comments. Be the first!