For more than a century, aeronautical visionaries have turned to the natural world for inspiration, and those working with modern-day miniature aircraft that fit in the palm of your hand are no different. By learning how bees safely zip through thick rainforests in spite of their poor vision, scientists say they can endow flying robots with similar capabilities, promising exciting new levels of autonomy for small drones.
Dense forests crowded with branches and leaves provide quite the obstacle course for fast-moving insects. How they are able to identify gaps to move through seems to defy their small brains and low resolution eyesight. Researchers at Sweden's Lund University have now uncovered the tricks the animals use to navigate these tight spaces without running into trouble.
The scientists studied the green orchid bee in action, discovering that the insect gauges the light intensity piercing through holes in the rainforest foliage to work out whether or not they are large enough to travel through. Using the brightness as a guide, the insects can pinpoint spots that give them the largest clearance from the edges.
The researchers say that one of the reasons the insects navigate so effectively is because their brains work differently to those of humans. Where a human brain takes in way more information than it is actually aware of, bees and other insects are more selective, which means that they see patterns rather than finer details.
"Their strategy is super simple," explains leader of the research, Emily Baird. "They measure their speed and their height above the ground by registering how quickly the pattern they see is coming towards them and moving across their eyes. This way, they have surrounding objects come at them at a constant speed when flying. If they are in a complex environment with dense vegetation, they will automatically fly slower than if they are flying in open terrain where objects are not as close."
The researchers say its tactics offer the basis for a fast, computationally simple and efficient model for navigation that could allow small robots to fly through cluttered environments without human help. But for that to happen, they must first find a way to map the insect's visualizations with mathematical models and digital systems. Baird is currently working on this problem, aiming to paint a three-dimensional picture of what the bees see using synchrotron radiation.
"The system is so simple, it's highly likely that other animals also use light in this way," says Baird. "The system is ideal for adapting to small, light-weight robots, such as drones. My guess is that this will become a reality within five to 10 years."
The team's research was published in the journal Proceedings of the Royal Society B.
Source: Lund University
Want a cleaner, faster loading and ad free reading experience?
Try New Atlas Plus. Learn more