Tiny schooling aquatic robots inspired by fish
We've already heard about small flying or wheeled robots that cooperate on tasks by working in collaborative "swarms." Harvard University researchers have now gone a step further, by developing tiny underwater robots that school together like fish.
Known as Bluebots, the fish-inspired robots each incorporate two wide-angle cameras and three high-visibility blue LEDs. They swim by flapping their tails and moving their fins.
Combining the output of its two cameras for 3D computer vision, each Bluebot is able to ascertain the distance, direction and heading of all the other Bluebots in a water tank, relative to itself. Using custom algorithms to analyze that data, the collective "Blueswarm" is thus able to perform actions such as aggregating, dispersing, and swimming in a circle.
"If we want the robots to aggregate, then each Bluebot will calculate the position of each of its neighbors and move towards the center," says PhD candidate Florian Berlinger, first author of the study. "If we want the robots to disperse, the Bluebots do the opposite. If we want them to swim as a school in a circle, they are programmed to follow lights directly in front of them in a clockwise direction."
In a demonstration of a possible search-and-rescue application of the technology, the Bluebots were programmed to disperse throughout their tank, until one them got close enough to a red light to detect its presence. That robot then proceeded to start flashing its LEDs, triggering the rest of the Blueswarm to aggregate around it.
And while the Bluebots themselves would likely be incapable of rescuing anyone, the core technology could also be applied to much larger, more capable autonomous underwater robots. It could also be utilized in applications such as the environmental monitoring of ecologically sensitive areas, or studying the schooling behaviour of fish.
"Robots are often deployed in areas that are inaccessible or dangerous to humans, areas where human intervention might not even be possible,” says Berlinger. "In these situations, it really benefits you to have a highly autonomous robot swarm that is self-sufficient. By using implicit rules and 3D visual perception, we were able to create a system that has a high degree of autonomy and flexibility underwater where things like GPS and Wi-Fi are not accessible."
A paper on the research, which was conducted in the lab of Prof. Radhika Nagpal, was recently published in the journal Science Robotics.