Robotics

Underwater robot can make its own snap decisions

View 4 Images
Marine scientists have successfully programmed an underwater deep sea robot to make its own decisions about what to do next from interpreting data coming in through its sensors
Wikimedia Commons (Public Domain)
Next they hope to craft multiple decision loops for processing this real-time visual and sonic information
University of Delaware
The researchers used a deep sea modular AUV model called REMUS 600 for their experiment
University of Delaware
Rather than follow the usual procedure of collecting data, then analyzing and interpreting it and feeding new instructions to the robot, the scientists pre-programmed the AUV to react in specific ways to certain kinds of sensory input
University of Delaware
Marine scientists have successfully programmed an underwater deep sea robot to make its own decisions about what to do next from interpreting data coming in through its sensors
Wikimedia Commons (Public Domain)
View gallery - 4 images

Two marine scientists have shown that autonomous underwater vehicles (AUVs) can be programmed to make independent decisions and trigger new missions in real time based on data coming from multiple sensors. They believe this could reveal much about the life of squid and other marine creatures.

The researchers used a deep sea modular AUV model called REMUS 600 for their experiment. They were investigating whether squid, fish, and krill attract whales to a deep ocean trench in the Bahamas known as the Tongue of the Ocean. They thought they might be able to do this more effectively if the AUV could react to biological information – such as a certain size or concentration of squid – in its environment.

Squid move constantly. Even when they're not actively swimming, the currents push them around. The same can be said for the ocean in general. "What you see at any given instance is going to change a moment later," said study co-author Mark Moline, the director of the School of Marine Science and Policy at the University of Delaware.

Rather than follow the usual procedure of collecting data, then analyzing and interpreting it and feeding new instructions to the robot, the scientists pre-programmed the AUV to react in specific ways to certain kinds of sensory input.

The researchers used a deep sea modular AUV model called REMUS 600 for their experiment
University of Delaware

A custom application running on one of the AUV's onboard computer stacks was set to constantly analyze incoming sonar data. When it identified the target number and size of squid (for this experiment, 100 squid greater than 20 cm long), it sent a "1" signal to the Remote Control (RECON) computer. Otherwise it sent a "0". This signal was then propagated through the vehicle network, which contains several other computers. There was a delay of up to 30 seconds on this process, making it close to real-time feedback.

When the "1" signal appeared, the RECON computer took over navigation either until a secondary mission was completed or additional sensor input met some prescribed condition that triggered another behavior change. The secondary mission was to perform a more detailed grid-based scan of the immediate area.

This detailed scan found a very concentrated group of squid in one area and a second, less dense group of similarly-sized squid in another area south of the first one. The researchers stated that this information likely would have been missed or incompletely gathered had the AUV been programmed to keep traveling in a straight line instead of automatically switching gears to investigate the squid mass.

The researchers believe this shows the potential of underwater vehicles with greater autonomy to unlock more secrets of the deep ocean. A team at MIT developed a similar technology last year to enable AUVs to pick their own routes and investigation hotspots based on timing constraints, obstacles in the environment, and target areas.

But where the MIT researchers focused on autonomous navigation and cooperation between multiple AUVs, Moline and his colleague Kelly Benoit-Bird from Oregon State University showed how effectively the robots can react to multi-sensor data flows.

Next they hope to craft multiple decision loops for processing this real-time visual and sonic information so that an AUV could, for example, follow an entire school of squid to see where they go and whether they scatter or congregate more tightly, and in the process to create a continuous roadmap of how the prey travels through the ocean.

We still have much left uncharted in this universe, but it seems robots such as these may be destined to carry the torch forward for mapping the world beneath the waves. The deep ocean's greatest explorers may soon prove to be robots following their own whims and curiosities, much like the great human explorers of centuries past.

A paper describing the research was published in the journal Robotics.

Source: University of Delaware

View gallery - 4 images
  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
1 comment
Stephen N Russell
Lisc & mass produce, or limited prod for academia IE Univ of HI etc. Aweosme, Replicate software for other ROVs.