In a first, scientists believe they have confirmed we have another sense – a “remote touch” that we share with others in the animal kingdom, like some shorebird species that can sense pray beneath sand without seeing or touching it first.
Researchers at Queen Mary University of London and University College London (UCL) set out to investigate whether the same kind of sense that the birds use to guide them – where tiny shifts in the movement of sand grains alerts an individual to food – might be more common among animals than previously thought.
"It’s the first time that remote touch has been studied in humans and it changes our conception of the perceptual world (what is called the ‘receptive field’) in living beings, including humans," said Elisabetta Versace, who leads the Prepared Minds Lab at Queen Mary University.
Testing out their granular media particle interaction theory, Versace designed an experiment to recreate what happens when shorebirds are foraging on sand. Essentially, subtle mechanical shifts occur when pressure changes in the medium, in this case sand, as a hand – or a beak – nears a buried object. In this study, participants moved their fingers through sand in search of a concealed cube, but were asked to identify where it was before they actually made contact with it.
The researchers then pitted the participants up against a robot loaded with a Long Short-Term Memory (LSTM) algorithm, and the human hands recorded nearly twice the success rate at sensing they were close to the cube compared with the artificial sensor.
For getting within the “expected detectable range,” humans scored 70.7%, stopping within 6.9 cm (2.72 in) with a median proximity of 2.7 cm (1.06 in), compared with 40% from the programmed robot hand. This, the researchers believe, is enough to confirm that we can sense an object before we touch it when it’s through a medium like sand that delivers cues through displacement and tiny changes in pressure.
The researchers hope to use these findings to help improve robotic touch – something that harnesses a natural kind of sensitivity in real-world situation, such as excavation and search-and-rescue operations.
"The discovery opens possibilities for designing tools and assistive technologies that extend human tactile perception," said Zhengqi Chen, a researcher in the Advanced Robotics Lab at Queen Mary. "These insights could inform the development of advanced robots capable of delicate operations, for example locating archeological artefacts without damage, or exploring sandy or granular terrains such as Martian soil or ocean floors. More broadly, this research paves the way for touch-based systems that make hidden or hazardous exploration safer, smarter, and more effective."
While the study has its limitations, from the controlled lab experimental design to a lack of mechanical analysis of the sand displacement when the participants "sensed" the imminent object, it opens the door to further investigation with a larger population and different mediums.
"What makes this research especially exciting is how the human and robotic studies informed each other," said Lorenzo Jamone, Associate Professor in Robotics & AI at UCL. "The human experiments guided the robot’s learning approach, and the robot’s performance provided new perspectives for interpreting the human data. It’s a great example of how psychology, robotics, and artificial intelligence can come together, showing that multidisciplinary collaboration can spark both fundamental discoveries and technological innovation."
The research was published in the journal IEEE International Conference on Development and Learning (ICDL).
Source: Queen Mary University of London