So, imagine you’re Tony Stark, operating your armored, high-tech exoskeleton to fly through the skies by using your helmet’s eye-tracking sensors to control your suit. And then, out of nowhere, the Mandarin blasts you with a force beam just before Fin Fang Foom tries punching your iron head through your iron rectum. And now with your eyes rolling inside your head like destabilized gyroscopes, do you keep flying straight, or do you shoot into the concrete like an unguided missile?
Well, it’s true that UC San Diego researchers Xiangjun Chen, Zhiyuan Lou, Xiaoxiang Gao, and Lu Yin probably didn’t have that exact scenario in mind while conducting the research for their Nature Sensors paper “A noise-tolerant human–machine interface based on deep learning-enhanced wearable sensors.” But the scientists definitely did want to develop gesture-based remote controls that could operate reliably despite the real-world jostling that is inevitable when human beings move, or if they themselves have motor impairments.
With the help of AI “data-cleaning,” co-lead author Chen, a postdoctoral researcher in the Aiiso Yufeng Li Family Department of Chemical and Nano Engineering at the UC San Diego Jacobs School of Engineering, worked to remove “noisy sensor data in real time” so that his team’s device could allow “everyday gestures to reliably control machines even in highly dynamic environments.”
Through their collaboration with the UCSD labs of professors Sheng Xu and Joseph Wang, and support from the US Defense Advanced Research Projects Agency (DARPA), Chen’s team initially sought to improve military divers’ ability to control underwater robots. But eventually they realized that even land-lubbers needed the remote control-version of image stabilization, especially for the quickly growing field of wearable technology that has, until now, been starved for bump-tolerant controls.
“This work establishes a new method for noise tolerance in wearable sensors,” said Chen. “It paves the way for next-generation wearable systems that are not only stretchable and wireless, but also capable of learning from complex environments and individual users.”
With its combination of motion and muscle sensors, a Bluetooth relay, and stretchable battery, the armband-mounted electronic patch uses AI to eliminate the data-noise from jostling and tremors, so that a shaky controller no longer means shaky control.
Using its database of dynamic real-world conditions and human gestures typical of motion on land or at sea, the device analyzes arm signals with its own deep learning platform to eliminate false-positives and deliver instantaneous control of mechanisms including robotic arms. Test subjects did just that while running or being subjected to shaking, jostling, and high-frequency vibrations, with ocean conditions simulated using the Scripps Ocean-Atmosphere Research Simulator at UC San Diego’s Scripps Institution of Oceanography. In all cases, the system delivered accurate, low-latency performance.
If Chen and colleagues are correct, their new device is the first gesture-based wearable remote control to eliminate the obstacle of turbulence-created data-noise, which means that such systems are now practical beyond pristinely rigid laboratory conditions, and are now valid for the real world of humans who sometimes don’t – or can’t – stop moving.
That means future applications for the UC San Diego device could include assisting factory and emergency workers through hands-free remote control of robots, vehicles, and tools, even at high speeds or during dangerous conditions.
But the unit’s use goes far beyond conditions suited for action movies and disaster scenarios. For instance, patients undergoing rehabilitation or those with motion impairments could train the device’s model using their own natural gestures without the need for fully restored fine motor control.
“This advancement,” says Chen, “brings us closer to intuitive and robust human-machine interfaces that can be deployed in daily life.”