MIT's EQ-Radio wirelessly monitors heartbeat and breathing to detect emotionsView gallery - 2 images
Facial expressions are often our first indication as to what other people are feeling, but it's far from an exact science. Now, researchers from MIT's Computer Science and Artificial Intelligence Laboratory (CSAIL) have developed the EQ-Radio, a device that uses RF signals to see through that poker face and determine someone's emotional state with high accuracy based on their heartbeat and breathing.
The study builds on previous work from lead researcher Dina Katabi, including a system that uses RF signals to track motion through walls and identify silhouettes of multiple people. This time, when those signals are beamed towards a person and bounce back to a receiver, the system measures minute changes in the patterns of their breathing and heartbeat.
Sick of Ads?
More than 700 New Atlas Plus subscribers read our newsletter and website without ads.
Join them for just US$19 a year.More Information
With a margin of error of about 0.3 percent, the researchers claim the device can wirelessly measure heartbeats with the precision of a regular ECG monitor, but without the inconvenience of having electrodes placed on the skin. The EQ-Radio works by breaking down the returning signals into individual heartbeats, and by measuring the intervals between them and how those intervals change over time. In this way, the researchers say it can determine the subject's arousal and positive or negative affect.
That data is then run through algorithms that match those with preset emotional states: happy, excited, angry or sad. So, the system will be more likely to tag a person it determines has low arousal and negative affect as "sad," while high arousal and positive affect will fit the bill for "excited."
Dina Katabi (center) demonstrates that although Fadel Adib's face (right) is neutral, the system can read his breathing and heart rate and determine that he is feeling sad
"Our work shows that wireless signals can capture information about human behavior that is not always visible to the naked eye," says Katabi. "We believe that our results could pave the way for future technologies that could help monitor and diagnose conditions like depression and anxiety."
The team trained the system to recognize these emotions by having test subjects watch videos or listen to music that evoked either pleasure, sadness, anger or joy, along with an emotionless baseline. The result was a reported 87 percent success rate in identifying what the person was feeling and, even when it hadn't measured a person before, it still managed 70 percent accuracy.
"Just by generally knowing what human heartbeats look like in different emotional states, we can look at a random person's heartbeat and reliably detect their emotions," says Mingmin Zhao, one of the researchers.
To achieve this level of accuracy, the researchers had to find a way to filter out irrelevant data, such as the larger movements of the chest due to breathing that can throw off the more subtle movements of the heartbeat. To achieve this, the team focused on acceleration of the chest rather than the overall distance moved.
The team says EQ-Radio could be used to study consumer behavior, allowing for advertising or entertainment creators to analyze an audience's subconscious physical responses, or even smart home devices that can tell if a user is getting agitated or upset and recommend that they take a break. And with the precision of the device's ability to read the waveform of a heartbeat, it may find a use in health monitoring equipment.
"By recovering measurements of the heart valves actually opening and closing at a millisecond time-scale, this system can literally detect if someone's heart skips a beat," says Fadel Adib, another of the researchers. "This opens up the possibility of learning more about conditions like arrhythmia, and potentially exploring other medical applications that we haven't even thought of yet."
The paper is available here and the team will present their work at the International Conference on Mobile Computing and Networking in October. The video below explains the project.