Images of ourselves recorded through cameras on smartphones and laptops can be a welcome addition to communication with friends or professional interactions, or just a bit of fun. But this powerful combination of hardware and software is being tapped into by scientists for other purposes as well. A team of researchers at the University of Rochester has developed a computer program that can help health professionals monitor a person`s mental health through the images from selfie videos the patient records while engaging in social media activity.
The method is a variation of existing health monitoring programs. The novelty here is that the user’s behavior can be monitored quietly and unobtrusively while they routinely use their computer or smartphone. No extra information about how the user is feeling needs to be provided. No special accessories are required, either. The user just needs to go about their computer routine as usual.
During its experiments, the team, led by computer science professor Jiebo Luo, successfully measured a user’s heart rate simply by monitoring small changes in the patient’s forehead color. Other visual signals could be extracted, such as blinking rate, eye pupil radius and head movement rate, from the video data, all of this using modern computer vision and signal processing techniques.
Parallel to that, the system analyses the user’s posts on social media, what they read on their favorite networks, and even their mechanical movements such as scrolling speed, typing and mouse clicking. Assessment weighting varies. What the user tweets, for instance, is more relevant than what they read as it can express the user’s thoughts and feelings.
To calibrate and assess the system, the research team enrolled 27 participants in a test group and sent them messages loaded with negative and positive sentiment to induce emotional reactions in order to gauge their reaction when exposed to that kind of material.
To further test the program’s performance and accuracy, the researchers compared the outcome from all the combined monitoring with the users’ own reports about their feelings, which they call ground truth.
This cross-referencing of data allows the program to understand from just the data gathered whether the user is feeling positive, neutral or negative, which are the only sentiments the program can read at this stage of its development. However, the researchers hope to make it more sensitive so it can capture more specific emotions such as sadness and anger.
No app has been created yet, but the team has plans to design one that would make users more aware of their emotional fluctuations and empower them to make their own adjustments.
The team has also taken into account ethical concerns that monitoring a person’s mental health can raise. The user would have to give the app permission to observe them constantly. The program is designed for the user only and does not share data with anyone else unless the user so designates. It does not grab personal data such as user location either.
A paper outlining the project was presented at the 29th AAAI Conference on Artificial Intelligence (AAAI) in Austin, Texas between January 25 and 30.
Source: University of Rochester