ADHD & Autism

Experimental autism treatment reads emotions using Google Glass

View 2 Images
Clinical research coordinator Jessey Schwartz (left) watches as 9 year-old Alex and his mother, Donji Cullenbine (right), use a Google Glass-connected smartphone app
Steve Fisch
Clinical research coordinator Jessey Schwartz (left) watches as 9 year-old Alex and his mother, Donji Cullenbine (right), use a Google Glass-connected smartphone app
Steve Fisch
Alex talks with his mother while wearing Google Glass
Steve Fisch

Children with autism spectrum disorder (ASD) often find it difficult to gauge other people's emotions based on their facial expressions – this can in turn lead to problems in communicating with those people. Scientists at Stanford University, however, are seeing new hope in an approach that utilizes Google Glass smart glasses.

Ordinarily, therapists use things like flash cards of faces to teach ASD kids what different emotions look like. Unfortunately, though, this sort of training has to be performed by a professional in a clinic, plus looking at still images of faces on cards isn't always transferrable to taking part in real-life social interactions.

In the new Stanford study, children with autism wore Google Glass headsets that were wirelessly linked to a machine learning-based app on a nearby smartphone. That app analyzed the view from the glasses' forward-facing camera, gauging the expressions/emotions of the people with which the children were interacting. It then determined which of eight core facial expressions those people were showing, and told the child via the glasses' speaker and in-lens display. Those expressions represented happiness, sadness, anger, disgust, surprise, fear, neutrality or contempt.

The Stanford-designed app was trained on a database of hundreds of thousands of facial photos, in which people were exhibiting those eight expressions. German research group Fraunhofer has also developed a similar Google Glass app.

Alex talks with his mother while wearing Google Glass
Steve Fisch

In a test of the Stanford system (known as Superpower Glass), 14 families had their 3 to 17-year-old ASD children use it at home for at least three 20-minute sessions per week, over an average course of 10 weeks. It could be used in "free play" mode, in which it simply identified other peoples' expressions, or in either of two game modes – in one of these, the child would try to guess the expression being shown by a parent, while in the other, the child would try to get the parent to display an emotion by describing the associated expression.

After the 10 weeks were up, 12 of the families reported that their children had begun making significantly more eye contact. Additionally, based on questionnaires completed by the parents before and after the treatment period, it was found that the children experienced an average decrease of 7.38 points on the SRS-2 autism traits scale – that means their symptoms were less severe. In fact, six of the children moved down one step in their autism severity classification.

"Parents said things like 'A switch has been flipped; my child is looking at me.' Or 'Suddenly the teacher is telling me that my child is engaging in the classroom,'" says the study's senior author, Dr. Dennis Wall. "It was really heartwarming and super-encouraging for us to hear."

A paper on the research was published this week in the journal npj Digital Medicine, and a larger, more randomized trial of the system is now underway.

Scientists at the University of Toronto have also developed a Google Glass-based autism app, that coaches kids in what to say next in conversations.

Source: Stanford University

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
0 comments
There are no comments. Be the first!