Science

Scientists pinpoint where in the brain we process facial expressions

Scientists pinpoint where in the brain we process facial expressions
The researchers were able to determine the exact region of the brain that works out the expression a person is conveying with their facial muscles
The researchers were able to determine the exact region of the brain that works out the expression a person is conveying with their facial muscles
View 1 Image
The researchers were able to determine the exact region of the brain that works out the expression a person is conveying with their facial muscles
1/1
The researchers were able to determine the exact region of the brain that works out the expression a person is conveying with their facial muscles

Recognizingfacial expressions is something that we do naturally, without anythought. However, whenever we smile or frown, or express any numberof emotions using our faces, we move a large number of muscles in acomplex manner. While we're not conscious of it, when you're lookingat someone making a facial expression, there's a whole part of ourbrains that deals with decoding the information conveyed by thosemuscles. Now,researchers at the Ohio State University have worked to pinpointexactly where in the brain that processing occurs.

To do so, theyturned to a method called functional resonance imaging or fMRI, asystem that detects increased blood flow in the brain, indicatingwhich part has been activated.

Tenstudents were placed in an fMRI machine, and showed more than 1,000photographs of people making facial expressions, each of whichcorresponded to one of seven emotional categories.Throughout the test, all participants showed increasedactivity in the same region of the brain, known as the posterior superior temporal sulcus (pSTS). This confirms that that region,located behind the ear on the right side of the brain, is responsiblefor recognizing facial expressions, but the data allowed the team todig a little deeper.

Bycross referencing the fMRI images with the muscle movements of eachexpression, the team was able to create a map of smaller regionswithin the pSTS that are activated by the movement of certainmuscles. This data was used to create a machine learning algorithmthat's able to identify the facial expression that an individual islooking at based purely on the fMRI data.

Theytested the system quite extensively, creating maps from the data ofnine of the participants, feeding the fMRI images from the 10thstudent to the algorithm, which was able to accurately identify theexpression around 60 percent of the time. To confirm the results, theexperiment was repeated, creating the maps from scratch and using adifferent student to view the expressions.

Thestudy essentially confirms that the mechanisms that we use toidentify facial expressions are very similar from one person to thenext.

"Thiswork could have a variety of applications, helping us not onlyunderstand how the brain processes facial expressions, but ultimatelyhow this process may differ in people with autism for example,"said study co-author Julie Golomb.

Thisnew work isn't the only recent breakthrough related to how we viewfaces. Earlier this month, University of Montreal researchers announced the results of a study that confirmed that Alzheimer's patients actually loose theability to holistically view faces, which is thought to be the means by which we recognize loved ones.

Fulldetails of the Ohio State research are published online in theJournal of Neuroscience.

Source:Ohio State University

No comments
0 comments
There are no comments. Be the first!