When it comes to human-machine interfaces, it would certainly help if computers could get a sense of what sort of people they were dealing with, so they could tailor their responses accordingly. Well, computers in the future may be able to do so, simply by watching how users move their eyes.

In a new study led by the University of South Australia, 42 test subjects were fitted with wearable eye-tracking systems – these recorded the subjects' eye movements as they performed "everyday tasks" around a university campus. All 42 people also completed questionnaires, which are commonly used to indicate dominant personality traits.

The scientists then cross-referenced the two sets of data, utilizing machine-learning algorithms to determine if a propensity towards specific eye movements went hand-in-hand with specific personalities. What they discovered was that yes, out of the so-called "Big Five" personality traits, certain eye movements could be matched up to four – neuroticism, extroversion, agreeableness, and conscientiousness.

"People are always looking for improved, personalized services," says team member Dr. Tobias Loetscher. "However, today's robots and computers are not socially aware, so they cannot adapt to non-verbal cues. This research provides opportunities to develop robots and computers so that they can become more natural, and better at interpreting human social signals."

Also taking part in the project were scientists from the University of Stuttgart, Flinders University, and the Max Planck Institute for Informatics. Their findings are described in a paper that was recently published in the journal Frontiers in Human Neuroscience.

In a previous study, it was found that eye movements could also indicate if people were lying.