Computers

AI learns from brain signals to create personalized attractive faces

A selection of the facial images used in the study
University of Helsinki
A selection of the facial images used in the study
University of Helsinki

Although certain celebrities are widely considered to be nice-looking, beauty does still ultimately lie in the eye of the beholder. A new AI-based system is able to ascertain which features are found most attractive by individual people, and then create faces combining those qualities.

Led by Assoc. Prof. Tuukka Ruotsalo, scientists from the the universities of Helsinki and Copenhagen started by getting a generative adversarial neural network to produce hundreds of lifelike computer-generated portraits.

Those facial images were then shown one at a time to a total of 30 test subjects, on a computer screen. Each person was instructed to focus more attention on the faces which they found most attractive, while the electrical activity of their brain was recorded using EEG (electroencephalography).

Machine learning-based algorithms subsequently determined which faces produced the greatest amount of activity for each person, then established which traits those faces had in common. Based on that data, the neural network then proceeded to produce new faces that combined those traits.

In a double-blind experiment, those new faces were then shown to the person, along with images of many other faces. Eighty-seven percent of the time, the individual selected the new faces as being amongst the most attractive – that figure should rise as the technology is developed further.

It is hoped that the team's findings could ultimately be used to help computer systems understand subjective preferences, and perhaps also to identify people's unconscious attitudes.

"The study demonstrates that we are capable of generating images that match personal preference by connecting an artificial neural network to brain responses," says senior researcher Michiel Spapé. "Computer vision has thus far been very successful at categorizing images based on objective patterns. By bringing in brain responses to the mix, we show it is possible to detect and generate images based on psychological properties, like personal taste."

A paper on the research was recently published in the journal IEEE Transactions in Affective Computing.

Source: University of Helsinki

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
8 comments
DOC HOLLYWOOD
Not seeing any black faces in your thumbnail pic. (not surprised) So in other words...another opportunity for the (mostly) white programmers to code their anti-black racial biases into systems that will eventually determine various levels of "worthiness and desirability". Yep... the future looks a lot like the present.
Reece Agland
What was the racial mix of both participants and photo choices? Many so called AI advances are biased due to input and selection bias.
guzmanchinky
Look no further than Alla Bruletova or Nata Lee or dozens of other Russian Instagram models to find the most perfect faces ever. Someday we will have robots that look like this and we will never leave the house and humans will die out...
EJ222
This would make for great adtech. Imagine internet ads grabbing your attention with attractive AI generated faces, and learning your tastes based on engagement.
Marco McClean
They look like people in screaming-in-Spanish soap operas. They're all competing psychopathic gaslighting villains; except I like the one with the asymmetrically-set eyes, middle, bottom row. She's the smart one, trying to make sense of the situation and/or just get out. The three-hour hair and makeup on all of them is a big turn-off, though if you imagine they're all chewing a big wad of gum and pretending not to for the photo, notice how much nicer to be around they seem. If it were my study, I'd mention the possibility of gum to the wired-up test subjects and see if that changes everything, because it would. Also the smell of all the products.
Jeff7
Clearly some of you haven’t had much experience with market research. Firstly the pics are labelled ‘a selection’. Secondly, when you are showing pics like this to people for research purposes the pics chosen are a carefully curated choice based on the respondents age, sex, race and other criteria. That bunch of people illustrates nothing except a collection of computer generated faces. Do you think those reachers did the whole study and NO ONE noticed there weren’t any people of colour? Looking for bias that didn’t exist.
ArdisLille
Thanks Doc Hollywood. (I think this is all pretty stupid.)
sidmehta
Soon we'll have Hollywood movies using computer-made actors that will look real to us.