Computers

AI "painting" approach packs more emotion than pixelation

View 2 Images
Prof. Steve DiPaola with examples of a pixelated facial image (left) and one that's been anonymized using the AI painting technique
Simon Fraser University
Prof. Steve DiPaola with examples of a pixelated facial image (left) and one that's been anonymized using the AI painting technique
Simon Fraser University
Based on techniques used by artists such as Picasso and van Gogh, the system boosts anonymization while simultaneously enhancing the subject's underlying facial expressions
AIpaint360

If you really want to sense the emotion in what someone is saying, it helps if you can see their facial expressions along with hearing their words. Doing so is impossible, however, when news programs pixelate or black out the faces of anonymous interviewees. Scientists have now developed a workaround, that uses artificial intelligence (AI) to "paint" those people's faces instead.

The system was developed by professors Steve DiPaola and Kate Hennessy from Canada's Simon Fraser University, working with assistant professor Taylor Owen from the University of British Columbia.

It starts by distorting the video image of a person's face, altering their facial proportions to make them less recognizable. This is initially done by a human computer operator, with the AI then adding a second level of random distortions – that randomization makes it impossible for anyone to see what the person originally looked like by reverse-engineering the process.

Based on techniques used by artists such as Picasso and van Gogh, the system boosts anonymization while simultaneously enhancing the subject's underlying facial expressions
AIpaint360

Next, the AI applies its painting process to the image. Based on techniques used by artists such as Picasso and van Gogh, this boosts the anonymization while simultaneously enhancing the subject's underlying facial expressions. The AI also takes the tone of subjects' voices into account when determining not only how to depict their faces, but also when choosing elements such as colors, which help convey emotion.

"When artists paint a portrait, they try to convey the subject's outer and inner resemblance," says DiPaola. "With our AI, which learns from more than 1,000 years of artistic technique, we have taught the system to lower the outer resemblance and keep as high as possible the subject's inner resemblance – in other words, what they are conveying and how they are feeling."

Plans now call for the technology to be tested with a partnering journalistic institution. The system could also have applications in 360-degree virtual reality.

You can see it in use, in the following video.

Sources: Simon Fraser University, AIpaint360

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
1 comment
f8lee
It's a cute and clever idea, but for one thing, its ability to anonymize is related to race and perhaps gender - for example, we can certainly tell that the blond white person is not Maxine Waters.
But the more obvious flaw IMHO is that once a system is used to distort a face (albeit today with the best of intention) how would a viewer know for certain that the sadness or fear (or whatever emotion) being portrayed just wasn't added in post production? To use an extreme example, if a psychopath blandly describes how she beheaded and disemboweled the dozen victims that have been found, but a future version of this "tool" allows the lawyers to make that statement look as though the subject were truly sorry, how would a viewer know?
So on the one hand, it cannot anonymize completely (again, the blond white chick ain't Michelle Obama) and yet at the same time it could easily be used to "create" apparent emotions that were not originally there, making it highly suspect to anyone watching.