Technology

Deepfake movie remakes prompt the creation of false memories

Deepfake movie remakes prompt the creation of false memories
A new study has found that deepfake movie 'remakes' prompted the creation of false memories in viewers
A new study has found that deepfake movie 'remakes' prompted the creation of false memories in viewers
View 1 Image
A new study has found that deepfake movie 'remakes' prompted the creation of false memories in viewers
1/1
A new study has found that deepfake movie 'remakes' prompted the creation of false memories in viewers

A new study has found that unofficial movie remakes made using deepfake technology prompted half of the viewers to falsely remember the films and, in some cases, consider them to be better than the original. But researchers also found that false memories could be created by less technical means.

Deepfakes use AI to create convincing images, audio and video, usually switching out one person’s face or voice for another’s. The technology could be considered the 21st-century version of Photoshop.

The technology can bring dead loved ones back to life, let you take a selfie with surrealist painter Salvador Dalí, or literally put words into someone else’s mouth. Yes, deepfakes can be used for good. Consider the recent use of English soccer player David Beckham’s AI-generated image to launch a campaign to end malaria, ‘spoken’ in nine languages. But, at the other end of the spectrum, we’ve seen a rise in non-consensual deepfake celebrity porn, and there are concerns that deepfakes could eventually lead to a society where nothing – and no one – can be trusted.

Deepfake technology has advanced and is more accessible, meaning that deepfakes are easier to make and are becoming more common. But how convincing are deepfakes, really? A new study by researchers at University College Cork, Ireland, and Lero, the Science Foundation Ireland Research Center for Software, has answered that question by examining whether deepfakes can distort our memories and beliefs.

The researchers recruited 436 participants with an average age of 25. Just over a third of participants (35%) had completed an undergraduate or postgraduate degree. The participants were asked to complete an online survey where they were shown clips of real and deepfake movie remakes and asked to give their opinion.

The deepfakes included ‘remakes’ of The Shining, The Matrix, Indiana Jones, and Captain Marvel. In the case of The Shining, for example, Brad Pitt and Angelina Jolie were cast as characters originally played by Jack Nicholson and Shelley Duvall. In The Matrix, participants were shown Will Smith as Neo, the role played by Keanu Reeves. The four real movie remakes were Charlie and the Chocolate Factory, Total Recall, Carrie, and Tomb Raider.

In some cases, participants were given text descriptions of the remakes instead of watching the deepfakes. For example, “In 2012, Brad Pitt & Angelina Jolie starred in a remake of The Shining. The real-life couple played Jack & Wendy Torrance in the Stephen King horror film.” Participants were not told that the deepfakes were false until later in the survey.

The researchers found that the participants readily formed false memories of the deepfake remakes, with an average of 49% believing each remake was real. Captain Marvel was most frequently falsely recalled (73%), followed by Indiana Jones (43%), The Matrix (42%) and The Shining (40%). Many also reported that the deepfake was better than the original: 41% in the case of Captain Marvel, 13% for Indiana Jones, 12% for The Matrix and 9% for The Shining.

Interestingly, false memory rates from the text descriptions were similarly high, suggesting that deepfake technology may not be more powerful than other tools at distorting memory.

“While deepfakes are of deep concern for many reasons, such as non-consensual pornography and bullying, the current study suggests they are not uniquely powerful at distorting our memories of the past,” said the researchers. “Though deepfakes caused people to form false memories at quite high rates, we achieved the same effects using simple text. In essence, this study shows we don’t need technical advances to distort memory, we can do it very easily and effectively using non-technical means.”

Nevertheless, the researchers say their findings represent an important step in establishing the baseline risks of memory distortion as a consequence of exposure to deepfakes and could inform future design and regulation of deepfake technology in the film industry.

They say that further research is needed to understand user perspectives on the inclusion of deepfakes in other areas, such as education, marketing and gaming.

The study was published in PLOS One.

Source: PLOS via EurekAlert!

No comments
0 comments
There are no comments. Be the first!