‘An average of 49% of the participants were fooled.’ Scientists Say They Can Use Deepfakes To Implant False Memories
by Trisha Leigh
The idea of deepfakes and the darker side of technology and what it can do is enough to make anyone uncomfortable.
The revelation that scientists believe deepfakes can even be used to implant false memories, then, should feel more than a little unsettling.
That is the claim in this paper, which details the process in which clips of made-up movies tricked participants into believing they were real.
Subjects ranked the fake movies and came to believe things like Chris Pratt starring in Raiders of the Lost Ark – and that it was better than the original.
All were presented as if they were real and participants were asked if they had seen it and to rate how it compared to the original. pic.twitter.com/MYxzAJnrDy
— Gillian Murphy (@gillysmurf) July 13, 2023
Gillian Murphy, a misinformation researcher and lead study author at University College Cork, did admit to this one caveat in an interview with The Daily Beast.
“However, deepfakes were no more effective than simple text descriptions at distorting memory. We shouldn’t jump to predictions of dystopian futures based on our fears around emerging technologies. Yes there are very real harms posed by deep fakes, but we should always gather evidence for those harms in the first instance, before rushing to solve problems we’ve just assumed might exist.”
In the study, researchers showed deepfaked video clips to 436 participants. They mostly involved big-name stars headlining movies they were not originally involved in.
An average of 49% of the participants were fooled, which means the effects were pretty convincing. As we mentioned, many of the participants would even claim the remakes with the different actors and actresses were “better” than the originals.
“Our findings are not especially concerning, as they don’t suggest any uniquely powerful threat of deepfakes over and above existing forms of misinformation.”
Researchers like these say we can combat these effects by ensuring that we are literate enough about technology to distinguish between fake and real media, but that’s only going to get harder as AI evolves over time.
Hopefully technology to combat it evolves as well, because human evolution takes just a bit more time to catch up.