TwistedSifter

New Study Finds That The Best AI Generated Deep Fakes Are Nearly Impossible For People Even Advanced Detection Tools To Spot

Deep fake creation concept

Shutterstock

Artificial intelligence is improving quite rapidly and depending on the estimates you trust, could end up disrupting just about every job on the planet. AI, however, is not just one thing, but rather many different applications based on a broad area of technology. As with any major technology advancement, it brings many great benefits, but also some potential dangers that need to be mitigated.

When it comes to AI, one of the most serious is its ability to generate high-quality images and videos. Sure, there are plenty of images and videos out there from AI that are laughably bad, but the technology has gotten much better in a very short amount of time. In fact, if you have spent any amount of time on YouTube, TikTok, Instagram Reels, Facebook, X, or any other popular social media site, you have definitely come across AI generated content.

If you don’t think you have, you are not only wrong, but at great risk of being deceived by the many uses of AI. Sure, some AI videos are just for entertainment, and they can be quite good. Other videos (or images), however, are used either scam someone to make money, or even to discredit someone and cause political issues. One of the most dangerous forms of AI have been called “Deep Fakes”.

Deep Fakes are when someone either generates an entirely fake video of another person, or uses a real video that is edited using AI to get them to say or do something that they never actually did. If you’ve seen viral videos of politicians like Donald Trump or Barack Obama dancing, fist fighting, or singing a funny song, those are some examples of (relatively harmless) deep fakes. What happens, however, when someone makes a deep fake of, for example, President Trump saying something that he never said to try to discredit him (put aside political opinions for now, this is just an example. It could happen to literally anyone).

Shutterstock

Up until recently, deep fakes were good enough to fool many people, but there were some ways to distinguish them from reality. If a deep fake was made about someone important and it was causing serious problems, they could come out and prove that it was indeed AI generated. One of the ways this was done was by having other computer algorithms analyze the video to look for things like a pulse at various points in the body.

While all people have a pulse, AI videos don’t. Or rather, didn’t.

According to a new study published in Frontiers in Imaging, AI technology is reaching the point where the even 100% fake people in fake videos will have the almost imperceptible pulse beating in their neck, wrists, and other areas. Peter Eisert is a professor at the Humboldt University of Berlin. He talked about this in a statement, saying:

“Here we show for the first time that recent high-quality deepfake videos can feature a realistic heartbeat and minute changes in the color of the face, which makes them much harder to detect.”

On the one hand, having technology that has become so realistic that even computers can’t tell what is real and fake can be good. The CGI in movies, for example, will have far more potential than ever before. For scammers, political disrupters, and many others, however, this opens the door to a lot of problems.

To make matters worse, this ability isn’t just going to be something that billion dollar movie studies have access to. Even free versions of AI image and video creation tools will have this ability sooner than most people would believe. That is because AI tools ‘learn’ based on real information. When an AI video creation tool ‘watches’ a real video of a real person, it is going to learn about all the tiny things that make a video look real and incorporate them into the videos it generates. Eisert explained:

“Our results show that a realistic heartbeat may be added by an attacker on purpose, but can also be ‘inherited’ inadvertently from the driving genuine video. Small variations in skin tone of the real person get transferred to the deepfake together with facial motion, so that the original pulse is replicated in the fake video.”

Shutterstock

As of today, there are still ways to detect deep fakes. It isn’t as simple as looking for a hand with 12 fingers like in older fake videos though.

Instead, experts will need access to state-of-the-art deep fake detectors to watch for the slightest of inconsistencies in things like blood flow patterns under the skin. And of course, even that type of thing will likely be impossible within a very short time due to the fact that AI is advancing so quickly.

If you enjoyed that story, check out what happened when