Deepfake Detection Software Analyzes a Person’s Blood Flow

Intel has shown off a new program that detects deepfakes by analyzing the person’s blood flow. Called FakeCatcher, a BBC journalist went along to Intel to find out how it works and if it is accurate.

Deepfakes are increasingly a problem; inauthentic videos and misinformation have the potential to corrupt societal trust.

“There’s this potential for video to deceive people,” says Matt Groh, assistant professor at MIT. “There is also the example of a deepfake of President Zelensky and that was to admit defeat to the Russians about a year and a couple of months ago.”

Even experts can have a hard time telling whether a video is a deepfake or not. But Intel claims to have solved this problem by building software that analyzes blood under the skin.

“If you only try and find the wrong things something they can be fixed and you can no longer find the wrong things,” explains Ilke Demir, a senior research scientist at Intel.

“So we twist that question and we ask what is real about authentic videos, what is real about us? So FakeCatcher looks at that question in the sense of looking at your heart.”

Demir explains that when a person’s heart pumps blood it changes the color of that person’s veins. That color change is something called photoflexmography (PPG).

“We take those PPG signals from many places on your face and convert them into PPG maps and then we develop a deep learning approach on top of that to classify into fake or real videos,” she says.

In short, FakeCatcher looks for minuscule signs of blood flow in your face, something a deepfake wouldn’t have.

FakeCatcher looks for other clues, such as eyes. Demir explains that real humans look at a specific point but deepfakes actually have “googly eyes.”

Intels says FakeCatcher is 96 percent accurate and works on many types of deepfake. But, when the BBC journalist tested the software it didn’t do that well.

FakeCatcher was shown deepfake videos and real videos of President Biden and Donald Trump. It marked the deepfake videos as inauthentic but it also marked the genuine videos as deepfakes too.

The model does not yet analyze audio which Intel hopes will improve its accuracy.


Image credits: Header photo licensed via Depositphotos.

Leave a Reply

Your email address will not be published. Required fields are marked *