11.06.2025 06:35

Deepfakes Become Even Harder to Detect: Now They Have Heartbeats

News image

In the ever-evolving landscape of artificial intelligence, deepfake technology has taken a significant leap forward, raising new challenges for detection systems.

Recent research from Humboldt University in Germany has revealed that modern deepfakes can now replicate realistic heartbeat patterns, rendering one of the most reliable detection methods — remote photoplethysmography (rPPG) — nearly obsolete. This advancement marks a critical turning point in the ongoing battle between deepfake creators and those working to unmask them.


The Science of Heartbeat Detection

When your heart beats, blood flows through vessels and into your face, causing subtle changes in skin color due to the way light passes through the skin.

These minute variations, imperceptible to the naked eye, can be detected using rPPG, a technique originally developed for medical purposes, such as measuring vital signs via webcams.

By analyzing these changes, rPPG-based detectors can estimate heart rate with remarkable accuracy, often within two to three beats per minute when compared to electrocardiogram (ECG) readings.

In genuine videos, blood flow distributes across the face with a slight temporal delay, creating a unique physiological signature that has been a cornerstone of deepfake detection.

Until recently, it was widely assumed that deepfakes could not replicate these subtle signals, making rPPG a robust tool for distinguishing real videos from synthetic ones. However, this assumption no longer holds.


A New Era of Deepfakes

A groundbreaking study, published in Frontiers in Imaging on April 30, 2025, by researchers from Humboldt University and the Fraunhofer Heinrich-Hertz-Institute, demonstrates that high-quality deepfakes can now inadvertently "inherit" heartbeat patterns from their source videos.

Led by Professor Peter Eisert, the research team developed a state-of-the-art rPPG-based detector that analyzes just 10 seconds of facial video to extract pulse signals. When tested on genuine videos, the detector performed flawlessly, accurately identifying heart rates.

Surprisingly, when the same detector was applied to deepfakes created using recent face-swapping techniques, it detected realistic pulse signals where none should exist. These signals were not deliberately programmed but were transferred from the original "driving" videos used to generate the deepfakes.

As Eisert explains, “Small variations in skin tone of the real person get transferred to the deepfake together with facial motion, so that the original pulse is replicated in the fake video.” This unintended preservation of physiological cues makes these deepfakes nearly indistinguishable from authentic footage using current detection methods.


Implications and Challenges

The discovery has profound implications for cybersecurity, media authenticity, and public trust. Deepfakes, which manipulate videos to alter identities or create false narratives, have already been used for malicious purposes, such as spreading misinformation, creating non-consensual content, or framing individuals.

The ability to mimic heartbeats exacerbates these risks, as it undermines a key biometric marker previously considered a reliable indicator of authenticity.

The study highlights that while deepfakes can now replicate global pulse signals across the face, they still fall short in mimicking the spatially and temporally nuanced patterns of blood flow.

In genuine videos, blood flow varies across different facial regions—cheeks, forehead, and chin — in a physiologically consistent manner.

Current deepfake algorithms, however, distribute these signals uniformly, lacking the anatomical precision of real human physiology. This weakness offers a glimmer of hope for detection systems.


The Path Forward

The researchers propose that next-generation deepfake detectors should shift focus from global heart rate detection to analyzing localized blood flow patterns. “Our experiments have shown that current deepfakes may show a realistic heartbeat but do not show physiologically realistic variations in blood flow across space and time within the face,” Eisert notes.

By exploiting this limitation, new detection systems could regain the upper hand in the technological arms race.

This development underscores the need for adaptive, multi-layered detection strategies. Beyond physiological signals, researchers are exploring other methods, such as tracking pixel brightness changes, analyzing eye-blinking patterns, or implementing digital watermarks to verify authenticity.

Companies like Adobe and Google are also investing in complementary technologies to combat the growing sophistication of synthetic media.


Also read:

Conclusion

The rise of deepfakes with realistic heartbeats signals a new chapter in the challenge of combating synthetic media. As AI continues to blur the line between real and fake, the stakes for developing robust detection tools have never been higher.

The findings from Humboldt University serve as both a warning and a call to action, urging researchers to innovate and stay ahead of malicious actors.

In an era where seeing is no longer believing, the focus must shift to uncovering the subtle, often invisible, markers that separate truth from deception.

Sources: Frontiers in Imaging, BBC Science Focus Magazine, Humboldt University of Berlin


0 comments
Read more