New research shows a 704% increase in deepfake "face swap" attacks from the first to the second half of 2023.
A report from biometric firm iProov warns that "face-swapping" fraudsters are increasingly using off-the-shelf tools to create manipulated images and videos.
iProov's analysts are tracking over 100 face swap apps and repositories, meaning that there is a wide selection of low-cost, easily accessible generative AI tools that can create highly convincing deepfakes to trick humans and some remote identity verification solutions that do a "liveness" test.
A "liveness" test will typically ask an individual to look into a webcam, perhaps turning their head from side to side, in order to prove that they are both a real person and to compare their appearance to identity documents.
According to the report, the most commonly used face swap tools by malicious actors are SwapFace, DeepFaceLive, and Swapstream.
Google Trends shows a steady increase in searches for the tools in the last year.
The face-swapping software can create a highly convincing synthetic video, which is fed to a virtual camera that mimics a genuine webcam. This tricks a remote identity verification system into believing the subject's "liveness" and trusting their identity.
Most face swap tools offer a free tier, allowing users to experiment with the technology at no cost. This has made the technology more attractive to malicious actors.
As deepfake technology is adopted more and more by identity fraudsters, an "arms race" will develop. Security firms will be battling to detect synthetic media, and the bad guys will be attempting to avoid detection.
Editor’s Note: The opinions expressed in this guest author article are solely those of the contributor and do not necessarily reflect those of Tripwire.