Free Republic
Browse · Search
General/Chat
Topics · Post Article

Skip to comments.

Real-Time Deepfakes Can Be Beaten by a Sideways Glance
The Register ^ | Mon 8 Aug 2022 | Brandon Vigliarolo

Posted on 08/08/2022 4:38:06 PM PDT by nickcarraway

For now at least, until data catches up

Real-time deepfake videos, heralded as the bringers of a new age of internet uncertainty, appear to have a fundamental flaw: They can't handle side profiles.

That's the conclusion drawn in a report [PDF] from Metaphysic.ai, which specializes in 3D avatars, deepfake technology and rendering 3D images from 2D photographs. In tests it conducted using popular real-time deepfake app DeepFaceLive, a hard turn to the side made it readily apparent that the person on screen wasn't who they appeared to be.

Multiple models were used in the test - several from deepfake communities and models included in DeepFaceLive - but a a 90-degree view of the face caused flickering and distortion as the Facial Alignment Network used to estimate poses struggled to figure out what it was seeing.

deep-fake-profile A pair of images from Metaphysic's tests showing a deepfaked Jim Carrey, and the result of turning to the side.

"Most 2D-based facial alignments algorithms assign only 50-60 percent of the number of landmarks from a front-on face view to a profile view," said Metaphysic.ai contributor Martin Anderson, who wrote the study's blog post.

Without being able to see enough reference points, the software simply doesn't know how to project its fake face.

Derailing deepfakes In a matter of just a few years, deepfakes have advanced from being able to superimpose faces onto images, to doing the same in pre-recorded video. The latest advances allow real-time face swapping, which has resulted in more deepfakes being used in online fraud and cybercrime.

A study from VMware found that two thirds of respondents encountered malicious deepfakes as part of an attack, a 13 percent increase from the previous year. Note that the VMware study didn't specify if the deepfake attacks respondents encountered were prerecorded or real time, and only had a sample size of 125 people.

The FBI warned in June of scammers using deepfake technology during remote job interviews. Those using the technique have been spotted interviewing for sensitive jobs that would give them access to customer data and businesses' proprietary information, the FBI said.

Deepfake videos have also been used to trick live facial recognition software, according to online fraud-combatting startup Sensity AI. Sensity's tests found that nine out of ten vendors' apps were successfully unlocked using a deepfake-altered video streamed from a mobile phone.

Fears over the technology have become serious enough for the European Union to pass laws levying fines on companies that fail to sufficiently fight deepfakes and other sources of disinformation. China also drafted deepfake laws that threaten legal punishment for misuse of the technology, as well as requiring a grant of permission for any legitimate use of deepfakes, which China calls "deep synthesis."

A workaround for how long? According to Metaphysic's report, even technology like Nvidia's neural radiance field (NeRF), which can generate a 3D scene from only a few still images, suffers from limitations that make it tricky to develop a good side profile view.

NeRFs "can, in theory, extrapolate any number of facial angles from just a handful of pictures. [However] issues around resolution, facial mobility and temporal stability hinder NeRF from producing the rich data needed to train an autoencoder model that can handle profile images well," Anderson wrote. We've reached out to Nvidia to learn more, but haven't heard back yet.

Readers will note that Metaphysic's demonstrations only included celebrity faces, of which plenty of profile views have been captured on film and in photos. The non-famous among us, on the other hand, are unlikely to have many side profile shots on hand.

"Unless you've been arrested at some point, it's likely that you don't have even one such image, either on social media or in an offline collection," Anderson wrote.

FBI warning: Crooks are using deepfake videos in interviews for remote gigs Deepfake attacks can easily trick live facial recognition systems online Amazon can't channel the dead, but its deepfake voices take a close second Prepare for weaponized AI that adapts in real-time to your defenses, says prof Gaurav Oberoi, a software engineer and founder of AI startup Lexion, found much the same when researching deepfakes in 2018. In a post on his blog, Oberoi detailed how deepfakes of comedian John Oliver superimposed over late-night host Jimmy Fallon worked well, but not in profile.

"In general, training images of your target need to approximate the orientation, facial expressions, and lighting in the videos you want to stick them into," Oberoi said. "So if you’re building a face swap tool for the average person, given that the most photos of them will be front-facing, limit face swaps to mostly forward facing videos."

What that means, in effect, is that scammers using real-time deepfakes are unlikely to have the data necessary to create a side profile view that isn't immediately recognizable as fake (provided they're not using a well-photographed celebrity face).

Until we know deepfakers have found a way to get around this shortcoming, it's a good idea to adopt the policy of asking the person on the other end of Zoom to show you a side view of their face - famous or not. ®


TOPICS: Business/Economy; Computers/Internet; TV/Movies
KEYWORDS: deepfake; fauxtography

A pair of images from Metaphysic's tests showing a deepfaked Jim Carrey, and the result of turning to the side.
1 posted on 08/08/2022 4:38:06 PM PDT by nickcarraway
[ Post Reply | Private Reply | View Replies]

To: nickcarraway

Anyone got a link to that recent video of strong sounding yet unblinking Biden?


2 posted on 08/08/2022 4:43:59 PM PDT by MNDude (Once you remove "they would never" from your vocabulary, it all begins to make sense)
[ Post Reply | Private Reply | To 1 | View Replies]

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
General/Chat
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson