Free Republic
Browse · Search
General/Chat
Topics · Post Article

Skip to comments.

AI Images Exposed: Researchers Reveal Simple Method To Detect Deepfakes
Scitech Daily ^ | August 01, 2024 | Royal Astronomical Society

Posted on 08/01/2024 12:34:45 PM PDT by Red Badger

In this image, the person on the left is real, while the person on the right is AI-generated. Their eyeballs are depicted underneath their faces. The reflections in the eyeballs are consistent for the real person, but incorrect (from a physics point of view) for the fake person. Credit: Adejumoke Owolabi

====================================================================================

By using astronomical methods to analyze eye reflections, researchers can potentially detect deepfake images, though the technique includes some risk of inaccuracies.

In an era when anyone can create artificial intelligence (AI) images, the ability to detect fake pictures, particularly deepfakes of people, is becoming increasingly important. Now, scientists say the eyes may be the key to distinguishing deepfakes from real images.

Detecting Deepfakes Through Eyeball Analysis

New research presented at the Royal Astronomical Society’s National Astronomy Meeting indicates that deepfakes can be identified by analyzing the reflections in human eyes, similar to how astronomers study pictures of galaxies. The study, led by University of Hull MSc student Adejumoke Owolabi, focuses on the consistency of light reflections in each eyeball. Discrepancies in these reflections often indicate a fake image.

Deepfake Eyes

A series of deepfake eyes showing inconsistent reflections in each eye. Credit: Adejumoke Owolabi

=============================================================================================

Astronomical Techniques in Deepfake Detection

“The reflections in the eyeballs are consistent for the real person, but incorrect (from a physics point of view) for the fake person,” said Kevin Pimbblet, professor of astrophysics and director of the Centre of Excellence for Data Science, Artificial Intelligence and Modelling at the University of Hull.

Researchers analyzed reflections of light on the eyeballs of people in real and AI-generated images. They then employed methods typically used in astronomy to quantify the reflections and checked for consistency between left and right eyeball reflections.

Eyes from Real Images

A series of real eyes showing largely consistent reflections in both eyes. Credit: Adejumoke Owolabi

==========================================================================================

Measuring Inconsistencies and Implications

Fake images often lack consistency in the reflections between each eye, whereas real images generally show the same reflections in both eyes.

“To measure the shapes of galaxies, we analyze whether they’re centrally compact, whether they’re symmetric, and how smooth they are. We analyze the light distribution,” said Pimbblet. “We detect the reflections in an automated way and run their morphological features through the CAS [concentration, asymmetry, smoothness] and Gini indices to compare similarity between left and right eyeballs.

“The findings show that deepfakes have some differences between the pair.”

The Gini coefficient is normally used to measure how the light in an image of a galaxy is distributed among its pixels. This measurement is made by ordering the pixels that make up the image of a galaxy in ascending order by flux and then comparing the result to what would be expected from a perfectly even flux distribution. A Gini value of 0 is a galaxy in which the light is evenly distributed across all of the image’s pixels, while a Gini value of 1 is a galaxy with all light concentrated in a single pixel.

The team also tested CAS parameters, a tool originally developed by astronomers to measure the light distribution of galaxies to determine their morphology, but found it was not a successful predictor of fake eyes.

“It’s important to note that this is not a silver bullet for detecting fake images,” Pimbblet added. “There are false positives and false negatives; it’s not going to get everything. But this method provides us with a basis, a plan of attack, in the arms race to detect deepfakes.”


TOPICS: Arts/Photography; Computers/Internet; Education; Military/Veterans
KEYWORDS: deepfake

1 posted on 08/01/2024 12:34:45 PM PDT by Red Badger
[ Post Reply | Private Reply | View Replies]

To: Red Badger

Interesting…is this how AI determined that all of the Apollo moon landing photos were faked?


2 posted on 08/01/2024 12:40:58 PM PDT by Jan_Sobieski (Sanctification)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Red Badger

Of course the next level of AI will be programmed to correct the appearance of light reflecting off eyes.


3 posted on 08/01/2024 12:46:46 PM PDT by Bubba_Leroy ( Dementia Joe is Not My President)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Red Badger
Give it a few more iterations.

If it's detectable it can be programmed in.

4 posted on 08/01/2024 12:50:36 PM PDT by Salman (It's not a slippery slope if it was part of the program all along. )
[ Post Reply | Private Reply | To 1 | View Replies]

To: Red Badger

Great! Now use it on the “Biden” encounters as of late.


5 posted on 08/01/2024 12:51:24 PM PDT by politicket
[ Post Reply | Private Reply | To 1 | View Replies]

To: Red Badger

The Coming bio-markers.

Revelation of John.


6 posted on 08/01/2024 12:57:37 PM PDT by Varsity Flight ( "War by 🙏 the prophesies set before you." I Timothy 1:18. Nazarite warriors. 10.5.6.5 These Days)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Red Badger
The skin on the woman in the left photo is just too perfect. Her image has such smooth, beautiful skin that it was no doubt digitally retouched. The AI image on the right has rough skin -- I have to hand it to the AI to generate imperfect skin.

I've used "PortraitPro" software on family photos. It's easy to use and does amazing quality work. Here are two sample portraits from the Anthropics web site showing before and after shots. Can you tell which is which?


7 posted on 08/01/2024 1:01:21 PM PDT by ProtectOurFreedom (“When exposing a crime is treated like a crime, you are being ruled by criminals” – Edward Snowden)
[ Post Reply | Private Reply | To 1 | View Replies]

To: ProtectOurFreedom

I can’t see your pictures.

Our security software won’t allow it...............


8 posted on 08/01/2024 1:03:19 PM PDT by Red Badger (Homeless veterans camp in the streets while illegals are put up in 5 Star hotels....................)
[ Post Reply | Private Reply | To 7 | View Replies]

To: Bubba_Leroy

bingo


9 posted on 08/01/2024 1:03:37 PM PDT by BigFreakinToad (Remember the Biden Kitchen Fire of 2004)
[ Post Reply | Private Reply | To 3 | View Replies]

To: Red Badger

ran across a blog where AI generated picture were called F’art


10 posted on 08/01/2024 1:04:30 PM PDT by BigFreakinToad (Remember the Biden Kitchen Fire of 2004)
[ Post Reply | Private Reply | To 1 | View Replies]

To: BigFreakinToad

11 posted on 08/01/2024 1:20:41 PM PDT by Red Badger (Homeless veterans camp in the streets while illegals are put up in 5 Star hotels....................)
[ Post Reply | Private Reply | To 10 | View Replies]

To: Red Badger
"Our security software won’t allow it..."

Interesting. I used catbox.moe. Here are the same images hosted at imgbb.com. Can you see these?


12 posted on 08/01/2024 1:29:56 PM PDT by ProtectOurFreedom (“When exposing a crime is treated like a crime, you are being ruled by criminals” – Edward Snowden)
[ Post Reply | Private Reply | To 8 | View Replies]

To: Red Badger

Until the next patch, at least.


13 posted on 08/01/2024 3:50:56 PM PDT by eclecticEel ("The petty man forsakes what lies within his power and longs for what lies with Heaven." - Xunzi)
[ Post Reply | Private Reply | To 1 | View Replies]

To: Red Badger

The “wax” look.


14 posted on 08/01/2024 3:51:22 PM PDT by Zathras
[ Post Reply | Private Reply | To 1 | View Replies]

To: ProtectOurFreedom
Can you see these?

FWIW I can't see any of the four images over a fast food restaurant's wifi.

15 posted on 08/01/2024 5:20:45 PM PDT by TChad
[ Post Reply | Private Reply | To 12 | View Replies]

To: TChad
FWIW I can't see any of the four images over a fast food restaurant's wifi.

Different wifi, same computer, all images are now visible.

16 posted on 08/01/2024 6:44:36 PM PDT by TChad
[ Post Reply | Private Reply | To 15 | View Replies]

To: TChad

Weird. Red Badger reported the same problem with his work security software and images hosted on catbox.moe. I reposted in #12 with images hosted on imgbb.com.


17 posted on 08/01/2024 6:52:46 PM PDT by ProtectOurFreedom (“When exposing a crime is treated like a crime, you are being ruled by criminals” – Edward Snowden)
[ Post Reply | Private Reply | To 16 | View Replies]

Disclaimer: Opinions posted on Free Republic are those of the individual posters and do not necessarily represent the opinion of Free Republic or its management. All materials posted herein are protected by copyright law and the exemption for fair use of copyrighted works.

Free Republic
Browse · Search
General/Chat
Topics · Post Article

FreeRepublic, LLC, PO BOX 9771, FRESNO, CA 93794
FreeRepublic.com is powered by software copyright 2000-2008 John Robinson