Want to spot a deepfake? The eyes could be a giveaway

A technique from astronomy could reveal reflection differences in AI-generated people’s eyeballs

Two side-by-side images and closeups of their eyes (displayed under their images) reveal that the image on the left is a real picture of the actress Scarlett Johansson while the image on the right is AI-generated. The tip-off is that the reflections in their left and right eyes don't match.

Reflections in the eyeballs of these images reveal that the one on the right is AI-generated, while the image on the left is a real photo (of the actress Scarlett Johansson).

Adejumoke Owolabi

Clues to deepfakes may be in the eyes.

Researchers at the University of Hull in England reported July 15 that eye reflections offer a potential way to suss out AI-generated images of people. The approach relies on a technique also used by astronomers to study galaxies.

In real images, light reflections in the eyeballs match up, showing, for instance, the same number of windows or ceiling lights. But in fake images, there’s often an inconsistency in the reflections. “The physics is incorrect,” says Kevin Pimbblet, an observational astronomer who worked on the research with then–graduate student Adejumoke Owolabi and presented the findings at the Royal Astronomical Society’s National Astronomy Meeting in Hull. 

To carry out the comparisons, the team first used a computer program to detect the reflections and then used those reflections’ pixel values, which represent the intensity of light at a given pixel, to calculate what’s called the Gini index. Astronomers use the Gini index, originally developed to measure wealth inequality in a society, to understand how light is distributed across an image of a galaxy. If one pixel has all the light, the index is 1; if the light is evenly distributed across pixels, the index is 0. This quantification helps astronomers classify galaxies into categories such as spiral or elliptical.

In the current work, the difference in the Gini indices between the left and right eyeballs is the clue to the image’s authenticity. For about 70 percent of the fake images the researchers examined, this difference was much greater than the difference for real images. In real images, there tended to be no, or close to no, difference.

A set of four pairs of eyes shows how the reflections in each eye don't match, revealing the images to be AI-generated. Green and red annotations in the eyes in the left column point out the differences.
Each of these pairs of eyes (left) have reflections (highlighted at right) that reveal them as deepfakes.Adejumoke Owolabi

“We can’t say that a particular value corresponds to fakery, but we can say it’s indicative of there being an issue, and perhaps a human being should have a closer look,” Pimbblet says.

He emphasizes that the technique, which could also work on videos, is no silver bullet for detecting fakery (SN: 8/14/18). A real image can look like a fake, for example, if the person is blinking or if they are so close to the light source that only one eye shows the reflection. But the technique could be a part of a battery of tests — at least until AI learns to get reflections right.

About Ananya

Ananya is a freelance science writer, journalist and translator, with ​a ​research background in robotics. She covers all things algorithms, robots, animals, oceans, ​​urban and the people involved in these fields.