Take for example a photo of Catherine, Princess of Wales, issued by Kensington Palace in March. News organizations retracted it after experts noted some obvious manipulations. And some questioned whether images captured during the assassination attempt on former president Donald Trump were genuine.
Here are a few things experts suggest the next time you come across an image that leaves you wondering.
Zoom in
It might sound basic, but a study by researcher Sophie Nightingale at Lancaster University in Britain found that, across age groups, the people who took the time to zoom into photos and carefully scrutinize different parts were better at spotting altered images.
Try it the next time you get a weird feeling about a photo. Just be sure not to focus on the wrong things. To help, we’ve created this (slightly exaggerated) sample image to highlight some common signs of image manipulation.
Rather than focusing on things like shadows and lighting, Nightingale suggested looking at “photometric” clues like blurring around the edges of objects that might suggest they’ve been added later; noticeable pixelation in some parts of an image but not others; and differences in coloration.
Consider this parrot: For one, who brings a parrot to a polling location?
And take a closer look at its wings; the blurred edges of its leading feathers contrast with the round cutouts closer to its body. This is clearly an amateurish Photoshop job.
Search for funky geometry
Fine details are among the hardest things to seamlessly edit in an image, so they get flubbed frequently. This is often easy to spot when regular, repeating patterns are disrupted or distorted.
In the image below, note how the shapes of the bricks in the wall behind the divider are warped and squished. Something fishy happened here.
Consider the now-infamous photo of Princess Catherine.
The princess appeared with her arms draped around her two of her children. Online sleuths were quick to point out inconsistencies, including floor tiles that appear to overlap and a bit of molding that appears misaligned.
In our polling place example, did you catch that this person had an extra finger? Sure, it’s possible they have a condition like polydactyly, in which people are born with extra fingers or toes. That’s a bit unlikely though, so if you spot things like extra digits, it could be a sign that AI was used to alter the image.
It’s not just bad Photoshopping that screws up fine touches. AI is notoriously iffy when it comes to manipulating detailed images.
So far, that’s been especially true of structures like the human hand — though it’s getting better at them. Still, it’s not uncommon for images generated by, or edited with, AI to show the wrong number of fingers.
Consider the context
One way to determine the authenticity of an image is to take a step back and consider what’s around it. The context an image is placed in can tell you a lot about the intent behind sharing it. Consider the social media post that we created below for our altered image.
Ask yourself: Do you know anything about the person who shared the photo? Is it attached to a post that seems meant to spark an emotional reaction? What does the caption, if any, say?
Some doctored images, or even genuine images placed in a context that differs from reality, are meant to appeal to our “intuitive, gut thinking,” says Peter Adams, senior vice president of research and design at the News Literacy Project, a nonprofit that promotes critical media evaluation. These edits can artificially engender support or elicit sympathy for specific causes.
Nightingale recommends asking yourself a few questions when you spot an image that gets a rise out of you: “Why might somebody have posted this? Is there any ulterior motive that might suggest this could be a fake?”
In many cases, Adams adds, comments or replies attached to the photo can reveal a fake for what it is.
Here’s one real-life example pulled from X. An AI-generated image of Trump flanked by six young Black men first appeared in October 2023 but reappeared in January, attached to a post stating that the former president had stopped his motorcade to meet the men in an impromptu meet-and-greet.
But it didn’t take long for commenters to point out inconsistencies, like the fact that Trump appeared to have only three large fingers on his right hand.
Go to the source
In some cases, genuine images come from out of the blue in a way that leaves us wondering if they really happened. Finding the source of those images can help shed crucial light.
Earlier this year, science educator Bill Nye appeared on the cover of Time Out New York dressed more stylishly than the baby-blue lab coat many of us remember. Some wondered if the images were AI-generated, but following the trail of credits back to the photographer’s Instagram account revealed that the Science Guy really was wearing edgy, youthful clothes.
For images that claim to have come from a real news event, it’s also worth checking news services like the Associated Press and Reuters and companies like Getty Images — all of which let you peek at the editorial images they’ve captured.
If you happen to find the originating image, you’re looking at an authentic one.
Try a reverse image search
If an image seems out of character for the person in it, appears pointedly partisan or just generally doesn’t pass a vibe check, reverse image tools — like TinEye or Google Image Search — can help you find the originals. Even if they can’t, these tools may still surface valuable context about the image.
Here’s a recent example: Shortly after a 20-year-old gunman attempted to assassinate Trump, an image appeared on the Meta-owned social media service Threads that depicted Secret Service agents smiling while clinging to the former president. That image was used to bolster the baseless theory that the shooting was staged.
The original photo contains not a single visible smile.
Even armed with these tips, it’s unlikely that you’ll be able to tell real images from manipulated ones 100 percent of the time. But that doesn’t mean you shouldn’t keep your sense of skepticism honed. It’s part of the work we all need to do at times to remember that, even in divisive and confusing times, factual truth still exists.
Losing sight of that, Nightingale says, only gives bad actors the opportunity to “dismiss everything.”
“That’s where society is really at risk,” she said.
Editing by Karly Domb Sadof and Yun-Hee Kim.