Is this video real or CGI?

Were tiny baby giraffes having fun in a park? A digital video created by an artist has been falsely shared as if it were real.

Source: Kevin/Pixabay

On Twitter: a viral video of a group of turtles running – yes, running! – in a circle, with the caption “Wait, turtles are fast?” Another viral video, also on Twitter, this time of a group of tiny baby giraffes galloping around a park with the caption: “I had no idea baby giraffes looked exactly like adult miniature giraffes, but I don’t know what I expected.

In both cases, tweeted videos of unusual animal behavior have gone viral. And in both cases, the videos were fake. Both had been computer generated by artist Vernon James Manlapaz. But Manlapaz didn’t create the videos to deceive anyone, just like the creators of the film. jurassic park had no intention of convincing anyone that geneticists could recreate dinosaurs that could then live in a theme park. The dinosaurs we saw on screen were, of course, computer-generated images and all made for entertainment.

Manlapaz works in the 3D animation business in Los Angeles and creates digital art on the side. He told a reporter, “My creative process usually starts with looking at my surroundings and asking myself, ‘What if?’ From there, I get inspired by games, movies, people, and life in general.

His Instagram page, where he shares his video art, includes fantastical tableaux like a school of multicolored tropical fish and a sea turtle swimming through the air in an amusement park, their sidewalk shadows trailing them; a battalion of two-foot-tall gummy bears hopping aimlessly down a grocery store aisle; and a baby rhino deftly riding a scooter alongside a strolling adult rhino and a stream of people riding bikes in a park. A simple glance at Manlapaz’s Instagram makes it clear that his elaborate work is not meant to deceive. Why, then, has his work led to viral misinformation?

profit from deception

Manlapaz’s videos, and others like them, were not created to intentionally mislead; however, people other than their creators often present these digital creations as real to attract social media followers and earn money, a tactic called engagement baiting.

The Literacy Newsletter The sieve describes this tactic as “accounts seeking to create broad social networks at all costs, even when it means passing off digital counterfeits as genuine”. The sieve points out that these social media users are aware that certain images, including “those featuring unusual or cute animals”, are particularly likely to go viral.

Researchers including Dimitrina Zlatkova and colleagues (in a preprint) have called the phenomenon of misleading visuals falsetography, “images, especially news photographs, which give a dubious, if not downright false, meaning to the events they seem to depict.” At the Disinformation Desk, we’ve previously written about nature photographs that went viral at the start of the pandemic – swans believed to be returning to a tourist-devoid Venice. But they were real photos that were misinterpreted. (They were real swans, but they weren’t in Venice.) We’ve also written about deepfakes – doctored photos or videos, including revenge porn. But deepfakes are usually created with nefarious goals in mind, often referred to as misinformation.

With work like Manlapaz’s, however, we’re talking about a different kind of forgery: photos or videos created solely for artistic or entertainment purposes, like a dinosaur movie, with no intent to mislead, only to enchant or create wonder. As we noted earlier, Manlapaz posts his work on his Instagram where it’s easy to see that he’s a digital artist who creates whimsical art. Misinformation later arises when a social media user co-opts Manlapaz’s art without his permission to trick viewers into clicking and engaging with their bogus posts in search of a profit.

Fight against falsetography

There are both programmatic and individual solutions to the proliferation of falsetography and its video counterparts. A general solution is extended fact-checking of visuals. The efforts of groups like Oryx and Bellingcat to verify images related to the war in Ukraine are a particularly notable example. Zlatkova and her colleagues are observing that the problem of fake images and videos is too big to fact-check individually beyond a single area (e.g., war photos in Ukraine) and are working on strategies to automate the process. They developed an open-source dataset using a variety of tools, including reverse image searches, examination of image characteristics such as Google tags, and a review of existing databases, such as the Fauxtography section of the Snopes fact-checking website. . These researchers plan to focus next on fact-checking the videos.

As an individual, you can also train yourself not to fall for this trick. The sieve suggests you actively seek out “engagement bait” by going to Twitter and searching for “amazing nature” or “baby animals.” Try to notice when a clickbait photo looks too good to be true. Or, they suggest, check out or even follow Twitter accounts such as @HoaxEye and @PicPedant who find the engagement bait for you; you can then look for patterns in the misleading photos and articles these sleuths post. Once you view the visuals with a critical eye, if a photo raises suspicion, you can use some of the tools described earlier, like a reverse image search, to do your own research. The cardinal rule: Always check before sharing.

And, just for entertainment, check out the clever digital art on @vernbestintheworld, Instagram of Manlapaz. We particularly like the oversized donut on the run and the gargantuan porcupine, not least because they’re a little too amazing to be turned into viral misinformation.

Comments are closed.