Crying “Ukrainian soldiers” created with artificial intelligence

Published on:

Illustration: Truthmeter.mk

This article was first published by Truthmeter.mk (North Macedonia), within the framework of Western Balkans Anti-Disinformation Project.

The videos were created using Sora, an artificial intelligence model developed by OpenAI. However, in the video of the young Ukrainian soldier crying, the Sora watermark is not visible. The video consists of two short clips, one four seconds long and the other 10 seconds long, stitched together. The fact that the clips are short is an indication that the video was generated by artificial intelligence, as most models can only produce continuous footage of up to 10 seconds

 

Author: Miroslava Simonovska

 

Tearful young men in military uniforms, portrayed as “Ukrainian conscripts,” have flooded social networks around the world. Their faces look desperate and frightened, filled with panic. The viral videos allegedly show young Ukrainian conscripts traveling in a military vehicle to the battlefield.

“I don’t want to die!” 23-year-old cries while traveling to the front”—it is written in the description of the videos, which have tens of thousands of views.

This is just another disinformation campaign created with artificial intelligence. Fake profiles on social networks are sharing these viral videos with translations in different languages. The clips are intended to present the “forced mobilization” in Ukraine of 22- and 23-year-old boys, although a closer and more in-depth analysis shows that the visual irregularities in the video are the result of artificial intelligence. In particular, these include the improperly fitted helmet that does not match the standards of Ukrainian military equipment, the absence of a name tag on the uniform, and the incorrect pronunciation of the place name “Chasiv Yar,” which is typical of AI-generated voices.

The goal is to undermine the morale of Ukrainians and their supporters by showing supposedly young, frightened or tortured “forced” soldiers. When people fall under the influence of these realistic but fake videos, they begin to doubt everything, and the truth becomes more difficult to recognize. The fact that the videos are being circulated on international platforms in multiple languages shows that they are targeting several different Western audiences, aiming to reduce support for Ukraine in the countries where the videos are distributed, to manipulate public emotions, and to evoke pity for the soldiers portrayed as being forcibly recruited. In this way, the goal is to create cognitive dissonance among them, or rather, to arouse distrust in the decisions of Western governments to support Ukraine’s right to self-defense.

According to a statement by the Center for Countering Disinformation (an institution affiliated with Ukraine’s National Security Service and Security Council), the person in the video is not a Ukrainian, but a Russian citizen. The face belongs to Vladimir Yuryevich Ivanov from St. Petersburg, who uses the online pseudonym Kussia88.

The AI ​​video featuring Ivanov was first posted on a now-deleted TikTok channel–@fantomoko–along with a dozen other such videos created with artificial intelligence of “Ukrainian soldiers crying in military vehicles.”

As Euronews reported, the age for mobilization in Ukraine is between 25 and 60 years old, not 23 as claimed in the AI-generated clip.

The youngest age was lowered from 27 to 25 in April of last year to bolster troop numbers amidst Russia’s full-scale invasion, which is not far from its fourth year. Those under the age of 25, such as the age the man in the video claims to be, are only able to sign up voluntarily, Euronews reports.

The clips were created using the Sora tool, an artificial intelligence model created by OpenAI. However, in the video of the young Ukrainian soldier crying, the Sora watermark is not visible. The video consists of two short clips, one four seconds long and the other 10 seconds long, edited together. The fact that the clips are short is an indication that the video was generated by artificial intelligence, as most models can only produce continuous footage of up to 10 seconds.

As the war on disinformation remains on the front lines on social networks, it is becoming increasingly difficult to detect these fake, AI-generated videos—and increasingly easy to believe them.