Attention, these manipulated videos of Putin and Zelenskyj are fake

Watch the video: Alleged Ukraine surrender – these manipulated videos are used for propaganda in the war.

FREE STANDING Putin declares the war against Ukraine over –

FREE STAND Selenskyj calls on his soldiers to lay down their weapons.

These videos have spread rapidly on the internet in the last few days. They are part of the war in Ukraine – part of the information war. Because they are manipulated. They were used as a propaganda weapon to spread false information in a targeted manner. Companies such as the Facebook group Meta or Twitter have already reacted and either deleted the videos or, if they were uploaded as “wrong”, left them online. But is that enough? What are deepfake videos and how dangerous are they really?

VIDEO PLAYER

Deepfakes are computer generated videos. They are created using special software technology. This often only requires a few photos or videos with different facial features of a person. An algorithm learns from the images to imitate the facial expressions, eye and mouth movements of the person exactly. The program then superimposes this face as a digital mask over another person’s face in a video recording.

Unknown hackers managed to load such a deepfake video with Zelenskyj on the home page of the Ukrainian news channel Ukraine 24.

Cora Bieß, peace researcher
“It was identified relatively quickly as a deepfake and spread as such. Despite all this, it is evident that the number of clicks on Zelenskyj’s video was extremely high within a few hours. And despite all that, even when people are told they’ve seen a fake, something still sticks with the message.”

And that’s the fatal thing about deepfakes. The fake is quickly recognizable in the video with Selenskyj: the shadow on the neck is missing and Selenskyj’s facial expression looks robot-like. But the information sticks in your head – that’s probably why the Ukrainian President reacted promptly on Instagram:

FREE STAND Zelenskyj “If I can recommend anyone to lay down their arms, it is the Russian military. Goes home. We are at home. We defend our country, our children and our families.”

And the Putin video is also quickly recognizable as a fake: it is based on Putin’s speech to the nation on February 21. The hand movements and shoulder shrug are completely identical to the new deepfake video. Only the head and the mouth movement seem rather robotic here, too.

While this technology was still relatively immature about three years ago, it is now becoming increasingly difficult to detect “deepfakes”.

Cora Bieß, researches deepfakes
“This is where the so-called cat-and-mouse game, also known as the AI ​​arms race, comes into play. That quasi, if detection mechanisms are further developed and improved, this can also automatically help people who are interested in improving deepfakes. And you can indirectly, even though that’s not the intention, also help to ensure that deepfakes can be created even better in the long term and can look even more real.”

Again and again, celebrities in particular fall victim to deepfakes because many photos and videos of them with different facial features are freely circulating on the Internet. The best-known example is Barack Obama in a video published on YouTube by Buzzfeed in 2018:

“President Trump is a total and complete dipshit.”

Or like last year Tom Cruise, who is said to have signed up on Tiktok: “What’s up Tiktok?”

But even for private individuals, a photo that can be manipulated with the help of apps is sufficient. It is true that these videos are often clearly recognizable as so-called cheap fakes, i.e. cheap imitations. Nevertheless, they can quickly become dangerous for individuals.

Cora Biess
“96% of all deepfakes that have been circulating on the Internet so far are made for the pornographic area and are mainly used for revenge pornography and have an extremely high potential for damage, especially for women.”

Deepfakes not only pose a threat to political opinion-making, but also endanger the privacy of women, who are often found unwittingly with their faces on porn sites.

Since there has been a lack of effective regulations so far, Cora Bieß warns against putting too many photos and videos of yourself online or sharing them in large WhatsApp or Telegram groups with sometimes unknown users. Their great concern is therefore precisely the women and children who have fled from the Ukraine and who organize themselves through such groups.

Cora Biess
“At the same time, there is an extreme risk potential here that, on the one hand, images are shared that can be tapped, which can result in pornographic deepfakes, women become victims of sexualized violence or children are unknowingly used in child pornographic deepfakes.”

One way to check deepfake videos is to use websites like deepware. After a video is uploaded there, you get an assessment of whether it could be a deepfake or not.

source site-3