Ukraine war: propaganda battle on TikTok | tagesschau.de


Exclusive

Status: 08/23/2022 5:29 p.m

With the Russian war against Ukraine, TikTok has also become a platform for propaganda. Rocket attacks and corpse pictures are also accessible to minors – and get loud SWR not always deleted.

The Russian war of aggression in Ukraine reached German children’s rooms with its images of rocket attacks and corpses, with propaganda and disinformation. The Chinese video platform TikTok plays both violent and propaganda content about the war in Ukraine to its young users. This is shown by research by the investigative research format VOLLBILD dated SWR.

In a self-experiment it becomes clear how quickly TikTok users can get into a tunnel of war content: The research team of the SWR created three new TikTok accounts, scrolled through the feed and only watched military videos in full. After an hour, TikTok played almost exclusively war content. These randomly sequenced videos contained, among other things, scenes of violence, dead bodies or brutal attacks on the civilian population. The violent and propaganda content is often not classified.

TikTok is owned by ByteDance, a Beijing-based company that was founded in 2017 and became popular primarily for its dance and music videos. According to the app, the app has more than a billion users, mostly young people, aged 13 and over. Even before the outbreak of war in Ukraine, political content was increasing on the platform. This development reached a new peak with the Russian invasion of Ukraine. The app has become the scene of an information war. The magazine “The New Yorker” writes about the “first TikTok war”.

Data analyst Philip Kreissel works at HateAid, a non-profit organization that provides advice and support to victims of digital violence. He says: “That is TikTok’s recipe for success. The algorithm only plays out content that the user likes and looks at for a long time. This is how it ends up in a so-called ‘rabbit hole’. There is a risk that the user will only such videos are still displayed. These can be both cat and war videos.”

“Viking” Balderson: “It’s also about attracting attention.”

Information warfare on TikTok

TikTok has developed into a battleground for propaganda in the Ukraine war: The information war on TikTok has made the Dane Storm Karl Balderson a kind of figurehead for Ukrainian propaganda. He presents himself as a “Viking” and reaches millions of users with his content on Instagram and TikTok. In an interview with the SWR he says: “It’s not just about producing content for propaganda or something. It’s also about generating attention.” He accepts that his propaganda content will also reach children and young people all over the world.

In Russia, TikTok has banned the posting of any content since March 6 to protect users after the Kremlin enacted so-called fake news laws. They threaten citizens with up to 15 years in prison if their statements deviate from the Kremlin’s portrayal of the war in Ukraine. Nevertheless, Russian propaganda continues to circulate on TikTok.

German influencers also post content that supports Kremlin narratives. The influencer Deniz K. has a channel on TikTok and has a podcast on RT DE, a program funded by the Russian state. The distribution of its content has been banned across the EU since March 2 due to disinformation. In his podcast, the influencer speaks, among other things, of a “witch hunt” on pro-Russian countries and claims that Vladimir Putin’s narrative of denazification is correct. Although TikTok has blocked his account several times, the influencer has always been able to upload his videos under a new account.

TikTok doesn’t always enforce community guidelines

TikTok’s guidelines state: “We (are) committed to fighting the spread of misinformation on the platform. (…) Our goal is to help ensure that TikTok remains a place where authentic and trustworthy content thrives.” Among other things, videos with “physical violence, fight scenes and torture in real scenarios” should not be uploaded.

The company tries to take action against inappropriate content on the platform in various ways. Among other things, users can report content that they consider inappropriate in the app. So-called content moderators then assess whether the reported content needs to be blocked. The German Network Enforcement Act states that social platforms like TikTok must respond to the reported content within 24 hours. If the matter is complicated, the company has one week to process the case.

In an attempt, the research team reported SWR two accounts and five videos for violent content. After 24 hours, only one of the five videos and none of the accounts was blocked. Only upon request SWR TikTok deleted all reported accounts and videos – except for one. A TikTok content moderator who wishes to remain anonymous says in an interview with the SWR: “We do the most important jobs because we are the ones who make sure that this platform is safe. And whether it ends up in the children’s room or not.” Nevertheless, there are videos on the platform that children should not see. “And honestly? I wouldn’t allow my kid to TikTok.”

Upon request, TikTok states: “We continue to respond to the war in Ukraine with increased protection and security resources to detect new threats and remove harmful misinformation.” In addition, TikTok also refers to the cooperation with independent organizations that checked facts for them.

The film “TikTok War: How Russia & Ukraine Make TikTok a Battleground” is available at www.youtube.de/vollbild and in the ARD media library.

Tiktok war in children’s rooms

Thomas Oberfranz, SWR, 23.8.2022 5:52 p.m

source site