Video platform TikTok: Algorithm recommends dangerous content


Exclusive

Status: 04/27/2022 3:00 p.m

Videos about self-harm and suicidal thoughts continue to be a problem on TikTok. This is shown by an analysis of the data BR. Users can therefore be drawn into a filter bubble of dangerous content.

Altmeier L, Cerruti C, Khamis S, Lehner M, Schöffel R. BR

More than other social networks, TikTok uses an algorithm to decide what users see. The algorithm curates the most important part of the app, the “For You” feed. A kind of personalized television program.

This is not a problem if users are interested in cat or dog videos and TikTok is showing more and more of them. It gets difficult when they dive into problematic content. On TikTok there is a whole subgenre under which young people produce videos about depression, self-harm or suicide.

They talk about mental health issues, show the scars from injuries they inflicted themselves, express suicidal thoughts. In the comment columns, other users often report suicide attempts and self-harm. There are countless of these videos on the platform, some are deleted after a while, others are online for months. Some of them have been viewed and liked millions of times.

TikTok algorithm learns quickly in the experiment

An experiment by BR Data along with PULSE report shows that by interacting with such videos, German users can end up in a filter bubble in which they almost only get this content recommended by TikTok in their feed.

For the experiment, the journalists from BR Data set up several test accounts and simulated the behavior of people who are interested in videos about depression, self-harm and suicidal thoughts.

The result: After a short time, the feed consisted almost exclusively of such content. After around 150 videos, on average every third video had a hashtag on topics such as sadness, depression, self-harm and suicidal thoughts. In practice, this was the case after around 45 minutes of use.

Experimental methodology

The BR data journalists worked for the experiment with five accounts that imitated the behavior of users with German IP addresses. The data backup was done with the help of the Algorithmic Transparency Institute, which has developed a number of tools that can be used to examine the TikTok platform in more detail.

The EU initiative Klicksafe offers further information on dealing with self-injurious behavior and social media on the Internet.

When the accounts continued to engage with this content, 70 to 85 percent of the feed consisted of sad music, young people speaking of mental illness or self-harm, and depressive quotes flashing to colored LED lighting.

Filter bubble effect known since 2021

In July 2021, the “Wall Street Journal” had already pointed out the bubble of dangerous content that users could get into. In a December 2021 blog article, TikTok announced it was working on adding more variety to the feed. Especially with content that could have a negative effect. As the research of BR now suggests, TikTok has not yet kept this promise to a large extent.

TikTok references millions of deleted videos

In its community guidelines, TikTok prohibits posting content “in which suicide or self-harm is promoted, promoted, played down as normal or glorified”. When asked why there is still a lot of content on the platform that probably violates the platform guidelines, TikTok refers to its statistics on deleted videos. It states that over 85 million videos were removed worldwide in the last quarter of 2021. For six million of the videos, TikTok cites “suicide, self-harm, and dangerous acts” as the reason for removal.

The company did not answer further questions about the filter bubble effect, such as whether the comment areas are checked for illegal statements and whether the recommendation algorithms protect minors in particular from harmful content.

Psychotherapist: “Tempted to imitate”

In the context of the experiment BR Data and PULSE report also found videos that deal constructively with mental illness.

In fact, it can help those affected if others talk about surviving suicidal phases or give specific offers of help, says psychotherapist Anke Glaßmeyer, who herself educates on this topic on Instagram. However, she also says: If suicidality is an issue, it depends above all on how it is talked about. “The more targeted and detailed the reports on suicides are, the more likely it is to imitate them.”

Glaßmeyer says about the results of the experiment: Anyone who is currently in a mentally unstable state can be triggered by such content on TikTok – this means a trigger stimulus that involuntarily evokes memories and feelings of past trauma and can lead to a relapse.

Algorithm changes required

Leni Breymaier, SPD chairwoman in the Bundestag Family Committee, considers the results of the experiment to be problematic: “Self-harm, depression, towards suicide: I think it would be desirable for algorithms to be changed in such a way that these things don’t get worse all the time.”

With a law on digital services (Digital Services Act), large platforms in the EU will in future be obliged, among other things, to offer their users a recommendation system to choose from that is not based on personal profiling. Orestis Papakyriakopoulos, who researches algorithms at Princeton University, goes one step further. He calls on the platforms to refrain from using recommendation systems for content of a political nature or on mental illnesses.

If you are having suicidal thoughts, please seek help immediately. With the anonymous telephone counseling service you will find contact persons around the clock. Telephone numbers of the telephone counseling service: 0800/111 0 111 and 0800/111 0 222 www.telefonseelsorge.de

source site