The platform recognizes the impact of certain challenges on adolescent mental health

#BussItChallenge, #SkullBreakerChallenge or # OboulChallenge… Whether they are fun or deemed dangerous, challenges and challenges of all kinds abound on the TikTok video sharing platform. In order to analyze and measure their effects on the physical integrity and mental health of adolescents, the application has produced a long report * that it has just published on its newsroom.

The platform interviewed more than 10,000 adolescents, parents and teachers from nine countries, including the United States and the United Kingdom, and called on numerous professionals, in particular child psychiatrists and specialists in adolescent safety. .

Risky challenges for 32% of adolescents

According to experts, most challenges on TikTok – like the #IceBucketChallenge or the #BlindingLightsChallenge – are considered harmless. The application thus affirms that 48% of the teenagers questioned consider that the challenges to which they have recently been exposed are “fun” and “light”. Almost a third, 32% of them, consider that there are “moderate risks”, but still feel safe.

On the other hand, 14% of teenagers surveyed believe they have completed a “risky and dangerous” challenge, and 3% describe the TikTok challenges as “very dangerous”. However, only 0.3% admitted having taken part in so-called “very dangerous” challenges. The researchers also realized that teens lacked clear information on the dangerousness of the challenges upstream. Because only 46% of the teenagers questioned felt they had “good information on the risks and what exceeds the limits”.

The application was also interested in the impact of certain content on the mental health of its users. For 31% of young people who had been confronted with content related to death, self-harm and suicide (such as #MomoChallenge or #BlueWhaleChallenge), the impact was “negative”, and 63% of them claim that this has affected their mental health.

New features offered by the platform

To deal with these problems, the application has announced the implementation of new features with an update of its “Safety Center”. All challenges and posts will therefore be monitored more closely and TikTok will no longer tolerate such content, which will be systematically removed. “We have created technology that alerts our security teams to a sudden spike in content that violates hashtag-related conditions, and we’ve extended that to capture potentially dangerous attitudes as well,” the platform said in its report.

A section entirely dedicated to challenges will thus be created. The warnings, which appear when a user is for example researching suicide or self-harm, will be adapted in more languages ​​to allow better accessibility, details the application.

* Survey of over 10,000 teens, parents and teachers from Argentina, Australia, Brazil, Germany, Italy, Indonesia, Mexico, UK, US and Vietnam.

source site