On social networks, suffering young people locked in bubbles

Tell me what’s on your “For You” page, and I’ll tell you who you are. The continuous feed of videos in this TikTok tab is developed according to your interests and the data collected by the Chinese social network on your personality but also your current state of mind. Everyone has their own bubble. Some people endlessly scroll through adorable canine acrobatics, others are passionate about renovating old furniture. But for users suffering psychologically, the walls of extreme personalization quickly close in. And the “For you” thread turns into a real trap.

Of the bubbles suffered »

“Social networks are designed to generate maximum available brain time for advertisers. They live from advertising and, by pushing you the most relevant content possible, they make you stay longer,” explains Paul Midy, deputy for the 5th constituency of Essonne and general rapporteur of the bill to secure the Internet. Because if you love Formula-1 races, you are much more likely to stay glued to your phone screen if TikTok offers you drift videos rather than ballet images. Océane Herrero, journalist and author of The TikTok system evokes “sustained bubbles”. “On TikTok, the app chooses content for you the first time you use it. The user loses control over what he is going to see and therefore his ability to decide,” she notes.

Without a sound, the algorithm adapts to changes in the mood or emerging passions of its users. “Everything is tracked, down to the level of user well-being,” warns Katia Roux, technology and human rights advocacy officer at Amnesty International. The association also published a report on Tuesday in which it accuses the “Pour toi” thread of pushing young people towards content that is harmful to their mental health. Michael Stora, psychologist and author of (a)social networks: discover the dark side of algorithms, evokes “soft algorithms”, designed to show you “videos that suit you, supposed to make you feel good”.

But if users are interested in content related to mental health, in the space of just an hour “many recommended videos idealize, trivialize or even encourage suicide”, accuses Amnesty International, which carried out the test for its report. “TikTok will push users to stay, even if it means offering them harmful content, to push them to continue scrolling,” regrets Katia Roux, accusing the platform of “making money off of people’s emotions.”

From purrs to scarifications

“You just need to be interested in a subject for it to be boosted and invade the Tik Tok thread for you,” explains Katia Roux who denounces this “spiral” which, “when it is proposed to people in a difficult situation or a weakened mental state is devastating.” Because the gearing obviously does not have the same consequences when it comes to kittens or content promoting scarification. “Adolescents are experiencing a mental health crisis” and “the data suggests that the rise of social media has played a role” in it, recalls Sydney Bryn Austin, professor of social and behavioral sciences at Harvard University who worked with Amnesty International.

“From 2009 to 2019, depression among adolescents doubled and suicide became one of the leading causes of death among young people aged 10 to 14,” she adds. Teenagers are particularly sensitive to this content. “They identify much more in mirrors because they are fragile at this time,” analyzes Michael Stora who adds that for “a young person in a very fragile state,” watching this type of content “can make them sink in turn.” “. “Young people have more difficulty taking a step back and TikTok is a social network they trust,” says Océane Herrero. If the TikTok algorithm obviously “does not intentionally push depressing content to a suicidal person, this is how it works” despite everything, underlines Paul Midy.

Become thinner than an A4 sheet

“The rise of visual social media, such as TikTok or Instagram, has worsened the negative impacts on young people, particularly regarding body image, self-esteem and the risk of developing eating disorders [TCA] », Regrets Dr Austin who adds that EDs have one of the “mortality rates among the highest of all mental illnesses”. However, social networks regularly give rise to harmful trends, followed by thousands of young people in search of themselves. “Social networks do not cause anorexia nervosa but in the era of pro-ana sites [pro-anorexie], we realized that certain teenage girls sometimes followed these sites to the point of ruining their physical health,” recalls Michael Stora, who headed Skyrock’s psychological unit for seven years. On TikTok, if a young girl shows interest in CAW-related content, her “For You” page will end up offering her a myriad of content.

The “thigh gap challenge” which encouraged young girls to show the gap between their thighs already advocated thinness. After taking over social media, the keyword is now banned from TikTok and, if you type it in the search bar, the app offers you emergency help numbers. “On Instagram, there was also the challenge which consisted of placing an A4 sheet in front of one’s waist and proving that it did not exceed,” recalls Océane Herrero, who adds, however, that this quest for thinness, especially among women, “it’s the essence” of our society. Social networks act as a mirror and sometimes distort our societal obsessions. Whether they are unhealthy or not.

Hashtags, from ban to reinvention

But the sounding boards that social networks have become, on which many of us hang for hours every day, have their share of responsibility. “At the time of the Facebook files [en 2021], several internal studies showed the responsibility of platforms in reinforcing unease, particularly on the image that young girls have of themselves because of Instagram,” recalls Océane Herrero. “In 2019, TikTok assured that it was now bringing “exploration” videos to the “For you” feed,” recalls the journalist. Many keywords are now also censored, such as “suicide” or “eating disorders”.

“But content creators are hijacking “ED” for “Ed Sheeran” [le chanteur] in order to pass through the meshes of the algorithm”, illustrates the author, who adds that blocking hashtags does not prevent communities from inventing new ones, on the contrary. “TikTok is committed to ensuring the safety and well-being of our teen community. We strive to manage with nuance the complexity of supporting the well-being of our community on our platform,” the app responded to Amnesty International. Contacted by 20 minutes as part of this investigation, the social network did not respond.

Anti-bubble and ball return law

Amnesty International is calling on social networks to abandon excessive personalization and ban targeted advertising throughout the world, at least for young people. In the European Union, it is already for minors since the entry into force of the new European regulation on digital services (DSA). In France, “we passed the digital law which requires platforms to give us the possibility of leaving our bubble and, therefore, to offer us a feed that is not based on our preferences”, collected by the algorithm , explains Paul Midy who specifies that this functionality will be effective from next year.

The MP urges platforms to apply European law and regulations, in particular by quickly deleting illegal content, but also to “better calibrate their algorithms”. “If you chat with ChatGPT and ask it for a joke, the artificial intelligence will do it, but if you ask for a racist joke, it will say no. It’s the same thing for the TikTok algorithm, they can put safeguards in place,” illustrates Paul Midy. And to conclude: “We have made enormous progress on these subjects. Although there is still much to do, the ball is with the platforms today. »

source site