Social Media Platforms: Sloppy Moderation, Poor Protection


exclusive

As of: December 6th, 2023 5:00 p.m

The EU’s Digital Services Act is intended to protect minors in particular on the Internet. A study shows: Large platforms moderate carelessly. A maximum of every third piece of content about eating disorders, self-harm and suicide is deleted.

There are disturbing images: a user who holds her upper arm with her thumb and forefinger, another user’s comment about it: “pretty”, “beautiful”. Photographs of protruding ribs with the note: “I want to see more visible ribs.” Or a user who shows his bare legs, which he injured in inches with a razor blade. His comment: “Wow” and “I can’t wait to make progress” – 2,500 likes, 35,000 views.

The photos come from publicly accessible accounts on Instagram, TikTok and X. All were still online at the time of writing and all had been reported as problematic.

A new study by the non-governmental organization “Reset” has come to the conclusion that the platforms do not delete data here at all or only hesitantly and that this may violate the new EU regulations, the Digital Services Act (DSA). she was laying NDR and “Süddeutscher Zeitung” in advance.

“We were shocked at how badly the platforms were failing,” says Rys Farthing, who led the investigation at Reset. In some cases only one percent of the content was deleted. In none of the cases was it more than 30 percent.

Possible violation of the Digital Services Act

The DSA requires special protection for minors. In addition, it imposes special rules on large networks: in particular, so-called risk reduction measures. This includes deleting, hiding or suppressing content. These measures must be proportionate but also effective.

Sabine Verheyen (CDU), MEP and chairwoman of the Committee for Culture and Education, sees a need to catch up: “Especially when it comes to the protection of minors, I don’t see enough of what’s happening at the moment. I believe that we need to act even stronger and more clearly here.” There is still a lack of structures to monitor platforms’ compliance with the DSA.

The EU Commission has a duty, says Rys Farthing from “Reset”: “The Commission has decided on this powerful set of rules and must now implement them.” The Commission says that the large networks are monitored in accordance with the DSA.

The platforms have already submitted initial reports. A review process is underway to evaluate the information provided. However, they will not comment on the ongoing proceedings.

Low deletion rates

In the study, “Reset” examined, among other things, how well the platforms moderate problematic content. To this end, more than 200 images and videos that glorify or trivialize eating disorders, self-harm or suicide were identified with the help of a psychologist and reported to the respective platform. Over a period of four weeks, the organization monitored whether the postings were deleted, given a warning – or left standing.

On Instagram, the deletion rate is 30 percent after reports of postings about suicide and self-harm, in which, for example, wounds or scars are shown in a self-aggrandizing manner. This means that Instagram is in a better position than X, formerly Twitter, where only 13 percent, i.e. a good one in ten posts, were deleted. On TikTok, the deletion rate was just over one percent.

The platforms responded less frequently to photos and videos that, for example, presented rigorous weight loss plans or showed off ribs and back bones as an incentive to further lose weight. Only about one in ten posts were deleted. At X there were even fewer.

Psychotherapist Julia Tanck, who specializes in the treatment of eating disorders and the influence of social networks, points out that eating disorders have the highest mortality rate among mental illnesses. She sees the platforms as having a responsibility to consistently delete or hide.

Young people in particular are insecure about their own body image, are constantly comparing themselves and are therefore particularly at risk of developing problematic behavior.

Platforms refer to extensive measures

The platforms refer to demand from NDR and SZ on extensive security measures that they have already taken. On Instagram, for example, explicit hashtags – such as #anorexia or #eating disorder – are provided with corresponding offers of help.

Meta says they are trying to find a balance between a place where those affected can exchange information and get information and deleting and hiding the content. TikTok responds similarly. You work with experts in the field and refer you to offers of help on the platform. X didn’t answer.

source site