Tiktok presenter reveals what she sees in the app: child abuse, extreme violence, mutilated animals

Abuse, violence, mutilated animals: what she saw at Tiktok made the presenter sick. Now she is complaining

The colorful app Tiktok also has a dark side (symbol image)

© ljubaphoto / Getty Images

Tiktok is the shining star in the social media sky. But while the users enjoy the funny clips, the moderators get to see much darker sides. One of them feels so traumatized by this that she now complains.

From advising twins to a funny challenge, then the fashion clip and relationship tips: With its quick switch from one short clip to the next, Tiktok is captivating more and more users to the screen. But so that this world remains harmless, the moderators make a great sacrifice, which is now revealed by a lawsuit. And gives insights into the dark side of the fun app.

Candie Frazier has seen enough of them. As a moderator, she said she had to view and evaluate a veritable flood of highly problematic content. According to the lawsuit, she was confronted with “extreme and drastic depictions of violence”. The videos showed, among other things, “the genocide in Myanmar, mass shootings, the rape of children and the mutilation of animals”. Frazier’s allegation: Tiktok did not adequately protect her from the psychological effects of her work. And traumatized her so much that she has now become ill.

Hard conditions

Frazier and her colleagues apparently had to work under extreme conditions. They viewed the content in 12-hour shifts and were only allowed to take the first 15-minute break after four hours. In order to be able to cope with the gigantic mass of videos uploaded to the app, they did not watch one clip at a time, but between three and ten – at the same time. The service loads a new problematic video at least every 25 seconds, the lawsuit said. The moderators are strictly monitored, every moment in which no videos are viewed is “severely punished”.

This is of little help to the moderators’ mental health. Frazier suffered from severe panic attacks and, according to the complaint, suffered from “severe psychological trauma, including depression and symptoms associated with nervousness disorders and post-traumatic stress disorder”. She has trouble finding sleep “and when she sleeps she is plagued with terrible nightmares,” according to the lawsuit. “She often lies awake at night trying to fall asleep while repeating videos in her head that she has seen at work.”

Tiktok does not want to comment

Although Frazier is not employed by the Tiktok operator itself, but by the service provider Telus International, the allegations against the company weigh heavily. Tiktok and its partners did not take sufficient care to comply with the industry rules for the protection of moderation staff. These not only provide for more regular breaks in the USA. In order to reduce the effects of the extreme content, the moderators must also be offered psychological support. Many services also use filters that automatically pixelate drastic representations. Because not only Frazier but also numerous other moderators are affected, the lawyers are therefore seeking a class action.

Teenage girl Idan and retiree Riki look at the camera and discuss digital media

Tiktok itself does not want to comment directly on the case, one “does not comment on ongoing legal disputes,” the company told several US media in a statement. However, the company emphasized that they are striving for a caring work environment and are expanding the offers for moderators in order to “support them emotionally and socially”. Frazier’s employer, who is not part of the lawsuit, was clearer. They are “proud of the valuable work the team is doing in building a positive online environment.” In addition, they have “robust” programs to guarantee the mental health of the employees. The company enables its employees to report problems “through various channels”, said a spokesman for the Washington Post. But Frazier never did that. The plaintiff and her lawyers refused to speak to the newspaper.

The lawsuit is not the first attempt to draw attention to the mental health problems faced by the moderators of major social media platforms. Only last year, Facebook, which has now been integrated into the parent company Meta, was sued by former moderators. And had to pay at least $ 1000 to each of the moderators in an agreement. Those who are particularly psychologically affected can claim 50,000 dollars. The agreement may also have played a role in the decision on the current lawsuit.

Sources: The Verge, Washington Post

source site-5