How TikTok tries to make its platform safer for teens

Violent content, child pornography videos, but also cyberstalking and the phenomenon of addiction … Pointed out for their lack of security, the platforms are increasing their protective measures for their youngest users. TikTok, the world’s most downloaded app in 2020, has made it a priority today. The social network, which has built its success thanks to the enthusiasm of young people for its short video format, has just announced “important new measures” aimed at strengthening the safety and well-being of adolescents on its platform.

Additional restrictions to better protect them against the dangers of social media. “These new features aim to limit the exposure of minors to inappropriate content, and to make them responsible for the way they use our application. We want our young audience to feel confident on our platform. This is why we think it is important to be more proactive in this area ”, explains to 20 minutes Julie de Bailliencourt, in charge of content security and moderation (Head of Product Policy) at TikTok.

“Direct messaging disabled by default for children under 17”

If TikTok wishes to recall that it already offers “many security and confidentiality parameters”, the Chinese social network has decided to move up a gear this back to school. “Direct messaging will be disabled by default for those under the age of 17. Younger users, aged 13 to 15, will only be able to share their videos with their subscribers or friends, as the “Everyone” option is now deactivated. It will also no longer be possible for these teenagers to use the “Duos” and “Collages” functions. [ces options qui permettent de reprendre une vidéo et d’en filmer une autre en parallèle sont souvent utilisées pour dénigrer les vidéos de certains utilisateurs]. In addition, 13-15 year olds will no longer receive notifications after 9 pm, nor 16-17 year olds after 10 pm, to promote more restful nights, ”explains Julie de Bailliencourt.

Insufficient announcements for certain associations of child protection which consider in particular that today it is still too simple for a user to lie about his age when registering on the platform, which let us remember, is prohibited at least 13 years old. “We still have work to do, certainly a lot to improve. We collect the feedback many experts, study groups to improve ourselves. But we try as much as possible to be proactive, in particular by seeking out abusive content, thanks to our artificial intelligence tools. [machine learning…] “, Adds the person in charge of the security of the contents, who wishes to specify that in the first quarter of 2021, nearly 70 million inappropriate videos were deleted,” 90% of them having been even before there have a report ”, and that TikTok has deleted more than 7.2 million accounts suspected of belonging to children.

Beheading videos, sexually explicit content …

This measured news is part of a context of very strong criticism against TikTok and its policy of moderation of content vis-à-vis minors. Last June, a video showing a person being beheaded was widely circulated among teens on the platform. The Academy of Dijon (Côte-d’Or) had to alert all of its establishments and parents’ associations to the potential exposure of middle school and high school students to these images. The consumer protection association UFC-Que-Choose and its European counterparts filed a complaint at the start of the year with the European Commission against TikTok, which they accuse of not sufficiently protecting minors against ” potentially dangerous content ”, in particular“ sexually explicit videos offered to users after a few minutes of viewing ”.

“Today’s announcements made by TikTok, and the new features put in place, come after the tragedy that happened in Italy [une fillette de 10 ans est morte asphyxiée après avoir participé à un challenge sur le réseau social], and the blocking of the platform in the country. It was a strong signal sent to the Chinese giant, showing that the authorities can take quick decisions, without waiting for their reaction ”, reminds 20 minutes Raphaël Bartlomé, head of the legal department of UFC-Que-Choose, in charge of the complaint against TikTok. “The measures announced are a start, but I doubt that this is enough. We are going to look very closely at their concrete translation on a daily basis, and remain vigilant on their policy of moderation. The priority today is to protect the youngest, by all means ”.

TikTok is not the only platform to be criticized for not protecting children enough. To avoid scandals that could tarnish their reputation and deprive them of advertising revenue, Google and Facebook regularly update their policies regarding adolescents. YouTube (Google) announced measures similar to TikTok on August 10, making videos uploaded by 13-17 year olds by default in “private” mode. Same types of measures at Instagram (Facebook), which announced at the end of July to report in private mode for its users under 16, who will also no longer receive “push” notifications from 9 pm.

Source link