Digital services: Transparency database: Online networks delete millions of files

Digital Services
Transparency database: Online networks delete millions of files

On TikTok, almost 508 million posts have been reported and more than 348 million posts have been deleted since the reporting requirement was introduced. photo

© Hasan Mrad/IMAGESLIVE via ZUMA Press Wire/dpa

A new EU database shows what content Facebook, Instagram and TikTok delete and restrict. But one platform stands out because of its particularly low number.

Child pornography, hate speech or terrorist propaganda: Amazon has more than 960 million of these or other questionable contents, Facebook, YouTube, Instagram, Pinterest, TikTok and X (formerly Twitter) deleted or restricted in the past six months. This emerges from an EU database created by the EU Commission.

The background is an EU law on digital services (Digital Service Act, DSA), according to which large online platforms and search engines must delete such content more quickly and make the reasons for this transparent.

Specifically, the platforms indicate in the database which content they have deleted or restricted its visibility and for what reason. The data can be filtered and evaluated according to categories such as hate speech, violence or pornography. In total, there were more than 16 billion entries from 16 major platforms in the database in April. In addition to large social networks, providers such as Zalando, Booking.com and various Google services also have to report deleted or restricted content.

Platforms report very differently

This shows that the reporting behavior of the individual platforms varies greatly. With more than 14 billion reports, by far the majority are on Google Shopping. That’s almost 94 percent of all posts reported since the end of September. The search engine reports very comprehensively, says communications researcher Jakob Ohme from the Weizenbaum Institute, who conducts extensive research on disinformation. “The Commission has already said that it wants to correct this overhang.”

But there are other differences too. On the social media platform TikTok, almost 508 million posts have been reported and more than 348 million posts have been deleted since the reporting requirement was introduced. That’s almost 70 percent of all content reported by TikTok. Instagram reported around 19 million posts in the first half of the year. A little more than 6.6 million of them, almost 35 percent, were completely deleted. According to the database, just over 832,000 pieces of content were reported on the online platform X during the same period. As of the beginning of April, only 24 posts were deleted.

Flexibility in reporting for platforms

For Ohme, there are several reasons for the networks’ different reporting behavior: “Firstly, platforms are differently active in content moderation. Secondly, platforms take the reporting obligation differently seriously.” Platform TikTok, on the other hand, is trying to “cut a good figure”.

Digitization expert Julia Kloiber, who works for the tech think tank Superrr Lab, also sees a need for improvement: “There is currently a lot of leeway in terms of scope, accuracy and the interpretation of the specifications.” This leeway makes comparisons between services more difficult. In addition, the data collection is only based on self-reports from the platforms. “Companies can claim a lot in their transparency reports. It is important that there is also a thorough check of the information.”

EU Commission investigates compliance

According to the EU Commission, they exist. It is the responsibility of the platforms to communicate all content moderation decisions immediately. If a violation is suspected, the EU Commission can request access to data in order to check compliance with transparency regulations, the Brussels authority said when asked. Fluctuations in reported content are due to different strategies and different content on the platforms.

In principle, investigations are already underway on the basis of the DSA as to whether X and Tiktok, among others, adhere to the DSA regulations and do enough to prevent the spread of illegal content. However, there has not yet been a decision and therefore possible penalties. If the Commission comes to the final conclusion that providers are violating the DSA, fines of up to six percent of annual global turnover may be imposed.

Important decisions for democracy and society

Overall, Kloiber rates the database as an “important milestone” that “brings light into the black box of online services”. “We have seen in recent years the impact that content moderation has on our democracy and society – in the run-up to elections, it must be ensured that disinformation and reported content is moderated well and quickly,” demands Kloiber. Influence on online platforms is also a danger in the upcoming European elections from June 6th to 9th, 2024, as the European Parliament warned last year.

Kloiber hopes that “through transparency, competition will begin between the platforms as to which platform has the highest standards and the best quality when moderating content.” Ohme hopes “that in a year there will be more precise and better enforced data that will allow further analyzes and statements.”

It is clear that the amount of data will continue to grow in the future. As of February 17, all providers of online platforms, search engines and online services, except small businesses, must provide data on their content moderation decisions. This data is not yet flowing into the transparency database. But according to the EU Commission’s plans, these should also be available on the DSA transparency database website in the future once the technical requirements have been met, said a Commission spokesman.

dpa

source site-5