New EU law: Which rules now apply to Facebook & Co


faq

Status: 08/25/2023 10:10 a.m

Facebook, X, TikTok, Google and many other tech companies have a lot of work to do in the EU. In the future, they will have to check exactly what is happening on their platforms, otherwise there will be high fines. An overview.

As of today, a new European Union (EU) law requires social networks to crack down on illegal content on their platforms. Otherwise, Facebook, Twitter’s successor X or Google face hefty fines.

What is it all about?

Last year, the EU passed a law on digital services. This is intended to ensure that platforms and search engines remove illegal content from their sites more quickly than before. In turn, it will be easier for users to report such content.

Which companies are affected?

First of all, very large platforms and search engines with more than 45 million active users per month are affected. Stricter requirements apply to them than to smaller companies. Because from the point of view of the EU, they pose a particularly great risk to society.

In April, the European Union classified 19 companies as “very large online platforms” and “very large online search engines” – including Twitter’s successor X, Facebook, Instagram, TikTok and several Google services. This also includes Zalando, Wikipedia, Booking.com, the Amazon Marketplace and the Apple App Store.

What exactly is changing?

The companies had four months to implement the EU requirements and adapt to the new rules. Specifically, it looks like this: In the future, terms and conditions must be formulated in such a way that every child can understand them, explains an EU official. Online marketplaces such as Amazon or Alibaba AliExpress should, for example, remove offers of counterfeit clothing or dangerous toys as much as possible and warn buyers accordingly.

Platforms and search engines not only have to delete illegal posts faster than before; In the future, they will also report in detail to the EU Commission on the risks that exist for the citizens of Europe. Snapchat or YouTube, for example, have to check whether their offer promotes cyber violence, undermines freedom of expression or whether their algorithm has a negative impact on the human psyche. The companies must then take appropriate measures.

What about advertising?

Targeted ads are also banned if they are based on sensitive data such as religion or political beliefs. Personal data from children and young people may no longer be collected for advertising purposes. In addition, the secrecy of the platforms should be limited: In the future, they will have to disclose more information about how they work.

How notice consumers and consumers the measures?

According to an EU official, many of the changes will not be immediately visible to consumers, but rather take place in the background. However, the long-term effect should not be underestimated.

What are the corporations saying?

Meta, with its flagships Facebook and Instagram, has put together a team of 1,000 employees just to work on the Digital Services Act (DSA). Google promised more transparency – including in the guidelines and with additional information about addressing individual target groups in advertisements. There should also be new tools for researchers to access data.

TikTok announced a few weeks ago that it would introduce an alternative, less personalized algorithm for users in the EU and would provide more transparency with regard to advertisements on the platform.

But not all tech giants are willing to just accept the rules. Amazon and Zalando have already filed lawsuits. They see themselves wrongly classified as “very large online platforms” and argue that the rules should not apply to them as traders. Other lawsuits could follow.

How does it go from here?

If the corporations do not comply with the requirements, they face a fine of up to six percent of global annual sales. The responsible EU Commissioner Thierry Breton emphasized: “Compliance with the DSA is not a penalty – it is a way for platforms to strengthen their trustworthiness.” From February 2024, the rules will also apply to smaller digital companies.

source site