Stricter rules for Internet companies: What the new EU digital law says


As of: 04/23/2022 4:02 p.m

With a new law, the EU wants to oblige Internet companies to take faster and better action against hate speech, disinformation and counterfeit products. What that means in concrete terms and for whom it applies – an overview.

After a last marathon of negotiations, negotiators from the EU states and the European Parliament have agreed on a law on digital services (Digital Services Act, DSA for short). It is intended to ensure stricter supervision of online platforms and more consumer protection. EU Commission President Ursula von der Leyen spoke of a historic agreement with regard to the regulations. What the law says exactly and how to proceed – an overview of the most important questions and regulations.

Who should the law apply to?

In principle, the new rules should apply to digital services that are intermediaries and enable consumers to access goods and content, for example. However, the EU wants to regulate the very large online companies in particular more closely. Platforms and search engines with more than 45 million users are considered very large. There are potentially around 20 companies, including Google with its subsidiary YouTube, Meta with Facebook and Instagram, Microsoft with its social network LinkedIn, Amazon, Apple and Twitter. These big services have more rules to follow than small ones. There will be exceptions for companies with fewer than 45 million monthly active users.

What does the law specifically say?

The goal is binding rules for the Internet based on the fundamental principle: What is illegal offline should also be online. This applies, for example, to hate speech and terror propaganda, but also to counterfeit products sold on online marketplaces. For the platforms, this means that they should take more responsibility for what happens to them.

Regulations against hate and hate speech online

So basically applies that the companies illegal content like Hate speech, calls for violence or terror propaganda remove promptly when notified. The guideline is 24 hours. Users should be able to easily report such content. They should also have the opportunity to contest the deletion decisions of the platforms and to demand compensation. Online platforms should also block users who often distribute illegal content such as hate speech or fraudulent ads. This is said to apply to a variety of platform providers, not just the biggest ones like Instagram, Facebook and YouTube.

At the same time, the law is intended to ensure that freedom of expression remains protected as a matter of principle. In this way, a distinction should be made between illegal content and content that is harmful but falls under freedom of expression.

Ads, algorithms and fakes

Large platforms should give their users more influence in the future advertisements be displayed to them. Sensitive data such as religious beliefs, sexual orientation or political views may only be used to a limited extent for targeted advertising. In principle, minors should no longer receive personalized advertising.

Social networks need their recommendation algorithms make it more transparent and give users choices. They should publish the most important parameters in their general terms and conditions. Users should also find out how they can change the terms and conditions. There is always criticism of the mostly secret recommendation algorithms. Former Facebook employee Frances Haugen, for example, criticized the fact that Facebook deliberately uses algorithms that promote polarizing content for profit.

So-called “dark patterns”, i.e. manipulative design practices, are to be banned in the future. Some companies use this to urge consumers to make a purchase decision. Misleading user interfaces – for example when selecting cookies – are also largely prohibited in other respects. Marketplaces will also be obliged to check providers in the future, so less counterfeit products land on the web.

Crisis Mechanism on War, Pandemic and Terror

Also new is a crisis mechanism, which the EU Commission subsequently proposed because of the Russian attack on Ukraine. This is intended to limit the effects of manipulation on the Internet in cases such as war, pandemics or terrorism. The EU Commission can trigger the mechanism on the recommendation of the panel of national DSA coordinators and then decide on measures to be taken by the very large services. Online platforms could be forced to provide information to supervisory authorities and experts.

What happens in case of violations?

In the event of violations, companies face fines of up to six percent of their worldwide annual turnover. According to the Reuters news agency, this could be up to seven billion dollars, based on sales of just under $118 billion last year. In the event of repeated violations, companies could even be banned from doing business in the EU. In addition, a fine of five percent of the daily turnover can be imposed to end a violation of the DSA.

How will the EU enforce its rules?

The very large digital groups should grant the EU Commission access to their data so that they can monitor compliance with the rules. In the case of the smaller Internet companies, a competent authority with investigative and sanctioning powers in the respective EU country in which the company has its headquarters should monitor compliance with the rules. For Germany it has not yet been clarified which authority will take over this. The Federal Network Agency and the state media authorities come into question.

How do the rules affect the German NetzDG?

Years ago, Germany pushed ahead with the Network Enforcement Act (NetzDG) ​​to combat crime and hate speech on the Internet. This should now become obsolete as a result of the DSA – even if the EU law lags behind the German law in terms of deletion periods. Overall, however, the DSA has a much larger scope. The responsible Federal Ministry of Transport has meanwhile announced that a Digital Services Act should be drawn up and that the existing national laws would have to be extensively revised.

What happens now?

The European Parliament and the EU states have to formally confirm the deal again. After the entry into force, a transitional period of 15 months is planned. According to the EU Commission, the rules should apply to the very large platforms and search engines four months after they were created.

Digital services: EU agrees on rules for internet companies

Alexander Göbel, HR Brussels, 23.4.2022 2:46 p.m

source site