“Even behind a nickname, even behind an avatar, everyone can be condemned”, warns Minister Jean-Noël Barrot

This week, the bill to “secure and regulate the digital space” arrived at first reading in the Senate and was finally adopted on Wednesday. It should make it possible to adapt French law to the new European regulation of the Digital Services Act (DSA) which will be effective from August 25 for around twenty major platforms. Several rules are set to change for digital players. Cyberbullying, misinformation and pornographic content… 20 minutes wished to discuss it with the Minister Delegate in charge of the digital transition and telecommunications Jean-Noël Barrot.

From August 25, the Digital Services Act (DSA) – one of the big digital projects in Europe – will force seventeen major platforms to follow new, stricter rules in order to regulate online behavior. The objective is above all to make illegal online what is already illegal offline. Why does this text have a major stake and how will it be organised?

This regulation on digital services is a revolution that France has brought to the European level. It brings the major platforms into the era of responsibility by imposing on them a number of new obligations and prohibitions with which they will have to comply. Among these obligations, there is that of moderating illegal content, messages inciting hatred and counterfeit products on marketplaces. Platforms will also have prohibitions such as advertising targeted at minors.

What will happen if the platforms do not respect these obligations?

So there will be penalties that can go up to 6% of global turnover. In the event of repeated breaches, if the platform does not comply after having been alerted the first time, it will be banned from the European Union.

At the same time, a new regulation that you carry is considered in the Senate this week. Its objective is to secure and regulate the digital space. What will it consist of?

This regulation will first allow us to take some measures to adapt French law so that the DSA can apply and thus designate the competent authorities. The idea is to clearly define in our law what we consider to be a platform. The bill includes a number of provisions around pornography, cyberbullying and cyberscams in particular.

You talk about competent authorities. What are they ?

For the regulation on digital services, the European Commission will be responsible for investigating investigations and imposing sanctions. For this, it will rely on national regulators. In our country, Arcom will be the coordinator of digital services, which will work closely with other authorities such as the National Commission for Computing and Liberties (CNIL) and the General Directorate for Competition, consumption and the repression of fraud (DGCCRF).

A few weeks ago, Twitter decided to leave the european code against disinformation, created in 2018 and signed by some forty major digital companies. However, the fight against disinformation is an important part of the DSA. Will there be discussions with Twitter to put the platform in front of its obligations?

It is true that the European code against misinformation – which Twitter has decided to leave – listed a number of measures. If this code of good conduct does not have the force of law, we can say that it was an appendix to the regulation on digital services, a manual for combating misinformation. What the new regulation says is that the major platforms must analyze the risks they pose to public debate and the quality of information. Very large platforms are therefore expected to implement active measures to combat disinformation. If Twitter does not cooperate, it will face fines. But it is true that this decision to leave the previous code against disinformation is not a reassuring signal.

The regulation also fights against cyberbullying, which is still too present on the platforms. What are you going to put in place at this level?

It is imperative to fight against the scourge of cyberbullying. With the government, we have an end-to-end approach that goes from awareness to sanction. To raise awareness, I wanted all sixth-grade students to benefit from a digital passport from the start of the next school year, which will make them aware of the risks and actions to take online when they are victims or witnesses of cyberbullying, in particular reporting contents.

And how do we do today to judge a cyberstalker most fairly while limiting the risk of recidivism?

For the sanction, I propose in the text that those found guilty of cyberbullying can be banned from social networks for a period of six months. A period that can be extended to one year in the event of a recurrence. For what ? Because cyberbullying is sometimes the act of individuals who behave like pack leaders and who designate victims for the vindictiveness of their communities and unleash on these victims raids of hatred and violence. With this measure of banishment, we will confiscate the sounding board of these pack leaders by depriving them of their notoriety. Like the stadium ban for hooligans, we will prevent recidivism.

What about the lifting of anonymity, an always complex question for platforms?

I say it clearly: there is no anonymity online. Anonymity should not be confused with pseudonymity. It is not because we hide behind a nickname or an avatar that we can evade justice. Even behind a nickname, even behind an avatar, everyone can be sentenced to heavy fines and prison terms when they commit crimes online, that is to say when they spread hatred and violence in line.

To fight against pornographic content, of which young children are also victims, the new regulations provide for the creation of a digital certificate » proving the majority of the user. Have these solutions already emerged?

Today, in France, more than two million children are exposed to pornographic content each month. The idea is to be able to verify the age of users. The platforms can go through either a zero euro payment with the payment card or an age estimate from a photo. These solutions exist and therefore porn sites in particular have no excuse not to use them. Age verification tools should be both very reliable and very protective of privacy and personal data.

But concretely, how will it work?

Some solutions are being tested technically and take the form of an application that will request anonymous proof of majority from a digital identity provider. The adult site will not know the identity of the user and whoever provided the anonymous proof of majority does not know why it was used. This principle of double anonymity is what we want to achieve so that age verification is effective, but at the same time respects privacy.

The DSA also aims to make content moderation more transparent. What will this transparency look like in the long term?

Platforms will be required to have their algorithms edited by third-party authorities. In the context that we have just experienced and in view of the systemic risk that the virality of certain platforms can pose to public safety, these audits will make it possible to identify the functioning of certain algorithms and the significant damage they cause. In addition, they will have to share their data with researchers. Sharing this data with the academic community will make it possible to see more clearly the presumed but insufficiently documented consequences of social networks on the health of users and children in particular.

source site