DSA: What are the new rules?

At the end of April 2022, European institutions finalized their agreement on a new regulation that will replace the 2000 E-Commerce Directive.

Since its entry into force, perhaps at the beginning of 2023, the Digital Services Act will apply to online marketplaces, social networks, app stores, price comparison sites and search engines, even those not created in the European Union but with a significant number of users.

What are his contributions?

The principle of “light” responsibility

The DSA reaffirms that digital intermediaries do not have an obligation to monitor content. They are not responsible if they are not the origin of the illegal content and if they quickly withdraw it when they have knowledge of it. Nothing new in this topic is of great importance to ensure freedom of expression and entrepreneurship on the Internet. And the DSA confirms it This exemption from liability applies even if intermediaries conduct a search To detect illegal content in advance.

Two key questions are not answered in the new text:

  • identification “ Illegal content There is still no uniform definition of this idea throughout Europe.
  • Deadlines for removing illegal content. It is the national laws and jurisprudence of each Member State that will have to decide on this fundamental issue.

The real progress of the DSA

Where DSA Really Innovates By Promoting Commitments brokers. Real preventive action is expected from them. For this, the DSA distinguishes 4 increasingly wide circles of obligations, adapted to their activity and size. Some take care of all of them, some only apply to hosts, still some are just platforms and the latter only targets very large platforms (more than 45 million monthly active users in the EU, that’s 10% of its current population).

All brokers An annual report must be prepared on notifications of illegal content received and its moderation activities.

The hosts They will also have to facilitate their notification and publication of their withdrawal decisions in a public database.

The platforms User accounts that frequently provide illegal content or frequently provide unfounded notifications should also be suspended.

They will have to publish an anti-abuse policy and prioritize notifications from “trusted whistleblowers”. The platforms will have to get more information from the professionals who use them about themselves and their products. The transparency of advertising must be strengthened, and the Internet user in particular will have to be able to identify the advertiser behind this or that advertisement.

The very large platforms Once a year, an analysis of the systemic risks caused by their use (“fake news”, false profiles, etc.) will be conducted and you should mitigate these risks by adapting a moderation policy.

They will undergo an independent review once a year, giving the auditor effective access to useful data. Their recommendation system, which often favors divisive content, should be clarified, and users should be able to modify the parameters. In the event of non-compliance with these obligations, very large platforms may be penalized by European authorities (A fine of up to 6% of the turnover).

Therefore, the increase and control of new obligations imposed on mediators is certainly what DSA is most innovative for.

(Image credit: iStock)

Article by

contributor

Anne’s cousin Anne Cousin has been a partner at the Herald since 2010. She practices New Technologies, Internet and Media Law. She accompanies her clients, in counseling and litigation, in cases…
See his contributions

This text is published under the responsibility of its author. Its content does not in any way involve the editorial team at Les Echos Solutions.

Leave a Comment