The EU states and the European Parliament have agreed on a new digital law. In addition to the regulation of Internet companies, the personal protection of users is also a priority. An overview.
After a 16-hour marathon of negotiations, negotiators from the EU states and the European Parliament have agreed on a law on digital services (Digital Services Act, DSA). This aims to tackle social problems – such as hatred, hate speech, defamation and misinformation – on the Internet, which have increased dramatically in recent years.
It also includes the sale of counterfeit goods and the use of sensitive data such as religious beliefs and political views for targeted advertising. Our article clarifies the most important questions and answers about the proposed legislation and provides an initial overview of the changes:
What is the Digital Services Act?
The Digital Services Act is the first part of a digital package that the EU Commission proposed at the end of 2020. The goal was uniform and binding rules for the Internet. EU Commission Vice Margrethe Vestager compared the situation with the first traffic light that brought order to the streets.
The law follows a fundamental principle: “What is illegal offline will also be illegal online in the EU,” wrote EU Commission President Ursula von der Leyen on Twitter after the deal.
This applies, for example, to hate speech and terror propaganda, but also to counterfeit products sold on online marketplaces. The platforms should take more responsibility for what content is distributed on them.
The second part of the digital package was the Digital Markets Act (DMA), which was agreed at the end of March. The DMA is intended to restrict the market power of leading technology groups such as Google and Facebook with stricter rules and counteract even greater monopolization.
What new rules will be introduced and who will be affected?
The new rules are intended to apply to digital services that are intermediaries and enable consumers to access goods and content, for example. This can be online marketplaces like Amazon, social media like Facebook, platforms for sharing content like YouTube and also search engines like Google.
In principle, the law stipulates that large services must follow more rules than small ones. With the distinction, the EU wants to ensure that the largest US Internet companies in particular are hit and that significantly smaller Internet companies are not prosecuted to the same extent. Therefore, there will be exceptions for small businesses with fewer than 45 million monthly active users.
As a general rule, companies must remove illegal content such as hate speech as soon as they are informed. A guideline is 24 hours. However, critics fear that this will delete too much rather than too little – and see it as a form of censorship.
A distinction should be made between illegal content and content that is harmful but falls under freedom of expression. This could be lies about the effectiveness of vaccines that endanger human health. Or false claims about eating disorders that drive young women into anorexia.
Marketplaces are obliged to check providers so that fewer counterfeit products end up on the Internet. Manipulative “dark patterns” that urge consumers to make a purchase decision are also prohibited. Sensitive data such as religious beliefs, sexual preferences or political views may no longer be used for targeted advertising on the Internet.
In principle, minors should no longer receive personalized advertising. Social networks need to make their recommendation algorithms more transparent and give users choices. Violations are subject to penalties of six percent of global annual sales.
Also new is a crisis mechanism that the EU Commission subsequently proposed because of the Russian war against Ukraine. This is intended to limit the effects of manipulation on the Internet in cases such as war, pandemics or terrorist attacks. The EU Commission can trigger the mechanism on the recommendation of the panel of national DSA coordinators and then decide on appropriate and effective action by the very large services.
What special regulations are there for particularly large providers?
Platforms and search engines with more than 45 million users are considered to be particularly large. With a view to harmful content, they will in future have to submit a risk assessment once a year and propose countermeasures. These reports are checked by the EU Commission and outsiders.
In addition, researchers should have access to data that determine, for example, what users see next in their newsfeed. “This will affect attention-based rankings that fill the pockets of companies with disinformation, hate and hate speech,” said Green MP Alexandra Geese after the agreement. For the first time you have an independent social control of the platforms.
What are the reactions to the agreement?
The reactions are mostly positive. Geese sees the DSA as the “beginning of a digital spring”. “It will be the new basic law for the Internet,” said the Greens politician. Martin Schirdewan from Die Linke said the DSA was cleaning up the internet. “This is a good start towards more digital democracy, even if there is still a long way to go.”
But not all reactions are so euphoric. Patrick Breyer from the pirates, on the other hand, expressed his disappointment. “The new set of rules as a whole does not deserve the designation “Digital Basic Law”, because the disappointing deal often fails to protect our fundamental rights on the Internet.” According to him, “there is no alternative to the attention-grabbing corporate algorithms that expose us to hate, violence and false information in the interest of profit”.
How do the rules affect the German NetzDG?
To the displeasure of the EU Commission, Germany pushed ahead years ago with the Network Enforcement Act (NetzDG) to combat crime and hate speech on the Internet. The NetzDG is likely to become obsolete as a result of the DSA – even if the EU law lags behind the German law in terms of deletion periods. Overall, however, the DSA has a much larger scope.
And so it goes on now: The European Parliament and the EU states have to formally confirm Saturday’s deal again. According to the EU Commission, a transitional period of 15 months or until January 1, 2024 is planned after the entry into force, whichever is later. For the very large platforms and search engines, the rules should apply four months after they have been designated.