Some items on our site have recently moved. Visit our News Hub for selected articles, special reports, podcasts and other resources.
Digital Services Act debate risks flashpoint over EU 'power grab'
11 January 2021 08:00 by Matthew Newman
Proposals for EU powers to fine “very large” platforms up to 6 percent of their annual revenue for violating rules on hate speech and the sale of illegal goods will spark a debate on whether the plans represent a “power grab” by the European Commission.
The Digital Services Act, unveiled last month, will impose tough new obligations on Internet giants such as Google’s YouTube, Facebook and Twitter to ensure that their platforms are free of illegal content such as hate speech, terror-related propaganda and images, as well as counterfeit goods and pirated movies and TV shows.
While the rules apply to all platforms, there are specific obligations for “very large” online platforms, because of their “systemic nature” and risks that they pose for users' fundamental rights, such as the freedom of expression, as well as access to information and their role in a democratic society. These specific rules target platforms that reach more 10 percent of the EU’s 450 million citizens.
Unlike for smaller platforms, which will be regulated by bespoke national bodies, the commission has set out new investigative and fining powers for itself, ensuring a consistent regulatory approach for the biggest platforms.
These new powers have raised concerns that the commission, a political body in which decisions are made by a “college” of 27 commissioners named by EU governments, lacks the independence and impartiality to regulate complex technical, societal and legal issues.
Aurelien Portuese, director of Antitrust and Innovation Policy at the Information Technology & Innovation Foundation, says the DSA and the Digital Markets Act — a second initiative to clamp down on the power of tech “gatekeepers” — show a worrying “power grab” by the commission. He’s concerned about a diminished role for national courts that he argues should be at the heart of an evidence-based approach to regulating Big Tech.
Before becoming law, both acts must be approved by the European Parliament and EU governments. While the discussions on the DSA and DMA have just started among member states last week, and a parliament committee will exchange views on Monday, it’s clear that the commission’s proposed enforcement powers will be a central part of the debate on the two measures, EU diplomats and parliament officials told MLex.
Under the commission’s draft text, EU governments will have the primary role in enforcing the DSA: Each will be required to establish an authority called a Digital Services Coordinator, which “shall act with complete independence.”
“They shall remain free from any external influence, whether direct or indirect, and shall neither seek nor take instructions from any other public authority or any private party,” according to the text. These requirements uphold companies’ fundamental right to lodge court appeals.
These independent national authorities will be supported by a new European Board for Digital Services, to be made up of the Digital Services Coordinators and that will coordinate with the commission and specialized national agencies that deal with consumer protection and data privacy.
Under the DSA, very large platforms could face steep fines if they aren’t transparent about how they remove illegal content, or they fail to put in place measures to stop the spread of fake news, uploads of illegal hate speech or terrorist images and the sale of counterfeit goods. They could also face penalties for restricting the freedom of speech.
While the commission could argue that the DSA establishes a decentralized model in which Digital Services Coordinators have enforcement powers and coordinate with the commission, this role doesn’t necessarily apply when it comes to very large platforms.
For the biggest platforms, the DSA has carved out a special role for the commission, whereby the EU executive will have direct supervisory powers and the power to impose fines of up to 6 percent of a company’s global sales.
Moreover, its powers extend from conducting dawn raids and investigations to obliging companies to hand over data, including algorithms, and imposing binding commitments.
Privacy advocates and academics have welcomed the DSA’s ambitious goals to rein in the power of Big Tech, but they are questioning commission enforcement powers that they see as outsized. Their main criticism is that they don’t consider it to be an independent authority.
Tech companies have long complained about the potential conflicts of interests inherent to the Brussels-based executive. The commission is not only a supranational body that sets political goals and proposes EU-wide legislation, but it also acts as a competition enforcer, levying billions of euros of fines on the likes of Microsoft, Intel and Google in the past two decades.
Granting additional powers to the commission to regulate large platforms’ management of illegal content, transparency and freedom of expression issues is a step too far for some critics.
“These powers should normally only be wielded by an independent regulator,” Ben Wagner, an assistant professor at Delft University of Technology, told an online event* last month. “There is a need for these powers to be controlled by an independent regulator.”
Wagner also criticized the weak role of the European Board for Digital Services, which will only have an advisory role. When the commission wants to crack down on large platforms, its only duty is to inform the EBDS of its plan.
The commission’s enforcement system is thus more closely in line with its powers as a competition authority, in which it can fine companies that abuse their market power or fix prices up to 10 percent of global revenue.
The new EBDS doesn’t resemble other pan-European bodies, such as the European Data Protection Board, an advisory organization established under the General Data Protection Regulation and formed with existing national data protection authorities.
In contrast, the DSA gives EU governments the flexibility to decide on designating an existing agency or authority as a Digital Services Coordinator or whether a new authority should be created.
This loosely defined system will likely further weaken the board as government departments fight over which entity will be granted new enforcement powers. This squabbling will likely lead to delays in enforcing the rules and increase reliance on the commission, particularly for large platforms.
Two levels of enforcement?
Centralized enforcement for large platforms may be a direct reaction to problems that arose with the “one-stop-shop” system created by the GDPR, whereby a company faces enforcement of privacy rules in the country where it has its “main establishment” in the EU.
Most US-based Big Tech companies — including Google, Twitter, Facebook and Apple — have chosen Ireland. This has led to complaints that the Irish Data Protection Commission has been overloaded with complaints since the GDPR took effect in May 2018 and takes too long to reach decisions.
There’s also a question on how aggressive Digital Services Coordinators will be in regulating platforms. With competition-style powers over Big Tech in the DSA, the commission would be more efficient and have more resources than some national regulators.
“There are some problems that need to be solved at the EU level, because any solution at the national level might first be insufficient, and, second, we will have a clash between different member states that don't agree on what is the right way to proceed,” Irene Roche Laguna, a commission official, said at the same online event last month.
Privacy advocates, academics and European Parliament members are wondering whether a separate, independent EU agency would be more appropriate for tackling sensitive issues such as hate speech and illegal content, where independent regulators at the norm at national level.
It’s early days in the review of the DSA, but enforcement powers are certain to attract lawmakers’ and policymakers’ attention. Whether the end result will be a hybrid system between the commission and national regulators, an EU agency, or a fully decentralized system, enforcement powers will be keenly watched.
* Assessing the Digital Services Act: evolution or revolution of platform governance? AWO Agency, Dec. 17, 2020.
Only discrimination involving digital product design should be considered self-preferencing, says CADE's Fernandes23 March 2023 20:15 by Ana Paula CandilOnly discrimination involving the design of digital products or services should be classified as self-preferencing
22 March 2023 11:35 by Tono GilAntitrust authorities are facing unprecedented pressure from governments and the public
21 March 2023 09:26 by Natalie McNelisIllumina probably had a sinking feeling reading TowerCast’s win in the European Court of Justice