EU's Big Tech content regulation concentrates enforcement power in Brussels
15 December 2020 00:00 by Matthew Newman
Proposed EU rules to force Big Tech companies to remove illegal content will establish a complex enforcement structure that may end up concentrating power in the hands of the European Commission, rather than delegating it to national authorities.
The Digital Services Act, unveiled today by commissioners Margrethe Vestager and Thierry Breton, marks a bold step to crack down on platforms that have been accused of being “too big to care.”
It comes as authorities in the US embark on lawsuits and investigations that could lead to the breakup of Facebook and force Google to change its advertising model.
The DSA will impose tough new obligations on Internet giants such Google’s YouTube, Facebook and Twitter to ensure that their platforms are free of illegal content such as hate speech, terrorists’ messages and images, as well as counterfeit goods and pirated movies and TV shows.
But what's defined as illegal content differs between EU countries, creating a tricky question around enforcement. The proposal envisages a system whereby platforms can be regulated where their users are, not where they are based — perhaps with an eye to avoiding a repeat of the GDPR’s chaotic “one-stop shop,” which puts a huge burden on host-country regulators.
In the event of disputes, however, enforcement powers fall to the commission. The likely complexity of major cross-border cases suggests that the EU executive will have a major role to play. Enforcement of illegal content may therefore look more like antitrust than like GDPR.
If national authorities don’t do what’s needed, “then eventually, the commission can take over and actually do it," Vestager said today.
— Country of origin —
The new enforcement process marks a departure from the "country of origin" rules in the EU’s 2000 e-commerce directive. There, companies that are considered as an “information society service” because they provide services at a distance are responsible for following the rules where they are based.
This means that countries where they are providing their services — the country of destination — must refrain from applying their own regulations.
These rules have rubbed against local authorities, particularly in the modern sharing economy. In a landmark decision, the EU Court of Justice said last year that Airbnb is an “information society service” and wasn’t subject to real-estate rules in France. Uber, however, was found in 2015 to be subject to local rules for taxi services in Spain.
This has left open a nagging question: When there’s a problem in another country, such as a violation of national hate speech laws on a social-media platform, how can a local authority get the host country do anything about it?
— Digital Service Coordinator —
The draft DSA spells out new duties for EU governments. They must name a “Digital Service Coordinator” that will be a single contact point for the commission and be included in a new European Board for Digital Services, which is an advisory group for the EU executive.
The Digital Service Coordinator investigates and enforces the DSA where companies are based. Importantly, a national coordinator can request that its counterpart in the “home” country take action against a company. If the host country authority fails to remedy the violation or there are disagreements over which action should be taken, the matter can be referred to the commission.
This system also provides for oversight from the European Board for Digital Services. If the board finds that there’s a violation in at least three EU countries, it can recommend action from the Digital Services Coordinator. The commission can also request that the national authority take action.
— Commission powers —
In reality, the commission may have a central role in major enforcement cases, particularly when dealing with “very large platforms,” which are defined as those that have 45 million users in the EU, representing about 10 percent of the bloc’s population.
The regulation gives the commission — with the digital, industry and competition departments working together — more powers to take up where national authorities may be out of their depth, understaffed or under-resourced to mount a big fight with a large platform such as Google or Facebook.
“We may have situations where there may be discussions between the two authorities — the host state authority may think that the platform isn't complying with the rules — and that is where the coordination structure kicks in,” said an EU official. “We also have a system whereby the supervision and enforcement of those rules, in case of problems, will also be moving towards the commission in the context of the cooperation structure.”
While it’s possible that smaller companies will see national authorities take action, in an age when the biggest concerns about content moderation concern large platforms, it’s likely that the commission will be in the driver’s seat. That should speed up investigations in EU-wide cases where there are “systemic” risks posed by the biggest platforms.
Related Articles
No results found