Platforms in line for more joined-up UK regulatory approach on harmful content
20 March 2020 00:00
Facebook, Twitter and other online platforms will find it tough to escape increasing regulatory scrutiny in the UK to ensure they curb harmful and illegal content as much as possible, as the country’s various watchdogs are starting to present a united front to plug the gaps in regulating Big Tech.
Platforms are facing political pressure to more vigorously police Internet harms such as inciting violence and terrorism, encouraging suicide, cyberbullying and child sexual abuse, in the wake of several high-profile tragedies involving children and other vulnerable people.
Voluntary action is seen as ineffective and the government is preparing legislation to require companies that allow the sharing of user-generated content to remove illegal material quickly and minimize the risk of it appearing.
Telecom and media regulator Ofcom this week said that it would need to formalize its relationship with national enforcers in overlapping areas, such as the Information Commissioner's Office, which oversees companies to prevent misuse of personal data.
That follows a government announcement last month that Ofcom is in line to take on the additional task of monitoring social platforms to prevent the spread of harmful content.
While companies' obligations and Ofcom's regulatory powers still need to be fleshed out, the message to Facebook and others is clear: there is no place to hide.
Regulators united
They will need particularly to heed moves by different industry regulators to strengthen their cooperation on tackling online harms. Enforcers with an interest in the digital space include Ofcom, the ICO, the Competition and Markets Authority, the Advertising Standards Authority and the Electoral Commission.
Ofcom and the ICO have in recent years worked to build a strong relationship on areas of interest, such as online harms and the ICO's guidelines on children's privacy. Nowadays, their chiefs meet regularly. Ofcom said this week it had already held talks with the ICO on cementing their relationship further in light of its planned new role as the online-harms regulator.
Specifically, Ofcom is interested in working with fellow regulators on “horizon scanning” — scenario planning for the very long term. Tony Close, director of content standards, licensing and enforcement at Ofcom, told a parliamentary hearing on Monday that the focus would be "on making sure that we are able to identify gaps before they emerge, or gaps caused potentially by the matrix of regulatory relationships".
Elizabeth Denham, the UK Information Commissioner, echoed this view at the same parliamentary hearing.
"The regulators cannot work in silos," she said. "We know that content and conduct regulation — the subject of the online-harms white paper — needs new regulation. But that new regulation needs to work really closely with us, because the personalization of the data delivers the content."
Denham supports the creation of an oversight board that sits on top of the regulators with an oversight in the digital economy. That would mean that "all of us can land at the doorstep of the technology companies with slightly different accountability mechanisms or audits looking at them," she said.
Further, Denham wants to see the Centre for Data Ethics and Innovation, the government advisory body on artificial intelligence ethics, take on a prominent role in horizon scanning and connect more closely with the regulators with oversight of digital companies.
The understanding is that by joining up their work, the regulators would be able to share expert resources, as long as there are clear red lines to prevent them competing against each other.
It remains to be seen how quickly and effectively regulators will overcome the challenges of closing the gap in regulating Big Tech. Further formalizing relationships, and setting up an oversight board, will require extensive discussions with the government.
Strengthening resources is another challenge in itself. Ofcom this week said that it would need to build up staff such as data analysts and expertise in artificial intelligence and algorithms. The idea is to boost knowledge of platforms' operational incentives and business models, so that Ofcom's interventions are effective.
Government position
Under the government's plan, companies would need to comply with a new "statutory duty of care," which they would need to meet by taking "reasonable and proportionate action to tackle online harms on their services." It would impose greater obligations on companies to ensure child-abuse or terrorist material isn't disseminated.
The regulator will issue codes of practice outlining the steps that businesses should take, for example, minimizing the spread of harmful fake news with dedicated fact-checkers.
The government has promised to reveal more details on Ofcom's enforcement powers this spring, before it puts forward a fully-fledged bill later in the year.
Last April, when the government first published its policy paper on online harms, it said that enforcement powers would include hefty fines, blocking access to websites, and potentially imposing liability on senior management members.
The latest government view is that the regime would affect less than 5 percent of UK businesses, which enable sharing of user-generated content such as through comments, forums or video sharing.
Big Tech
Big Tech companies will be keen to start building a relationship with Ofcom, which has not previously regulated their services.
They have already called for clarity on what's required of them in the upcoming rules for tackling online harms. A Twitter executive this week underlined the importance of establishing such clarity before thinking about the kind of enforcement action Ofcom might need to use.
Both Facebook and Twitter have said that regulating fake news raises complex issues. Companies are also stressing the importance for regulators not to impose overlapping requirements.
Facebook this week warned against the inclusion in the bill of a “shopping list” of harms, to allow for flexibility on emerging harmful content such as misinformation related to public health. Companies will also be eager for government direction to Ofcom on balancing freedom of expression, safety and privacy.
Regulators have declared their interest in plugging the regulatory gaps when it comes to online harms. Their appetite to collectively regulate the digital space will only increase, and online platforms should be aware that the pressure is truly on.
Related Articles
No results found