Clampdown on web content

Sunday Star-Times | Sunday, 7 April 2019

Labour's Tom Watson
"The plans do nothing to protect our democracy from dark digital advertising campaigners and fake news."
—Tom Watson

United Kingdom

Social media bosses could be liable for harmful content, a leaked British plan reveals.

There has been growing concern about the role of the internet in the distribution of material relating to terrorism, child abuse, self-harm and suicide, and ministers have been under pressure to act.

Under plans expected to be published in the UK tomorrow, the government will legislate for a new statutory duty of care, to be policed by an independent regulator, and likely to be funded through a levy on media companies.

The regulator — likely to be the UK's communications regulator Ofcom initially but in the longer term a new body — will have the power to impose substantial fines against companies that breach their duty of care and to hold individual executives personally liable.

The debate has been sharpened in recent months by the case of the British teenager Molly Russell and issues raised by the Christchurch shootings.  Molly's parents said she killed herself partly because of self-harm images on social media.

The scope of the recommendations is broad.  As well as social media platforms such as Facebook and search engines such as Google, they take in online messaging services and file-hosting sites.

Other proposals in the online harm white paper include:

  • Government powers to direct the regulator on specific issues such as terrorist activity or child sexual exploitation.
  • Annual "transparency reports" from social media companies, disclosing the prevalence of harmful content on their platforms and what they are doing to combat it.
  • Cooperation with the police and other enforcement agencies on illegal harms, such as the incitement of violence and the sale of illegal weapons.

An Ofcom survey last year found that 45 per cent of adult internet users had experienced some form of online harm and 21 per cent had taken action to report harmful content.

In a joint foreword, the home secretary, Sajid Javid, and the secretary of state for culture, media and sport, Jeremy Wright, say it is time to move beyond self-regulation and set clear standards, backed up by enforcement powers.

Companies will be asked to comply with a code of practice, setting out what steps they are taking to ensure that they meet the duty of care — including by designing products and platforms to make them safer, and pointing users who have suffered harm towards support.

The code of practice is also likely to include the steps companies will be expected to take to combat disinformation, including by using fact-checking services, particularly during electoral periods, and improving the transparency of political advertising.

Regulated firms will be expected to comply with the code of practice — or explain what other steps they are taking to meet the duty of care.  However, many questions are left to the regulator to determine.

British Prime Minister Theresa May has repeatedly raised the issue of online harm, and her government has gradually shifted its position from favouring voluntary self-regulation, to tougher enforcement.

The white paper has repeatedly been delayed.  Whitehall sources said the government had been holding it back for several reasons, including difficulties finding "appropriate legal advice" due to Brexit.

Labour's Tom Watson, the shadow culture secretary, feared that "some major concerns remained", including the fact it could take years to implement.

"They also do nothing to tackle the overriding data monopolies causing this market failure and nothing to protect our democracy from dark digital advertising campaigners and fake news," he added.

The Christchurch shootings have left their mark on the UK debate.  Thousands watched the March 15 [2019] attack as it occurred and millions more saw the video as it was uploaded on the internet.  That highlighted the specific difficulties in regulating live content on the web, where the standard "notice and takedown" practice enshrined in European Union law is tricky to apply.

Facebook boss Mark Zuckerberg
Mark Zuckerberg has finally said he backs regulation of the internet but new plans from the British Government could have seen him held personally responsible for harmful content distributed on his social media platforms.

By the time Facebook was officially notified about the live stream, a suspect had been arrested, and the video had been downloaded to be reshared elsewhere.

The white paper addresses those problems, calling on the regulator to outline specific procedures that companies need to take to keep livestreamed material off the internet but it does not suggest an answer beyond embracing technology.

Regulation of the internet has gradually become popular even amongst the companies that would be regulated.  In late March Facebook boss Mark Zuckerberg issued a public call for international regulation of the web on four fronts: political advertising, data portability, privacy, and harmful content.

 The Guardian