• Tech
  • Social Media

U.K. Authorities Propose Making Social Media Executives Personally Responsible for Harmful Content

2 minute read

U.K. authorities say they want to hold social media executives personally liable for harmful content circulated on their platforms in a new policy paper released Monday, reports AFP.

The newly-published Online Harms White Paper sets out guidelines to tackle the spread of violent content, suicide encouragement, disinformation, and cyber bullying as well as requirements for companies to take action to prevent terrorist content, child sexual exploitation and abusive content. The regulations also call for an independent regulator to be set up to enforce the rules.

“Online companies must start taking responsibility for their platforms, and help restore public trust in this technology,” Prime Minister Theresa May said in a statement.

The measures would apply to social media, file hosting sites, chat forums, messaging services and search engines.

Companies stand to face tough punishments under the proposed policies if they do not comply. “We are consulting on powers to issue substantial fines, block access to sites and potentially to impose liability on individual members of senior management,” the government said in its statement.

Read More: WhatsApp Has Added a New Security Feature In an Attempt to Crackdown on Misinformation

While experts believe this is a step in the right direction, the implementation of the policies may not be straightforward.

“A standardized code of practice would be valuable in promoting cross-platform collaboration, but the proposals raise a number of practical issues that still need to be worked out,” Stuart Macdonald, cyber-terrorism expert and professor of law at Swansea University, tells TIME. “For example, will the regulatory approach differ for smaller tech companies, whose lack of capacity is exploited by terrorist groups?”

Social media regulation gained a renewed focus in recent weeks after 17 minutes of the deadly New Zealand mosque shooting was live-streamed on Facebook before the company removed it. Other social media sites like Youtube, Twitter, Reddit and Instagram scrambled to block and delete copies of the video as versions uploaded by users went viral.

Other countries have taken steps to regulate online content after the New Zealand attack. Last week, lawmakers in Australia passed new legislation to hold social media accountable what is shared on their platforms.

More Must-Reads From TIME

Write to Amy Gunia at amy.gunia@time.com