Facebook is tightening restrictions on its live-streaming feature in the wake of the horrific mass shooting in Christchurch, New Zealand, as world leaders prepare to meet for a summit aimed at curbing online terror.
The social media giant said it is introducing a “one-strike” policy, which temporarily restricts access for users that break Facebook rules. The company did not specify which offenses will be covered by the policy or the length of suspensions for rule-breaking users.
“Live can be abused and we want to take steps to limit that abuse,” Facebook said in a statement. The company said they plan to extend the restrictions over the coming weeks, including to the creation of ads, but did not lay out specific plans.
Facebook has come under fire for its role in online terror since 17 minutes of deadly shooting in New Zealand that left 51 people dead were broadcast on Facebook Live. In the 24 hours after the attack, the company scrambled to remove 1.5 million videos containing footage of the bloodshed.
New Zealand’s Prime Jacinda Ardern and French President Emmanuel Macron will host a summit starting Wednesday where they plan to ask tech giants to sign a pledge called the “Christchurch Call” to agree to combat extremism on the internet.
Ardern said the new rules are a step in the right direction.
“Facebook’s decision to put limits on live streaming is a good first step to restrict the application being used as a tool for terrorists, and shows the Christchurch Call is being acted on,” Ardern said in an email from her spokesman, according to Reuters.
Read More: Your Facebook App Looks Very Different Today. Here’s How to Use the New Design
Facebook also announced a $7.5 million investment into a partnership with three universities – the University of Maryland, Cornell University and the University of California, Berkeley – to research new methods to find edited versions of content.
Several countries have taken steps to regulate online content since the attack. Last month Australia passed legislation setting out fines and punishment for social media sites for hosting hate content, and the U.K. has proposed making social media executives personally responsible for harmful content shared on their platforms.
More Must-Reads from TIME
- Introducing the 2024 TIME100 Next
- Sabrina Carpenter Has Waited Her Whole Life for This
- What Lies Ahead for the Middle East
- Why It's So Hard to Quit Vaping
- Jeremy Strong on Taking a Risk With a New Film About Trump
- Our Guide to Voting in the 2024 Election
- The 10 Races That Will Determine Control of the Senate
- Column: How My Shame Became My Strength
Write to Amy Gunia at amy.gunia@time.com