Skip to main content

Facebook says that it removed 1.5 million videos of the New Zealand mass shooting

Facebook says that it removed 1.5 million videos of the New Zealand mass shooting

/

1.2 million were ‘blocked at upload’

Share this story

Illustration by Alex Castro / The Verge

In the first 24 hours after the deadly mass shooting in New Zealand, Facebook says that it has removed 1.5 million videos that were uploaded of the attack, of which 1.2 million “at upload.”

The company made the announcement in a Tweet, following up on a prior announcement that it had been alerted by authorities and removed the alleged shooter’s Facebook and Instagram accounts. Facebook spokeswoman Mia Garlick says that the company is also “removing all edited versions of the video that do not show graphic content.”

We’ve reached out to Facebook for additional comment, and will update this post if we hear back.

The terror attack appears to have been designed to go viral, with the alleged shooter releasing a manifesto that referenced numerous individuals like YouTuber Felix Kjellberg and Candace Owens, as well as white supremacist conspiracy theories. He also posted a 17-minute video to Facebook, Instagram, Twitter, and YouTube, which prompted the message to go further viral, even as all of those companies have worked to prevent its spread.

The attacks have prompted social media sites to react to such content: Facebook, Twitter, and YouTube have been working to remove videos. Reddit banned a subreddit called r/watchpeopledie, while Valve began removing tributes to the alleged shooter that were posted to user profiles.

But Facebook’s removal of more than a million copies (and edited versions) of the video speaks to the enormous challenge that it has in moderating the site. In its drive for rapid growth, its efforts to scale up its ability to monitor and remove content that’s offensive, illegal, or disturbing have been left wanting, and allow for suspects to use the platform to spread their message quickly. There have been other high-profile examples of where murders or terror attacks were streamed on the platform. Indeed, as Facebook has worked to address the problem, it has used third-party contractors, some of whom have been radicalized and traumatized by the very act of taking down such content.

Following the attack, numerous world leaders have called out Facebook for its role in disseminating this type of content. According to Reuters, New Zealand Prime Minister Jacinda Ardern indicated that she wants to speak with the company about live streaming, while British Labor leader Jeremy Corbyn said that such platforms must act, and raised the question about regulation.

Updated March 17th, 2019 11:17AM ET: Updated to clarify that Facebook has removed 1.5 million videos total, with 1.2 million blocked at upload.