Facebook said on Saturday night that within the first 24 hours of a horrific attack on Muslims in Christchurch, New Zealand that killed at least 50 people and wounded dozens of others on March 15, it had blocked 1.5 million attempts to share a video of the attack live-streamed on Facebook by the shooter, with over 1.2 million of those attempts being blocked at upload.
Editor's Note: The U.S. version of this story names the suspect involved in this act of terrorism. Gizmodo Australia has chosen not to publish it - Tegan.
The social media giant wrote on Twitter that it had taken the additional step of blocking all versions of the video edited to remove the graphic content at the behest of local authorities and “out of respect for the people affected by this tragedy.”
Out of respect for the people affected by this tragedy and the concerns of local authorities, we're also removing all edited versions of the video that do not show graphic content." — Mia Garlick, Facebook New Zealand
— Facebook Newsroom (@fbnewsroom) March 17, 2019
At least 17 minutes of the attack were live-streamed on Facebook, and the mass murder was accompanied by the online release of a manifesto apparently intended to frame the incident within the context of a fringe internet communities like far-right boards on the site 8chan. The suspect referenced internet personalities like YouTube gaming personality Felix “PewDiePie” Kjellberg and conservative pundit Candace Owen in the manifesto, reportedly carried weapons displaying white supremacist symbols, and cited neo-Nazi terrorist Anders Breivik as the inspiration for the attack.
As the Wall Street Journal wrote on Sunday, despite attempts to suppress its spread by New Zealand authorities and many major platforms, the video quickly spread on those same platforms, including Facebook, YouTube, and Twitter. The Washington Post noted that it remained easy to locate versions of the video on those sites as of Friday, and archived versions were just as widespread.
TechCrunch also noted that the 1.2 million blocked-at-upload figure means that Facebook missed approximately 300,000 copies at the time they were put on the platform for distribution, as well as fails to shed any light on how long the videos were on the site or how many views they racked up.
These numbers are vanity metrics unless they’re shown alongside engagement and video views from the live stream. This is bordering on misinformation which gives us no way of understanding the true scale of distribution and amplification.https://t.co/6dgRc7KiVj
— Mark Rickerby (@maetl) March 17, 2019
Additionally, the Journal wrote it is trivial to locate copies being shared on fringe sites, particularly those associated with the far-right internet like image board 8chan:
Those other places include 8chan, an online message board that describes itself as “the darkest reaches of the internet.” Users can post images or discussions anonymously, and anti-Muslim and anti-Semitic material is widespread on the site. On a number of threads created Saturday, unnamed people shared the video and linked to archived versions.
Some users edited and embellished the video, attempting to mimic video games designed from the point of the view of gun-wielding characters. One version was edited to include sound effects during the times when the suspect fired his gun and graphics of bullets and other tools when he changed weapons. Another video added fast-paced background music.
Other places the video can be found include 8chan’s more mainstream cousin, 4chan, as well as far-right social network Gab. (Gab itself was banned from its hosting and payment processors last year after one of its users, white supremacist Robert Bowers, killed 11 people at Pittsburgh’s Tree of Life Synagogue after issuing thinly-veiled threats via the site.)
Reddit banned a community called r/watchpeopledie for violating policies that prohibit “content that incites or glorifies violence” in the wake of the shootings, with one source telling the Verge the users on the subreddit were actively encouraging sharing both the video and the manifesto. (However, the subreddit was largely inactive since 2012 and had been “quarantined” behind a landing page for some time, the Verge wrote.) Video game distribution platform Steam also took down over 100 profiles which had usernames or profile photos glorifying the Christchurch shooter.
While Facebook has removed a staggering number of the videos, authorities are still angry with the platform over its role in disseminating violent imagery—and increasingly sceptical of it and other major tech companies’ go-to excuse that their scale makes it difficult, if not impossible, for them to react quickly to such events.
“The rapid and wide-scale dissemination of this hateful content—live-streamed on Facebook, uploaded on YouTube and amplified on Reddit—shows how easily the largest platforms can still be misused,” Senator Mark Warner wrote in an email to Gizmodo. “It is ever clearer that YouTube, in particular, has yet to grapple with the role it has played in facilitating radicalization and recruitment.”
Tech companies “have a content-moderation problem that is fundamentally beyond the scale that they know how to deal with,” Stanford University and Data & Society researcher Becca Lewis told the Washington Post. “The financial incentives are in play to keep content first and monetisation first.”
New Zealand Prime Minister Jacinda Ardern has requested a meeting with Facebook over live-streaming, according to Reuters, while British Labour Party leader Jeremy Corbyn stated that “Those that control and own social media platforms should deal with it straight away and stop these things being broadcast.” Last year, European Union legislators mulled imposing fines on platforms that fail to remove extremist content within one hour, foreshadowing the kind of proposal that is likely to follow in the wake of the attack.