Pro-Nazi and anti-Asian content on TikTok is reportedly exploiting the platform’s everyday features like duets and catchy pop songs to bypass moderators.
In a study by London’s Institute of Strategic Dialogue, Ciaran O’Connor and his team of researchers examined 1,030 different videos on TikTok over the last three months. They found that almost half of these videos featured pro-Nazi language, with others re-enacting offensive stereotypes or misunderstandings about nearly every marginalised community.
These videos ranged from transphobic content run by incels to Caucasian men eating bats as a reference to the offensive stereotype that grew popular after the COVID-19 virus was first reported in a wet market in Wuhan. Other videos included reenactments of the Christchurch shooting and the murder of George Floyd — the latter of which included men in blackface.
Alarmingly, the study found that a sizeable amount of these videos were either driven by or made by Australians. Some of these videos also included links to chat groups with larger extremist conversations.
In “Hatescape: An In-Depth Analysis of Extremism and Hate Speech on TikTok,” we take a look at how TikTok is used to promote hatred and glorify extremism and terrorism. (1/6) https://t.co/O1am5Ui2VC
— Institute for Strategic Dialogue (@ISDglobal) August 24, 2021
While some of these videos have since been removed, up to 80 per cent of them were still live when the study concluded. So, just how were some of these videos able to stay up for so long? Well, according to the study, they found a way to hide their content within TikTok’s everyday features.
This includes dueting themselves with unrelated videos, using large-scale hashtags and misspelling keywords of offensive language, adding an extra number to their account name when banned, and, bizarrely, soundtracking their hate speech with pop songs available on the app. According to the study’s author, Ciaran O’Connor, these groups namely used MGMT’s ‘Little Dark Age’, Kate Bush’s ‘Running Up That Hill’, and Gotye’s ‘Somebody That I Used To Know’.
Poor Kate Bush, she doesn’t deserve this.
“There’s an enforcement gap in TikTok’s approach to hate and extremism… this content is removed, but inconsistently,” O’Connor said.
“TikTok is notoriously quite a difficult platform to study, for one thing, it’s kind of new and there isn’t much of a methodology,” Ariel Bogle, an analyst at the International Cyber Policy Centre (ICPC), added via the ABC.
“It’s also driven by an algorithm, which has remained very opaque.”
A TikTok spokesperson told the ABC that while the platform “categorically prohibits violent extremism and hateful behaviour,” it “greatly value[s]” research that can help “strengthen how we enforce our policies to keep our platform safe and welcoming.”