TikTok, a company that’s no stranger to content moderation concerns, set off some major red flags this week when it erroneously blocked popular creators from including phrases like “Black Lives Matter” in their bio’s. At the time, TikTok claimed the phrases were “inappropriate content” — an issue that the company promises it has fixed.
Comedian Ziggi Tyler first noticed the issue when he tried to update his bio page in TikTok’s Creator Marketplace — a beta feature that the platform debuted last year to connect popular creators with branded sponsors. Tyler posted several clips on his account showing that every time he tried to tweak his bio to include phrases like “Black Lives Matter,” “Pro-Black,” or “Black success” TikTok would throw up a message that his bio contained “inappropriate content” and couldn’t be updated.
Later in the same video, Tyler replaced the word “Black” with “white” in each of those phrases — showing how TikTok didn’t flag his bio when it included terms like “supporting white supremacy,” or “pro-white.”
“White people can get on here and call me the n-word and make videos about violent extremism but I can’t do anything,” Tyler said. “We can’t do anything. We’re tired.”
Tyler told Forbes in a recent interview that as a Black creator, he was hoping to highlight his racial background in the Creator Marketplace specifically, in the hopes that advertisers looking to focus on racial justice in their campaigns — or even diversify their ads in general — would use his talents. That’s a struggle when you can’t even include the phrase “Black Lives Matter,” in your bio, he said.
TikTok didn’t deny the issue, telling Gizmodo it could be chalked up to some kinks the company’s ironing out in the still-in-development Marketplace. While TikTok promised that it’s since corrected the AI issues that caused these phrases to be flagged, it’s also put a temporary hold that keeps any creators from changing their bios at all.
“Our TikTok Creator Marketplace protections, which flag phrases typically associated with hate speech, were erroneously set to flag phrases without respect to word order,” a TikTok spokesperson told Gizmodo. “We recognise and apologise for how frustrating this was to experience, and our team has fixed this significant error. To be clear, Black Lives Matter does not violate our policies.”
Per TikTok, its algorithms flagged Tyler’s bio because he had also included the word “audience” alongside phrases like “Black People Matter.” Specifically, TikTok explained that its AI tools were taught to flag bio’s that included the word “die,” which is… in the middle of the word “audience.” Jutting up against the aforementioned phrase, the algo interpreted the words in Tyler’s bio as “die,” and “black people.”
TikTok also pointed out that its algorithm would have drawn attention to any combination of the words “die,” and “black”: “ingredients for blackberry pies, for example, or “the worst plague of the medieval-era was caused by tiny black fleas.” Both of these, were they included in some person’s Creator Marketplace bio, would have been flagged for inappropriate content by TikTok’s bungled AI.
It’s unlikely that creators of colour are going to be quick to forgive the platform, which has a contentious relationship with the Black creators that use it. Some of these figures went on an “indefinite strike,” from choreographing dances to popular songs after white users were accused of appropriating moves from the Black community — and sometimes gaining viral fame as a result — without offering credit. During the Black Lives Matter protests that ramped up last year, multiple Black creators alleged that TikTok was suppressing content about George Floyd’s death, which the company later blamed on a “technical glitch.” Hmmm. Sure sounds familiar.