Social media giant TikTok has been accused of discrimination after it was reported to be excluding videos from users with disabilities on the main newsfeed, according to an investigative report from a German digital rights blog, Netzpolitik.
According to Netzpolitik's source as well as documents sighted by them, TikTok's moderation guidelines meant users deemed as having facial disfigurement, autism and Down syndrome as being more susceptible for bullying and were marked as "special users". As a result, the site made the decision to remove their content from the main newsfeed, preventing them from achieving a higher reach on the platform's algorithm. TikTok described these "special users" as being "susceptible to harassment or cyberbullying based on their physical or mental condition."
While excluding people with disabilities was the primary focus of the moderation, the "special users" tag also extended to people with "facial problems" including a slight squint or a facial birthmark as well as some users who identified as being LGBTQIA+ or "fat".
Human moderators sifting through the content were required to mark any content from users with disabilities as "risk 4" meaning only those within their home country would be able to view the videos. Once a video uploaded by a "special user" had surpassed 6000 views, it was tagged with "Auto R", meaning it would not appear in the 'For You' page. That's where those wanting to become TikTok famous aspire to end up in. The report explains the human moderators had around 30 seconds to view a TikTok and decide whether someone should be deemed a "special user" and removed from the platform's most popular feed.
While Netzpolitik's report outlines the platform still had the moderation in place up until at least September 2019, TikTok has since confirmed it's removed the rules in a statement provided to Gizmodo Australia.
"Early on, in response to an increase in bullying on the app, we implemented a blunt and temporary policy. This was never designed to be a long-term solution, but rather a way to help manage a troubling trend," a TikTok spokesperson confirmed to Gizmodo Australia.
"While the intention was good, the approach was wrong and we have since changed the earlier policy in favour of more nuanced anti-bullying policies and in-app protections. We continue to grow our teams and capacity and refine and improve our policies, in our ongoing commitment to providing a safe and positive environment for our users."
TikTok would not disclose how many Australians were affected.
It's not the first controversy TikTok has faced in recent months. In November 2019, U.S. officials launched a national security probe into whether the Chinese-owned app was sending the data of its U.S. citizens back to the Chinese government. It's strongly denied these accusations and has made efforts for its parent company, ByteDance, to segregate the company from the rest of its operations in order to prove it.
Since then, the platform has also been accused of censorship due to a teen's account being suspended after she posted a video about China's treatment of Uyghur Muslims disguised as a make up tutorial. The account was later reinstated with TikTok explaining it was a "human error".
China’s ByteDance, owner of wildly popular (and often deeply annoying) music app TikTok, has moved to segregate much of the app’s operations from the rest of its business in a bid to convince the U.S. government user data is safe from the prying eyes of Chinese spies, Reuters reported on Wednesday.