A new report accuses the social media app TikTok of discriminating against content made by disabled, queer, and fat users.
The German technology website netzpolitik.org examined internal documents they obtained that discussed the app’s moderation policies and the controversial measures taken by TikTok’s owner ByteDance to combat bullying on the platform.
TikTok is a phone app that was released by the Beijing-based company ByteDance in 2017. The platform allows users to share short video clips, and this past February the app itself reached one billion downloads globally.
Documents examined by netzpolitik.org have a section entitled “Imagery depicting a subject highly vulnerable to cyberbullying” that lays out how to handle content from people “susceptible to harassment or cyberbullying based on their physical or mental condition.”
The policy did not result in videos being deleted, but it ensured that their reach was limited on the platform.
The memo says that cyberbullying has negative psychological consequences, and in order to prevent it, moderators were instructed to limit their reach to their country of origin.
This applied even to videos that appeared to include a disabled person. Moderators were told to assess a users “physical or mental condition” and given examples that included facial disfigurement, autism, Down Syndrome, and “disabled people or people with some facial problems such as birthmark, slight squint, and etc. if it showing with positive energy and value [sic].”
An anonymous source at TikTok who talked to netzpolitik.org said that moderators found the rule confusing and hard to implement.
Videos that came from vulnerable populations would appear on a moderators’ screen if they reached 6000 to 10,000 views. Those videos would get labeled “not recommend,” which meant that they wouldn’t show up on other users’ recommended video list called the “For You” page, which is generated by an algorithm.
The For You page is the place where aspiring artists and performers want to be in order to get publicity on the platform, and many commenters will leave the tags #foryou or #fyp on videos in hopes of putting them on the page.
Moderators also maintained a list of accounts entitled “special users,” whose videos were automatically tagged “not recommend.” Many of the accounts had rainbow flags or LGBTQ terms in their description.
The source netzpolitik.org spoke to said that Western moderators complained about the rules, but that their concerns were ignored by the platform’s corporate directors in Beijing.
A spokesperson for ByteDance said that the rules exist, but that they were simply a starting point and they are not meant to be permanent.
“This approach was never intended to be a long-term solution and although we had a good intention, we realized that it was not the right approach,” the spokesperson said.
The spokesperson said that the rules in the memo have already been replaced by more nuanced rules, although they didn’t explain them.
Disability activists have called the policy exclusionary.
“The regulation listed here transforms this behavior into new digital platforms in which the visibility of disabled people is deliberately reduced out of misunderstood and unnecessary care,” said Constantin Grosch of the organization AbilityWatch.
He said that hiding videos from people because of their characteristics and rendering them invisible is itself a form of cyberbullying.
Annika, a 21-year-old kindergarten teacher who appeared on the “special users” list, said that the policy is “inhuman.”
Annika, a self-described fat woman, said that one of her dance videos blew up and she started getting messages like “Kill yourself, nobody in the world wants you.”
“But I’m the kind of person who doesn’t care,” she said. She said that not only is it unnecessary to limit the reach of her videos to protect her, it might be doing more harm than good.
“People follow me now who see me as a role model. That made me even stronger.”