For years, queer people on YouTube have accused the video-sharing giant of using LGBTQ keywords like “gay” and “transgender” to demonetize, hide, and generally censor content, even if it was appropriate for all ages.
And some YouTubers have put together a video showing proof.
Three YouTubers who go by Andrew, Sealow, and Een looked into what it would take to get a video demonetized. Andrew, whose channel is “YouTube Analyzed,” tested 15,296 common words from the dictionary. Sealow tested 14,000 words separately.
To test them, they uploaded short videos with no real content, just titles.
YouTube uses a bot to decide whether content is not appropriate and to demonetize it. YouTubers have noted in the past that adding or removing words like “transgender” changes the way the site’s bots react to it.
What the three found was that was that there is a list of words that increase the likelihood that a video will be demonetized on YouTube, mostly sexual or political terms (YouTube avoids associating advertisers with controversial political content). Other words lead to automatic demonetization.
Andrew said that he believes – YouTube is secretive about its algorithm, hence the point of the testing – that each word in a title is given a value by the bot, and the total value is used to determine whether the video will be demonetized automatically or not.
But many of those words were LGBTQ and had nothing to do with explicit content or partisan politics. They said that if they changed the word “gay” in a video’s title to “happy,” it was declared advertiser-friendly consistently.
“Straight” and “heterosexual,” though, are perfectly fine, according to the YouTubers.
“The exact same videos are monetized without the LGBTQ terminology,” said Sealow in the video. “This is not a matter of LGBTQ personalities being demonetized for something that everyone else would also be demonetized for, such as sex or tragedy. This is LGBTQ terminology like ‘gay’ and ‘lesbian’ being the sole reason a video is demonetized despite the context.”
Earlier this year, several YouTubers sued the platform alleging that it discriminates against LGBTQ people.
DO NOT LET YOUTUBE GET AWAY WITH THIS.
I uploaded my video TWICE to see if the word "transgender" would trigger the algorithm… and every step of the way was fine UNTIL I added the word Transgender. RIGHT away, the video was demonetized.
Literally. RIGHT. AWAY. pic.twitter.com/mvCucFPyZP
— Chase Ross 🐝 (@ChaseRoss) May 30, 2018
YouTube has consistently denied that their algorithm discriminates against LGBTQ content and says that even if it does, there is an appeals process in place to remedy any mistakes.
The decisions the human moderators who see the appeals make are then used to train the algorithm. The problem with that is that if those people are biased against LGBTQ people, their biases will affect the algorithm.
Since many of them may live in countries where homosexuality is illegal – YouTube allegedly outsources the hiring of these moderators – their biases may creep into how the bots operate.