YouTube Censorship in the Post-Truth Era

This article was last updated on April 16, 2022

Canada: Free $30 Oye! Times readers Get FREE $30 to spend on Amazon, Walmart…
USA: Free $30 Oye! Times readers Get FREE $30 to spend on Amazon, Walmart…

Recent developments from YouTube are particularly pertinent given that we live in a post-truth era.  On January 25, 2019, the following missive was posted on YouTube’s official blog:

youtube censorship in the post-truth era

Note this paragraph:

We’ll continue that work this year, including taking a closer look at how we can reduce the spread of content that comes close to—but doesn’t quite cross the line of—violating our Community Guidelines. To that end, we’ll begin reducing recommendations of borderline content and content that could misinform users in harmful ways—such as videos promoting a phony miracle cure for a serious illness, claiming the earth is flat, or making blatantly false claims about historic events like 9/11.” (my bold)

While the author of the posting makes this version of censorship seem rather benign by invoking the “earth is flat” and other conspiracy themes, it is quite clear that Google is taking steps to ensure that the narrative on its YouTube product follows its own, internal guidelines to which the rest of us are not privy.  According to YouTube, the videos that will fall off of its “recommended” list will be “selected” based on both human and artificial intelligence (machine learning).  With YouTube’s recommendation algorithm generating more than 70 percent (700 million human  hours) of YouTube’s video views every day, its recommendations are extremely influential on its users.   According to Guillaime Chaslot, the Ex-Google engineer who helped to build the artificial intelligence that is used to curate the recommended videos and one of the people behind AlgoTransparency, YouTube’s artificial intelligence is designed to increase the amount of time that people spend on the YouTube site because it leads to more ad clicks, Google’s almost exclusive source of billions of dollars in annual income.  Here is Guillaume Chaslot’s Twitter thread regarding the subject:

youtube censorship in the post-truth era

Here is quote from an article that he wrote for Medium in March 2017:

YouTube search and YouTube recommendation algorithm yield surprisingly different results in these examples, despite both algorithms using the same data. This shows that small differences in the algorithms can yield large differences in the results. Search is probably optimized more towards relevance, whereas recommendations might take watch time more into account.

Surprisingly, one can notice that “likes” and “dislikes” on a video have little impact on recommendations. For instance, many videos claiming Michelle Obama was “born a man” have more dislikes than likes, but are still highly recommended by YouTube. YouTube seems to put more weight in maximizing watch time than likes.

Hence, if “the earth is flat” keeps users online longer than “the earth is round”, this theory will be favored by the recommendation algorithm.

Once a conspiracy video is favored by the A.I., it gives an incentive to content creators to upload additional videos corroborating the conspiracy. In turn, those additional videos increase the retention statistics of the conspiracy. Next, the conspiracy gets recommended further.  Eventually, the large amount of videos favoring a conspiracy makes it appear more credible. 

The point here is not to pass judgement on YouTube. They’re not doing this on purpose, it’s an unintended consequence of the algorithm. But every single day, people watch more than one billion hours of YouTube content.  And because YouTube has a large impact on what people watch, it could also have a lot of power in curbing the spread of alternative news, and the first step to finding a solution is to measure it.” (my bolds)

Note the use of the word “alternative news”, a rather interesting turn of phrase in this “fake news” era.        

Note that this new process will currently only affect a “very small” number of videos in the United States but that Google plays to cool this change out to more countries over time.  Using Google’s own “less than one percent of the content on YouTube” claim and given that, according to Omnicore, there are over 5 billion videos shared on YouTube, one percent is still a roughly 50 million videos that will still be available to users but will not appear on its recommended list meaning that it will be far less likely that they will be viewed, truthful or not.

Truth really is in the eye of the beholder and even more so in this post-truth era.  In this case, truth is defined by Google.  Given that over 1.9 billion logged-in users visit YouTube every month and that every day, people watch more that a billion hours of video, YouTube’s narrative on what is “borderline content” and “alternative news” is highly influential.

Click HERE to read more from this author.


You can publish this article on your website as long as you provide a link back to this page.

Share with friends
You can publish this article on your website as long as you provide a link back to this page.

Be the first to comment

Leave a Reply

Your email address will not be published.


*