Recent developments from YouTube are particularly pertinent given that we live in a post-truth era. On January 25, 2019, the following missive was posted on YouTube’s official blog:
Note this paragraph:
While the author of the posting makes this version of censorship seem rather benign by invoking the “earth is flat” and other conspiracy themes, it is quite clear that Google is taking steps to ensure that the narrative on its YouTube product follows its own, internal guidelines to which the rest of us are not privy. According to YouTube, the videos that will fall off of its “recommended” list will be “selected” based on both human and artificial intelligence (machine learning). With YouTube’s recommendation algorithm generating more than 70 percent (700 million human hours) of YouTube’s video views every day, its recommendations are extremely influential on its users. According to Guillaime Chaslot, the Ex-Google engineer who helped to build the artificial intelligence that is used to curate the recommended videos and one of the people behind AlgoTransparency, YouTube’s artificial intelligence is designed to increase the amount of time that people spend on the YouTube site because it leads to more ad clicks, Google’s almost exclusive source of billions of dollars in annual income. Here is Guillaume Chaslot’s Twitter thread regarding the subject:
Here is quote from an article that he wrote for Medium in March 2017:
Surprisingly, one can notice that “likes” and “dislikes” on a video have little impact on recommendations. For instance, many videos claiming Michelle Obama was “born a man” have more dislikes than likes, but are still highly recommended by YouTube. YouTube seems to put more weight in maximizing watch time than likes.
Hence, if “the earth is flat” keeps users online longer than “the earth is round”, this theory will be favored by the recommendation algorithm.
Once a conspiracy video is favored by the A.I., it gives an incentive to content creators to upload additional videos corroborating the conspiracy. In turn, those additional videos increase the retention statistics of the conspiracy. Next, the conspiracy gets recommended further. Eventually, the large amount of videos favoring a conspiracy makes it appear more credible.
The point here is not to pass judgement on YouTube. They’re not doing this on purpose, it’s an unintended consequence of the algorithm. But every single day, people watch more than one billion hours of YouTube content. And because YouTube has a large impact on what people watch, it could also have a lot of power in curbing the spread of alternative news, and the first step to finding a solution is to measure it.” (my bolds)
Note the use of the word “alternative news”, a rather interesting turn of phrase in this “fake news” era.
Truth really is in the eye of the beholder and even more so in this post-truth era. In this case, truth is defined by Google. Given that over 1.9 billion logged-in users visit YouTube every month and that every day, people watch more that a billion hours of video, YouTube’s narrative on what is “borderline content” and “alternative news” is highly influential.
Click HERE to read more from this author.