Digital Publishing Platforms
4 mins read

YouTube makes “hundreds of changes” to reduce extreme content recommendations

Getting your Trinity Audio player ready...

YouTube is altering its video recommendation algorithm to prevent promotion of conspiracies and false information, according to a post on its official blog. “We’ve made hundreds of changes to improve the quality of recommendations for users on YouTube,” it states, noting that these updates will help in “reducing recommendations of borderline content and content that could misinform users in harmful ways.”

This is a positive development for new publishers because if this works out as stated, genuine news stories would be less likely to be overshadowed by sensationalist videos that spread misinformation.

“A rabbit hole of extremism”

Over the years, YouTube has faced criticism for letting conspiracy theories and other misinformation spread over its platform. After the mass shooting in Parkland, Florida, last year, the top trending video on YouTube wasn’t a news clip about the tragedy, but a video that suggested survivor David Hogg was a “crisis actor”.

The video got 200,000 views before being removed from YouTube. This is not an isolated case; often when something significantly newsworthy happens, conspiracy theorists and fake news mongers begin uploading their takes on YouTube affecting the visibility of genuine content.

The problem, as pointed out by multiple observers, is in YouTube’s recommendation algorithm which favors sensationalist content to keep its users engaged.

The Wall Street Journal conducted an investigation of YouTube recommendations with a former Google engineer, Guillaume Chaslot. Chaslot had worked on the recommender algorithm while at YouTube. The investigation found that YouTube would recommend far-right or far-left videos to even those users who watched mainstream news sources. And its bent towards extreme content was present across a wide variety of topics. For example, someone searching for information on the flu vaccine would be directed to anti-vaccination conspiracy videos.

“The algorithm doesn’t seek out extreme videos but looks for clips that data show are already drawing high traffic and keeping people on the site. Those videos often tend to be sensationalist and on the extreme fringe,” engineers from YouTube told the Journal.

Prof. Zeynep Tufekci, whose area of expertise includes social impacts of technology, privacy and surveillance, research methods, and complex systems, also found herself repeatedly being presented with content recommendations that stood at the far end of what she was looking at. For example, as she researched on Donald Trump for an article, she was directed towards videos on far-right conspiracy theories.

She wrote in an op-ed for The New York Times last year, that this happened because “YouTube’s algorithms were designed to keep users engaged in its environment by feeding the “natural human desire: to look “behind the curtain,” to dig deeper into something that engages us. As we click and click, we are carried along by the exciting sensation of uncovering more secrets and deeper truths. YouTube leads viewers down a rabbit hole of extremism, while Google racks up the ad sales.”

Google’s mea culpa, and changes to empower news media

However, Google has been active over the past year driving initiatives in support of the news media. Early last year it announced the Google News Initiative that pledged $300 million to help the industry.

A YouTube specific funding announcement of $25 million followed a few months later to “support the future of news in online video,” through expertise, innovation funding, and support. That YouTube intended to make “authoritative (news) sources” readily accessible was prominently expressed in the announcement.

This latest development is yet another step towards making YouTube friendlier to publishers. The post clarifies that the changes would only affect recommended videos and not the availability of a video which will be determined by whether it is in compliance with Community Guidelines.

This is to ensure that YouTube strikes a balance between maintaining a platform for free speech and living up to its responsibility to users. That’s important as even publishers would not want censorship if, at any time, YouTube’s algorithms conclude that a genuine video carries borderline content.

YouTube will be using both machine learning and human evaluators to help train the machine learning systems that generate recommendations. The changes will be rolled out slowly, beginning with recommendations for a small set of video in the United States. As the system grows more accurate YouTube will roll it out to other countries.

It’s just another step in an ongoing process, but it reflects our commitment and sense of responsibility to improve the recommendations experience on YouTube.

The YouTube Team

A positive trend for publishers

A Knight Foundation and Gallup poll in September 2018 found that a majority of respondents had lost trust in the media in recent years because of things like inaccuracy, bias, “fake news,” and “alternative facts.” But things have already started looking up this year with trust in traditional media increasing according to the 19th Annual Edelman Trust Barometer 2019.

Google’s initiatives to support news media along with YouTube updates that downgrade fake news and conspiracy theories would likely add to this positive trend for publishers in the year ahead.