YouTube is tweaking its algorithms to stop recommending conspiracy theories - Los Angeles Times
Advertisement

YouTube is tweaking its algorithms to stop recommending conspiracy theories

Share via
Washington Post

YouTube said Friday it was retooling its recommendation algorithm, which suggests new videos to users, to prevent the promotion of conspiracy theories and false information, reflecting a growing willingness by the company to quell misinformation on the world’s largest video platform after several public missteps.

In a blog post, YouTube — a division of Alphabet Inc.’s Google — said it was taking a “closer look†at how it could reduce the spread of content that “comes close to — but doesn’t quite cross the line†of violating its rules. YouTube has been criticized for directing users to conspiracies and untruths when they begin watching legitimate news.

The change to the company’s recommendation algorithms is the result of a six-month technical effort. It will be small at first — YouTube said it would apply to less than 1% of the site’s content — and affects only English-language videos, meaning that much unwanted content will still slip through the cracks.

Advertisement

The company emphasized that none of the videos would be deleted from YouTube. People who search for them or subscribe to channels that post them will still be able to find them.

“We think this change strikes a balance between maintaining a platform for free speech and living up to our responsibility to users,†the blog post said.

Tech companies gave massive platforms to conspiracy theorists like Alex Jones. Is the crackdown finally here? »

Advertisement

YouTube, which has historically given wide latitude to free speech concerns, does not prohibit conspiracy theories or other forms of false information. The company does ban hate speech but defines it somewhat narrowly as speech that promotes violence or hatred of vulnerable groups.

Advocates say those policies don’t go far enough to prevent people from being exposed to misleading information, and that YouTube’s own software often pushes people to the political fringes by feeding them extremist content that they do not seek out.

YouTube’s recommendation feature suggests videos to users based on the videos they previously watched. The algorithm takes into account “watch time†— or the amount of time people spend watching a video — and the number of views as factors in the decision to suggest a piece of content. If a video is viewed many times to the end, the company’s software may recognize it as a high-quality video and automatically start promoting it to others. Since 2016, the company also has incorporated satisfaction, likes, dislikes and other metrics into its recommendation systems.

Advertisement

But from a mainstream video, the algorithm often takes a sharp turn to suggest extremist ideas. The Washington Post reported in December that YouTube continued to recommend hateful and conspiratorial videos that fueled racist and anti-Semitic sentiment.

More recently, YouTube has developed software to stop conspiracy theories from going viral during breaking news events. In the aftermath of the Parkland school shooting in Florida last February, a conspiracy theory claiming that a teenage survivor of the shooting was a “crisis actor†was the top trending item on YouTube. In the days after the October 2017 massacre in Las Vegas, videos claiming that shooting was a hoax garnered millions of views.

YouTube’s separate search feature has also been called out for promoting conspiracies and false content. This month, for instance, a search for RBG, the initials of Supreme Court Justice Ruth Bader Ginsburg, returned a high number of far-right videos peddling conspiracies — and little authentic content related to the news that she was absent from the court while recovering from surgery.

Six months ago, YouTube began to recruit human evaluators to review content based on a set of guidelines. The company then took the evaluators’ feedback and used it to develop algorithms that generate recommendations.

Dwoskin writes for the Washington Post.

Advertisement