YouTube said Friday it is retooling its recommendation algorithm that suggests new videos to users in order to prevent promoting conspiracies and false information, reflecting a growing willingness to quell misinformation on the world’s largest video platform after several public missteps.
In a blog post that YouTube plans to publish Friday, the company said that it was taking a “closer look” at how it can reduce the spread of content that “comes close to – but doesn’t quite cross the line” of violating its rules. YouTube has been criticized for directing users to conspiracies and false content when they begin watching legitimate news.
The change to the company’s so-called recommendation algorithms is the result of a six-month long technical effort. It will be small at first — YouTube said it would apply to less than one percent of the content of the site — and only affects English-language videos, meaning that much unwanted content will still slip through the cracks.
The company stressed that none of the videos would be deleted from YouTube. They would still be findable for people who search for them or subscribe to conspiratorial channels.