SAN FRANCISCO – Whether it's a video claiming that the Earth is flat or that the moon landing has been simulated, conspiracy theories are not hard to find on YouTube. But in a significant political change, YouTube announced Friday that it was considering no longer recommending them.
After years of criticism that YouTube leads viewers to videos spreading misinformation, the company said it is changing its recommendations for users. In a blog post, YouTube stated that it would no longer offer "boundary content" videos or those that "misinform users" even if the footage did not violate community guidelines.
YouTube said that the number of videos affected by the policy change represented less than 1% of all videos on the platform. But considering the billions of videos contained in the YouTube library, the number remains high.
YouTube and other powerful technology platforms have come under increasing criticism for failing to control the content published by users.
YouTube's recommendation engine was denounced for pushing users to disturbing content, even if they showed little interest in such videos. He has also been accused of having widened the political divide in the country, pushing already partisan viewers to adopt more extreme viewpoints.
The new policy is also the latest example of YouTube's adoption of a more aggressive approach to content that many find unpleasant, even if it does not violate the guidelines of the service community.
At the end of 2017, YouTube began to put "controversial religious or supremacist" content in a "limited state" so that videos are not monetized with advertising and features such as comments and mentions are disabled. . Some videos appear behind a brief message saying that they may be inappropriate or offensive.
YouTube has only provided three examples of types of videos that it would stop recommending: those promoting a miracle cure for a critical illness, those who claim that the Earth is flat, or content that makes patently false claims about events. historical events such as the 9/11 attacks.
The company declined to provide more details about the other videos that would be ranked at the limit.
YouTube does not remove targeted videos and always recommends them to users who subscribe to a channel that creates this type of content. In addition, YouTube will not exclude so-called "limits" videos from search results.
"We believe this change is a balance between maintaining a platform for freedom of expression and respect for our responsibility to users," wrote YouTube in his blog.
YouTube said it was constantly adjusting its system of recommendations, noting that it had made hundreds of changes last year. In its early years, YouTube indicated that it suggested videos that the company said would result in more clicks or views, but that the video creators had begun trying to play the system with clickbait titles.
YouTube recently said it wanted to recommend videos that viewers would consider "time well spent." YouTube also said it was working to expand its recommendations so they are not too similar to the last video.
Just like the powerful and opaque algorithms that govern the search results of YouTube's parent company, Google, the video service ignores the factors taken into account by its systems to determine the recommended videos.
YouTube has not revealed much about how it would determine which videos would be excluded from its recommendations. Decisions about specific videos will not be made by YouTube employees, but by so-called machine learning algorithms.
Human evaluators from "everywhere in the United States," said the company, will watch different YouTube videos and provide feedback on the quality of these videos. These judgments will help inform what the algorithm reports.
Google has taken a similar approach to determining the quality of its search results.
YouTube announced that it would begin to gradually modify a small series of videos in the United States, but plans to introduce changes globally as the system becomes more accurate.