Youtube’s algorithm serves problematic videos: Study shows

The latest study from software nonprofit Mozilla Foundation showed that the Youtube’s algorithms can recommend you videos featuring sexualized content and false claims than personalized interests. More than 37,000 volunteers participated on the study by using a browser extension track down their Youtube usage.

The records by the browser extension revealed that the video flagged as problematic is still coming across the platform via YouTube’s recommendation or on their own. The study defined problematic videos as “Youtube Regrets” and it included videos “championing pseudo-science, promoting 9/11 conspiracies, showcasing mistreated animals, [and] encouraging white supremacy.”  As per the study, these Youtube regrets comes under recommended videos due its higher chances of going viral.

It has been reported that Youtube removed around 200 videos flagged through the study. A spokesperson confirmed to Wall Street Journal that “the company has reduced recommendations of content it defines as harmful to below 1% of videos viewed. YouTube has launched 30 changes over the past year to address the issue, and the automated system now detects and removes 94 percent of videos that violate YouTube’s policies before they reach 10 views.”

Youtube never made any public announcement regarding how exactly the recommendation algorithm works, claiming it as proprietary. But Youtube’s effort to eliminate harmful videos from the platform is appreciable.

Related posts

Leave a Comment