YouTube’s dislike button does nothing for the algorithm, a new Mozilla study has found. We continue to see content we don’t want no matter how much we mash that thumbs down. The same goes for “Not Interested” and “Don’t recommend this channel” options.
The report, titled Does This Button Work? Investigating YouTube’s Ineffective User Controls, comes after a months-long study of YouTube behavior by the Mozilla Foundation. They enlisted the help of 20,000 volunteer web users through an extension on Mozilla’s Firefox browser, the RegretsReporter.
“We learned that people don’t feel YouTube’s user controls are effective tools for managing the content they see.” Said Rebecca Ricks, the Senior Researcher at Mozilla. “Our research validates these experiences — the data shows that people don’t actually have much control over the YouTube algorithm.” The YouTube algorithm continues to recommend videos people clearly don’t want to see even after they’ve let the platform know their preferences.
Mozilla used a variety of tools in its research. It began with a survey of 2,758 RegretsReporter users, and 68% of them felt the YouTube algorithm completely ignored their interests. Then Mozilla used a qualitative method which confirmed the survey results. It did this by overlaying a Stop Recommending button onto recommended videos which would trigger the YouTube algorithm. The extension then tracked the videos which were rejected, and which were recommended again. A discouraging 67% of rejected videos resurfaced on the participants’ recommended feeds.
The algorithm was also incapable of determining the types of videos to stop recommending. For example, people who disliked war videos or told YouTube to stop recommending war videos continued to be fed grisly footage from the war in Ukraine.
The most effective tool was the “Don’t recommend channel” option, which worked 43% of the time, while the least effective tool was the “Not interested” option, which only worked 11% of the time. Mozilla says YouTube has the ability to fix this and recommends YouTube make its tools more proactive and allow us to shape our own experiences on the platform.
Until then, we can expect to keep getting those horrible recommendations, no matter how many times we smash that dislike button.
Related Posts
New study shows AI isn’t ready for office work
A reality check for the "replacement" theory
Google Research suggests AI models like DeepSeek exhibit collective intelligence patterns
The paper, published on arXiv with the evocative title Reasoning Models Generate Societies of Thought, posits that these models don't merely compute; they implicitly simulate a "multi-agent" interaction. Imagine a boardroom full of experts tossing ideas around, challenging each other's assumptions, and looking at a problem from different angles before finally agreeing on the best answer. That is essentially what is happening inside the code. The researchers found that these models exhibit "perspective diversity," meaning they generate conflicting viewpoints and work to resolve them internally, much like a team of colleagues debating a strategy to find the best path forward.
Microsoft tells you to uninstall the latest Windows 11 update
https://twitter.com/hapico0109/status/2013480169840001437?s=20