According to a study, YouTube’s “dislike” and “not interested” buttons don’t really affect your recommendations

YouTube’s recommendation algorithm is a mystery that baffles both users and content creators. In light of this, Mozilla conducted a study that found users’ recommendations didn’t change much when they used “dislike” and “not interested” to prevent YouTube from suggesting similar videos.

According to the study, the organization’s members were still shown videos on YouTube that were very similar to those they had previously rejected, despite their use of YouTube’s feedback tools and their customization settings. Selecting “not interested” or “dislike” as a means of preventing bad recommendations was found to be largely ineffective, preventing only 11% and 12% of such recommendations, respectively. The most effective techniques were “don’t recommend channel” and “remove from history,” which reduced poor recommendations by 43% and 29%, respectively. Study participants were generally unsatisfied with YouTube’s curation capabilities.

More than 567 million videos were analysed in a study conducted by Mozilla, which used data from 22,722 users of the company’s own RegretReporter browser extension. In addition, a survey of 2,757 people who use RegretReporter was conducted to collect additional data.

Almost eighty-three percent of users surveyed reported using YouTube’s built-in rating and commenting features, adjusting their preferences, or skipping over subpar content in order to “teach” the algorithm to recommend more relevant content. In a recent survey, 39.3% of YouTube users who tried to improve their recommendation settings reported that their efforts were unsuccessful.

Not a thing has changed. If I reported something as misleading or spam, it would often be there the next day. It’s as if the more I tell them why their ideas are bad, the bigger the mountain of bullshit will become. One survey respondent noted that “even if you block certain sources, they eventually return.”

About one-quarter (23%) of YouTube users who actively tried to alter the recommendation had a negative or neutral reaction. They mentioned things like unwanted videos re-appearing in the feed or having to put in a lot of time and energy to improve recommendations.

“They evolved, but unfortunately for the worse. I feel like I’m being punished because I took the initiative to alter the algorithm’s course of action. When people don’t talk to one another as much, “there’s less information to use for making recommendations.” said another person involved in the study.

Mozilla found that YouTube’s best tools for preventing poor recommendations had limited effect on user activity. An internal report concluded that the company “is not really that interested in hearing what its users really want, preferring to rely on opaque methods that drive engagement regardless of the best interests of its users.”

In order to better understand YouTube’s recommendation engine, the organisation suggested that the video-sharing site make its user controls intuitive and provide researchers with access to detailed data. We have reached out to YouTube for comment on the study, and will revise the piece accordingly.

According to a study conducted by Mozilla last year, 71% of the videos users “regretted” watching were suggested by the service’s algorithm. These videos focused on false information and spam. YouTube published a blog post defending its current recommendation system and its decision to filter out “low-quality” content a few months after this study became public.

After relying on algorithms to recommend content to users for a long time, social media platforms like TikTok, Twitter, and Instagram are now giving users more control over their feeds.

Legislators all over the world are also examining the impact that social networks’ opaque recommendation engines have on their constituents. To address this issue, the European Union passed the Digital Services Act in April, and the United States is currently considering the bipartisan Filter Bubble Transparency Act.

Leave a Reply

Your email address will not be published.