YouTube has grown to become the home for videos on the internet.
Whether it be tutorials on how to go about doing something, content that keeps us entertained, or if we'd like to dig deeper into a specific topic, there's probably a video available for it on YouTube. The possibilities are endless.
If you’re on YouTube on a regular basis, you must be aware of how bizarre YouTube's recommended videos can be.
It often delivers good content that overlaps with our interests and helps us discover new creators, but occasionally, the recommendations are borderline criminal.
YouTube makes all of its money from advertisements, just like Facebook. More ads mean more revenue for YouTube.
So, at the end of the day, they want you to spend as much time as possible on the platform. And to keep people engaged, YouTube utilizes a strong recommendation system powered by AI. According to Neal Mohan, the company’s Chief Product Officer, over 70% of the time people spent on YouTube is spent watching recommended videos.
While what's under the hood of the YouTube recommendation algorithm is largely unknown to the public, there’s information available regarding how it operates on a high level.
If you'd like to learn more about how the algorithm works (from a technical standpoint), I'd suggest you check out this article.
Now, let's take a closer look at the problems with YouTube's recommendation system.
Engaging content ≠ what users want
Guillaume Chaslot, an ex-software engineer at Google used to work on improving YouTube's recommendation algorithm. While working on this project, he was able to witness the dark sides of the algorithm first-hand.
He now heads a non-profit called algotransparency with a mission to expose the impact of the most influential algorithms.
He points out that the motivation behind Youtube's recommendation algorithm is deeply flawed and it's not really aligned with what the user wants.
According to Chaslot, “The AI isn't built to help you get what you want – it's built to get you addicted to YouTube. Recommendations were designed to waste your time.”
The way YouTube achieves this is by optimizing its algorithm for expected watch time. The algorithm tracks and measures the viewing habits of an individual, pools similar users and then finds and recommends other videos that they will engage with.
But, since the AI is told to optimize for watch time, it doesn't necessarily reflect what the user wants.
It might be great for a company trying to generate more revenue via ads. That's why the AI favours longer videos that also return the best watch time. The longer the video length, the more ads YouTube can push to the user in between.
Due to the preference for watch time, on one side we have cat videos, baby videos, etc, ranking up in the algorithm. But at the same time, we also see a lot of shocking and controversial videos- conspiracy theories, hate speech, fake news, etc. spreading on the platform.
Controversial content grabs a user's attention and gets longer watch times, hence gaining priority in the algorithm.
With people relying more and more on YouTube as their single source of information consumption, recommendations can push people further to extremes — whether they want it or not — just because it’s in YouTube’s interest to keep pushing whatever content they deem engaging so that we keep watching for as long as possible.
Controversial content on YouTube
YouTube's recommendation system has been repeatedly accused of being designed to send people down rabbit holes of disinformation and extremism.
The platform has come under scrutiny for promoting terrorist content, foreign state-sponsored propaganda, extreme hatred and a number of conspiracy theories.
Mozilla started a campaign called #YouTubeRegrets, gathering stories from the public about videos that skewed their recommendations and led them down bizarre or dangerous paths.
After watching a video about Vikings, a user was recommended content about white supremacy. Another user shares their experience of starting out watching a boxing match. YouTube then went on to recommend the user videos of street fights, then accidents, and urban violence.
You can read the most alarming stories compiled by Mozilla about people who went down rabbit holes they never meant to go down. It's disturbing and equally eye-opening at the same time.
Mozilla has since come out seeking help from YouTube users to research more about the algorithm.
They developed an add-on called RegretsReporter that allows users to report controversial recommendations.
Mozilla pointed out that the harm that algorithm-driven content enables is not easily seen or understood because all of it happens in private. Insights from the RegretsReporter extension help Mozilla to hold YouTube accountable for the AI developed to power their recommendation systems.
YouTube continues to leave independent researchers in the dark by not providing meaningful data to study the issue and help them improve the recommendation system. Moreover, users have limited options to control the recommendations they receive on the platform.
Added to that, as a user it's quite difficult to pass on clickbaity borderline content on YouTube. By engaging with such content, we give a positive signal to the algorithm and it gets boosted.
The way in which YouTube is capable of narrowing a user’s content exposure and ultimately shifting their worldview is frightening and something that we should all be aware of while navigating the platform.
One thing that you can do to improve your privacy on YouTube is to pause your YouTube history. By doing so, your recommendations will be less based on your watch history and more dependent on the channels you’re subscribed to.
There’s no reason to let a company profit from selling our attention- by exploiting the basic human desire of wanting to dig deeper into what engages us.
And it comes down to each one of us to be mindful and navigate YouTube responsibly.
* * *
Thanks for making it till the end. If you found this useful, please do consider sharing it with your friends. I spent a few good hours researching the topic and writing this article.
Thanks to Gopika for reviewing the draft of this article.