The Piggyback Trick
gif gif
gif

The Piggyback Trick

How YouTube Algorithms Promote
Conspirology and Propaganda.

gif

Published: November 28, 2023.
Читати українською.

By: Roman Kulchynsky, Halyna Pastukh, Yuliia Dukach, Serhii Mikhalkov,
Vlad Herasymenko, Nadia Kelm, Yevheniia Drozdova.

Youtube has long earned the reputation of a convenient and reliable service. Two out of three Ukrainians come here for entertainment and educational content, but they also use it to watch the news and follow the twists and turns of Ukraine's political life.

The recommendation algorithm is the “heart” of YouTube. In a nutshell, the better recommendations the platform offers, the more time its users spend watching videos and the more money they bring to the advertisers by watching their adverts.

According to YouTube's Chief Product Officer, for 70 percent of the time you watch, you are riding a chain of AI-driven recommendations which demonstrates the sheer pervasiveness of the recommendation algorithm. This is why the algorithm is constantly being improved and perfected to make the content offered by the platform as personalized as possible.

Despite the apparent effectiveness, the platform still has plenty of room to improve its social responsibility. So far it seems that through recommendations YouTube may promote content that is threatening and dangerous both for the individual viewer and Ukraine's national security.

Here are some of the most important factors out of the many which define what video to recommend:

However, the exact weight of each of the above factors remains unknown. The same is true about the algorithm used by the platform to identify similar videos. The information regarding the algorithms is a tightly guarded commercial secret.

Therefore, we decided to look into the algorithms of the platform based on the recommendations provided from mid-to-late summer to 205 Ukrainians who agreed to install our proprietary application and share the information about all their views on the platform.

Methodology

Our aim was to identify potential problems in the platform's recommendation system. In particular, we wanted to know if all the recommended content is safe and whether YouTube promotes propaganda and misinformation. We were also interested to see to what extent the platform tends to put the user into a filter bubble.

Let's take a look at what is wrong with YouTube algorithms using real-life examples and real user experience.

It is true that all the recommendations sooner or later lead to Russian propaganda?

Vatnik

vatnik-avatar

Our first user preferred markedly pro-Russian content. The Ukrainian slang term for such a person is "vatnik” which we nicknamed him for the purposes of this study.

His viewing experience demonstrates how Russian disinformation finds its way into Ukrainian YouTube.

Examples of recommendations for "Vatnik"

scroll down to explore recommendations

scroll-down

skip visualization and read on

2023-07-28 | 15:19:59

SELECTED VIDEO

YOUTUBE RECOMMENDATIONS

loading

Most of the videos watched by this user were in Russian. During the study period, he watched 153 videos in Russian and only 15 in Ukrainian.

YouTube is sensitive to the user's preferred content language. The language of the videos from the recommendation list directly depends on your preferred language: Ukrainian or Russian.

The dependence is almost linear. The more you watch in Russian, the more Russian-language recommendations you get. The opposite also holds true.

Although our sample did not include those who avoided Russian-language content completely, it was still sufficient to conclude that watching fewer videos in Russian is one of the most effective ways of getting rid of Russian-language recommendations in your personalized list if you find them annoying (or you can install a special application to block them). However, the speed of the platform's adaptation to a language change remains unclear.

Language recommendations depend on browsing history

The analysis of our user's views shows how easily potentially destructive and manipulative topics pervade a Ukrainian YouTube feed. Despite being based on real issues, they are often hyperbolized and provided with emotional wrapping making perfect clickbait videos.

For example, he watched a 3-minute video called “A woman fights off TRSSC officials to protect her home and children!!!" four times in July. After watching the video, it is still hard to say what exactly happened. However, the title of the video tricks users into thinking that they would see a woman actually fighting off TRSSC officials.

The video belongs to “Vigilant Dnipryanyn” channel. Many of the videos published on the channel contain openly manipulative content that follows the lines of Russian propaganda. For example, such videos are about the allegedly illegal mobilization or show people challenging or even bullying TRSSC officials and police officers.

We found a number of similar channels which label their videos from “Education” to “Entertainment” (more about the effects below). Such channels strongly and emotionally criticize the mobilization in Ukraine.

The analysis of the user's experience revealed that Youtube promotes the persons and organizations which the platform attempted to ban or curb their media impact.

For example, on 11 August 2023, YouTube deleted two channels of the American pseudo-expert from the USA with pro-Russian views Scott Ritter. This, however, does not prevent the platform from continuing to promote his interviews or highlights reels in the recommendation lists.

The entire existence of some YouTube channels is dedicated to promoting the opinions and claims coming from Ritter and several pro-Russian foreign propagandists, and the platform does nothing to ban them.

The same may be true about other banned personalities. Despite being blocked on YouTube, Russia’s top propagandist Vladimir Solovyov is still present there in the form of highlights reels or materials whose descriptions say nothing about featuring the banned presenter.

YouTube does not always adhere to its own policies by allowing uploads which violate the platform's rules.

This is the gray zone: although such videos are eventually removed by the platform, users still have plenty of time to follow personalized recommendations and watch them.

The platform's algorithm takes no account of Russia's war against Ukraine while offering personalized recommendations to Ukrainians. Amidst the Ukrainian struggle against Russian propaganda, YouTube reserves the right to provide personalized content which is neither “pro-Ukrainian” nor “pro-Russian”. YouTube turns a blind eye to the fact that Russian propaganda is actually part of Russia’s arsenal of weapons used against both Ukraine and Western values which, among other things, make YouTube possible. YouTube has gone a long way to banish Russian propaganda, and that is a fact. However, the recommendation system still promotes videos which spread it. Therefore, its algorithms have to be trained to recognize pro-Russian content and exclude it from recommendations for Ukraine and other countries.

As far as Ukraine is concerned, YouTube ought to consider the national bans on spreading content coming from individuals who represent a threat to Ukraine’s national security (click here for the official list).

Since YouTube fails to distinguish pro-Russian videos, the users who enjoy watching videos from the pro-Russian presenter Diana Panchenko will be offered “Strana.ua” and “Yasno. Ponyatno” channels, which have been banned in Ukraine, in their “personalized” feed. The users who watch Russian propagandists featured on Oleksandr Valdemar's channel will be recommended other experts from that channel who profess similar anti-Ukrainian principles.

How does YouTube “understand” what the video is about?

YouTube relies on the information about the author and everything related to the video such as its title, description, keywords and the video sequence.

This information is used to classify the video according to certain categories such as “News and Politics”, “Games” or “Music”. Naturally, if a user enjoys watching videos from the “Sports” category, YouTube, they are unlikely to be recommended videos from the “Science and Technology” category.

This also means that algorithms which determine the video category sometimes fail and are generally imperfect.

The Regular

frequenter-avatar

Another user who took part in our study watched YouTube a lot and actively followed recommendations. For the purposes of our study, we nicknamed him “The Regular”. Each fifth video that he watched was recommended by the platform.

He ranked at the top of the group of users who were not particularly interested in choosing and preferred to follow the recommendations instead.

Examples of recommendations for the "Regular"

scroll down to explore recommendations

scroll-down

skip visualization and read on

2023-07-16 | 22:24:23

SELECTED VIDEO

YOUTUBE RECOMMENDATIONS

loading

Most of the users who took part in our study followed up to 15% of the recommendations. In other words, 15% of the videos recommended by the platform were actually watched. There were also those who never once clicked on the videos suggested by the platform during the study period.

Our user appears to be quite uninhibited. He knows how to find intimate videos which slip past the platform's policing algorithms by pretending to be educational. Interestingly, the platform responded to the sexually explicit title of the watched video with recommendations which were no less explicit.

While watching videos from the YouTube channel of a medical institution, he received recommendations to watch similar content from bloggers, authors of courses on sexual development, massage therapists and fitness trainers with sex references.

One of the recommendations included a self-proclaimed “doctor” who not only treats physical and psychological problems, but also helps to get rid of the “curse of celibacy”. Videos from such bloggers who position themselves as doctors, sexologists and authors of spiritual development courses are also often categorized as “Education” by the platform.

One should always remember that YouTube has never been a source of reliable information on its own. But there is something much worse: the platform can also promote potentially harmful anti-scientific and pseudo-medical content through its recommendation system.

Our participants’ recommendations included content on treating cancer with baking soda, miracle cures for cholesterol, all sorts of diets and fasting plans of unproven efficacy, dubious methods of losing excess weight and direct advertising of medical drugs of uncertain origin. And all those videos were also labeled as “Education” by the platform.

Examples of titles included in the recommendations

YouTube VP of Engineering Cristos Goodrow says the platform is trying to counter such videos. According to Goodrow, such videos represent “borderline cases”: “although some content may border on restricted, it might not be in complete violation of our Community Guidelines. Those include conspiracy theory videos (“the moon landing was faked in a studio”) or other content that spreads misinformation (“orange juice can cure cancer”)”.

He also added that the platform would focus on providing responsible recommendations and take significant steps to ensure that the system does not recommend such content to a wide audience.

He also added that the platform would focus on providing responsible recommendations and take significant steps to ensure that the system does not recommend such content to a wide audience. Our study which revealed numerous recommendations of “borderline content” shows that YouTube fails to meet that goal.

We also noted that YouTube sometimes incorrectly categorizes videos whose nature is completely unambiguous.

For example, a video featuring Russian propagandist Yaakov Kedmi was marked as “Film & Animation” whereas and conspiracy theories could be labeled “Science & Technology”.

Why is that a problem?

The video category influences its promotion. That is, if Yaakov Kedmi’s video is marked as “Film & Animation”, his propagandist video might pop up on the recommendations list of somebody who enjoys watching videos about movies. Once the person watches it, the platform will suggest Kedmi again and will later add other similar propagandist content

Which of the recommendations are being revised

What type of video is recommended after watching

This is due to the changes in the recommendations algorithm. The platform used to promote popular channels and videos. However, beginning 2017, YouTube started to recommend content based on similarity. Marking channels and videos is one of several factors which influence the promotion.

And it isn’t music. It is your viewing history.

One day, “The Regular” used the platform for a single purpose: to listen to a track on a lesser-known channel of a Kyiv-based blogger. After watching the video, he got 48 recommendations.

YouTube categorized the video as “People & Blogs”. However, only one suggestion from among the platform's recommendations was from the “Music” category. All the others contained political and war-related content.

Here are some examples:

And it isn’t music. While the above-mentioned videos do not fall under the notion of Russian disinformation, they could certainly be classified as “Junk Information” if YouTube had such a category.

This is a compelling example of how the viewing history influences one's recommendations.

Even if a person is re-watching a new video, YouTube suggests videos from the channels that the user has watched before.

The viewing history being the defining factor behind the recommendations system means the platform is programmed to force “proven recommendations” which sometimes creates a bubble or cycling effect.

This is how the algorithm works: if you watch something new which is not characteristic of you according to your viewing history, the recommendation system will need certain time to analyze and adjust to your new preferences.

The previous selection demonstrates that our user often watched various experts and channels with a pro-Ukrainian stand. This factor defined the choice of recommended videos.

This is one of the factors considered by the recommendation algorithm. If a user watches a video featuring a certain person, the platform will recommend significantly more videos with that person thereby reducing the chances of new people entering the recommendations list.

On the one hand, the selection is driven by the user’s interests. Obviously, if somebody enjoys watching a certain person, it is safe to assume further interest in that person. On the other hand, such approach contributes to building a filter bubble around the user.

On another occasion, the user watched educational content on learning English. Out of the 17 recommendations from YouTube, only 4 were relevant. Others included political content based on the viewing history.

On the morning of 27 July, the user decided to watch the historic TV series entitled “THE UNBELIEVABLE STORY OF WORLD’S MOST FAMOUS COURTESAN! Mata Hari”.

This time, YouTube also suggested 17 recommendations. However, just like before, only 4 of them had “Mata Hari” in the description. Others contained political content which was strikingly similar to videos watched earlier.

One of the recommended videos was entitled “Wolf Messing’s Profecies for 2023” from “Story Chest” channel. The video claimed that Messing had predicted the Covid-19 pandemic during his lifetime. He was also said to have predicted Russia's War against Ukraine. According to the authors, the mind reader allegedly predicted the collapse of the United States and Russia’s triumph. Such statements are suspiciously similar to Russian disinformation which often relies on all sorts of prophets, magicians, and astrologers to lend credence to the claims.

Similar channels and videos make frequent appearances on YouTube recommendations list. They are an example of what is called a harmful platform business model.

Showing advertisements used to be allowed only for popular channels verified by YouTube. A change arrived in 2012 when YouTube allowed all content creators to place advertisements offered by the platform in exchange for a share of the profit it generates.

This forces content creators to compete for the attention of the audiences and lure in new users. Authors resort to clickbait headlines and produce a large number of videos of questionable quality.

The platform is bursting with self-proclaimed experts, Tarot card readers, fake doctors and scientists who stick flashy titles on videos which contain nutty or anti-scientific content. They would do whatever it takes to make the user click and watch the video.

More views equals more money earned from advertising which, in turn, means more income for YouTube.

Not Your Personal Choice

It is now evident that what you have watched on the platform defines your recommendations list. Still, this does not necessarily mean that the recommendations and views are all about one's personal choice.

It appears that the promotion algorithm is not only guided by your own, but also by other people's interests while making recommendations. If you watch pro-Russian content, YouTube will also consider the viewing history of other users who also consume such content. As a result, you will be recommended what they liked — not you.

AND WHAT DOES YOUTUBE SAY?"

Therefore, if you have a liking for pro-Russian narratives, you will be fed what other pro-Russian users are watching.

If you, like our Vatnik, watch the video entitled “Why did Shoigu go to North Korea? Konstantin Sivkov” from YouTube channel “OUR TOMORROW”, the platform will respond to your interest in Russian affairs by suggesting to watch other conspiracy theories. For example, you will be offered deliberations about the world government and the ruling race, ideas such as “the Ukrainian army is under R1b command” or answers to the eternal question “Who really killed JFK?”.

Examples of conspiracy video titles recommended by YouTube

AND WHAT DOES YOUTUBE SAY?"

A similar study was carried out by Engadget: it was found that the recommendations provided by YouTube video platform may lead children to videos of school shootings and other gun-related content.

“The study found that in addition to promoting content about shooting and guns among all gamer accounts, YouTube did so to a much greater extent suggesting it to those who followed YouTube recommendations. Such videos contained scenes of mass shootings, graphic demonstrations of the damage a firearm can do to the human body as well the instructions on how to convert a single-shot handgun to full automatic”.

YouTube has paved way towards promoting conspiracy theories by combining a business model that rewards provocative videos with shares and advertising awards, and an algorithm that guides users down a personalized path designed to keep them glued to their screens.