New evidence that YouTube doesn’t radicalize

The role that YouTube and its behind-the-scenes recommendation algorithm plays in encouraging online radicalization has been suggested by both journalists and academics alike. This study directly quantifies these claims by examining the role that YouTube’s algorithm plays in suggesting radicalized content. After categorizing nearly 800 political channels, we were able to differentiate between political schemas in order to analyze the algorithm traffic flows out and between each group. After conducting a detailed analysis of recommendations received by each channel type, we refute the popular radicalization claims. To the contrary, these data suggest that YouTube’s recommendation algorithm actively discourages viewers from visiting radicalizing or extremist content. Instead, the algorithm is shown to favor mainstream media and cable news content over independent YouTube channels with slant towards left-leaning or politically neutral channels. Our study thus suggests that YouTube’s recommendation algorithm fails to promote inflammatory or radicalized content, as previously claimed by several outlets.

That is from a new paper by Mark Ledwich and Anna Zaitsev.  That hardly settles the matter, but you may recall the last serious papers on this topic also indicated that YouTube does not radicalize.  So if you are still believing that YouTube radicalizes, you will need to come up with additional facts for your point of view.

Here is a Mark Ledwich tweet storm on the paper.

Comments

Are "mainstream media and cable news content" well known for teaching people facts?

The other link I saw to this story had additional facts. The YouTube algorithm changed significantly in 2018 and again in 2019.

It now recommends more mainstream news.

So I'd say this framing is entirely wrong. YouTube did have recommendation problems, they recognized it, and they've tried to fix it.

If you are looking forward, you have essentially a new product to consider.

Oh, Mark Ledwich mentions "the 2019 algorithm." That's what he's talking about, the new vs the old.

There’s been zero evidence that Youtube’s Algorithm pushes people towards extremism in any sense of the word.

This is a meme on the anonymous-left that wants governmental regulation on cat videos.

Use of the singular, "algorithm" only signals illiteracy.

What's the logical difference between a black box and a box full of black boxes?

Your pretentious critique of superficial detail over substance is pure sophistry.

Teaching moment:

If we live in the age of algorithms, we also live in the age of a/b testing. That means we will never have a solid grounding in what "the algorithm" is doing, because it's more than one thing, and changing, constantly pursuing some metric(s).

You certainly can't tell what impact YouTube recommenders will have in 2020, based on what they did in 2016.

Good point. More generally, it’s difficult or impossible for researchers to see what a particular user would see in terms of recommendations, so at first glance these studies should be taken with a big grain of salt.

Mainstream stuff is the real radicalizing problem, brainwashing people to reject historical norms in favor of extremist views on race, sex, sexuality, religion, and all social issues.

I agree; it's evident to anyone with vaguely non-mainstream opinions and political interests that the Youtube algorithms in 2016 often led you towards (or at least suggested) increasingly polarising and controversial content, especially the "Ben Shapiro DESTROYS leftist student" variety. But this has changed over the past few years, and now I even get 'counter-radicalisation' videos recommended to me after watching fairly benign content.

If ben Shapiro is considered radical then this whole topic is ridiculous. He's a more liberal version of my grandparents.

It is radical content that radicalizes, not the channel delivering it. Youtube has completely fallen out of favor by Daesh - and of course, Daesh's content is banned anyways.

What does happen is that youtube's various algorithms play a certain role in generating a self-reinforcing environment. Here is a simple experiment which allows facts to be generated on an empirical basis. Note your recommendations before watching a couple of videos - for example, serious science involving which races require stricter discipline for their own good - then see what youtube keeps recommending, and compare the difference before and after.

Obviously, if one already believes that certain races require stricter discipline, then the recommendations will simply reinforce such a view. Something distinct from whatever one may consider radicalization.

No kidding. Where is your study?

Chapeau. +1.
It is the fact that radical content is readily available, 24/7. It doesn't really matter if YouTube policies encourage or discourage radicalization or normalization. Radical views are more readily found by people who would have been isolated from same prior to 1994. It's not just the ultra-radicals (be they Daesh or Birchers) who are benefiting - elements of mainstream politics have also taken advantage of the phenomenon to create a radicalized mainstream.

"towards left-leaning or politically neutral channels"

So, it's not politically neutral but (probably deliberately) left-leaning?

Color me surprised!

All depends on what radical means.

I have not read the paper but could it be that they are looking at the average recommendation rather than the most extreme. Let’s say the recommendation set is 99% non radical and 1% radical, people will still be exposed to more radical options even though the average recommendation is highly mainstream.

Right, it seems to me that the paper is missing the point by emphasizing averages and totals instead of the extremes. E.g. that mass killer in New Zealand was able to livestream his video on social media. The mainstream social media companies tried to quickly eliminate the videos but they still got through, temporarily.

Using the broad numbers in that paper, those videos probably don't even appear as a blip, being so small in number and so brief in their stay on the main social media sites. And the 99% that ChrisA mentions probably never saw them (I didn't even attempt to view them). But the point is not the outreach to the 99%, it's the ability of the 1% to reach each other.

Radicalization is the process of bringing extreme supporters of a cause to take radical actions. It's about outliers. No statistical studies will ever show that.

YouTube doesn't a radical make, but it facilitates the type of narrow-mindedness that Fox News has exploited to enormous influence and profits. I wouldn't describe Fox News as radical; rather, it's stupid. Similarly, binge watching Ben Shapiro won't make one a radical but it will make one stupid - or less smart.

Mea culpa. No, I don't binge watch right-wing YouTube stars, but I do binge watch what interests me. And that's the problem with videos as compared to written text: we watch videos for affirmation not exploration. I've commented before that I read transcripts of Cowen's interviews (when available) rather than watch them. Try reading a transcript of an interview and then watch it. The experience will enlighten. If one were to read a transcript of "the shows" on Fox, one will appreciate just how insipid they are - "the shows" are personality driven.

Binge reading rayward comments will make the reader stupid.

Remember, remember, the 29th of November.

I will celebrate the 29th of December with my family tonight.

I think good traditions like the 29th of December help to prevent and counteract radicalizarion.

Good point, Mr. Berg. You gave me a lot to think about. Thanks for sharing.

I would note, and the authors cite this as a major limitation, that this is a very short-term study that used blank slate YouTube accounts. The more video views one accumulates, the more information the algorithm has to tailor the content. While the platform may not start out offering more extremist video prompts, it most likely does reinforce certain categories of videos once a user has demonstrated a somewhat persistent interest in a topic. Is that good? If I like watching videos about some obscure, harmless hobby, sure. If I'm a teenager going through a rough time and watched a few white supremacist videos out of curiosity, not so much.

That liberast hater is back.

The one clueless enough to imagine that people here might react to someone posting about white supremacy.

Are you sure those whitey-tightys are properly fitting a far-right white like yourself today?

The events that took place between 1933 and 1945 in Germany would seem to prove that outlier is a matter of definition. It is evident that those who participated in the Holocaust were not outliers as events unfolded, though arguably they remained in the minority believing exterminating an entire group of formerly fellow citizens, and all those belonging to that group in lands Germany conquered, was proper.

The best source of internet research and critique is the Berkman Klein Institute on the Internet and Society at Harvard: https://cyber.harvard.edu They do some very interesting research, and I will look forward to reading the article posted above. They did the early research on disinformation campaigns.

Also, YouTube is not the internet.

For some people, the Internet is pretty much exclusively google and facebook and Amazon.

For the lazy:

Computer scientist claims it’s unquantifiable and unmodelable because content creators are an adversarial system.

Basically a computer scientist says it’s not possible to measure, therefore it must exist. Descartes for the liberal academic in 2019.

All of this in tweets of 200 characters each. And of course within 2 tweets he goes full ad hominem fallacy.

Isn’t this the same argument that guns don’t kill people?

The question of whether YouTube facilitates "radicalization" seems a bit irrelevant as I suspect the chief complaint of the people who worry about such things is that the radicalizing videos exist at all.

You don't care about Daesh and other Islamist ideologies?

I do, because they are a direct barbarian challenge to free societies. Or maybe you have no problems with beheadings, or burning a prisoner alive as an object lesson to infidels.

And the number of people killed as a result of those Islamist videos runs into the hundreds in the West, though one can say America has been more or less spared from the higher death toll that Western Europe experiences.

Who exactly is being "radicalized" by ISIS videos on YouTube? Is it your neighbor Bob down the street? I can count on one hand the number of actual Americans or Europeans (as opposed to paper Americans or Europeans, or as it goes lately, people who are just squatting in America or Europe) who have renounced their allegiance to the Western devils and left to join the caliphate. Think John Walker Lindh, and he was around long before YouTube.

It seems instead the people who are being "radicalized" are the ones who arrived, or whose family arrived, from those areas where ISIS has a physical, rather than virtual, presence. Setting aside the fact that YouTube may take a backseat to influence from family, peers, religious leaders, etc. in these cases, it makes the entire solution seem a bit bass-ackwards. We allow terrorists to enter the country, then blame YouTube for making them terrorists?

A common-sense solution to this problem would be simply to stop accepting immigrants from areas with a high level of anti-Western sentiment. Of course, when Orange Man did this he was decried as racist by the left. It seems they are less concerned with saving American and European lives than using "radicalization" as a pretext to furthering Internet censorship, all in the name of maintaining a "free society", in the Orwellian sense, of course.

Dudes. If you are going to make forceful opinions, do at least the minimum work on what you are talking about.

"At the beginning of 2019, YouTube announced that its algorithm will no longer recommend “borderline content” that could harm or seriously misinform viewers."

A timeline of algo changes here.

Yes, your thought police already won.

Let me guess, Now it will be used to say any decline in revenue to Defense Contractors, and any decline in US forces buying from defense contractors will be labeled hate speech.

The obvious becomes obvious. Nice job Boomer, I have Raytheon stock.

Huh? A private company chose it's actions.

Don't tell me you are making the switcheroo that private companies should only decide your way ..

Curious to see if MSNBC and CNN radicalizes Boomers and Cat Ladies.

When might Ledwich and Zaitsev examine the gaming industry for evidence of "radicalization"?

While video games may lack explicit political content, the prospect of electronic weapons training (in lieu of [or in addition to?] conventional target practice or live-fire drills) surely was conceived and marketed with some tacit purpose in mind (in addition, of course, to the inherent entertainment value).

--so tell us about the target audiences (pun unavoidable) of the video gaming industry (as it morphs into the VR gaming industry).

Arvind Narayanan has an interesting tweet stream in criticism of this paper. Some bits from the strea:

The key is that the user’s beliefs, preferences, and behavior shift over time, and the algorithm both learns and encourages this, nudging the user gradually. But this study didn’t analyze real users. So the crucial question becomes: what model of user behavior did they use?

The answer: they didn’t! They reached their sweeping conclusions by analyzing YouTube *without logging in*, based on sidebar recommendations for a sample of channels (not even the user’s home page because, again, there’s no user). Whatever they measured, it’s not radicalization.

Incidentally, I spent about a year studying YouTube radicalization with several students. We dismissed simplistic research designs (like the one in the paper) by about week 2, and realized that the phenomenon results from users/the algorithm/video creators adapting to each other.

Let’s not forget: the peddlers of extreme content adversarially navigate YouTube’s algorithm, optimizing the clickbaitiness of their video thumbnails and titles, while reputable sources attempt to maintain some semblance of impartiality. (None of this is modeled in the paper.)

After tussling with these complexities, my students and I ended up with nothing publishable because we realized that there’s no good way for external researchers to quantitatively study radicalization. I think YouTube can study it internally, but only in a very limited way.

If you’re wondering how such a widely discussed problem has attracted so little scientific study before this paper, that’s exactly why. Many have tried, but chose to say nothing rather than publish meaningless results, leaving the field open for authors with lower standards.

This does a great job of showing the partisan bias underlying the initial study. Poor scientific methodology was chosen to try and push a narrative, and unfortunately, it worked on Tyler.

Of course it worked on Tyler. It's precisely what he'd want to hear, given his particular and fairly obvious biases towards tech. You see it all the time with him. When he likes the conclusion of the paper, it gets uncritical bumf written about it. When he doesn't like the conclusion he gives the paper a very critical overview. It’s fairly predictable.

Seems intuitively obvious. Radical content is non-monetizable, so YouTube steers their users instead towards videos that advertisers are willing to run ads alongside.

YouTube Premium subscribers don't see ads, so incentives differ there. But that's a negligible percentage of users.

It's not clear if the study was looking for directly presenting radical leaning or disinfranchised people to the radical content or looking for other patterns. I would suspect the process of radicalization is hardly a straightforward process and likely one that is related to viewing the existing status quo as overbearing. In other words, the behavior described for YouTube and it's algos may well be enough to push those on the edge more towards radicalization -- and so to seek it out to spite the system that wants to give them some other content.

Was that type of affect considered?

As a heavy YouTube consumer I have noticed a big change in the way the site recommends videos. The more benign the content, the wider variety of channels that will be recommended. I used to be able to follow the rabbit hole down both right and left wing videos. But now, I get fewer and fewer new political channel recommendations. The algorithm (yes I used the singular and you can go pound sand bear/shrug) really likes to recommend popular trending videos. Watch one music or gaming video and the entire “trending” tab for those subjects will show up in your feed. I spend more time “do not recommend this channel” now than watching what is recommended. If I want to watch political commentary, I have to search for it. But with subjects like auto repair/restoration, I have found a lot of interesting content. New channels all the time.

My YouTube use is completely non-political -- just music and old TV clips. But my musical taste has moved toward far heavier, darker and doomier metal, almost entirely from exposure to the YT suggestion menu (and some viewer suggestions in the comment strings). My taste in classical has also evolved, but in the other direction, to more quiet and restrained solo piano.

So is this some kind of analog indicating that YT could help "radicalize" your thoughts and views?

Given that youtube can chance the recommendation system on us, the most we can claim is that, in the window of time that was tested, the youtube algorithm favored mainstream sources. But was this the case a few years ago, when the issue came to the forefront? The paper can't say anything.

I used to have to clean my youtube history because there were certain topics that would quickly make the algorithm move towards crazy BS: For instance, a video or two on America fielding low IQ troops in Vietnam would quickly lead to white supremacy, and how the ACW was about states rights.

Nowadays I only see that on topics where the somewhat suspicious views are quite popular, like videos on Star Wars.

Comments for this post are closed