YouTube’s algorithm could be fueling extreme ideas and polarization

YouTube's algorithm could be fueling extreme ideas and polarization

As is already the case with Twitter and its tendency to create sociological bubbles or informational echo chambers, YouTube’s algorithms, according to a new study , also appear to be fueling the most radical ideas, the harshest positions, even conspiracy theories. .

More than 330,000 videos on nearly 350 YouTube channels were manually analyzed and classified according to a system designed by the Anti-Defamation League .

From less extreme to more extreme

Processing more than 72 million comments, the study showed that all three types of channels (Alt-lite, Intellectual Dark Web (IDW) and Alt-right) increasingly share the same user base; and that users are constantly migrating from smoother content to more extreme content .

The study authors hypothesized that the alt-lite and the Intellectual Dark Web often serve as gateways to more extreme ideologies. They tested it by tracking the authors of 72 million comments on roughly two million videos between May and July of last year.

The results were that more than 26% of people who commented on alt-lite videos tended to skip to the alt-right videos and subsequently comment there.

The alt-right tend to sympathize with anti-Semitic, Islamophobic, anti-feminist, anti-communist, anti-capitalist, homophobic, racist, ethnonationalist, traditionalist, and neo-reactionary ideas. This type of ideology has been driven by the development of social networks, the strong opposition of the Republican Party during the presidency of Barack Obama and the impact of the Great Recession since 2008 .

We still don’t know much about YouTube’s radicalization: For one thing, we’re not quite sure what exactly it is that moves people from alternative material to far-right material. That’s in part because YouTube restricts access to recommendation data .

The tension between individual freedom and collectivism has not been resolved since it emerged at the dawn of the 18th century. There is no answer. And probably both positions must exist so that neither wins definitively. The same happens with ideologies, and also with ideas that now seem radical to us (many of the current moderate ideas were, to a greater or lesser extent, radical in the past).

The problem posed by the study is whether, perhaps, YouTube would be catalyzing a transformation beyond reflection, a kind of evolution from a moderate positioning to a more radicalized one not so much by the ideas themselves, but by the reinforcement of peers through from the internet . After all, extreme political ideas evolve due to the need to connect with others, which would also explain part of the current COVID-19 denial movement: