We investigated for several months how users are exposed to extremist content on YouTube and what role extremist vloggers and the YouTube recommendation algorithm play in the real-life radicalization of viewers. Facebook and Twitter have been repeatedly implied in the distribution of (hate) propaganda and fake news, but remarkably little is known about the role of YouTube, despite being one of the most popular media platforms around the world. Its importance has sadly been underscored by the viral video of ‘involuntary celibate’ Elliot Rodger, who killed six people in 2014, and the recent Christchurch attacker in New Zealand, whose ideology bears strong similarity to the extremist content we encountered on YouTube.
In a collection of written articles, data visualization, videos and a podcast for our broad online and printed news audiences, we shed light on the most important YouTube channels of the ‘reactionary right’, both in The Netherlands and internationally. We uncover their mutual connections around themes like antisemitism, anti-feminism and white supremacy and show how this can easily lead viewers down a rabbit hole of increasingly extremist content, even if the YouTube recommendation algorithm itself is not biassed. To sketch the full picture, we did not only talk to radicalization experts and analyzed some 600.000 videos and 120 million comments from 1500 YouTube channels, but we also tracked down a handful of the (usually anonymous) commenters that expressed increasingly extremist views. In this way we hope to shed light on YouTube, and other social media, fuelled radicalization processes, that the interviewees described as ‘personal development’ and a deepening of their understanding.
What makes this project innovative?
What was the impact of your project? How did you measure it?
Source and methodology