YouTube’s video recommendations are not random. And often, they’re dangerous.
The streaming service’s Autoplay feature is amplifying bigoted, anti-Muslim, and xenophobic videos that have real life consequences for our communities. And it brings a ton of revenue to YouTube.1
The Action Center on Race and the Economy (ACRE) released a report on October 31st shedding light on how YouTube’s Autoplay feature contributed to a white supremacist’s murder of 51 Muslims in New Zealand last March.2
Here’s how YouTube’s Autoplay feature works:
The longer you stay on YouTube, the more ads you watch. That’s how they make money.
Seems logical, but here’s the kicker: YouTube keeps you glued to your screen by incrementally leading you to more and more extreme content.3 Too often, this results in YouTube recommendations peddling far-right extremist content and giving them a free platform to build community and radicalize viewers.
YouTube has proved that they care about public opinion. Due to grassroots pressure, they’ve already taken moderate strides to reduce the amount of “harmful content.”4 But that’s just not enough, and we need your help.
ACRE’s report singles out the experience of a young man who was radicalized by white supremacist content on YouTube at a young age but later renounced his far-right thinking:
“I think YouTube certainly played a role in my shift to the right because through the recommendations I got, it led to me to discover other content that was very much right of center, and this only got progressively worse over time, leading me to discover more sinister content.”5
After watching a video of Bill Maher and Ben Affleck debating Islam, he was sent to conspiracy theorist and Islamophobe Paul Joseph Watson’s channel, and from there to “more sinister content.”
We have already seen the deadly and devastating effects of this on Muslim, Jewish, and migrant communities around the world.6 Now, more than ever, platforms like YouTube have a responsibility to take action and prevent future atrocities like Christchurch.
Sign the petition now and tell YouTube: Stop radicalizing viewers against Muslims.
Public backlash and media pressure have already forced YouTube to acknowledge and reduce the dangerous impact of its model. They have stopped recommending conspiracy theory videos and “harmful misinformation.”7 As a result, the views those videos receive through recommendations dropped by over 50% in 6 months.8 They have the tools to stop promoting Islamophobia and with your help, we will make them do it.
The movement to hold tech companies accountable is growing bigger than ever. Google and Alphabet, YouTube’s parent companies, are already under scrutiny for their unethical decisions.9 You and people like you are reclaiming power that should have never been in the hands of big tech. One campaign at a time, we can win this together.
In partnership with:
By submitting your email address, you agree to receive communications from MPower Change and ACRE. You will be able to unsubscribe from communications you receive from us by following the directions included in each such communication.
Sources:
1. "How YouTube Drives People to the Internet’s Darkest Corners," Wall Street Journal, February 7, 2018
2."Fanning the Flames," Action Center on Race and Economy, 30 Oct, 2019
3."Fanning the Flames," Action Center on Race and Economy, 30 Oct, 2019
4. "Our ongoing work to tackle hate," YouTube Official Blog, June 5, 2019
5."Fanning the Flames," Action Center on Race and Economy, 30 Oct, 2019
6. "After New Zealand Massacre, YouTube’s Algorithm Still Promotes Islamophobic Videos," Huffington Post, March 16, 2019
7. "Continuing our work to improve recommendations on YouTube," YouTube Official Blog, January 25, 2019
8. "Our Ongoing Work to Tackle Hate", YouTube Official Blog, June 5, 2019
9. “Google Employees Protest at Alphabet’s Shareholder Meeting,” Vox, 10 June 2019