© 2024 Milwaukee Public Media is a service of UW-Milwaukee's College of Letters & Science
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

YouTube Removes White Supremacist Content

LULU GARCIA-NAVARRO, HOST:

YouTube is removing thousands of videos with white supremacist and other extremist content. It's the latest big tech company to tighten restrictions in response to public pressure to clean up the hate speech that permeates social media. NPR national security correspondent Hannah Allam is here with us to talk about the purge. Hey.

HANNAH ALLAM, BYLINE: Hi.

GARCIA-NAVARRO: So it has been a couple of days since YouTube began taking down videos. What have you seen? What kind of effect has it had?

ALLAM: Well, there was no waiting around. This policy kicked in immediately. YouTube videos with extremist content started vanishing - videos that promoted white supremacy, neo-Nazi videos. Some civil rights groups and people who've been targeted for harassment online say it's a step in the right direction, although they also have concerns that it doesn't go far enough or it's impossible to enforce. And on the flipside, there are people who say it goes too far.

GARCIA-NAVARRO: Well, tell me. Give me an example of that. Who's saying it's going too far?

ALLAM: One big example is Steven Crowder. He's a right-wing commentator. His YouTube channel has 3 million or more subscribers. He has a history of offensive language, including repeatedly insulting a Vox journalist's race and sexual orientation. After reviews, YouTube ultimately decided not to take down his videos. They said it wasn't a violation. Got some pushback, but then they decided to demonetize them, meaning he can't profit from them. Of course, he wasn't happy with this, and neither were his fans. And his fans include Senator Ted Cruz, the Texas Republican who, you know, went on Twitter demanding that YouTube, quote, "stop playing God and silencing those voices you disagree with."

GARCIA-NAVARRO: In YouTube's effort to get rid of hate speech and bigotry, it also apparently swept up some unintended victims?

ALLAM: That's right. That was a concern going into this, and it has played out. We've already seen several examples of historical and educational material being removed, things like an educational video used to teach about Hitler in Nazi Germany. Even the Southern Poverty Law Center, which is one of the nation's best-known trackers of extremism, they had one of their videos removed because it included an interview with a British holocaust denier.

GARCIA-NAVARRO: So I guess that brings us to the question of who or what is flagging these videos?

ALLAM: Yes. And the question of enforcement and, you know, weeding out these videos is difficult when YouTube uses a variety of methods. They have automated systems, human monitors. YouTube users themselves can report and flag violations. But like other platforms, YouTube's in a bind. On one hand, it wants to be an open forum for a broad spectrum of ideas. On the other, it doesn't want to be accused of helping to spread extremism and hate that we've seen lead to violence in some cases.

GARCIA-NAVARRO: And, of course, it wants to make money.

ALLAM: Definitely, it wants to make money. Advertisers love YouTube because it's unmatched in its reach. And so, yes, there's definitely the financial angle, as well.

GARCIA-NAVARRO: Some free speech advocates say this restricting of content smacks of censorship.

ALLAM: That's right. It's typically framed as a kind of a free speech issue. But there's also an argument that it's a public safety issue, and that we're in a new era with new and evolving technology that's made it incredibly simple and instantaneous to spread hate and extremism. And when those ideas turn to violence, as we've seen happen in places like Christchurch, New Zealand, it becomes as much about public safety as it is about free speech. And so for now, this debate is in the private sector. But the concern is if these companies fail to do something about it, does the government then come in and start regulating?

And, you know, that raises a bunch of thorny First Amendment questions. And so that's the tension we're seeing, who gets to decide the new rules for a new era.

GARCIA-NAVARRO: NPR's national security correspondent Hannah Allam.

Thank you so much.

ALLAM: Thank you. Transcript provided by NPR, Copyright NPR.

Hannah Allam is a Washington-based national security correspondent for NPR, focusing on homegrown extremism. Before joining NPR, she was a national correspondent at BuzzFeed News, covering U.S. Muslims and other issues of race, religion and culture. Allam previously reported for McClatchy, spending a decade overseas as bureau chief in Baghdad during the Iraq war and in Cairo during the Arab Spring rebellions. She moved to Washington in 2012 to cover foreign policy, then in 2015 began a yearlong series documenting rising hostility toward Islam in America. Her coverage of Islam in the United States won three national religion reporting awards in 2018 and 2019. Allam was part of McClatchy teams that won an Overseas Press Club award for exposing death squads in Iraq and a Polk Award for reporting on the Syrian conflict. She was a 2009 Nieman fellow at Harvard and currently serves on the board of the International Women's Media Foundation.