Experts Say Russian Propaganda Helped Spread Fake News During Election
ARI SHAPIRO, HOST:
Fake news spreading on social media is a big part of this year's election storyline. Now two independent research groups have found that some of those stories had support from a Russian propaganda campaign. Craig Timberg wrote about this in the Washington Post and joins us now to talk about it. Hi, there.
CRAIG TIMBERG: Hi.
SHAPIRO: Describe what this propaganda campaign consisted of.
TIMBERG: So there's legions of botnets and paid human trolls that collect information and tweet it to one another and amplify it online. And that makes these stories that in many cases are false or misleading look much bigger than they are, and they're more likely to end up trending on Google News or end up in your Facebook feed.
SHAPIRO: Botnets meaning networks of automated fake people or what?
TIMBERG: Well, they're not fake people, they're essentially computer programs that you can program to sort of tweet back and forth to each other. What the researchers found were that there were these thousands and thousands of social media accounts that just basically amplified what one another are saying, and they did it essentially in a massive online chorus.
SHAPIRO: Why would Russia want to do this?
TIMBERG: Well, look, I think we've seen a lot of evidence that the Russians had a stake in the election that just passed. They've been mad at Hillary Clinton since the protests in Russia in 2011. They clearly seem to have a fondness for Donald Trump. And probably the most important reason is because we're a strategic competitor with Russia, and so undermining our democracy and our claims to having a clean democracy clearly were important goals to the Russians.
SHAPIRO: Was that animosity towards Hillary Clinton and fondness for Donald Trump reflected in the kinds of stories that were being promoted through these channels?
TIMBERG: Yeah, the Hillary Clinton storylines about her being sick or dying or a criminal about to be arrested, all that sort of stuff got amplified. Stories about Donald Trump being, you know, a candidate who could bring peace and settle tensions with Russia also got amplified.
There's also lots of stuff that was just, you know, tension raising. There were reports of supposed international incidents. You know, a fake coup in Turkey, the prospect that the U.S. was going to attack another country, that sort of stuff got echoed as well, in part just to raise the temperature of tensions during the election.
SHAPIRO: Does the U.S. government have tools to detect or prevent this from happening? It seems like minimal use to find out after the election that this was all going on.
TIMBERG: I think it's safe to assume that the U.S. government does absolutely have tools to monitor this. It's not clear that they or anybody else have tools to stop it, though. And one of the things we deal with in the story is here Facebook and Google have claimed that they can really crack down on fake news. They're kind of going to get off the sidelines on this issue.
But it turns out it's hard because the news ecosystem is enormously complex and it's easily manipulated, and there's lots of reasons to manipulate it. If you have a certain political point of view, you can get a hearing in the internet that you couldn't get 20 or 30 years ago.
SHAPIRO: During the campaign, American intelligence agencies said Russia was responsible for hacking the DNC emails, now there's this. It sounds like there was a multipronged intelligence and misinformation effort by Russia to impact the U.S. presidential election. Can we say whether it worked?
TIMBERG: I don't think there's any way to know if it worked in the sense that it determined the outcome of the election, though the thought alone, to even - that we're even talking about it is kind of remarkable. Look, there's no way to run a parallel election and take these factors out and see what happens.
But let's remember, this was a very close vote where just, you know, a few tens of thousands of votes in a few states ended up making the difference. So I don't know, if you believe that the kind of information that crashes through all of our social media accounts affects how we think and potentially how we vote, I think you would conclude that this kind of stuff does matter.
SHAPIRO: Craig Timberg covers technology for The Washington Post. Thanks for joining us.
TIMBERG: It was my pleasure. Transcript provided by NPR, Copyright NPR.