Facebook Scrutinized For Its Data Sharing With Cambridge Analytica
NOEL KING, HOST:
Facebook is coming under a lot of pressure after the personal data of some 50 million Facebook users was shared with a British firm called Cambridge Analytica. The company then used that data to try and influence the 2016 presidential election. At this point, the U.K.'s information commissioner is seeking a warrant to look at Cambridge Analytica's databases and servers. Rebecca MacKinnon is with us now. She's the director of Ranking Digital Rights. It's a nonprofit that ranks companies based on how well they protect users' data. Rebecca, thanks for coming in.
REBECCA MACKINNON: Thanks so much for having me.
KING: So the easy question here is, does Facebook have a problem protecting our personal information?
MACKINNON: Well, they certainly have a problem informing us about what they're collecting, who they're sharing it with and what's happening to that information. And in our ranking of Internet companies overall, I mean, basically the whole industry is not doing well on this.
MACKINNON: But we looked at a couple of questions around, does the company clearly disclose to users what options they have to control, what data of theirs is retained and used and shared? And Facebook provided less disclosure. I mean, you know, you're clicking on these privacy policies, and Facebook says, well, you know, we had people's consent. But even if you read the policy - and I have the whole team that's reading these policies - Facebook has provided less disclosure and fewer options for controlling what gets shared than any other Internet platform that we examined...
KING: So they're at the bottom of the rankings.
MACKINNON: ...You know, including Chinese and Russian companies. So...
MACKINNON: So they disclose a minimal amount of information about what you can do to control your information, particularly when it comes to sharing with targeted advertising. And in terms of...
KING: Let me ask you about that. The data that Cambridge Analytica collected was used to target audiences with digital ads. Isn't it pretty common for Facebook to allow third parties to collect user data?
MACKINNON: Right. Well, this is part of the business model.
MACKINNON: But the issue is, what's getting disclosed to users? Do we have any idea about what's happening? And not just users, but, do regulators understand what's going on? Do journalists actually know what Facebook is sharing with whom under what conditions? This is not shared clearly.
KING: I mean, I could be one of those 50 million people. I'm on Facebook. But I don't know that. Should I? And can I?
MACKINNON: You should. You cannot right now, and that's a problem. And this is one of the other problems, is that we're relying on companies to voluntarily disclose this information for the most part. I mean, privacy law in the United States is very weak. And I think that this latest incident with Cambridge Analytica just shows that we need a comprehensive privacy law and a privacy agency in this country, not just for consumer protection. It's now become really clear that data protection is nothing less than a national security imperative.
KING: Why do you think there hasn't been more outcry among consumers? People don't seem that ticked off, or that furious or that freaked out about this.
MACKINNON: Well, in part, you just don't know what's happening.
KING: If you're one of 50 million.
MACKINNON: Right. And, you know, we've got a real problem, as well. You know, the news today that Facebook's chief security officer, he's having his team disbanded for the investigation. And so there's a real question about the extent to which Facebook is really determined to get to the bottom of what's happened and whether it's more determined to protect its reputation.
KING: Rebecca MacKinnon, the director of Ranking Digital Rights. Thank you, Rebecca.
MACKINNON: Thank you. Transcript provided by NPR, Copyright NPR.