Facebook Previously Failed To Keep Privacy Promises, Ex-FTC Adviser Says
NOEL KING, HOST:
Facebook says it is sorry. This past weekend, Facebook took out full-page ads in major newspapers apologizing for a, quote, "breach of trust." The company placed the ads after news broke that a political data mining firm, Cambridge Analytica, reportedly used the data of 50 million Facebook users. One person who does not buy the apology is Tim Wu. He's a former senior adviser at the Federal Trade Commission. And he was there in 2011, the last time that Facebook got in trouble for failing to keep its promises about privacy. Tim Wu is skeptical that the company will change its ways anytime soon.
TIM WU: The problem the FTC was confronting was the problem, similar to now, that there were a number of abusive apps that you installed, and then they did a lot more with your data than you thought they were. And one of the big problems is that Facebook gave you the impression that you could control your own privacy by, you know, setting the settings in certain ways, but those settings didn't do anything. They were, like, fake buttons.
KING: Why even put them up if they were fake, just to give people a sense of security?
WU: Yes, or - you know, they prevented some things, but the apps were very easily capable of getting around the settings. So they promised to fix it all up. They entered in a big consent decree. They agreed to audits every two years for the next 20 years and a massive privacy probe. It was all going to be different. But the problem didn't go away. In fact, those little settings remained fake. The survey app that we - really, created by Cambridge Analytica - was able to take the data of people's friends and family and take it all for itself even though Facebook was supposedly not allowing that. So that is the problem. They've basically broken the promise they made to the country in 2011 and the promises they kept making to everybody through these privacy settings. And I think that's the core of the problem.
KING: Why does Facebook continue breaking its promises? Aren't they afraid that they'll get hammered by the FTC if they don't shape up?
WU: I think they clearly did not make it a priority. Let's put it that way. And I think that period, 2012 to 2015, was one where they were obsessed with revenue generation. And the fact is that privacy - it's like kryptonite to their business model. You know, they have to be able to promise their advertisers that they have the goods on everyone and they have the power to manipulate people. And so if they are also extremely tight on privacy, that tends to throw a wrench into the machine.
KING: Many of us use Facebook, and we want to keep using Facebook. And Facebook is a business. So what exactly is the fix here?
WU: I think the problem lies here. It's actually a very fundamental one, which is Facebook is always in the position of serving two masters. If its actual purpose was just trying to connect friends and family, and it didn't have a secondary motive of trying to also prove to another set of people that it could gather as much data as possible and make it possible to manipulate or influence or persuade people, then it wouldn't be a problem. For example, if they were a nonprofit, it wouldn't be a problem. But they...
KING: But they're not, right? (Laughter).
WU: They're not. But, well, that's the problem. I think there's a sort of intrinsic problem with having for-profit entities with this business model in this position of so much public trust because they're always at the edge because their profitability depends on it.
KING: Let me ask you - Facebook runs afoul of the FTC. Can the FTC seek damages for that violation? Can they fine Facebook?
WU: Yes, they can. Every single violation is punishable by a $40,000 fine. So it could be billions of dollars in damages if the FTC decides to police this very aggressively.
KING: Do you think Mark Zuckerberg is sincere in his apologies? He appeared on television. He seemed really concerned by what had happened. Are you buying it?
WU: He's the kind of guy who tends to decide how he should feel and then feel that way. I think at the heart of this company is something rotten, is some overwhelming ambition to be the biggest, most powerful company out there. And it's because they're running fast and hard, trying to break things and handling stuff that's almost like radioactive material - people's data - I think more mistakes are going to happen. I don't think we're at the end of it unless they fundamentally change their business model.
KING: And fundamentally, that would make them a less successful business. So they are not likely to, are they?
WU: Well, if Mark Zuckerberg is telling the truth when he says, what I care about is connecting people to their families and friends, that's a very lofty ambition. If that's what he really wants to do, he can do it. But it doesn't mean he'll be the most profitable company in the world. You know, utilities - which is what Facebook is, a social utility - have never been understood as profit centers before.
WU: And there's a reason. The social sphere is a little bit different. And maybe we need to accept that it's not a source of major profit to be in people's personal lives.
KING: We've talked a lot about what you think Facebook's responsibilities are and some moves that they could make. What about just straight-up government regulation? What would regulating Facebook look like even if Facebook didn't 100 percent like it or want it?
WU: So there's two major ways this could happen, and they all revolve around the idea that personal data is very sensitive. You know, it's almost like radioactive waste or something. You can't just dump it all over the place, right? And that's basically what Facebook has been doing. And now we're finding out, hey, it's kind of dangerous when we let people, completely unregulated, handle this stuff. So they need to be a fiduciary, be a trustee when it comes to the personal data of their customers. They need to be under very strong legal duties to handle that stuff carefully and properly and not, for example, allow it to be given out to random researchers without even anonymizing it or really making sure it doesn't get stolen. You know, they didn't take any serious measures to prevent it from being stolen. And that's where regulation would go.
KING: What do you think a realistic future looks like for Facebook?
WU: I think a time of reckoning is coming. I think they need new management, frankly. Yes, I do. I don't think this management is doing what they say that they're promising, which is putting people's privacy first. They've been saying that for years. They're not actually doing it. Either that - they can stay the same, then they need to be under a lot more government supervision if we're going to trust them.
KING: Tim Wu is a professor at Columbia Law School. He was a senior adviser at the Federal Trade Commission. Tim, thank you so much.
WU: Sure, it's been a pleasure.
(SOUNDBITE OF THE SESHEN SONG, "FLAMES AND FIGURES") Transcript provided by NPR, Copyright NPR.