By Lambert Strether of Corrente
Facebook CEO and squillionaire Mark Zuckerberg recently announced (in a Facebook post, naturally) that Facebook will become a “a privacy-focused communications platform.” He writes:
As I think about the future of the internet, I believe a privacy-focused communications platform will become even more important than today’s open platforms. Privacy gives people the freedom to be themselves and connect more naturally, which is why we build social networks.
Since recent events — the New Zealand massacre live-streamed on Facebook, an executive exodus, a criminal probe into data sharing, and a ginormous day-long service outage, all in one week — have put Facebook back in the news, now seems like a good time to evaluation Zuckerberg’s new direction. In this simple post, I’ll ask two questions: First, can Zuckerberg be trusted? Second, can Facebook become “privacy-focused,” as any normal human would understand the term? Spoiler alert: No, and no.
Should Zuckerberg be Trusted?
Zuckerberg should not be trusted, for three reasons.
First, Zuckerberg either either lies or bullshits consistently and flagrantly. Hacker Noon reviewed Zuckerberg’s Congressional testimony:
I watched Mark Zuckerberg’s testimony to Congress this week (first to the Senate, then to the House), I was shocked by how many patently false or misleading statements he made…. feel it is important to already get some of the obvious lies out there. I feel that we software engineers and machine learning experts who actually understand Facebook’s technology have a duty to spread the word that Mark is either lying or he doesn’t actually know what Facebook does
The article reviews four of Zuckerberg’s lies, in detail, and concludes:
For those who are US citizens, I ask you to consider for a moment the gravity of the fact that the CEO of one of the world’s most powerful companies is outright lying to the Congress.
Zuckerberg is seemingly oblivious to how people really feel about him. He is, without question, one of the least trusted tech executives alive today. When Satya Nadella says he is going to do something with Microsoft, people don’t question his ulterior motives. When Tim Cook (or Tim Apple) says he’s going to do something at Apple, people don’t squint their eyes trying to see where the deceit may [lie] in his statement. Yet, with Zuckerberg, it’s the complete opposite. Zuckerberg could tell the world he’s giving away all of his stock, selling his homes, and walking barefoot around the globe in penance for his sins, and we would all wonder if he’s secretly figured out how to generate advertising money from people who walk around the globe barefoot. Facebook can pivot as many times as it wants, but the problem isn’t Facebook, it’s Zuckerberg.
That’s hardly fair. The problem is Facebook too!
Second, Facebook the firm lies strategically. From the British House of Commons, “Disinformation and ‘fake news’: Final Report” (PDF):
The management structure of Facebook is opaque to those outside the business and this seemed to be designed to conceal knowledge of and responsibility for specific decisions. Facebook used the strategy of sending witnesses who they said were the most appropriate representatives, yet had not been properly briefed on crucial issues, and could not or chose not to answer many of our questions. They then promised to follow up with letters, which—unsurprisingly—failed to address all of our questions. We are left in no doubt that this strategy was deliberate.
There’s no reason to think that the strategy of Facebook’s Founder is any different.
Third, previous Zuckerberg announcements have come to nothing. Consumer Reports:
The company hasn’t always delivered on past promises. In the spring of 2018, for example, Zuckerberg announced that a “Clear History” setting would soon allow consumers to delete data Facebook had collected off the site and from third parties. Nearly a year later, the tool hasn’t appeared. It’s now promised for this spring, and it’s still unclear exactly how it will work. Consumer Reports e-mailed Facebook for more details about the rollout of “Clear History” but the company has not yet responded.
(See above; no “follow up” is a strategy.)
Can Facebook Become “Privacy-Focused”?
I believe the future of communication will increasingly shift to private, encrypted services where people can be confident what they say to each other stays secure and their messages and content won't stick around forever. This is the future I hope we will help bring about.
This privacy-focused platform will be built around several principles:
Private interactions. People should have simple, intimate places where they have clear control over who can communicate with them and confidence that no one else can access what they share.
Encryption. People's private communications should be secure. End-to-end encryption prevents anyone — including us — from seeing what people share on our services.
Reducing Permanence. People should be comfortable being themselves, and should not have to worry about what they share coming back to hurt them later. So we won't keep messages or stories around for longer than necessary to deliver the service or longer than people want them.
Safety. People should expect that we will do everything we can to keep them safe on our services within the limits of what's possible in an encrypted service.
Interoperability. People should be able to use any of our apps to reach their friends, and they should be able to communicate across networks easily and securely.
Secure data storage. People should expect that we won't store sensitive data in countries with weak records on human rights like privacy and freedom of expression in order to protect data from being improperly accessed.
Over the next few years, we plan to rebuild more of our services around these ideas.
Zuckerberg’s essay is a pragmatic business decision rather than some newfound manifesto for solving Facebook’s privacy problems. The writing is already on Zuckerberg’s wall: the underlying market realities for social media are changing. Consider two kinds of platforms owned by Facebook, the company – the town square version, e.g. Facebook, the original social network for broadcasting widely, and the living room version, e.g. a messaging subsidiary, such as WhatsApp, which narrowcasts to a select audience. The town square is slowly but surely emptying. As a social network, Facebook, has 15 million fewer users today than in 2017. During October – December of 2018, 23% of Facebook users in the U.S. showed signs of activity, e.g. updated their status or posted a comment, as compared to 32% at the same time in 2017. In 2016, Facebook accounted for more than half of time spent on social networks, but that figure is anticipated to be 44.6% in 2019, while, for the first time, from 2018 on, it was expected that Facebook usage among the 11-24 demographic – highly coveted by advertisers – would decline.
At the same time, the living room is increasingly where the action is taking place. In the third quarter of 2018, 63% of U.S. internet users shared articles and photos via messaging apps such as WhatsApp and Messenger, as compared to 55% sharing them on a town square social network. By the end of 2018, 60% of survey respondents globally (except for China) used WhatsApp, up from less than 50% two years earlier. In China, the dominant local messaging app, WeChat had 1 billion users in 2018, up from 500 million in 2014.
All Zuckerberg is doing is preparing the ground to shift resources over to where the users are going.
In other words, Zuckerberg’s announcement is really a “product road map.” This, again, makes sense, but not as a “pivot to privacy.” And so to the question we’ve posed: “Can Facebook Become ‘Privacy-Focused’?” And if we look at Facebook from the 30,000-foot level, from its founding until today, the answer is clearly no, because — like so many Silicon Valley companies — Facebook is based on a form of arbitrage. We can see Zuckerberg himself expressing this idea, in crude terms, way back in 2004 (and ironically enough, in chat):
Zuck: Yeah so if you ever need info about anyone at Harvard
Zuck: Just ask.
Zuck: I have over 4,000 emails, pictures, addresses, SNS
[Redacted Friend’s Name]: What? How’d you manage that one?
Zuck: People just submitted it.
Zuck: I don’t know why.
Zuck: They “trust me”
Zuck: Dumb fucks
Konstantin Kakaes explains in more sophisticated terms, in Technology Review: “Zuckerberg’s new privacy essay shows why Facebook needs to be broken up“:
Facebook has minted money because it has figured out how to commoditize privacy on a scale never before seen. A diminishment of privacy is its core product. Zuckerberg has made his money by performing a sort of arbitrage between how much privacy Facebook’s 2 billion users think they are giving up [“dumb fucks”] and how much he has been able to sell to advertisers [“Just ask”]. He says nothing of substance in his long essay about how he intends to keep his firm profitable in this supposed new era. That’s one reason to treat his Damascene moment with healthy skepticism.
So — as we might call it — privacy arbitrage is Facebook’s business model and has been since its inception (as regulatory arbitrage is Uber’s). To avoid tedious discussions about what privacy is and is not, let me give an example where privacy clearly does not exist. From Boing Boing, “You cannot opt out of Facebook’s surveillance network”:
Even if you don’t use it, Facebook is embedded across the web and in apps through ads, share buttons, tracking pixels and so forth, watching everything and everyone. Katherine Brindley set out to find how forthright the company was in its claims not to track users who engage privacy controls. Not very.
“I enabled a bunch of privacy settings and still felt like my Facebook/Insta ads were a little too relevant. So I faked a pregnancy by downloading the What to Expect app to see how long it would take for FB to hit me with a maternity ad. The answer? 11 hours.”
The systems that tracked Katherine Brindley will not go away with Facebook’s pivot. Nor will the collection of metadata generally. From the New York Times, “Zuckerberg’s So-Called Shift Toward Privacy“:
Here are four pressing questions about privacy that Mr. Zuckerberg conspicuously did not address: Will Facebook stop collecting data about people’s browsing behavior, which it does extensively? Will it stop purchasing information from data brokers who collect or “scrape” vast amounts of data about billions of people, often including information related to our health and finances? Will it stop creating “shadow profiles” — collections of data about people who aren’t even on Facebook? And most important: Will it change its fundamental business model, which is based on charging advertisers to take advantage of this widespread surveillance to “micro-target” consumers?
The Financial Times, in “Facebook’s pivot must be viewed with scepticism,” concludes that it’s unlikely that Facebook will do any of these things:
Moreover, end-to-end encryption is a red herring as far as privacy and advertisers are concerned. Facebook does not care so much about the content of private messages as their metadata, such as the location and recipients of messages. This is valuable to advertisers, allowing them to discover groups of users with similar interests. Mr Zuckerberg’s post mentions metadata only once, asserting that “we use this data to run our spam and safety systems”, and suggesting time limits on storage. Minimising data stored is a good principle, but it is largely irrelevant for advertisers once a network of connections has been identified.
And Wired, in “Zuckerberg’s Privacy Manifesto Is Actually About Messaging,” concludes:
The fact that your individual messages might be encrypted in transit does not, in any way, prevent Facebook The Entity from knowing who your friends are, where you go, what links you click, what apps you use, what you buy, what you pay for and where, what businesses you communicate with, what games you play, and whatever information you might have given to Facebook or Instagram in the past.
(“Whatever information” including Facebook’s ginormous social graph, which will doubtless be thrown against the new data from “the living room.”) Clearly, everything that happened to Katherine Brindley with Facebook as it exists today will continue to happen. It’s absurd for Facebook to call this a “pivot to privacy.”
So think back to Facebook’s fundamental business model:
[A] sort of arbitrage between how much privacy Facebook’s 2 billion users think they are giving up [“dumb fucks”] and how much he has been able to sell to advertisers [“Just ask”]
What Zuckerberg is trying to do with his disingenuous announcement is keep that trade going, by making people think — again — that they’re not giving up as much privacy as they actually are, and that’s all he’s trying to do.
Should Zuckerberg be trusted? Of course not. Can Facebook Become “Privacy-Focused”? [hollow laughter]
 “Which is why we build social networks. It is? Really?
 For your amusement and delectation, here’s a section of the Facebook Terms of Service: “We cannot predict when issues might arise with our Products. Accordingly, our liability shall be limited to the fullest extent permitted by applicable law, and under no circumstance will we be liable to you for any lost profits, revenues, information, or data, or consequential, special, indirect, exemplary, punitive, or incidental damages arising out of, or related to, Facebook Products, even if we have been advised of the possibility of such damages.” So, whatever Facebook might have been doing — it’s not even clear that the outage was an accident, given that it occurred across the entire Facebook “family of apps and services,” which Facebook could could be furiously integrating to make an antitrust-driven breakup more difficult — good luck collecting for damages, pal!
 There’s also no particular reason to think that the messaging encryption won’t be backdoored, both for Facebook’s purposes, and as a service to the intelligence community.