Facebook published a white-paper1 about privacy and data protection.
This 29-page white-paper:
I will start by commenting on the first half of that doc: “the inherent tensions with communicating about privacy.” Then, I’ll move onto the second half: “charting a path forward.”
Facebook writes that despite “no easy or obvious solutions to the challenge of transparency, there are exciting new paths forward.”
When I first read this statement I thought “exciting new paths” heralded more transparency about Facebook’s business model of data extraction2. (Back in 2012, MIT Review had already reported that “Facebook has collected the most extensive data set ever assembled on human social behavior.”3)
No, not yet — so, what are those “exciting new paths” about?
Facebook’s “exciting new paths” are to “to consider privacy notifications as dynamic design challenges, and not just as formalistic compliance obligations.”
Basically that means Facebook wants to implement “privacy notifications” by “adapting” “human-centric design.”
That is it.
Talking about the implementation of those privacy notifications, Facebook acknowledges the complexity of comprehensiveness, design standardization, and the risk of click fatigue — yet I see further challenges. Carry on reading.
So — Facebook wants to inform people of data collection by “adapting” “human-centric design” methods. There is a caveat: those are the methods that made people addicted to digital products4.
Doesn’t taking care of people’s privacy and data also mean taking care of people? Helping people curb their addiction to digital products?
In 2013, Tim Harris, a former Google employee now president of the Center for Humane Technology, sent a presentation to 10 Google employees to raise the alarm about product addictions5 6. Even Nir Eyal, the author of best-selling book Hooked7, who helped startups make their products addictive8, and also warned against it9, is now working on helping people “get their attention back”10.
Facebook’s lack of transparency about data collection is an issue for sure. Yet, will human-centric design methods, that made people drug-like-addicted to products like Facebook, help?11 Isn’t it a quick fix to a larger, wider and more complex issue? Shouldn’t Facebook treat the root cause instead?
Let’s move onto the next bit of that white-paper.
In adding more notifications — and demanding more clicks — Facebook recognizes a risk of click-fatigue. In that context — can we expect people to take mindful decisions when these notifications pop up? Won’t people just click-through mindlessly and blindly accept those intrusions?
That is what academic researchers say:
Research shows that the relationship between Facebook’s products and its users is complex. Sean Parker, early Facebook investor and Napster founder, said:
“Facebook was built to be addictive. […] It’s a social-validation feedback loop … exactly the kind of thing that a hacker like myself would come up with, because you’re exploiting a vulnerability in human psychology,”15.
Will Facebook’s “privacy notifications” be enough to “empower” people to take “meaningful decision”? Will people overcome the urge for gratification and reward?
I doubt it. Millions of people don’t even realize they are on the Internet when they use Facebook16. And even for people who are digitally savvy, how will Facebook explain the privacy implications of data collection when privacy itself is “a vague term in the first place”? Even academic reseachers are struggling to explain what privacy is about in the digital space.17
Next in that white-paper Facebook develops a rhetoric to join the conversation on policy making.
Facebook invites “organizations, regulators, policymakers, and experts” to “collaborate on a people-centered approach to the development, testing, and evolution of new transparency mechanisms that meet the diverse needs of a global community.”
Let’s look at Facebook’s “global community” first.
Some think tanks and an early investor Facebook agree to say that Facebook is no community. Sandy Parakilas, a former Facebook employee explains18:
“At an even more basic level, Facebook treats its users as a commodity to be hooked into the system, surveilled and then monetized.”
In startup ecosystems, we often hear: “If you don’t pay, you are the product.” That is inacurrate. Facebook users are not even the product, they are a commodity feeding products of Facebook Company (i.e. Facebook.com, Instagram and WhatsApp). Those users feed Facebook’s business model of extraction of “behavioural data for computation and analysis”19. So that clients of Facebook Company can target people with ads to change their minds — as consumers, or voters.
Next in the course of this white-paper, Facebook invokes regulators “to explore policy co-creation strategies” to “better understand the products and technologies involved, identify concerns, and establish clear, upfront goals for privacy notices.”
In a world where technology can shape society20, do regulators want Facebook to “co-create” policies”? In the aftermath of all the scandals Facebook was involved, does society want Facebook to shape society?
I’ll give no answer. If you want to know more about Facebook’s lobbying effort, please read Issie Lapowsky’s interview of Facebook’s deputy chief privacy officer, Rob Sherman.
Last but not least, Facebook mentions its goodwill for “small and medium-sized businesses” because SMEs have “limited resources to invest and improving their privacy notifications.”
Rob Sherman, Facebook’s deputy CPO, mentions that SMEs can get “privacy expertise […] from Facebook.”21. Do SMEs really want to get “privacy expertise” from a firm that was required to answer for their companies’ practices before Congress?
Also - could we not imagine an internet where SMEs collect little data about people? Where there is no need for “privacy notifications”?
Note: SMEs can take a look at this article outlining privacy’s best practices (to deal with third-party services, CDN providers, social widgets, marketing emails, storing data, PII data, IP anonymization, syslog, and two-factor authentication).
While it can feel good to read Facebook saying “people have to be meaningfully informed, in a way that empowers them.” I feel the suggestions Facebook puts forward fall short of addressing the issue of taking full care of people’s online privacy, a concept that is complex, still undefined and widely misunderstood by most.
Erin Egan, Vice President and Chief Privacy Officer, Public Policy, Facebook Communicating Towards People-Centered and Accountable Design (2020)↩
Shoshana Zubook, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (2019), 86-87↩
Forrester Research Human-Centered Design And Digital Addiction In 2019↩
Oliver Günther, Sarah Spiekermann, Privacy in E-commerce: Stated Preferences vs. Actual Behavior (Communications of the ACM, January 2005)↩
Alessandro Acquisti, Jens Grossklags, Privacy and rationality in individual decision making (IEEE Security & PrivacyVol. 3, No. 1, January/February 2005), 26-33↩
Diana I. Tamir, Jason P. Mitchell, Disclosing information about the self is intrinsically rewarding (PNAS May 22, 2012)↩
Leo Mirani, Millions of Facebook users have no idea they’re using the internet (Quartz, 2015)↩
Kay Burkhardt, The privacy paradox is a privacy dilemma - interview of Prof. Dr. Spyros Kokolakis (Mozilla, Firefox Citizen, 2018)↩
Shoshana Zubook, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power (2019),↩
Issie Lapowsky,Facebook’s plan for privacy laws? ‘Co-creating’ them with Congress, Protocol↩