This week Facebook users learned anew the meaning of privacy disenfranchisement. The revelation came just in time for the holidays, when the New York Times published a blockbuster article and a follow up about the extent to which Facebook shared – and allegedly still, in part, shares – user data. After the Cambridge Analytica scandal, Facebook claimed it had changed its sharing practices. For those who watched Congressional hearings in which Facebook executives testified about privacy reforms, the NYT revelations gave cause for cognitive dissonance.
The NYT article describes how Facebook provided some partner companies with enhanced permissions to access user information. The reporters claimed, for example, that some companies could obtain information about Facebook users’ friends. The article alleged that some companies appear to still have enhanced permissions to access some Facebook users’ data.
In a blog post, Facebook explained the sharing and staunchly defended its practices, saying that “…none of these partnerships or features gave companies access to information without people’s permission, nor did they violate our 2012 settlement with the FTC.”
Facebook’s response to the revelations may have been legally wise, but it failed to recapture the trust of some users – including this author – who feel disempowered by the platform. Many individuals made conscious choices to tighten access to personal information on Facebook, especially post-Cambridge Analytica. Ticking boxes, opting-out, and locking down profiles was supposed to be an effective privacy tool. But fresh and pressing questions remain about how well these choices were respected.
The reality of how information works is that it is difficult to wield absolute power over what happens to information after sharing it broadly. Information travels. It can be customised. It can be added to other information, and it can be fed into algorithms that can lead to decisions about people based on that data. The information can be used to map associations and patterns. Until we know more about the fate of all of the data shared by Facebook, we cannot assess all of the consequences.
For Facebook users considering an appropriate response, the first question is often whether to leave the platform entirely. It’s a complicated matter for many, especially for those who run a business and believe they must be on the platform for competitive reasons. Pressure from school friends and relatives is also a reason that some will want to stay on Facebook. For those who have acquired significant followings on Facebook, there’s the additional issue of not wanting to let fans and followers down.
Whether leaving or staying, Facebook users in the UK have meaningful rights under the GDPR and can ask Facebook for a copy of their data, can object to the processing of their data for direct marketing, can ask for deletion of data, and can file complaints to the Information Commissioner’s Office, among other actions. The GDPR is a robust tool with significant penalties and should be considered a front-line defense.
For those who decide to remain on Facebook, consider creating a Facebook-only email that is used absolutely nowhere else. Give Facebook as little personal information as possible, including omitting a phone number if practicable. Carefully consider all other choices before using Facebook to sign in on other web sites, and the same goes for using Facebook Messenger via other websites. Ask friends to communicate by non-Facebook means when possible. Pruning Facebook friends to those known and trusted is wise, and work to learn about and try out tools that enhance privacy.
For those who never imported their contacts to Facebook, consider keeping it that way. If contacts have already been imported, it’s not something that can be easily reversed. But a request for deletion might help long-term. No matter what, take the lesson: don’t allow other websites to routinely import your contacts. Learn to question that request and get more information. Learn to say no, and say it often when it comes to sharing your information and that of your friends.
But above all, to achieve truly substantial privacy improvements, users must have a meaningful voice at Facebook, and Facebook needs to give users a seat at the table to have an early voice in decision-making about privacy choices. Ticking a plethora of boxes is the ultimate powerlessness in the face of meaningful privacy issues that need to be tackled. A seat at the table, not more tick boxes, is what is most needed.
Pam Dixon is the Executive Director of the World Privacy Forum, a non-profit public interest research group