The Larger Facebook/Cambridge Analytica Question: Is this really what we signed up for? Thumbnail
Building Trust 11 April 2018

The Larger Facebook/Cambridge Analytica Question: Is this really what we signed up for?

By Sally WentworthPresident and Chief Executive Officer (CEO)

Mark Zuckerberg’s testimony before the US Congress today and the flood of news about the privacy breach at Facebook and revelations that the company mishandled the data of millions of people has me asking:

Is this really what we signed up for?

It is clear that we are not in control of our online information nor do we really have any idea how it is bought, sold, or used.

For some of us, signing up for a social network like Facebook was about staying in touch with our kids and friends. For others, it was an easy way to reach new customers, or gather a community behind a social project. Yes, many of us figured out that our information was being used to serve up ‘relevant’ ads: as a matter of fact, that seems pretty standard in today’s online world. But that’s only a small part of a much bigger picture.

In the past few weeks we have found out – yet again – that information about ourselves, and our friends and contacts was used far beyond what we intended. We have been profiled, pigeon-holed, politically manipulated, and played like pawns in someone else’s chess game. I’d challenge you to find anyone who says “yes – that’s what I was signing up for, and I knew it, and I am entirely comfortable with where we’ve ended up”.

No matter how or when it started to widen, this gap between what we reasonably expect and what is actually being done with our personal information reflects an unforgivable breach of ethics.

No one signed up for this.

The sense of outrage so many of us feel at such an egregious breach of trust – whereby Cambridge Analytica was able to make use of people’s personal data, collected from Facebook without their knowledge or explicit permission – hasn’t let up either.

Of course, we enjoy the benefits of free access to online platforms that let us connect with friends and meet new ones; share our stories, our recommendations; find new and interesting products, services and pastimes.

Yes, these are “free”, but only in exchange for data about us, some information to grant us access. But our choices are based on a partial and misleading picture of what we’re really signing up to, and a false impression of the resulting risk.

We aren’t sharing a few personal messages with only our close friends and contacts. We are pouring our data into a vast and volatile market that has an economic momentum we can’t control, and a political influence we are only just starting to understand.

This bears no relation to most people’s understanding of what a social network platform is about. The agreements we thought we were making look more and more like Faustian bargains.

This current episode has already triggered a process of investigation and response by governments, regulators, and those entrusted with safeguarding the rights of citizens. Yet, big as it is, Facebook is just one element of the online ecosystem.

But amongst all of this, the testimony, apologies, criticisms, and questions there has been little discussion of where we should end up.

So for anyone who collects, uses or shares information about us, here’s what we want:

  1. Fairness: Be fair with us. Respect our data, our attention, and our “social graph”. This means putting our interests above yours. This should be unequivocal. Your business model IS us. Seek our consent honestly, and when you use or share our information, don’t exceed what we consented to. If you do, we expect our law makers to hold you accountable in a meaningful way.
  2. Transparency: Make your privacy terms easier to understand so our consent actually means something. Be up-front and honest about your business model, your partners, your privacy policies and practices. Open your enterprise up to privacy audit, and then tell us what you are doing to address the findings.
  3. Choice: Give us genuine choices, starting with “opted out by default”. Let us opt in if we see fit. Let us opt out when we change our mind. Respect our right to stop using your products and services. Delete our data when we leave – and sooner if you no longer need it.
  4. Simplicity: You design your services for minimum friction and maximum convenience; apply your design efforts to privacy, too. Don’t expect us to manage our data, piece by piece, or fiddle with complex settings: let us express our preferences and intentions, and respect our choice.
  5. Respect: Show your respect for us and our interests, especially our privacy and autonomy. This means that public policies should prioritize our privacy, not corporate interests. Don’t treat us as mere raw material, or as the product you sell to your customers. From now on we will no longer shrug off privacy concerns and say ‘Well, I have nothing to hide.’ Now we know better.

To get here is going to take action from all of us.  It’s time to stand up and ask for a better deal.

The game has changed. We need to demand new rules.

Note: The blog post has been amended for accuracy.


Disclaimer: Viewpoints expressed in this post are those of the author and may or may not reflect official Internet Society positions.

Related Posts

Building Trust 21 February 2020

NDSS 2020: The Best in Security Research – For the Good of the Internet

On 23 February, the 27th consecutive Network and Distributed System Security Symposium (NDSS) kicks off in San Diego, CA....

Building Trust 11 February 2020

Every Day Should Be Safer Internet Day

Safer Internet Day is an opportunity for people and organizations around the world to join forces in a series...

Building Trust 28 January 2020

This Data Privacy Day It’s the Little Things That Count

Today we’re celebrating Data Privacy Day, which is all about empowering people and organizations to respect privacy, safeguard data,...