Why have we given up our privacy to Facebook and other sites so willingly?
Cambridge Analytica Why have we given up our privacy to Facebook and other sites so willingly?
Cambridge Analyticaâs ransacking of millions of Facebook usersâ data has triggered a backlash against the social network â" and highlighted how much personal information we share without thinking of the consequences
Facebook is on the ropes. A week of revelations about Cambridge Analyticaâs use of data gleaned from the social network has left the world demanding answers. The company canâ t seem to decide: is it outraged that it was taken advantage of by an unscrupulous actor, or relieved that this is just normal use of tools that it made widely available for almost five years? Should Mark Zuckerberg come out front and centre leading the response, or should he hide in a cupboard until it all blows over?
Faced with its first true crisis, the company is paralysed with fear. And that paralysis is, remarkably quickly, leading people to reassess their relationship with the site as a whole. The teens got there first, really. Facebook usage among younger people has been declining for years, in the face of competition from upstart rivals such as Snapchat, internal disruption from Facebook-owned Instagram, and a general sense that Facebook is full of old people and parents. But the backlash isnât a generational thing any more. Weâre all losing control of our data, both online and off, and weâre starting to kick back.
Not only is the burgeoning #deleteface book movement picking up steam (although it will take a few weeks before hard numbers are available about how many have followed through on their words), but people are also beginning to look up, as if from a daydream, to ask: how exactly did we end up in this situation? Why did we give up our privacy so willingly? And how can we get it back?The Guardian view on big tech: a new era needs new rules | Editorial Read more
The 50m profiles harvested from Facebook by a Cambridge Analytica partner under the guise of research are a huge data store, but they pale in comparison with the amount of information the company holds on its own users. At the same time that Facebook turned off the spigot that had been used to pump industrial quantities of data off its platform, the company opened up a second set of floodgates: the Facebook Audience Network, which allows third parties to track, profile and advertise to Facebook users wherever they find them on the internet.
Facebook isnât really a social network. Itâs barely even an advertising company. Itâs a data analytics firm, which manages to use its position as the middleman for a vast proportion of all human communication to find out everything there is to know about its users.
Just as Cambridge Analytica claimed enormous powers of perception with a scant selection of personal information, Facebook also boasts to advertisers about how much it knows about its users â" and how effective it can be at influencing their minds: it cites a games company that âmade video adverts to match different gamer stylesâ for a â63% increase in purchase intentâ; a clothes retailer that achieved âa dramatic increase in salesâ with ârichly personalised adsâ; and a mobile network that scored âa major boost in awareness and purchase intentâ by focusing on users with families. (Facebook used to have a similar page on which it showed off to politicians about how effective it was at s winging elections, but it quietly removed that in February.)
If you think youâre a passive user of Facebook, minimising the data you provide to the site or refraining from oversharing details of your life, you have probably underestimated the scope of its reach. Facebook doesnât just learn from the pictures you post, and the comments you leave: the site learns from which posts you read and which you donât; it learns from when you stop scrolling down your feed and how long it takes you to restart; it learns from your browsing on other websites that have nothing to do with Facebook itself; and it even learns from the messages you type out then delete before sending (the company published an academic paper on this âself-censorshipâ back in 2013).
Facebook boasts to advertisers about how much it knows about its users â" and how it can influence their minds
This data life isnât limited to Facebook. Google, famously, is in the same basic business, although the company is a bit more transparent about it (for a shock, try going to the âMy Activityâ and âLocation Historyâ pages to be vividly reminded that Google is tracking everything). And Amazon is building a modern surveillance panopticon, replete with an always-on microphone for your kitchen and a jaunty camera for your bedroom, purely to sell you more stuff.
Avoiding the big players doesnât help much. Large data brokers such as Experian and Equifax exist to collate information about everyone, whether or not theyâre online. The security services continue to build their own surveillance databases, with powers strengthened in the UK through the recent Investigatory Powers Act. Even going to church now comes with the potential for a dose of surveillance: the Church of England has authorised the roll-out of 14,000 contactless card readers, to let parishioners give without carrying cash. Is it time to say goodbye to the anonymity of the co llection plate, and hope youâre one of the more generous donors?
Richard Stallman has been warning of this state of affairs since before Zuckerberg even touched his first computer. The veteran computer scientist, creator of the GNU operating system and leader of the Free Software Movement, warns that âthe only database that is not dangerous is the one that is never collectedâ.
âThere is a limit on the level of surveillance that democracy can co-exist with, and weâre far above that,â he tells me on the phone from the Massachusetts Institute of Technology. âWe suffer more surveillance than the inhabitants of the Soviet Union, and we need to push it way down.
âAny database of personal data will be misused, if a misuse can be imagined by humans. It can be misused by the organisation that collects the data. In many cases, the purpose of collecting it is to misuse it, as in the case of Facebook, but also in the case of Amazon, Google to some e xtent, and thousands of smaller companies as well.
âIt can also be misused by rogue employees of the company and it can also be stolen by some third party and misused. Thereâd be no danger of data breaches if a database doesnât exist. And, finally, it can be taken by the state and misused.â
Stallman has little sympathy for those who choose to use such services. âTheyâre foolish,â he says, when I ask him why he thinks data harvesting is tacitly accepted by so many people. âTheyâre accustomed to a certain kind of convenience ... they choose to ignore that it might be dangerous.â
Iâm less certain that thereâs a choice being made at all, though. Yes, people may regularly be accepting terms and conditions that require them to give up their data, but that doesnât mean they read them. I should know: I have. A few years ago, I decided to read, in full, the small print for every single product or service I used. I read almost 150,000 words of le galese â" three-quarters of Moby Dick â" in less than a week, from the 21,000 words required to turn off the alarm on my iPhone on a Monday morning to the 4,000 words required to browse BuzzFeed in my lunch break.
The experience was gruesome. Legal documents are not written to be read by humans, and certainly not to be read back-to-back in a harrowing marathon of End-User Licence Agreements. But I did learn one thing, which is that the modern notion of consent upon which the entire data edifice is built has the shakiest of foundations.
Lukasz Olejnik, an independent security and privacy researcher, agrees: âYears ago, people and organisations used to shift the blame on the users, even in public. This blaming is unfortunate, because expecting users to be subject-matter experts and versed in the obscure technical aspects is misguided.
âBlaming users is an oversimplification, as most do not understand the true implications when data are shared â" they cannot . You canât expect people to fully appreciate the amount of information extracted from aggregated datasets. That said, you canât expect users to know what is really happening with their data if itâs not clearly communicated in an informed consent prompt, which should in some cases include also the consequences of hitting âI agreeâ.â
He adds that at many organisations, privacy was not being taken seriously, âexcept when there was a need to include the phrase âWe take the privacy of our users very seriouslyâ following a data breachâ.
The modern notion of consent upon which the entire data edifice is built has the shakiest of foundations
It doesnât have to be like this. Doctors are required to demonstrate not just consent, but informed consent, from patients: the latter have to understand what they are agreeing to, or the agreement is moot. After years of mis-selling scandals, the same principle is slowly making it s way to the financial industry. Logging in to check an ISA, you may be confronted with a 12-point questionnaire designed to check you understand the risks and are happy for the investment to continue.
Yet online, the biggest companies in the world base their businesses around users hitting âI agreeâ on a dialogue box on a website once, a decade ago, and then never being told what their agreement entails, nor being offered any way to retract their consent and take back control of the information they gave up.
Change is coming. In the EU, the General Data Protection Regulation â" GDPR â" overhauls a continentâs worth of rules around a clear principle that the only person who can ever own an individualâs data is that individual. Olejnik describes the law as a âgood starterâ, but notes that even it will still need to be âreviewed and updated on a regular basisâ.
Stallman wants to go one step further. âI recommend a law prohibiting any system tha t collects data,â he says, âno matter who runs it, whether itâs a company, some non-profit organisation, or a public agency, whatever, that they are not allowed to collect data unless they can justify it as absolutely necessary for the function to be done.â
It would be a huge step, and one that is unlikely to come without a radical change in how the public views mass data collection. But he has hope, and rejects the label of a Cassandra, doomed with accurate predictions that will always be ignored.
âI donât know the future, because the future depends on you, so Iâm going to try my damn best,â he says. âIâm a pessimist by nature. But just because things look dim, is no reason to give up. And thatâs what Iâve been saying for many, many years.âTopics
- Cambridge Analytica
- Social networking
- Share on Facebook
- Share on Twitter
- Share via Email
- Share on LinkedIn
- Share on Pinterest
- Share on Google+
- Share on WhatsApp
- Share on Messenger
- Reuse this content