Facebook's dark secrets: What else is the social network up to behind your back?

If you thought Facebook’s emotion manipulation tests were bad, look at what else they are doing

Jamie Merrill
Saturday 05 July 2014 17:27 EDT
Comments
Working with scientists at Cornell University, researchers from Facebook tweaked the emotional content of users’ News Feeds
Working with scientists at Cornell University, researchers from Facebook tweaked the emotional content of users’ News Feeds (Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

If Facebook were a country, something its founder Mark Zuckerberg has blogged about, its 1.3bn active users would make it the second largest in the world. More interesting though, is that unlike any dictatorial regime past or present, it does not have to spy on its users – they give up their data willingly.

This week that ability to record everything from shopping habits to births, marriages and deaths came into focus again with the news that the social media giant had intentionally manipulated nearly 700,000 users to see how positive and negative emotions spread across the site.

Working with scientists at Cornell University, researchers from Facebook tweaked the emotional content of users’ News Feeds during a single week in 2012. The motive, claim critics, was to see if happier users share more data, based on the rationale, that more shares will result in increased advertising.

The study sparked outrage and the Proceedings of the National Academy of Sciences (PNAS), which originally published the work, has been forced to issue a formal expression of concern over the study. Facebook has said that “none of the data used was associated with a specific person’s Facebook account” and said it took “responsibility” for the upset caused. However this isn’t the first time Facebook has been caught out.

“This is just the latest instalment of the insidious process of how people are overtly manipulated and marketed to online by the likes of Facebook,” Professor Alan Woodward, a leading computer privacy and security expert at the University of Surrey, told the Independent. “Facebook’s business model is one where, because the end user isn’t a paying customer, they and their personal data become the product.”

Most people are already aware that Facebook makes money by selling targeted ads to companies that want to reach a specific demographic. What many might not realise though is the scale to which Facebook gains access to information, its history of conducting experiments to encourage us to share more, and its flexible relationship with terms and conditions governing data use.

Marketeers in disguise

The so-called “mood altering” study was the product of Facebook’s Data Science team, but this isn’t the first time this two-dozen strong outfit has been used like this, says Professor Woodward, who describes them as “marketeers in disguise”.

In fact Facebook has been using these data scientists for years and is actively recruiting more social researchers, according to some reports. For example, previous studies by the team have looked at patterns of “self-censorship” on the site – the instances in which people think about posting something and stop. For Facebook and for advertisers these moments can be crucial to understand why and how people present themselves online.

Locked out

Facebook’s media representatives in the UK are keen to point out that this week’s scandal isn’t a “privacy issue” and that no individual data was compromised, but the Data Science team is no stranger to controversy, with the Wall Street Journal reporting that the unit had previously conducted a study in which thousands of users were threatened with losing their accounts if they didn’t confirm their identity.

The affected users are said to have received a message warning them that the site believed they were using fake names, and that they could lose their accounts. In reality they were all genuine users and this was simply a test to help improve Facebook’s security measures.

Down in the farm

Facebook’s worst advertising scandal came in October 2010 when it emerged that tens of millions of Facebook users had their “Facebook ID” compromised while playing popular gaming apps, including farm simulator FarmVille, on the site.

It quickly became clear that these “building blocks of Facebook” had allowed the exposure of the names, addresses, occupations and photos of affected users (including those with the most stringent privacy setting) to dozens of advertising agencies.

Keeping you on track

Facebook’s policy statement reads that, “When we deliver ads, we do not share your information.” This is certainly the case for personally identifying information, but the site which is set to make more than $10bn advertising revenue this year, does collect vast amounts of data of interest to advertisers – we’ve all had an experience where Facebook seems to “know” we are off on holiday or looking for a sofa, and offers up the a corresponding advert.

This is for two main reasons, explains Professor Woodward. Firstly, Facebook is being sued in America for allegedly reading your messages and searching for key words to build up a picture about you. It denies this strenuously.

He said: “This alleged monitoring of Facebook messages is probably one of the biggest manipulations out there. People are just not aware that their communications might be looked at in this way.”

Perhaps even more complex though is Facebook’s approach to tracking users. Recently the company announced that it would be changing its policy on tracking, and start collecting information about users’ activity around the web – adding this outside browsing activity to the anonymous profiles it uses to sell targeted ads.

Facebook has said that this will help users get more relevant and useful advertising, but Prof Woodward, says that the fear is that this “build-up of data” will allow people to identify Facebook users from “snippets of data scattered around the internet, in a way that was never intended.”

Watch your T&Cs

Facebook users do agree to terms of service that give the company wide leeway in how it can treat them, nonetheless it’s long been a complaint since 2005 that the site changes it terms and conditions regularly, often making them more complex and putting information, such as your name, picture and gender, in the public eye by default.

“The big question is why Facebook feels the need to keep changing its privacy settings,” asked Professor Woodward. “It doesn’t seem to do this for increased functionality or extra services, so it’s difficult to conclude there is not something else behind it.”

Part of the problem, says Chris Scott, a specialist privacy and reputation lawyer at Schillings, is that most users do not have any “meaningful understanding” of what terms and conditions mean. He added that most users have “little or no understanding” of how their personal information can be used.

For its part Facebook says it clearly communicates changes and points to a recent Ofcom report which found that only 3 per cent of Facebook users didn’t know how to make their account more private.

The Feds sue Facebook

After years of evidence gathering the US Federal Trade Commission sued Facebook in November 2011. It claimed that Facebook “deceived consumers by telling them they could keep their information on Facebook private, and then repeatedly allowing it to be shared and made public.”

Facebook quickly settled and along with Twitter and Google is now forced to undergo a privacy audit every two years for the next 20 years. But as one senior media planner at a digital advertising agency told the Independent, “The old adage that if you don’t pay then you’re the product remains the case. This is as true with Facebook as it is with any other free online service.”

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in