Facebook can make you happy...if Mark Zuckerberg wants you to be

The news that Facebook can manipulate user's emotions is no big surprise once you understand the power of the site

Felicity Morse
Tuesday 01 July 2014 10:11 EDT
Comments
Facebook CEO Mark Zuckerberg prepares to speak at a news conference at Facebook headquarters July 6, 2011 in Palo Alto, California.
Facebook CEO Mark Zuckerberg prepares to speak at a news conference at Facebook headquarters July 6, 2011 in Palo Alto, California. (Justin Sullivan/Getty)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

To control a nation, you need to control its news. George Orwell knew it, Joseph Stalin knew it and now Facebook knows it.

The social network secretly carried out an experiment on hundreds of thousands of its users in 2012, attempting to manipulate their moods for a psychological experiment. It found that it was able to make users happy or sad by altering the amount of positive or negative content that appeared in their News Feed.

Outrage over the results played out, ironically, on social media when they were released. One of the researchers tried to justify the experiment, saying they wanted to investigate “the common worry that seeing friends post positive content leads to people feeling negative or left out.” And there was self-interest at stake too: Facebook were concerned that friends' negativity might lead people to leave the site.

There can be nothing ethical about an experiment that tries to make people unhappy without their consent, and without any real way of monitoring their mental health. Yet putting the ethics of the research to one side, consider how utterly terrifying it would be to live in a world where social media controls how you feel.

Well, you already do.

Facebook filters its News Feed ruthlessly. It says this is to stop you feeling overwhelmed by the thousands of updates you would otherwise receive. It claims to prioritise posts depending on what social media managers term 'engagement'. Essentially, you are more likely to see something that your friends have commented on, liked or shared. The popular stuff rises to the top of the News Feed and the less engaging content is silently buried. If your posts regularly get no likes or comments, then Facebook will stop showing them.

Lots of Facebook users don’t know this. The tiresome pedants who comment 'not news' under articles don’t realise they are unwittingly increasing its reach, allowing more and more people to see the item which has annoyed them. This is just one of the problems of a News Feed filtered in this way: engagement does not mean content is worthy. It merely means it provoked a response. Just look at Katie Hopkins.

Once you know this, it’s not surprising that Facebook can alter the moods of their users. It’s a stream of social stimuli, filtered to favour the provocative. Like emotional junkies, we ping from outrage at perceived slights to tears over the video of a dog standing sentry at its owner’s grave. But this is not the real problem; only seeing the most engaging posts can be defended, after all, it’s crowdsourced, it’s democratic.

The real problem with their secret experiment is that it undermines the assertion that it is the most popular posts that are given priority, and not the posts that Facebook wants us to see. I run a Facebook page and there are constant algorithmic anomalies that can’t be explained. Sometimes it’s feels that someone is sitting in Facebook HQ arbitrarily turning a knob to decide how many people see my posts. I also harbour suspicions that the news Facebook lists in its trending module is heavily editorialised. It’s what Facebook wants to trend.

Facebook is the second most visited site in the world after Google. That’s a lot of power to have, and a lot of minds to own. It is time we acknowledge how influential these sites are and hold them to account. Before we are all typing in newspeak.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in