Facebook study finds people only click on links that they agree with, site is an 'echo chamber'

Though people tend to have a diverse range of friends, Facebook’s algorithm means that they mostly see those who share their views, research has found

Andrew Griffin
Friday 08 May 2015 09:50 EDT
Comments
Facebook lights up the London Eye with the nation's general election conversations. The colours represent discussions of their parties on the social network
Facebook lights up the London Eye with the nation's general election conversations. The colours represent discussions of their parties on the social network (Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Facebook users tend almost entirely to click on links that they agree with, meaning that their news feeds can become an echo chamber, according to new research.

Most users of the site have friends with a broad range of political views. But they tend only to click on posts by those friends that they agree with politically, according to new research published in Science, and so the News Feed tends to show them the same kind of content.

Researchers call that self-sustaining effect the “filter bubble”. As technology companies learn more about their users, they show them the kind of results that they have clicked on in the past — which has the effect of showing people things that they already agreed with and wanted to see.

The new study was conducted by Facebook’s in-house scientists, and aimed to establish whether the site’s algorithm was creating a filter bubble, and whether that led to political polarisation. Facebook’s algorithm decides what users see on their news feed based on a range of criteria, including how often a link has been clicked by other users and whether it is the kind of thing that users have engaged with in the past.

To establish whether people tended to engage with people they disagreed with, Facebook researchers mapped out the site’s users according to which parties they’d indicated that they support. Over 10 million Facebook users were unknowingly mapped on a five point scale according to whether they were Conservative or Liberal.

The site then analysed news content to decide whether the organisation posting it tended to be conservative or liberal. Researchers calculated that by looking at the affiliations of the people who liked them — ranking the New York Times as slightly liberal but Fox News as slightly conservative, for instance.

Researchers then analysed when stories were “cross-cutting” — stories from conservative sources that were seen by liberal Facebook users, or vice versa. By analysing the two sets of data together, researchers could work out how often people see stories that they weren’t expected to agree with.

The Facebook news feed does tend to work as an echo chamber, the researchers found. Users were about 1 per cent less likely to see stories that they didn’t agree with, they said — less of a bias than some critics of the News Feed had suggested, but still enough to be significant.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in