Instagram denies algorithm boosts photos of semi-nude men and women

Research suggested that the algorithm promoted bikini photos and bare-chested men

Adam Smith
Wednesday 17 June 2020 06:24 EDT
Comments
A woman takes a picture on as the sun goes down the beach of Yalta August 18, 2003 in Crimea, Ukraine
A woman takes a picture on as the sun goes down the beach of Yalta August 18, 2003 in Crimea, Ukraine (Oleg Nikishin/Getty Images)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Instagram’s algorithm is designed to prioritise photos of semi-nude men and women, according to a report from Algorithm Watch in partnership with the European Data Journalism Network.

But the company has suggested that the survey is misleading, and promised to give more information about how it decides what shows in the feed.

The researchers asked 26 volunteers to install a browser add-on and follow 37 professionals from 12 countries who use Instagram to advertise.

These advertisements range in topics, but focused on the food, travel, fitness, fashion, and beauty sectors.

The add-on automatically opened Instagram at random ties and checked which uploads appeared at the top of the feed in an attempt to get a better understanding of the company’s content recommendation system.

“If Instagram were not meddling with the algorithm, the diversity of posts in the newsfeed of users should match the diversity of the posts by the content creators they follow,” the report states.

However, it was found that of 1737 posts containing 2,400 photos, 21 percent were containing pictures showing “women in bikinis or underwear, or bare chested men.”

Those photos made up 30% of all posts shown from the same accounts. “Posts containing pictures of bare chested men were 28% more likely to be shown. By contrast, posts showing pictures of food or landscape were about 60% less likely to be shown in the newsfeed.”

Since Instagram’s guidelines state that nudity is not allowed on the service, the notion that the algorithm would favour revealing content is questionable.

Instagram, which is owned by Facebook, has since denied the research’s accuracy.

“There's been a recent study that suggests we boost content specifically because it contains semi-nudity. This is not true. We surface posts based on interests, timeliness of posts, and other factors to help people discover content most relevant to them” the company said.

“The study looked at an extremely small sample size, which likely surfaced the type of content they were researching: the more you engage with certain types of posts, the more likely we are to show you similar posts.”

“This research is flawed in a number of ways and shows a misunderstanding of how Instagram works. We will be releasing more information about what posts we do and don't recommend in the coming weeks” it concluded.

Instagram said recently that it was changing its algorithm in order to prevent the “shadowbanning” of black users and avoid systemic bias in its system.

Shadowbanning, Instagram says, is “filtering people without transparency, and limiting their reach as a result”.

Insight into social media companies’ algorithms has been something many people have been demanding for a variety of reasons.

In 2018, the US congress questioned Google CEO Sundar Pichai about why searching for the word “idiot” on Google Images show a whole host of pictures of Donald Trump.

Similarly, Twitter’s algorithm linked Mr Trump to the word “racist” in its search functionality.

Facebook was also recently criticised following a report that suggests the social media giant shelved research that would make its platform less polarising, a decision which, if made, was described as “antigrowth”.

Such algorithms can, theoretically, be gamed to promote certain content. Political subreddit r/the_donald, which is a space on Reddit for supporters of the president, repeatedly attempted to get its partisan posts to the homepage of the site or on its r/All subreddit.

These endeavours resulted in Reddit changing its algorithm to prevent all political content from reaching the front page.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in