Instagram working on new feature that automatically detects when users send unsolicited nudes

‘There is an epidemic of misogynist abuse taking place in women’s DMs’

Vishwam Sankaran
Thursday 22 September 2022 01:09 EDT
Comments
Meta CEO Mark Zuckerberg's net worth plummets by billions

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Instagram is working on a new filter to protect users from unsolicited sexual photos in their chats.

Instagram parent company Meta has confirmed to The Verge that the new feature is in the early stages of development and will be optional for users.

An app researcher observed on Twitter that the technology covers photos that may contain potential nudes in a user’s direct messages (DM) inbox, while Instagram cannot view the photos themselves.

“Technology on your device covers photos that may contain nudity in chats. Instagram can’t access the photos,” an early image of the tool shared by the researcher noted.

Photos would remain covered unless a user chooses to view them, according to the image of the feature shared by the researcher.

The nudity protection technology would reportedly not allow Meta to view the actual images, nor would be available for access by third parties.

If users opt for the feature, the app would automatically blur out the image if it detects a photo with nudity in a user’s DM.

The new feature comes following widespread criticism that Instagram ignores misogynistic harassment faced by its female users.

In April, the Center for Countering Digital Hate published a report that the app overlooked an “epidemic of misogynist abuse” sent over the platform’s DMs to its female users.

It noted that Instagram failed to act on 90 per cent of image-based abusive DMs sent to female public figures.

“There is an epidemic of misogynist abuse taking place in women’s DMs. Meta and Instagram must put the rights of women before profit,” the campaign group’s chief executive Imran Ahmed said.

Instagram responded to the report, saying that accounts sending messages that break the platform’s rules are given a strike and are blocked from sending DMs for a period of time, with stronger punishments being handed out if such behaviour continues.

But Mr Ahmed said the current processes are not enough and that more needed to be done to protect women.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in