Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Facebook parent Meta to remove sensitive ad categories

Facebook's parent company Meta says it will remove sensitive ad targeting options related to health, race or ethnicity, political affiliation, religion or sexual orientation beginning on Jan. 19

Via AP news wire
Tuesday 09 November 2021 17:45 EST
Facebook Metaverse
Facebook Metaverse (Copyright 2021 The Associated Press. All rights reserved.)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Facebook’s parent company Meta says it will remove sensitive ad targeting options related to health, race or ethnicity, political affiliation, religion or sexual orientation beginning on Jan. 19.

Currently, advertisers can target people who have expressed interest in issues, public figures or organizations connected to these topics. That information comes from tracking user activity on Facebook Instagram and other platforms the company owns.

For instance, someone who’s shown interest in “same-sex marriage” may be shown an ad from a non-profit supporting same-sex marriage. But the categories could also be misused and Meta, formerly Facebook, has been under intense scrutiny from regulators and the public to clean its platform of abuse and misinformation.

Meta Platforms Inc. said in a blog post Tuesday that the decision was “not easy and we know this change may negatively impact some businesses and organizations." Shares of the company closed at $335.37 Tuesday, down almost 1%.

"Some of our advertising partners have expressed concerns about these targeting options going away because of their ability to help generate positive societal change, while others understand the decision to remove them," wrote Graham Mudd, vice president of marketing and ads. “Like many of our decisions, this was not a simple choice and required a balance of competing interests where there was advocacy in both directions."

The Menlo Park California-based company, which last year made $86 billion in revenue thanks in large part to its granular ad targeting options, has had a slew of problems with how it serves ads to its billions of users.

In 2019, Facebook said it would overhaul its ad-targeting systems to prevent discrimination in housing , credit and employment ads as part of a legal settlement. The social network said at the time it would no longer allow housing, employment or credit ads that target people by age, gender or zip code. It also limited other targeting options so these ads don’t exclude people on the basis of race, ethnicity and other legally protected categories in the U.S., including national origin and sexual orientation.

It also allowed outside groups that were part of the lawsuit, including the American Civil Liberties Union, to test its ad systems to ensure they don’t enable discrimination. The company also agreed to meet with the groups every six months for the following three years, and is building a tool to let anyone search housing-related ads in the U.S. targeted to different areas across the country.

After an uproar over its lack of transparency on political ads Facebook ran ahead of the 2016 election, a sharp contrast to how ads are regulated on traditional media, the company created an ad archive that includes details such as who paid for an ad and when it ran. But it does not share information about who gets served the ad.

Outside researchers tried to remedy this. But in August, Facebook shut down the personal accounts of a pair of New York University researchers and shuttered their investigation into misinformation spread through political ads on the social network.

Facebook said at the time that the researchers violated its terms of service and were involved in unauthorized data collection from its massive network. The academics, however said the company is attempting to exert control on research that paints it in a negative light.

The NYU researchers with the Ad Observatory Project had for several years been looking into Facebook’s Ad Library, where searches can be done on advertisements running across Facebook’s products.

The access was used to “uncover systemic flaws in the Facebook Ad Library, to identify misinformation in political ads, including many sowing distrust in our election system, and to study Facebook’s apparent amplification of partisan misinformation,” said Laura Edelson, the lead researcher behind NYU Cybersecurity for Democracy, in response to the shutdown.

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in