FCC to consider rules for AI-generated political ads on TV and radio, but can't touch streaming
The nation’s top telecommunications regulator is introducing a proposal to require political advertisers to disclose when they use content generated by artificial intelligence in broadcast TV and radio ads
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.The head of the Federal Communications Commission introduced on Wednesday a proposal to require political advertisers to disclose when they use content generated by artificial intelligence in broadcast television and radio ads.
If adopted by the five-person commission, the proposal would add a layer of transparency that many lawmakers and AI experts have been calling for as rapidly advancing generative AI tools churn out lifelike images, videos and audio clips that threaten to mislead voters in the upcoming U.S. election.
Yet the nation's top telecommunications regulator would only have authority over TV, radio and some cable providers. The new rules, if adopted, would not cover the tremendous growth in advertising on digital and streaming platforms.
"As artificial intelligence tools become more accessible, the commission wants to make sure consumers are fully informed when the technology is used,” FCC chair Jessica Rosenworcel said in a statement Wednesday. “Today, I’ve shared with my colleagues a proposal that makes clear consumers have a right to know when AI tools are being used in the political ads they see, and I hope they swiftly act on this issue.”
The proposal marks the second time this year that the commission has begun taking significant steps to combat the growing use of artificial intelligence tools in political communications. The FCC earlier confirmed that AI voice-cloning tools in robocalls are banned under existing law. That decision followed an incident in New Hampshire’s primary election when automated calls used voice-cloning software to imitate President Joe Biden in order to dissuade voters from going to the polls.
If adopted, the proposal announced Wednesday would ask broadcasters to verify with political advertisers whether their content was generated using AI tools, such as text-to-image creators or voice-cloning software. The FCC has authority over political advertising on broadcast channels under the 2002 Bipartisan Campaign Reform Act.
Left for commissioners to discuss are several details of the proposal, including whether broadcasters would have to disclose AI-generated content in an on-air message or only in the TV or radio station's political files, which are public. They also will be tasked with agreeing on a definition of AI-generated content, a challenge that has become fraught as retouching tools and other AI advancements become increasingly embedded in all kinds of creative software.
Rosenworcel hopes to have the regulations in place before the election.
Jonathan Uriarte, a spokesperson and policy adviser for Rosenworcel, said she is looking to define AI-generated content as that generated using computational technology or machine-based systems, “including, in particular, AI-generated voices that sound like human voices, and AI-generated actors that appear to be human actors.” But he said her draft definition will likely change through the regulatory process.
The proposal comes as political campaigns already have experimented heavily with generative AI, from building chatbots for their websites to creating videos and images using the technology.
Last year, for example, the RNC released an entirely AI-generated ad meant to show a dystopian future under another Biden administration. It employed fake but realistic photos showing boarded-up storefronts, armored military patrols in the streets and waves of immigrants creating panic.
Political campaigns and bad actors also have weaponized highly realistic images, videos and audio content to scam, mislead and disenfranchise voters. In India's elections, recent AI-generated videos misrepresenting Bollywood stars as criticizing the prime minister exemplify a trend AI experts say is cropping up in democratic elections around the world.
As generative AI has become more cheap, accessible and easy to use, bipartisan groups of lawmakers have called for legislation to regulate the technology in politics. With just a little over five months until the November elections, they still have not passed any bills.
A bipartisan bill introduced by Sen. Amy Klobuchar, a Democrat from Minnesota, and Sen. Lisa Murkowski, a Republican from Alaska, would require political ads to have a disclaimer if they are made or significantly altered using AI. It would require the Federal Election Commission to respond to violations.
Uriarte said Rosenworcel realizes the FCC's capacity to act on AI-related threats is limited but wants to do what she can ahead of the 2024 election.
“This proposal offers the maximum transparency standards that the commission can enforce under its jurisdiction," Uriarte said. "It is our hope that government agencies and lawmakers can build on this important first step in establishing a transparency standard on the use of AI in political advertising.”
___
The Associated Press receives support from several private foundations to enhance its explanatory coverage of elections and democracy. See more about AP’s democracy initiative here. The AP is solely responsible for all content.
Subscribe to Independent Premium to bookmark this article
Want to bookmark your favourite articles and stories to read or reference later? Start your Independent Premium subscription today.