AI-generated voices in robocalls can deceive voters. The FCC just made them illegal
The Federal Communications Commission is outlawing robocalls that contain voices generated by artificial intelligence
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.The Federal Communications Commission on Thursday outlawed robocalls that contain voices generated by artificial intelligence, a decision that sends a clear message that exploiting the technology to scam people and mislead voters wonāt be tolerated.
The unanimous ruling targets robocalls made with AI voice-cloning tools under the Telephone Consumer Protection Act, a 1991 law restricting junk calls that use artificial and prerecorded voice messages.
The announcement comes as New Hampshire authorities are advancing their investigation into AI-generated robocalls that mimicked President Joe Bidenās voice to discourage people from voting in the state's first-in-the-nation primary last month.
Effective immediately, the regulation empowers the FCC to fine companies that use AI voices in their calls or block the service providers that carry them. It also opens the door for call recipients to file lawsuits and gives state attorneys general a new mechanism to crack down on violators, according to the FCC.
āBad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities, and misinform voters,ā the agency's chairwoman, Jessica Rosenworcel, said in a news release. āWeāre putting the fraudsters behind these robocalls on notice.ā
Under the consumer protection law, telemarketers generally cannot use automated dialers or artificial or prerecorded voice messages to call cellphones, and they cannot make such calls to landlines without prior written consent from the call recipient.
The new ruling classifies AI-generated voices in robocalls as āartificialā and thus enforceable by the same standards, the FCC said.
Those who break the law can face steep fines, maxing out at more than $23,000 per call, the FCC said. The agency has previously used the consumer law to clamp down on robocallers interfering in elections, including imposing a $5 million fine on two conservative hoaxers for falsely warning people in predominantly Black areas that voting by mail could heighten their risk of arrest, debt collection and forced vaccination.
The law also gives call recipients the right to take legal action and potentially recover up to $1,500 in damages for each unwanted call.
Rosenworcel said the commission started looking at making robocalls with AI-generated voices illegal because it saw a rise in these types of calls. It sought public comment on the issue last November and in January, a bipartisan group of 26 state attorneys general wrote to the FCC urging it to move forward with a ruling.
Audio recordings that use AI to convincingly imitate people seem ālike something from the far-off future, but this threat is already here,ā Rosenworcel told The Associated Press. āAll of us could be on the receiving end of these faked calls, so thatās why we felt the time to act was now.ā
Sophisticated generative AI tools, from voice-cloning software to image generators, already are in use in elections in the U.S. and around the world.
Last year, as the U.S. presidential race got underway, several campaign advertisements used AI-generated audio or imagery, and some candidates experimented with using AI chatbots to communicate with voters.
Bipartisan efforts in Congress have sought to regulate AI in political campaigns, but no federal legislation has passed, with the general election nine months away.
The AI-generated robocalls that sought to influence New Hampshireās Jan. 23 primary election used a voice similar to Bidenās, employed his often-used phrase, āWhat a bunch of malarkeyā and falsely suggested that voting in the primary would preclude voters from casting a ballot in November.
New Hampshire Attorney General John Formella said Tuesday that investigators had identified the Texas-based Life Corp. and its owner, Walter Monk as the source of the calls, which went to thousands of state residents, mostly registered Democrats. He said the calls were transmitted by another Texas-based company, Lingo Telecom.
New Hampshire issued cease-and-desist orders and subpoenas to both companies, while the Federal Communications Commission issued a cease-and-desist letter to the telecommunications company, Formella said. A task force of attorneys general in all 50 states and Washington, D.C., sent a letter to Life Corp. warning it to stop originating illegal calls immediately.
According to the FCC, both Lingo Telecom and Life Corp. have been investigated for illegal robocalls in the past. In 2003, FCC issued a citation to Life Corp. for delivering illegal pre-recorded and unsolicited advertisements to residential lines.
More recently, the task force of attorneys general has accused Lingo of being the gateway provider for 61 suspected illegal calls from overseas. The Federal Trade Commission issued a cease and desist order against Lingoās prior corporate name, Matrix Telecom, in 2022. The next year, the task force demanded that it take steps to protect its network.
__
The Associated PressāÆreceives support from several private foundations to enhance its explanatory coverage of elections and democracy. See more about APās democracy initiative here. The AP is solely responsible for all content.