Fake humans are turning up to job interviews – and you might not even know, FBI warns

Not clear why scammers are creating detailed, false applicants, agency warns

Andrew Griffin
Wednesday 29 June 2022 05:32 EDT
Comments
(Getty Images/iStockphoto)

Your support helps us to tell the story

From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.

At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.

The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.

Your support makes all the difference.

Fake humans are conducting job interviews – and could trick the people interviewing them, the FBI has warned.

Scammers are using deepfakes and other technology to create false applicants that can undertake job interviews, the agency warned. The fake people are made by stealing the personal information of other people and then creating fake but convincing applicants that can go to job interviews as them, it said.

If successful, criminals can then use the job position to access useful data held by those companies, it suggested. But it is not exactly clear why cyber criminals are using the attack.

The problem is on the rise, with a growing number of complaints from companies who have been targeted by the strange attack, the FBI said in a public service announcement.

The attacks are on the rise amid the increase of remote work or work-from-home positions after the pandemic.

In the attack, a person might appear on screen as normal, and talk and move like a real person. But that person is controlled by a scammer, who is able to use fake or AI-generated voices, as well as videos, to create a convincing job applicant.

“Complaints report the use of voice spoofing, or potentially voice deepfakes, during online interviews of the potential applicants,” the agency said in its advisory.

“In these interviews, the actions and lip movement of the person seen interviewed on-camera do not completely coordinate with the audio of the person speaking. At times, actions such as coughing, sneezing, or other auditory actions are not aligned with what is presented visually.”

The agency did not give any specific advice for combating the attack, or spotting the fake applicants, but asked that anyone affected get in touch.

It comes amid an increasing fear about deepfakes, which have also been used to generate fake non-consensual sexual imagery and to make politicians appear to have given statements they never really said.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in