French hackers have found a way to silently hijack your phone through Siri
As long as you've got your headphones plugged in, antenna-wielding hackers can hijack your phone
Your support helps us to tell the story
From reproductive rights to climate change to Big Tech, The Independent is on the ground when the story is developing. Whether it's investigating the financials of Elon Musk's pro-Trump PAC or producing our latest documentary, 'The A Word', which shines a light on the American women fighting for reproductive rights, we know how important it is to parse out the facts from the messaging.
At such a critical moment in US history, we need reporters on the ground. Your donation allows us to keep sending journalists to speak to both sides of the story.
The Independent is trusted by Americans across the entire political spectrum. And unlike many other quality news outlets, we choose not to lock Americans out of our reporting and analysis with paywalls. We believe quality journalism should be available to everyone, paid for by those who can afford it.
Your support makes all the difference.Computer researchers working for the French government claim they have been able to crack into mobile voice-controlled assistants like Siri, allowing them to remotely and silently give commands to smartphones.
The team, who work the French government's Network and Information Security Agency (ANSSI) used a "remote command injection" to access Siri and Google Now, the voice-controlled assistant apps available on iOS and Android smartphones.
As long as the phone has a pair of headphones with a built-in microphone plugged into the jack, the hackers can use the cord as an antenna to give mock voice commands.
By broadcasting a silent electromagnetic signal which the headphone wire picks up, they can fool the phone into thinking the signals coming from the headphones are simply regular voice commands.
Everything that can be done through voice commands can be done silently and remotely with this kind of technology.
By accessing Siri in this way, the hackers could command the phones to call certain people, send texts, make searches, or send the phone to a website containing malicious software, all without the owner realising.
The hackers were able to hijack the phones from up to five metres away, with the use of an amplifer, laptop, antenna and radio.
As reported by Wired, two reserachers on the team, Jose Lopes Esteves and Chaouki Kasmi, said the possibility that hackers could send these "parasitic" signals could have "critical security impacts" on the industry.
There are limitations to this method. The hack only works if the target has headphones with a built-in microphone plugged in. All standard-issue iPhone headphones have one, but many types of headphones don't.
A lot of Android phones don't have the Google Now voice-activated assistant accessible from the homescreen, as Siri is on the iPhone.
And most obviously, people with their headphones in their ears would be able to hear the distinctive sound of Siri or Google Now activating.
However, the team's technology, as well as our appetite for futuristic and easy-to-use features on our smartphones, makes the method feasible as hacking technique.
iOS9, Apple's latest mobile operating system, has a hands-free feature that allows owners to activate Siri simply by saying "Hey Siri". Google Now has has a similar feature, and users only have to speak to activate it.. These features mean the hackers can easily activate the voice control before giving their instructions.
However, the hijack can work on previous iOSs too. The team figured out a way to mimic the signal sent by the headphones when the home button is pressed, activating Siri even on older versions that don't have the hands-free feature.
Vincent Strubel, director of the ANSSI team, told Wired that hackers could potentially go to a crowded train station or café, break into multiple devices, and get them to call a premium-rate phone number to generate money.
There's no evidence that this hack has actually been used by criminals, and the team have tipped off Apple about the security breach.
By using better shielding on their headphone cords, or by making it impossible to use these voice control features without enabling voice recognition or having to say a password, the potential danger could be neutralised.
Join our commenting forum
Join thought-provoking conversations, follow other Independent readers and see their replies
Comments