-
Tips for becoming a good boxer - November 6, 2020
-
7 expert tips for making your hens night a memorable one - November 6, 2020
-
5 reasons to host your Christmas party on a cruise boat - November 6, 2020
-
What to do when you’re charged with a crime - November 6, 2020
-
Should you get one or multiple dogs? Here’s all you need to know - November 3, 2020
-
A Guide: How to Build Your Very Own Magic Mirror - February 14, 2019
-
Our Top Inspirational Baseball Stars - November 24, 2018
-
Five Tech Tools That Will Help You Turn Your Blog into a Business - November 24, 2018
-
How to Indulge on Vacation without Expanding Your Waist - November 9, 2018
-
5 Strategies for Businesses to Appeal to Today’s Increasingly Mobile-Crazed Customers - November 9, 2018
Your smartphone is wide-open to hackers standing 16 feet away from you
French researchers claim to have remotely accessed iOS and Android digital assistants and silently delivered commands by using headphones with inbuilt microphones as antennas.
Advertisement
The worst part of it all is the fact that the target device’s owner would never be the wiser.
The hackers use a laptop running a software-defined radio, an amplifier and an antenna to broadcast radio wave signals that are picked up by the cord on the headphones. To reach distances of 16 feet, something like a vehicle would be needed to house the larger hardware. The Siri or Google Now has to be enabled in the system settings and the mobile device needs to have a pair of earphones or headphones, which also must contain a microphone, plugged in to the device. In the meantime, security conscious smartphone users should disable voice command functions from their lock screens. Like Apple’s “Hey Siri”, Google allows users to begin a voice search with the generic query “OK Google”.
Most smartphones can respond to your voice commands, but they might also respond to someone else’s. A hacker could use the exploit to send texts, make calls, access social media accounts and direct the phone’s browser to malware sites.
Eventually, hackers can wake-up Siri and simulate voice commands.
Still, Vincent Strubel, the director of their authors’ research group at ANSSI, is quoted by Wired as warning that “the sky is the limit here”.
The only way to protect your iPhone from getting hacked using this method is to disable access to Siri from the lockscreen. Simply by voicing commands, Siri listens and obeys, whether you want to know how many calories are in your soda can or how many planes are flying above your head this very instant.
In their Paris talk, the researchers demonstrated a few scenarios, such as turning the phone into an eavesdropping device by commanding it to make a call to an attacker’s monitoring phone. However, Siri is enabled by default on the iPhone’s lock screen and it has no voice identity feature.
Voice-activated assistants, such as Siri and Google Now, can do pretty much everything that the device permits – send emails, upload pictures, turn on the camera and so on. The hackers are able to accomplish this using radio waves, and the implications may be serious. Zimperium uncovered the massive Stagefright vulnerability in Android phones earlier this year.
Advertisement
Once more, this is an intricate hack and it will certainly not pose great problems to vast majority of the users.