Ultrasound frequencies called dolphin attack can now Hack you voice assistant
Almost every other
day researchers find new hacks to get into your smartphones, gadgets, and smart
devices. Now, a group from China's Zheijiang University have found a new way to
enter into your smartphones and open malicious websites and even gain access to
connected smart devices if there are any using ultrasound frequencies.
For this
purpose, the hack takes advantage of what is known as the Dolphin Attack, where
frequencies of over 20KHz are sent to a nearby smartphone to attack Siri,
Google Assistant, Alexa, and other voice-based virtual assistants.
Two teams said the
assistants responded to commands broadcast at high frequencies that can be
heard by dolphins but are inaudible to humans hence the attack name.
They were able to
make smartphones dial phone numbers and visit rogue websites.
Microphones on a
smartphone can catch frequencies above 20,000Hz, something that humans cannot
hear. Voice assistants are able to catch these frequencies and take commands
accordingly, without the knowledge of the owner. Taking advantage of this, the
researchers translated human voice commands into ultrasound frequencies, and
then played them back using very cheap components. The parts include a regular
smartphone, an amplifier, an ultrasonic transducer, and a battery.
Many smartphones
feature a voice-controlled assistant that can be set up to constantly listen
for a "wake word".
Google's assistant
starts taking orders when a person says "ok Google", while Apple's
responds to "hey Siri" and Amazon's to "Alexa".
Researchers in China set
up a loudspeaker to broadcast voice commands that had been shifted into
ultrasonic frequencies.
This method is said
to work on all assistants like Siri, Alexa, Google Assistant, Samsung S Voice,
Bixby, and on devices like iPad, MacBook, Amazon Echo, Audi Q3, and more. What
makes this attack scary is that speech recognition can recognize this frequency
easily on any device mentioned in the report, and this attack works even
if the necessary security measures are in place. Furthermore, the attacker
doesn't need any device access as well to carry out this hack. To demonstrate
the effectiveness of the attack, researchers showed how they changed the
navigation on an Audi Q3, and successfully commanded smartphones to do tasks
like 'open dolphinattack.com', and 'open the back door'.
However, for this
hack to work, there are certain preconditions. Firstly, the smartphone to which
the signal needs to be transmitted to, has to be in a range of five to six feet
from the transmitter, and not more. Also, in case of Siri and Google Assistant,
you have to have the assistant activated. Furthermore, the user is immediately
alerted as these assistants make a tone or reply back to these frequency
commands. Therefore, the conditions include that the device has to be very
near, the user has to leave their phone unlocked and voice assistant activated,
and not be around the smartphone, or be distracted. All of these together are a
very unlikely scenario, but something that can be worked around if the hacker
seriously looks to do harm.
In any case, the
researchers claim that the only simple solution for this hack, is for device
makers to programme their AI assistants to ignore frequencies above 20kHz or
cancel out any frequencies that humans cannot understand.
Until then, keep your
voice assistants deactivated, if you're really paranoid.
A Google spokesman
said: "We take user privacy and security very seriously at Google, and
we're reviewing the claims made."
Amazon said in a
statement: "We take privacy and security very seriously at Amazon and are
reviewing the paper issued by the researchers."
Comments
Post a Comment