siri

A Design Flaw is Making it Extremely Easy to Hack Siri and Alexa

google home and amazon echo

Chinese researchers have discovered a terrifying vulnerability in voice assistants from Apple, Google, Amazon, Microsoft, Samsung and Huawei.  It affects every iPhone and Macbook running Siri, any Galaxy phone, any PC running Windows 10 and even Amazon’s Alexa assistant.   Using a technique called the “DolphinAttack”, a team from Zhejiang University translated typical vocal commands into ultrasonic frequencies that are too high for the human eary to hear.  But they are perfectly decipherable by the microphones and software powering our always-on voice assistants.  This relatively simple translation process lets them take control of gadgets with just a few words uttered in frequencies none of us can hear.

The researchers, however, didn’t just active basic commands like “Hey Siri” or “Okay Google” though.  They could also tell an iPhone to call a specific phone number.  Or tell an iPad to FaceTime that number.  They could force a Macbook or a Nexus 7 to open a malicious website.  They could even order an Amazon Echo to “open the back door”.  Even an Audi Q3 could have its navigation system redirected to a new location.

The researchers used a smartphone with about $3 of additional hardware to hack each voice assistant.  Which includes a tiny speaker and amp.  In theory, their methods can be duplicated by anyone with a bit of technical know-how and just a few bucks in their pocket.  And, well… the methods are now public.  In some cases, the attacks could only be made from a few inches away.  Although, gadgets like the Apple Watch are vulnerable from within several feet.  Which means, it’s hard to imagine an Amazon Echo being hacked with DolphinAttack.  In order for someone to make the command “open the back door”, they would already need to be inside your home.  And in which case, they could just open the back door.  All jokes aside, this does worry me a little.

siri

But a bigger concern is the ability to hack your iPhone.  A hacker would only need to walk by you in a crowd.  They would have their phone out, playing a command in frequencies that you wouldn’t hear and you would have your own phone dangling in your hand.  Maybe you wouldn’t see that Safari or Chrome had loaded a site.  And then that site ran code to install malware.  And then all of your contents and communications on your phone are open for the hacker to explore.

The exploit is enabled by a combination of hardware and software problems.  The microphones and software that power voice assistants like Sir, Alexa and Google Home can pick up inaudible frequencies – specifically frequencies above the 20kHz limits of human ears.  What this means is that it’s up to the software to decipher what is human speech and what is machine speech.  Which sounds kind of difficult, doesn’t it?  In theory, Apple or Google could just command their assistants to never obey orders from someone speaking at 20kHz with a digital audio filter. But according to what the Zhejiang researchers found, every major voice assistant company exhibited vulnerability with commands stated above 20kHz.

Does this feel scary to anyone else?  I mean, it’s possible that this might never happen.  But like I always say – what else can it lead to?  What other kind of vulnerabilities can this expose?  I don’t want to get too Elon Musk here right now, but it seems that perhaps some of these tech companies aren’t concerned with how a product might be misused.  That’s not to say that we should be thinking about how AI will take over the world.  But it doesn’t hurt to look at these products with a bit more risk management in mind.

It’s also possible that we are so excited about the tech itself, we get caught up in it and maybe miss some steps.  And by we, I mean everyone.  Including the consumers.  Perhaps more testing is needed?  Or maybe tech giants need to hire more nefarious people to put out some crazy ideas?  Whatever the answer may be, we are missing a few steps and it potentially will have a big impact on the consumers.  Which is why it makes sense to go back to the drawing board a couple more times, just to be safe.