Exploiting Voice Assistants

May 11, 2018

As our everyday technologies keep improving and advancing there must be some sort of repercussions, right? Research graduates from the United States discovered possibilities of disguising voice commands that directly affect voice assistants such as Alexa, Siri, and Google Assistant. These disguised voice commands are typically followed by voice assistants (VA). Even though voice assistants aren’t new, the power of VA’s on devices are getting stronger.

According to students from UC Berkeley, one of the most effective ways to pronounce commands to a VA is to hide it in music. This music is so specially crafted towards the VA, no human will pick up on it. Typically covered by white noise by a device’s speaker, these commands can force functions that make a device connect to websites or even switch to airplane mode.

We still want to see if we could make it even more stealthy.
UC Berkeley graduate

Their study is strictly based on deep learning development – Automatic Speech Recognition (ASR) to be exact. Deep learning in computer science has known vulnerabilities, little is known whether such efforts are valid on the practical speech recognition level. By embedding the voice commands into a song, called CommanderSong, the direct voice command evades even the keenest human detection. They successfully crafted commands “the carrier” for the wav-to-API attack to happen. In addition, when their wav-air-API attack plays the CommanderSongs and decodes the recorded audio manages to achieve a 96% success rate. In this way, the song carrying the command can spread through radio, TV, or even any media player installed on the portable devices like smartphones, potentially impacting millions of users in long distance.

Something like this should put some paranoia around voice assistants or at least light a fire under the developing security teams to strengthen their abnormal command detection protocol. Although developers were somewhat prepared by implementing optional features that lock access to personal information to a specific user based off their voice patterns. Additional features should always require the device to be unlocked/authorized before accessing sensitive information.

To ease the paranoia, researchers suggest that there is no evidence that attackers have used this method in the real world – yet it is just a matter of time before they do – assuming attackers haven’t tried to copy his team’s work and implement it in a more malicious way already. In conclusion, the undetected voice commands demonstrated by the researchers displays how easy it is for attackers to exploit this technique. They can have digital assistants unlock doors of smart homes, transfer money through banking apps, and purchase items from online retailers, all without the user knowing what was happening. The worst part is this could be done on a large scale.

Subscribe with us at Apex United Corporation to stay up to-date on the lastest business and technology news.

Join Us
Let's Talk