Voice assistant devices could be more vulnerable than we think

Voice assistant devices could be more vulnerable than we think

While the undetected voice commands demonstrated by the researchers are harmless, it is easy how attackers can exploit the technique. Now a new study claims that Google Assistant, along with its rivals like Apple's Siri and Amazon's Alexa, could be vulnerable to sound commands that can't even be heard by humans. In doing so, you could essentially override the message the voice assistant is supposed to receive and substitute it with sounds that will be interpreted differently, thus giving the voice assistant a different command that would be virtually unrecognizable to the human ear.

These commands can be hidden in white noise played over loudspeakers or YouTube videos, as students from the University of California, Berkeley, and Georgetown University demonstrated two years ago.

"We wanted to see if we could make it even more stealthy", said UC Berkeley fifth-year computer security Ph.D. student Nicholas Carlini, one of the authors of the research that has been published online. Major smart speaker manufacturers like Amazon, Google and Apple say that they have safeguards in place to prevent their assistants from being hijacked. Apple points out that HomePod can't do things like open doors, while iPhones have to be unlocked to execute certain Siri commands.

While the commands may go unheard by humans, the low-frequency audio commands can be picked up, recovered and then interpreted by speech recognition systems. The idea was to get Android devices to read aloud the ingredients of the burger from the Whopper's Wikipedia page.

There is no U.S. law against broadcasting subliminal messages to humans, let alone machines. Like using a Cap'n Crunch whistle from a cereal box to trick payphones into giving free calls, this latest attack is simply the evolution of using a system against itself, and it will make digital assistants (like it did with telephones) more secure in the long run. For its part, the Federal Communications Commission (FCC) has discouraged the practice, calling it "counter to the public interest".

Canara Bank posts huge loss on bad loans
Its return on assets became negative at (-) 0.03 per cent at the end of March quarter as against 0.15 per cent in the year-ago. Gross NPAs jumped to Rs 47,468 crore as on March end 2018, up 18 per cent from Rs 40,312 crore in the December quarter.

Researchers in China call the technique, "Dolphin Attack". The receiver must be close to the device, but a more powerful ultrasonic transmitter can help increase the effective range.

They were able to hide the command, "O.K. Google, browse to evil.com" in a recording of the spoken phrase, "Without the data set, the article is useless". During the Urabana-Champaign, they showed that though commands couldn't yet penetrate walls, they still had the potential to control smart devices through open windows in buildings.

They also embedded other commands into music clips.

You can hear the audio files on Carlini's website.

Related Articles