Researchers say they have found a way to hijack voice assistants from the major tech firms using cheap lasers.
- They discovered that shining even cheap laser pointers at microphones in smart speakers and some smartphone models could result in the device interpreting the light as sound.
- The research team demonstrated how it was able to “speak” with smart speakers and smartphones running Google’s Assistant, Amazon’s Alexa, and Apple’s Siri using the lasers, even getting them to perform tasks like opening a garage door.
- Smart speakers, which don’t require extra authentication, were particularly vulnerable to this kind of attack. Researchers tested popular models from all the major tech firms.
- Google and Amazon told Business Insider they were reviewing the research for its security implications. Apple declined to comment. Facebook, which uses Amazon’s Alexa in its Portal speaker, did not immediately respond.
- Visit Business Insider’s homepage for more stories.
It turns out laser pointers are good for more than just confusing cats.
A team of researchers from Tokyo’s University of Electro-Communications and the University of Michigan say they have discovered that you can “hijack” voice-enabled devices by shining a laser at them.
The team found that microphones in some of the most popular smart speakers and smartphones on the market interpreted the bright light of the laser as sound.
“Thus, by modulating an electrical signal in the intensity of a light beam, attackers can trick microphones into producing electrical signals as if they are receiving genuine audio,” they wrote.
The team tested popular smart-speaker models from all the major tech firms as well as some smartphones that variously run Google’s Assistant, Amazon’s Alexa, and Apple’s Siri.
Their list of devices included Google Home, various Amazon Echo models, the Apple HomePod, and Facebook’s Portal speaker, which runs Alexa. They also tested an iPhone XR, a Samsung Galaxy S9, and a Google Pixel 2.
The team found all were vulnerable to the attack, in varying degrees. They were able to hijack the tablets, phones, and speakers from some distance — and through windows. They hijacked a Google Home speaker from 110 meters away, for example.
Some of the devices were less vulnerable than others, as noted by Wired and in the team’s paper. Some Android smartphones, the iPhone, and the iPad, require additional authentication or a “wake word” from the user before carrying out certain actions. Would-be hijackers would need to re-create a person saying a wake command like “Hey Siri” or “OK Google” to wake up an assistant before they could carry out an attack.
But smart speakers don’t have this extra layer of authentication.
The researchers used reasonably affordable laser pointers ranging from $13.99 to $17.99 to carry out the attacks, though to give the speakers specific instructions the laser pointer had to be paired with a $27.99 sound amplifier and device called a laser driver to control the intensity of the beam — which costs $339.
Here’s a video of the team hacking a Google Home device to open a garage door using a cheap laser pointer:
In their paper the researchers warned the laser attack could also be used to unlock smartphone-connected front doors, to shop online, or to find and unlock cars such as Teslas connected to a victim’s Google account.
“We are closely reviewing this research paper,” a Google spokeswoman told Business Insider. “Protecting our users is paramount, and we’re always looking at ways to improve the security of our devices.”
Amazon is also taking a closer look at the security of its devices following the paper’s publication. “Customer trust is our top priority and we take customer security and the security of our products seriously,” an Amazon spokeswoman said. “We are reviewing this research and continue to engage with the authors to understand more about their work.”
Apple declined to comment when contacted by Business Insider, and Facebook was not immediately available for comment.
The researchers noted that they hadn’t found any evidence to suggest this hack has been used in the real world. You can read the researchers’ full paper here.