Amazon Echo, Google Home Smart Speakers Can Be Hacked With Laser 'Light Commands', Researchers Claim

The Light Commands vulnerability is present is pretty much all smart devices with a microphone, including smartphones, tablets, and smart speakers.

Share on Facebook Tweet Share Reddit Comment
Amazon Echo, Google Home Smart Speakers Can Be Hacked With Laser 'Light Commands', Researchers Claim

Photo Credit: Amazon

Amazon Echo or a Google Home can be fooled by a laser pointer

Highlights
  • Amazon, Google are reviewing researchers’ findings
  • Hackers can use smart speakers to break into smart home devices
  • It will be hard for the hackers to actually exploit it

Researchers have found an interesting and scary new way of hacking into smart speakers. It includes the use of lasers to send commands to the smart speakers like Amazon Echo or Google Home without actually using spoken commands. Apparently, the microphones present in the smart speakers also react to the light and by modulating an electrical signal in the intensity of a laser beam, the attackers can trick microphones into believing that they are getting actual spoken commands.

According to a report by Wired, cyber-security researcher Takeshi Sugawara and a group of researchers from the University of Michigan say that they can change the intensity of a laser beam and point it to a smart speaker's microphone to send any command that would be interpreted as a normal voice command by a smart speaker. The vulnerability is not just limited to smart speakers but can be used to hack into any computer that includes a microphone to accept voice commands. The researchers, who dubbed the vulnerability as Light Commands, also tested the same method with smartphones and tablets and achieved some success.

“Light Commands is a vulnerability of MEMS microphones that allows attackers to remotely inject inaudible and invisible commands into voice assistants, such as Google Assistant, Amazon Alexa, Facebook Portal, and Apple Siri using light,” the researchers wrote on a website dedicated to the Light Commands vulnerability.

The vulnerability can be used to break into other systems that are connected through a smart speaker like smart home devices, connected garage doors, online shopping, remotely starting some vehicles, and more.

While the vulnerability certainly sounds serious, it will need a lot of effort from an attacker to exploit it. It requires specialised equipment, like a laser pointer, laser driver, sound amplifier, a telephoto lens, and more. Also, an attack would only work on unattended smart speakers as an owner could notice a light beam reflecting on a smart speaker.

Google and Amazon told Wired that they are reviewing the research team's findings but did not share any plans on whether or how they plan to protect against such attacks.

The best mitigation available against the Light Commands vulnerability right now is avoid putting your smart speaker, smartphone, or tablet in the line of sight of a possible attacker. You can also active a secondary authentication for online purchases or other sensitive commands on your smart speaker, if available.

Comments

For the latest tech news and reviews, follow Gadgets 360 on Twitter, Facebook, and subscribe to our YouTube channel.

Gaurav Shukla Paranoid about online surveillance, Gaurav believes an artificial general intelligence is one day going to take over the world, or maybe not. He is a big ‘Person of Interest’ fan. More
Uber Posts Q3 Loss, Says Will be Profitable by the End of 2021
US Urged to Invest More in AI; Ex-Google CEO Warns of China's Progress
 
 

Advertisement

 

Advertisement

© Copyright Red Pixels Ventures Limited 2019. All rights reserved.