Siri, Alexa and Google Smart Speakers Hacked by a Laser

Researchers have published their findings after successfully hacking popular ‘smart’ devices from Apple, Google and Amazon with a laser beam of light.

The form of attack has been dubbed “light commands”, which exploits the design of a smart device’s microphone, which work by converting sound into electrical signals. Researchers have found a new way to use the MEMS microphone to react to light shone at it by a laser, which can in turn be used to manipulate the device.

According to reports, they were able to launch commands at the devices from 110 meters away, which could potentially have widespread ramifications in the future if cybercriminals were to utilise this technique.

Researchers say that a possible attack could be conducted ‘easily and cheaply’”

“By modulating an electrical signal in the intensity of a light beam, attacks can trick microphones into producing electrical signals as if they were receiving genuine audio,” the reserach paper published by the University of Michigan and the University of Electro-Communications in Tokyo says.

According to ThreatPost, “MEMS microphones feature a small, built-in plate called the diaphragm, which when hit with sounds or light sends electrical signals that are translated into commands. Instead of voice commands, researchers found that they could ‘encode’ sound using the intensity of a laser light beam, which causes the diaphragm to move and results in electrical signals representing the attacker’s commands.”

This video demonstrates the researchers sending commands to a Google Home unit.

“In a real-life attack,” Lindsey O’Donnell explains, “an attacker could stand outside a house and potentially shine a laser light onto a voice assistant that is visible through a window. From there, an attacker could command the voice assistant to unlock a door, make online purchases, remotely start vehicles or other malicious actions.”

“Protecting our users is paramount, and we’re always looking at ways to improve the security of our devices.” Google.

More dangerous still, researchers say that a possible attack could be conducted “easily and cheaply”, adding that a $14 laser pointer and a basic sound amplifier could be enough to initiate an attack. They conducted tests on a range of devices, from the Google Nest, Echo, iPhone XR, Samsung Galaxy S9, Google Pixel 2, Alexa and Siri unit and found that any system using MEMS microphones was vulnerable to be comprised.

The team stated that while there have been no recorded incidents of a malicious attack on these smart devices, the possibility is indeed present, and they are beginning to collaborate with the manufacturers of these products to eliminate vulnerability.

The researchers state that “an additional layer of authentication can be effective at somewhat mitigating the attack. Alternatively, in case the attacker cannot eavesdrop on the device’s response, having the device ask the user a simple randomized question before command execution can be an effective way at preventing the attacker from obtaining successful command execution.”

ThreatPost contacted Apple, Google, Amazon and Facebook for comments on the news, and an Amazon spokesperson replied that “customer trust is our top priority and we take customer security and the security of our products seriously. We are reviewing this research and continue to engage with the authors to understand more about their work.”

A Google spokesperson added that “we are closely reviewing this research paper. Protecting our users is paramount, and we’re always looking at ways to improve the security of our devices.”

Featured Posts
Recent Posts
Search By Tags
Follow Us
  • YouTube Best Practice Icon
  • LinkedIn Social Icon
  • Facebook Basic Square
  • Instagram Social Icon
  • Twitter Basic Square

© 2020 by Best Practice

  • White YouTube Icon
  • White LinkedIn Icon
  • White Instagram Icon
  • White Facebook Icon
  • White Twitter Icon