According to the latest study, we have got to know that Google Assistant-powered smart speakers & Alexa could fool you and make you agree to provide your private information such as passwords and pins. One more study reveals that these smart speakers can be haked with the help of “laser light commands”.
According to the investigation performed by Wired, we have come to know that anyone can hack a smart speaker and can send a command to the microphone of the speaker by simply making some little alterations in the intensity of the laser beam to a specific frequency. The main reason to worry is that speakers will recognize those frequency/signal as normal voice commands, which make smart speakers hazardous and vulnerable to attackers. This attack has given a name that is “light command”.
After examining it carefully, researchers said a few words regarding light commands. They said vulnerability of the micro-electromechanical system is the foremost cause that gives access to hackers to remotely inject inaudible & invisible commands into voice assistants, like Alexa, Google Smart Assistant, Siri in Apple, etc.
The foremost reason to worry is that the attacker can steal your private data through “light command” by sitting in another house or building. This can only be done through the signal.
The best thing I liked about the researchers is that not only did they find the shortfall or loophole but they also showed how to rectify that shortfall or loophole. The researcher found that if the manufacturing company installs a slight light protector in front of the mic, then the attacker can’t steal the user’s data.
In response to the report, Google & Amazon spokesperson said that they are still reviewing this research report. And Apple & Facebook both not even comment or respond to the report. In last we will suggest you all to must take care of your smart speakers as attackers can steal your personal data. What we can do is wait for the companies to respond to the research paper.