Security News > 2020 > March > Siri and Google Assistant hacked in new ultrasonic attack
Dubbed SurfingAttack by a US-Chinese university team, this is no parlor trick and is based on the ability to remotely control voice assistants using inaudible ultrasonic waves.
Voice assistants - the demo targeted Siri, Google Assistant, and Bixby - are designed to respond when they detect the owner's voice after noticing a trigger phrase such as 'Ok, Google'.
As explained in a video showcasing the method, a remote laptop generates voice commands using text-to-speech Module to produce simulated voice commands which are then transmitted to the disc using Wi-Fi or Bluetooth.
In theory, voice assistants should only respond to the owner's voice, but these can now be cloned using machine learning software such as Lyrebird, as was the case in this test.
SurfingAttack was inspired by the 2017 DolphinAttack proof-of-concept, which showed how voice assistants could be hijacked by ultrasonic commands.
News URL
Related news
- Google Adds New Pixel Security Features to Block 2G Exploits and Baseband Attacks (source)
- Over 4,000 Adobe Commerce, Magento shops hacked in CosmicSting attacks (source)
- Samsung phone users under attack, Google warns (source)
- LottieFiles hacked in supply chain attack to steal users’ crypto (source)
- Google fixes two Android zero-days used in targeted attacks (source)