Security News > 2020 > March > Siri and Google Assistant hacked in new ultrasonic attack

Dubbed SurfingAttack by a US-Chinese university team, this is no parlor trick and is based on the ability to remotely control voice assistants using inaudible ultrasonic waves.
Voice assistants - the demo targeted Siri, Google Assistant, and Bixby - are designed to respond when they detect the owner's voice after noticing a trigger phrase such as 'Ok, Google'.
As explained in a video showcasing the method, a remote laptop generates voice commands using text-to-speech Module to produce simulated voice commands which are then transmitted to the disc using Wi-Fi or Bluetooth.
In theory, voice assistants should only respond to the owner's voice, but these can now be cloned using machine learning software such as Lyrebird, as was the case in this test.
SurfingAttack was inspired by the 2017 DolphinAttack proof-of-concept, which showed how voice assistants could be hijacked by ultrasonic commands.
News URL
Related news
- Google says hackers abuse Gemini AI to empower their attacks (source)
- Google fixes Android kernel zero-day exploited in attacks (source)
- Google acquisition target Wiz links fresh supply chain attack to 23K pwned GitHub repos (source)
- Zero-Day Alert: Google Releases Chrome Patch for Exploit Used in Russian Espionage Attacks (source)