Security News > 2024 > April > LastPass: Hackers targeted employee in failed deepfake CEO call
LastPass revealed this week that threat actors targeted one of its employees in a voice phishing attack, using deepfake audio to impersonate Karim Toubba, the company's Chief Executive Officer.
While 25% of people have been on the receiving end of an AI voice impersonation scam or know someone who has, according to a recent global study, the LastPass employee didn't fall for it because the attacker used WhatsApp, which is a very uncommon business channel.
"In our case, an employee received a series of calls, texts, and at least one voicemail featuring an audio deepfake from a threat actor impersonating our CEO via WhatsApp," LastPass intelligence analyst Mike Kosak said.
Kosak added the attack failed and had no impact on LastPass.
The deepfake audio used in this attack was likely generated using deepfake audio models trained on publicly available audio recordings of LastPass' CEO, likely this one available on YouTube.
Europol warned in April 2022 that deepfakes may soon become a tool that cybercriminal groups routinely use in CEO fraud, evidence tampering, and non-consensual pornography creation.