Security News > 2023 > May > Your voice could be your biggest vulnerability
AI technology is fueling a rise in online voice scams, with just three seconds of audio required to clone a person's voice, according to McAfee.
With 53% of adults sharing their voice data online at least once a week and 49% doing so up to 10 times a week, cloning how somebody sounds is now a powerful tool in the arsenal of a cybercriminal.
"Advanced artificial intelligence tools are changing the game for cybercriminals. Now, with very little effort, they can clone a person's voice and deceive a close contact into sending money," said Grobman.
"It's important to remain vigilant and to take proactive steps to keep you and your loved ones safe. Should you receive a call from your spouse or a family member in distress and asking for money, verify the caller - use a previously agreed codeword, or ask a question only they would know. Identity and privacy protection services will also help limit the digital footprint of personal information that a criminal can use to develop a compelling narrative when creating a voice clone," concluded Grobman.
The voice of a person who speaks with an unusual pace, rhythm or style requires more effort to clone accurately and is less likely to be targeted as a result.
AI voice cloning protection Set a verbal 'codeword' with kids, family members or trusted close friends that only they could know.