https://www.sans.org/newsletters/ouch/phantom-voices-defend-against-voice-cloning-attacks

A disturbing new trend has emerged in the world of scams – the use of artificial intelligence (AI) to clone voices and deceive unsuspecting victims. Margaret, a retired teacher, fell victim to such a scam when a cybercriminal used AI to mimic her grandson’s voice and trick her into sending money.

Voice cloning technology allows attackers to create highly realistic replicas of voices, making it difficult for victims to detect the deception. These scams often involve urgent requests for money, exploiting victims’ emotions and trust.

To protect yourself from voice cloning scams, be aware of the signs of deception and take the following precautions:

  • Limit information sharing: Be mindful of the personal information you share online, as it can be used to train AI voice cloning models.
  • Be wary of urgent requests: If someone calls you with an urgent request for money, especially if it involves a family member or friend in distress, be cautious and verify the information.
  • Use a secret passphrase: Create a unique passphrase that only you and your loved ones know to verify the caller’s identity.
  • Hang up and call back: If you receive a suspicious call, hang up and call the person back on a trusted phone number to verify the authenticity.

By staying informed and taking these steps, you can help protect yourself from falling victim to voice cloning scams.