Skip to main content
critical Riskdeepfakes

Voice Cloning Scams

Criminals use AI to clone voices from short audio samples, then call victims pretending to be loved ones in distress.

Last updated: January 5, 2025

The "Grandparent Scam" 2.0

AI voice cloning has supercharged traditional phone scams. Scammers need just 3-10 seconds of audio (from social media videos, voicemail, etc.) to create a convincing clone of someone's voice.

🔍How This Scam Works

  1. Scammers find audio of your family member online (social media, YouTube, etc.)
  2. Use AI to clone the voice
  3. Call you pretending to be in an emergency (arrested, in accident, kidnapped)
  4. Request immediate money transfer for bail, hospital bills, ransom
  5. Create urgency to prevent verification

🚩Red Flags to Watch For

  • Urgent requests for money via phone or text
  • Claims of being in trouble (arrest, accident, hospital)
  • Asks you not to tell other family members
  • Requests unusual payment methods (gift cards, crypto, wire transfer)
  • Background noise or call quality issues
  • Refuses video call
  • Story doesn't quite add up

🛡️How to Protect Yourself

  • 1Establish a family code word that only you know
  • 2Hang up and call the person directly on their known number
  • 3Verify with other family members before sending money
  • 4Never send money via gift cards, crypto, or wire for 'emergencies'
  • 5Limit what you share on social media (videos with voices)
  • 6Be skeptical of urgent requests, even if voice sounds right

📞If You've Been Targeted

  1. Contact your bank immediately to stop/reverse payments
  2. Report to local police and FBI IC3
  3. Warn family members
  4. Document all communications
  5. Consider identity theft monitoring

🌍Report & Get Help

Report fraud and get support through these official resources in your country:

🇺🇸United States

  • FBI IC3

    Report cyber fraud

    📞 1-800-CALL-FBI

🇬🇧United Kingdom

Learn More

Related Scam Alerts

Share: