- Home
- /Home
- /Scam Watch
- /Voice Cloning Scams
Voice Cloning Scams
Criminals use AI to clone voices from short audio samples, then call victims pretending to be loved ones in distress.
The "Grandparent Scam" 2.0
AI voice cloning has supercharged traditional phone scams. Scammers need just 3-10 seconds of audio (from social media videos, voicemail, etc.) to create a convincing clone of someone's voice.
🔍How This Scam Works
- Scammers find audio of your family member online (social media, YouTube, etc.)
- Use AI to clone the voice
- Call you pretending to be in an emergency (arrested, in accident, kidnapped)
- Request immediate money transfer for bail, hospital bills, ransom
- Create urgency to prevent verification
🚩Red Flags to Watch For
- •Urgent requests for money via phone or text
- •Claims of being in trouble (arrest, accident, hospital)
- •Asks you not to tell other family members
- •Requests unusual payment methods (gift cards, crypto, wire transfer)
- •Background noise or call quality issues
- •Refuses video call
- •Story doesn't quite add up
🛡️How to Protect Yourself
- 1Establish a family code word that only you know
- 2Hang up and call the person directly on their known number
- 3Verify with other family members before sending money
- 4Never send money via gift cards, crypto, or wire for 'emergencies'
- 5Limit what you share on social media (videos with voices)
- 6Be skeptical of urgent requests, even if voice sounds right
📞If You've Been Targeted
- Contact your bank immediately to stop/reverse payments
- Report to local police and FBI IC3
- Warn family members
- Document all communications
- Consider identity theft monitoring
🌍Report & Get Help
Report fraud and get support through these official resources in your country:
🇺🇸United States
- FBI IC3
Report cyber fraud
📞 1-800-CALL-FBI
🇬🇧United Kingdom
- Action Fraud
Report scams
📞 0300 123 2040
Learn More
Related Scam Alerts
Deepfake Video Scams
Scammers use AI-generated fake videos of celebrities, executives, or family members to manipulate victims into sending money or revealing sensitive information.
Deepfake Extortion and Blackmail
Criminals create fake compromising videos or images of victims using AI, then demand payment to prevent distribution.
AI-Powered Identity Theft
Criminals use AI to create fake IDs, forge documents, and impersonate victims with synthetic voices and images to steal identities at scale.