- Home
- /Home
- /Scam Watch
- /Deepfake Extortion and Blackmail
Deepfake Extortion and Blackmail
Criminals create fake compromising videos or images of victims using AI, then demand payment to prevent distribution.
What is this scam?
Scammers use AI to create fake pornographic, compromising, or embarrassing videos featuring your face. They threaten to send these deepfakes to your family, employer, or social media unless you pay. The videos are entirely fabricated but look convincing enough to cause panic.
This scam is particularly cruel because:
- The content never existed—it's 100% AI-generated
- Victims feel shame even though they did nothing wrong
- Fear of exposure makes victims pay without verifying
- Once you pay, scammers often demand more money
🔍How This Scam Works
- Photo harvesting: Scammers collect your photos from social media
- Deepfake creation: AI swaps your face onto explicit or compromising content
- Contact: Email or message with "proof" video/image
- Threat: Claim they'll send to contacts, post online, or ruin your reputation
- Demand: Request payment (usually Bitcoin) within 24-48 hours
- Escalation: If you pay, they often demand more money
🚩Red Flags to Watch For
- •Email claiming they have compromising video/images of you
- •Threats to send content to family, employer, or social media
- •Demand for payment in cryptocurrency
- •Tight deadline creating urgency (24-48 hours)
- •Generic language suggesting mass email blast
- •Claims they hacked your webcam (usually false)
- •No specific details about the "incident"
🛡️How to Protect Yourself
- 1Never pay extortionists—it confirms you're a target and invites more demands
- 2Report to police immediately—extortion is illegal
- 3Limit personal photos on public social media
- 4Use webcam covers when not actively using camera
- 5Tell trusted friends/family about the scam attempt
- 6Document all threats but don't engage with scammers
- 7Set social media to private and limit friend lists
- 8If deepfake appears online, report to platform and request removal
📞If You've Been Targeted
If you're being extorted:
- Do NOT pay - Payment doesn't stop threats and marks you as a target
- Report to police - Extortion is a serious crime
- Report to FBI IC3 if in US (ic3.gov)
- Screenshot all communications - Evidence for law enforcement
- Block the scammer - Don't engage further
- Warn family and friends - Give them a heads-up in case they're contacted
- Report to platform if content appears online
- Consult a lawyer if content is distributed
- Seek support - Victim support services can help with emotional impact
Remember: You did nothing wrong. The content is fake. Real authorities understand deepfakes exist.
🌍Report & Get Help
Report fraud and get support through these official resources in your country:
🇺🇸United States
- FBI Internet Crime Complaint Center
Report cyber extortion
- Cyber Civil Rights Initiative
Support for victims of intimate image abuse
📞 1-844-878-2274
- National Suicide Prevention Lifeline
24/7 emotional support
📞 988
🇬🇧United Kingdom
- Action Fraud
Report extortion
📞 0300 123 2040
- Revenge Porn Helpline
Support for intimate image abuse
📞 0345 6000 459
🇨🇦Canada
- Canadian Centre for Child Protection
Help with image removal
- RCMP Cyber Crime
Report cyber extortion
🇦🇺Australia
- eSafety Commissioner
Report and remove intimate images
📞 1800 880 176
Learn More
Related Scam Alerts
Deepfake Video Scams
Scammers use AI-generated fake videos of celebrities, executives, or family members to manipulate victims into sending money or revealing sensitive information.
Voice Cloning Scams
Criminals use AI to clone voices from short audio samples, then call victims pretending to be loved ones in distress.
AI-Powered Identity Theft
Criminals use AI to create fake IDs, forge documents, and impersonate victims with synthetic voices and images to steal identities at scale.