- Home
- /Home
- /Scam Watch
- /Deepfake Video Scams
Deepfake Video Scams
Scammers use AI-generated fake videos of celebrities, executives, or family members to manipulate victims into sending money or revealing sensitive information.
What is this scam?
Deepfake video scams involve criminals using AI to create realistic fake videos that impersonate trusted individuals such as CEOs, family members, celebrities, or government officials. These fabricated videos are then used to request urgent wire transfers, solicit investments in fraudulent opportunities, extract sensitive corporate information, or manipulate victims emotionally. The technology behind these scams has improved at an alarming rate, and the resulting videos are increasingly difficult to distinguish from genuine footage.
In a typical scenario, a scammer might create a deepfake video of a company executive on a video call instructing an employee to wire money to a new account. Or they might fabricate a video of a grandchild in distress asking for emergency funds. The combination of a familiar face, a convincing voice, and an urgent story makes these scams devastatingly effective.
How AI makes this scam more dangerous
Traditional impersonation scams relied on phone calls, emails, or text messages where the victim could not actually see the person making the request. Deepfake technology removes that limitation entirely. Scammers can now produce video that shows a person's face moving naturally, lips syncing to speech, and expressions matching the emotional tone of the message. Some deepfake tools even work in real time, allowing a scammer to appear as someone else during a live video call.
The barrier to creating these videos has dropped significantly. What once required a Hollywood visual effects team can now be done with consumer-grade software and a handful of reference photos or videos scraped from social media. AI models can learn the visual appearance of a person's face from public photos and generate a convincing fake in hours. Voice cloning technology is often combined with the video deepfake to create a complete audio-visual impersonation.
The scale is also concerning. Scammers can create deepfake videos of multiple different people and deploy them across various targets simultaneously, making this a high-volume, high-reward operation.
Who gets targeted and why
Businesses are prime targets, especially finance departments and employees who handle wire transfers. Scammers study corporate hierarchies through LinkedIn and company websites to identify who reports to whom, then impersonate executives to request payments from subordinates who are unlikely to question a directive from their boss. Elderly people are frequently targeted with deepfakes of family members in supposed emergencies, exploiting the emotional bond and the urgency of the situation. Investors and wealthy individuals may be targeted with deepfake endorsements from celebrities or financial experts promoting fraudulent investment schemes.
Warning signs specific to this scam
During video calls, watch for subtle visual artifacts such as unnatural blinking patterns, slight blurring around the edges of the face, skin texture that looks too smooth or waxy, lighting on the face that does not match the background, and lip movements that are slightly out of sync with the audio. The person may avoid turning their head to extreme angles or the video quality may degrade when they move quickly. Beyond technical tells, be suspicious of any video call where someone makes an urgent financial request, refuses to verify their identity through a separate channel, pressures you to act immediately without following normal procedures, or asks you to keep the request confidential. If anything feels even slightly off about a video interaction, trust that instinct and verify through an independent channel before taking any action.
🔍How This Scam Works
- Target selection: Scammers identify high-value targets (executives, elderly family members)
- Data collection: Gather photos and videos from social media
- Deepfake creation: Use AI to generate fake video of target person
- Delivery: Send via video call, email, or messaging apps
- Urgent request: Create time pressure ("Wire money immediately")
- Exploitation: Victim complies before verifying authenticity
🚩Red Flags to Watch For
- •Video call with unusual audio quality or sync issues
- •Requests for urgent money transfers via video
- •Family member or executive asking for sensitive info via video
- •Uncharacteristic behavior or requests
- •Refusal to verify identity through alternate means
- •Pressure to act immediately without verification
- •Video quality that seems slightly "off" (uncanny valley effect)
🛡️How to Protect Yourself
- 1Establish a family code word for emergency verifications
- 2Never act on urgent financial requests without independent verification
- 3Call the person back on a known phone number (don't use number from suspicious message)
- 4Use multi-factor authentication for corporate approvals
- 5Train employees to recognize deepfake red flags
- 6Implement dual-approval processes for wire transfers
- 7Question any unusual request, even if the video looks real
- 8Look for artifacts: unnatural blinking, lip sync issues, odd lighting
📞If You've Been Targeted
If you've been scammed:
- Stop any ongoing transactions immediately
- Contact your bank to freeze accounts and reverse transfers if possible
- Report to law enforcement - FBI Internet Crime Complaint Center (IC3)
- Document everything - Save videos, messages, transaction records
- Notify your employer if it was a corporate scam
- Warn your network - Others may be targeted with the same deepfake
- Consider credit monitoring if personal information was shared
Don't blame yourself - These scams are sophisticated and fool even tech-savvy individuals.
🌍Report & Get Help
Report fraud and get support through these official resources in your country:
🇺🇸United States
- FBI Internet Crime Complaint Center (IC3)
Report cyber crimes including deepfake scams
📞 1-800-CALL-FBI
- Federal Trade Commission (FTC)
Consumer protection and scam reporting
📞 1-877-FTC-HELP
🇬🇧United Kingdom
- Action Fraud
UK's national fraud and cybercrime reporting centre
📞 0300 123 2040
🇨🇦Canada
- Canadian Anti-Fraud Centre
Report fraud and get support
📞 1-888-495-8501
🇦🇺Australia
- Scamwatch
Report scams to ACCC
Learn More
Related Scam Alerts
Voice Cloning Scams
Criminals use AI to clone voices from short audio samples, then call victims pretending to be loved ones in distress.
Deepfake Extortion and Blackmail
Criminals create fake compromising videos or images of victims using AI, then demand payment to prevent distribution.
AI-Powered Identity Theft
Criminals use AI to create fake IDs, forge documents, and impersonate victims with synthetic voices and images to steal identities at scale.