TL;DR

Deepfake videos use AI to create realistic fake videos of real people. Scammers use them for fraud, impersonating executives, family members, and celebrities. Protect yourself by verifying through separate channels, looking for visual glitches, and being suspicious of urgent requests for money or information.

Why it matters

Deepfake technology has become accessible to criminals. In 2024, a Hong Kong company lost $25 million when an employee was tricked by a deepfake video call impersonating the company's CFO. These scams are becoming more common and more convincing. Anyone can be targeted.

How deepfake scams work

The technology

AI analyzes videos and photos of a person to learn:

  • Their facial features and movements
  • Their voice patterns and speech
  • Their mannerisms and expressions

Then it generates new video that looks and sounds like that person saying whatever the scammer wants.

Common attack types

Executive impersonation

  • Fake CEO/CFO on video calls
  • Urgent wire transfer requests
  • "Keep this confidential" pressure
  • Targets: Finance teams, assistants

Family emergency scams

  • Fake video of relative in distress
  • Requests for bail money or emergency funds
  • Creates panic to bypass rational thinking
  • Targets: Elderly family members

Romance and trust scams

  • Fake video calls with romantic interests
  • Building relationships for financial fraud
  • Used when victim requests video proof
  • Targets: Online dating users

Celebrity endorsement fraud

  • Fake videos of celebrities promoting scams
  • Cryptocurrency schemes
  • Investment frauds
  • Targets: Social media users

Red flags to watch for

Visual clues

Current deepfakes often have telltale signs:

  • Unnatural blinking — Too slow, fast, or irregular
  • Skin texture — Too smooth or plastic-looking
  • Edge artifacts — Blurring around face edges
  • Lighting inconsistencies — Face lit differently than background
  • Hair issues — Strands don't move naturally
  • Accessories glitches — Glasses, earrings behaving strangely
  • Background warping — Slight distortions when head moves

Audio clues

  • Robotic quality — Slightly unnatural speech patterns
  • Sync issues — Lips don't quite match audio
  • Breathing — Unnatural or missing breath sounds
  • Background noise — Inconsistent ambient sounds

Behavioral clues

  • Urgency — "This must happen now"
  • Secrecy — "Don't tell anyone about this"
  • Unusual requests — Asking for things they normally wouldn't
  • Avoiding verification — Resisting callbacks or confirmation

How to protect yourself

The callback rule

Never act on a video request without verification through a separate channel.

If your "CEO" calls asking for a wire transfer:

  1. Say you'll call them right back
  2. Use a number you already have (not one they give you)
  3. Confirm the request directly
  4. If they resist verification, it's likely a scam

Family code words

Establish a family verification system:

  • Create a secret code word only family knows
  • Use it to verify emergency calls
  • Change it if it might be compromised
  • Never share it via text or email

Organizational safeguards

For businesses:

  • Require multi-person approval for large transfers
  • Establish verification protocols for video requests
  • Train employees on deepfake awareness
  • Create clear escalation procedures

Technical measures

  • Enable two-factor authentication everywhere
  • Be cautious about video calls from unknown numbers
  • Verify meeting links come from legitimate sources
  • Consider deepfake detection tools for high-risk situations

What to do if targeted

Don't panic

  • Take a breath
  • Don't act immediately
  • The urgency is manufactured

Verify

  • Contact the person through known channels
  • Call their verified phone number
  • Check with others who might know

Document

  • Save the video if possible
  • Note the contact method used
  • Record any phone numbers or emails

Report

  • FTC (US): reportfraud.ftc.gov
  • IC3 (FBI): ic3.gov
  • Action Fraud (UK): actionfraud.police.uk
  • Your local police
  • Your organization's security team

What's next

Stay informed about AI safety: