← Back to BlogEmerging Threats

Deepfake Scams: When the Voice on the Phone Isn't Your Boss

Deepfake Scams: When the Voice on the Phone Isn't Your Boss

Imagine getting a call from your CEO. You recognise the voice. The tone, the cadence, even the slight verbal habits — it sounds exactly like them. They tell you there's an urgent payment needed today, a confidential deal, and they need you to handle it personally without running it through the normal channels.

Would you do it?

If you said yes — or even "maybe" — you need to know about deepfake scams.

What is a deepfake scam?

Deepfake technology uses AI to clone a person's voice (or video image) using just a short sample of real audio or footage. With publicly available tools and as little as 10-20 seconds of audio from a YouTube video, a podcast, or a social media reel, a scammer can create a convincing replica of virtually anyone's voice.

These cloned voices are then used to call employees, suppliers, or family members and request urgent money transfers, password resets, or sensitive information.

Real cases, real money

In 2019, a UK energy company CEO was tricked into wiring €220,000 (about $380,000 AUD) after receiving a phone call he believed was from his parent company's CEO — it was a voice deepfake.

Cases have escalated dramatically since then. The 2024 Hong Kong deepfake video call case — where a single employee authorised a $25 million transfer after a fake video conference — is the most extreme example to date.

How to protect your business:

  1. Create a verbal code word for urgent financial requests. Something simple that only genuine executives in your team know. If someone calls claiming to be the CEO, ask for the code word. A deepfake caller won't know it.
  1. Mandate a second-channel confirmation. Any instruction to transfer money or change credentials — regardless of who it appears to be from — must be confirmed via a separate communication channel (e.g. an email from a known address, or a call back to a confirmed number).
  1. Set a payment threshold that requires in-person approval. For amounts above a certain level, physically confirm with the requester. A deepfake can't walk into the office.
  1. Limit public audio and video of your leadership team. The less material available, the harder it is to create a convincing deepfake. Not always practical, but worth considering.
  1. Train your team to feel comfortable pushing back. Employees need to know that it's not only okay to verify — it's required. Even if the caller sounds annoyed.

The human element

Deepfake scams work because we trust our senses. We've spent our whole lives learning that "that sounds like Dave" means it IS Dave. AI is now exploiting that instinct.

The fix is procedural, not sensory: build processes that require verification beyond just "it sounded like them."

Practise spotting AI fakes at Phishbate — free →

Think you can spot a phish?

Put your knowledge to the test with the Phishbate interactive quiz. It only takes a few minutes.

Take the Quiz →