Why this matters: According to the FBI IC3, BEC attacks result in billions of dollars in losses annually. AI makes it easier than ever for attackers to clone voices and faces. Following protocol is our only defense against devastating financial and reputational impacts.
The Deepfake Dilemma
Imagine you are at your desk. You receive an urgent, slightly panicked voice message from your CEO. They are stranded at a foreign airport and need you to authorize an immediate $500,000 wire transfer to a new vendor. The voice sounds exactly like them, down to the breathing patterns.
What would you do?
Choose your immediate action:
The Modern Threat Landscape
Digital impersonation and synthetic media are sophisticated threats designed to induce financial transfers, credential disclosures, or reputational harm. Review the tabs below to understand the primary attack vectors.
AI Voice Cloning (Audio Deepfakes)
Attackers use generative AI models to replicate an executive's voice. They train the AI on publicly available audio (e.g., earnings calls, interviews). The result is a voice that can be dynamically typed into text-to-speech engines in real-time to mimic commands over the phone.
Executive Video Deepfakes
Often used in fake virtual meetings. Attackers use deep learning algorithms to swap faces and lip-sync audio in real-time on platforms like Zoom or Teams. They frequently cite "poor connection" to excuse low resolution or minor glitches.
Business Email Compromise (BEC)
BEC attacks rely on compromised email accounts or deceptive domains (typosquatting) to send fraudulent instructions. When BEC is paired with synthetic media (e.g., an email followed by a fake voice mail), it dramatically increases the perceived authenticity of the attack.
Knowledge Check
Which attack vector involves using public earnings calls to train a generative AI model, allowing an attacker to mimic a leader's commands over the phone?
Anatomy of a Synthetic Attack
How do attackers successfully trick seasoned professionals? They follow a structured lifecycle to build trust before exploiting it.
1. Reconnaissance
Threat actors scrape social media, corporate websites, and press releases to gather high-quality audio and video samples of the target executive.
2. Synthesis
The collected data is fed into machine learning models to generate highly realistic synthetic voice or video assets.
3. Execution (The Lure)
The attacker initiates contact. They might send an urgent email (BEC) and follow it up with a cloned voicemail to establish "proof of life."
4. Exploitation
The target, convinced by the multi-modal deception, bypasses standard protocols and executes the requested financial wire or credential handover.
Identity Verification & Callback Protocol
The core defense against impersonation is the Callback Protocol. No financial transfer or confidential disclosure shall be executed solely on the basis of email, video, or voice instruction.
Click below to expand the mandatory steps for high-risk requests:
Knowledge Check
You receive an urgent email from a known vendor asking to update their banking details. They provide a new phone number in the email to call if you have questions. What MUST you do before processing the change?
Incident Reporting & Communications
If you suspect an impersonation or social engineering attempt, speed is critical. A delayed response allows attackers to pivot to other employees.
Report Immediately
Notify the Information Security Team right away so network logs can be preserved and containment actions can begin.
Maintain External Silence
Employees are strictly prohibited from responding to the threat actor or making public comments without coordination from Legal and Corporate Communications.
No Unauthorized Recording
Never attempt to independently record or screenshot internal video conferences without documented authorization from compliance teams.
Knowledge Check
If you suspect a deepfake video message from an executive was sent to your team, what is the policy regarding external communication?
Key Takeaways
Threats are Multi-Modal
Attackers combine BEC emails with AI voice cloning and video deepfakes to manipulate trust.
Always Use the Callback Protocol
Use multi-channel verification to pre-validated contacts for all high-risk requests. Never trust voice or email alone.
Report, Don't React
Send anomalies straight to Information Security. Do not engage the threat actor or comment publicly.
Strict Policy Enforcement
Bypassing these verification procedures is a material violation and carries severe consequences.
Final Assessment
You have completed the tutorial portion of the lesson.
You will now take a 4-question assessment to test your ability to apply the SOP.
You must score 80% or higher to pass and receive your certificate.
Assessment Question 1
What is the mandatory protocol before executing any high-risk financial transfer requested via an unexpected video call?
Assessment Question 2
Which of the following describes Business Email Compromise (BEC) when enhanced by synthetic media?
Assessment Question 3
What is the primary reason why relying solely on visual or auditory recognition is no longer sufficient for authorization?
Assessment Question 4
If a deepfake incident is suspected during an active operation, what is the most critical immediate action?