3 Physical Tests to Spot a Deepfake Candidate

Technical recruiter performing forensic "Profile Turn" and "Hand Pass" tests on a candidate during a video interview to detect deepfake overlays.
Table of Contents
Table of Contents

In 2026, a high-fidelity video call is no longer proof of life. As deepfake technology becomes more accessible, candidates are increasingly using digital overlays to simulate technical know-how and identity.

Also, the March 2026 INTERPOL Report notes a terrifying reality: AI-enhanced fraud is now 4.5 times more profitable than traditional cybercrime methods.

If your hiring process relies on a standard chat and code format, you are vulnerable. At STACK IT, we’ve moved beyond conversation and into forensic verification. We believe that human insight > automation, but that insight must be paired with physical tests that AI mapping simply cannot pass.

Maneuver 1: the profile turn

The most common digital mask used by imposters is a front-facing AI overlay. These tools are highly effective at mapping features—eyes, nose, and mouth—as long as the candidate remains relatively still and centered.

How to execute: during your interview, ask the candidate to turn their head 90 degrees to the left or right to show their profile.

What to look for:

  • Watch the jawline and the ears. Most current generative AI models struggle to maintain the digital mask at a sharp angle. You will often see a momentary shimmer or a complete loss of the overlay where the mask detaches from the real face.
  • If the candidate’s skin texture suddenly changes or becomes smooth during the turn, you are likely looking at a render.

Maneuver 2: the hand pass

Deepfakes rely on unobstructed anchor points on the face. When those points are blocked, the AI rendering engine must work in real-time to guess what is behind the obstruction.

How to execute: ask the candidate to slowly pass their hand across their face, from one ear to the other, while continuing to speak.

What to look for:

  • Look for halos or blurring around the fingers. If the candidate is a deepfake, their features (eyes or mouth) may appear on top of their hand, or the hand itself will appear to melt into the face.
  • This maneuver disrupts the AI’s ability to map the mouth to the voice. If the audio remains perfectly clear while the hand blocks the mouth, or if the digital lips stop moving entirely, you have identified a mask.

Maneuver 3: contextual logic breaking

While not strictly physical, this maneuver tests the candidate’s cognitive latency and breaks the AI-assisted script.

How to execute: Interrupt a technical answer with a completely unrelated, high-context request. For example: 

‘Before you finish that, can you look at that object behind you and tell me what color it is? or Can you tilt your camera down to show your keyboard for a moment?”

What to look for:

  • An AI-assisted candidate (prompt gamer) is often focused on a second screen or teleprompter. A sudden shift in physical focus or environment-based questioning causes a logic break.
  • Watch for a 3–5 second freeze as the candidate (or their proxy) attempts to recalibrate their AI tool to the new physical context.

Why AI forensics is the new standard

Relying on a boutique and intentional process means we don’t take shortcuts. When firms skip these forensic steps, they fall into the trap of resume roulette, leading to a high cost of a bad hire. Which is a financial and operational hit that most scaling teams cannot afford.

Beyond the financial loss, there is the issue of Bill 190 hiring requirements. In Ontario, you are now legally required to disclose the use of AI in your hiring process. By using these physical maneuvers, STACK IT ensures that our clients remain compliant while using human-led judgment to filter out the noise.

If your organization fails to implement reasonable verification, especially after the AI-disclosure mandates of Bill 149 highlighted these risks, you are legally exposed to negligent hiring lawsuits that are increasingly difficult to defend.

Our recruiters don’t just ask the questions to ask candidates that test for technical depth; we use these forensic maneuvers to ensure the person answering those questions is authentic.

Conclusion: trust, but verify

The interview imposter crisis isn’t going away. As AI tools improve, the burden of verification shifts to the recruiter.

At STACK IT, we’ve built our entire success-based recruiting model around this. We don’t flood your inbox with 30 candidates. We deliver a small batch of verified, human professionals who have passed our forensic STACK.Is your process actually verifying humans? Don’t hire a deepfake. Download the Forensic AI Hiring Playbook to access our full list of detection maneuvers and the Bill 190 Ontario Hiring Compliance Checklist.

SHARE THIS ARTICLE

Need immediate help? Call (905) 238-9204

Consent