The AI Applicant: Deepfakes In UK University Interviews

Aug 29, 2025

Image Credit: Prexels

Some UK universities have started using automated online interviews and questionnaires—especially for international students—to make admissions faster and more efficient. These tools help schools assess spoken English and communication skills before issuing the Confirmation of Acceptance for Studies certificate required for UK student visas. However, a new and worrying twist has emerged: a handful of applicants are using AI-generated “deepfake” voices and faces to manipulate these interviews, raising concerns about deception in admissions.

The platform Enroly, which is a software platform used by a number of universities to automate their application processes,, has been at the forefront of this issue. Although the deepfake cases are now rare—about 30 incidents out of 20,000 January interviews, approximately 0.15%—experts call it “the future of fraud”. Phoebe O’Donnell, Enroly’s Head of Services, described the phenomenon as “the stuff of nightmares for interview assessors”. In comparison, more traditional forms of cheating—like someone off-screen helping or lip-syncing—make up a slightly larger share of deceptive attempts for about 1.3%.

To counteract these threats, Enroly and universities employ several fraud-detection tools, including facial recognition, passport matching, and real-time detection techniques. If an automated interview seems suspicious, the candidate may be asked to complete a live interview instead. This vigilance is vital not only to keep admissions fair but also to maintain compliance with UK Visas and Immigration. As deepfake technology evolves, universities must stay ahead of the curve to safeguard the integrity of the admissions process.

With generative AI tools becoming more accessible, what is currently a small percentage of cases today could quickly escalate into a widespread threat tomorrow. In fact, deepfake fraud is increasing at a pace few expected. That is why our projects focus on staying ahead of this growth curve—building advanced real-time deepfake detection and improving your deepfake defense readiness. Reach out to us at Karna if you want to know more about securing yourself from deepfake frauds.

Copyright ©2025 Project Karnā Inc. Various trademarks held by their respective owners.

Copyright ©2025 Project Karnā Inc. Various trademarks held by their respective owners.

Copyright ©2025 Project Karnā Inc. Various trademarks held by their respective owners.