North Korean operatives are increasingly using artificial intelligence (AI) to deceive Western companies into hiring them for remote IT jobs, according to Microsoft. The technology company said the long-running scheme by groups linked to North Korea is becoming more sophisticated with the help of AI tools that create fake identities, alter stolen documents and disguise accents during online interviews.
According to a report The Guardian, Microsoft said North Korean groups have been using AI applications to produce realistic CV photos, generate culturally appropriate Western names and modify identity documents. Cybersecurity researchers refer to the groups behind these operations as Jasper Sleet and Coral Sleet.
The company said voice-changing software has been used during online interviews so that applicants appear to speak with Western accents. AI tools such as Face Swap are also being used to insert the faces of North Korean workers into stolen identity documents and generate professional-looking headshots for job applications.
Wages allegedly sent back to the state
The scam typically involves individuals applying for remote software or IT jobs in Western countries using fake identities and the help of local intermediaries.
Once hired, the workers reportedly send their salaries back to the North Korean government. In some cases, they have also threatened to release sensitive company information if their employment is terminated.
Microsoft said it disrupted about 3,000 Microsoft Outlook and Hotmail accounts last year that were linked to fake North Korean IT workers.
AI used throughout the scam process
The company said AI tools are being used at every stage of the operation - from generating name lists and email formats to scanning job platforms such as Upwork for suitable vacancies.
Applicants then tailor their applications based on the skills mentioned in job listings, making them appear more credible to employers.
Once employed, AI is reportedly used to draft emails, translate documents and generate code to avoid suspicion and maintain their positions.
Microsoft has urged companies to strengthen hiring checks, including conducting video or in-person interviews.Experts say potential deepfake images or videos may sometimes be identified through signs such as unusual pixelation around facial features or irregular lighting on AI-generated faces.
Track Latest News Live on NDTV.com and get news updates from India and around the world