AI Drives Rise in CEO Impersonator Scams | Cyber crooks are using deepfake voice and videos of top executives to bilk companies out of millions of dollars; ‘No longer a futuristic concept’

https://www.wsj.com/articles/ai-drives-rise-in-ceo-impersonator-scams-2bd675c4?gaa_at=eafs&gaa_n=ASWzDAjCunIp9ViczVmDzJM7CPE1Sxv0HA__QQbJFSXOJ9IICehimlFdNf3W&gaa_ts=68a44eed&gaa_sig=3wvxyiIDuumjQubxZUzXx4aDQ0Nl4uoPkvYDjFNMP8hAm4TOu9ZHBnWk4XxyY_GmXCG42gp1tO_R6obnXx9TGw%3D%3D

Share.

4 Comments

  1. “In the U.S., there were more than 105,000 deepfake-related attacks last year—or roughly an attack every five minutes—a massive jump from 2023, said Brian Long, chief executive and co-founder of OpenAI-backed cybersecurity firm Adaptive Security.

    “A year ago, maybe one in 10 security executives I spoke to had seen one. Now it’s closer to five in 10,” said Long, whose firm specializes in guarding against AI-powered social engineering attacks.

    Companies reporting CEO impersonation attacks include carmaker Ferrari, cloud-security company Wiz, and advertising firm WPP. In one well-documented case cited by multiple cybersecurity experts, an employee of U.K.-based engineering firm Arup transferred $25 million to fraudsters last year, after a video meeting with AI-generated impersonations of several company executives.

    In a typical deepfake scam, an office worker with privileged access to a company’s inner operations, such as a finance manager, executive assistant or senior software engineer, receives a call from a fake CEO or other senior official with an urgent matter. Often it involves a sudden breakthrough in a sought-after acquisition or merger, or a similar high-stakes deal.

    The initial call is then followed by a one-on-one virtual meeting with a convincing video of the official, giving the worker specific instructions for wiring emergency funds into a special-purpose account, transmitting business data or security credentials to an encrypted email address, or simply clicking on an emailed link with malicious code.

    Like many virtual assistants, the fake official’s automated image is designed to respond to questions or comments in near real time, mimicking a natural, conversational manner—including a recognizable tone of voice, speech pattern or even telltale accent.”

  2. If your company allows for 25mio deals deriving from a video conference, there’s s.th fundamentally flawed..

  3. If your company can get fooled just because someone sounds and looks like confident CEO then it deserves to fall

  4. I say, good. About time people started praying on CEOs and corporations, instead of people’s grandmothers and small businesses.