Scammer Successfully Deepfaked CEO’s Voice To Fool Underling Into Transferring $359,000

Scammer Successfully Deepfaked CEO’s Voice To Fool Underling Into Transferring $359,000

The CEO of an energy firm based in the UK thought he was following his boss’s urgent orders in March when he transferred funds to a third-party. But the request actually came from the AI-assisted voice of a fraudster.

The Wall Street Journal reports that the mark believed he was speaking to the CEO of his businesses’ parent company based in Germany. The German-accented caller told him to send €220,000 ($359,560) to a Hungarian supplier within the hour. The firm’s insurance company, Euler Hermes Group SA, shared information about the crime with WSJ but would not reveal the name of the targeted businesses.

Euler Hermes fraud expert Rüdiger Kirsch told WSJ that the victim recognised his superior’s voice because it had a hint of a German accent and the same “melody.” This was reportedly the first time Euler Hermes has dealt with clients being affected by crimes that used AI mimicry.

Kirsch told WSJ the fraudster called the victim company three times. Once the transfer occurred, the attacker called a second time to falsely claim that the money had been reimbursed. Then the hacker reportedly called a third time to ask for another payment.

Even though the same fake voice was used, the last call was made with an Austrian phone number and the “reimbursement” had not gone through, so the victim grew more sceptical of the caller’s authenticity and didn’t comply.

AI-generated voice technology has become disturbingly realistic in recent months and Kirsch told the Journal that it he believes commercially-available software was used to facilitate the fraudulent executive impersonation.

In May, the AI company Dessa released a simulation of the podcaster Joe Rogan voice that was a near-perfect replica of his t’s gravelly timbre. It was so similar to the real thing that a longtime listener would have difficulty distinguishing between Joe Rogan and “Joe Fauxgan.”

As for the unidentified energy companies stolen money, it was reportedly sent to a Hungarian bank account, moved to an account in Mexico, and subsequently distributed to various other locations. No suspects have been identified.

Dessa demonstrated the technology in a blog post that discusses the societal implications of the technology and suggests a few examples of criminal ways the deepfake voices could be used, including election manipulation, impersonating family members, and gaining security clearance.

The blog states that “in the next few years (or even sooner), we’ll see the technology advance to the point where only a few seconds of audio are needed to create a life-like replica of anyone’s voice on the planet.”

The energy firm CEO was tricked by a fake AI-assisted voice two months before Dessa’s warning was posted.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.