Scammers Use AI to Fake CEO’s Voice & Fool Staff into Transferring €220,000

Scammers just got a whole lot more sophisticated, and deep-fake enabled fraud is set to become a pillar of criminal activities targeting both private and corporate entities.


A group of scammers have targeted a UK-based energy firm and managed to fool the staff members into transferring $243,000 (AUD $360,000) to a third-party through the use of artificial intelligence and deep-fake technology, according to a report from the Wall Street Journal.


Earlier this year in March, the CEO of the unnamed energy firm received a call, supposedly from the CEO of his business’ parent company which was based in Germany. The caller had a typically German accent and requested the transfer of €220,000 to a Hungarian supplier, to which the English-based CEO agreed.


The reality of the situation was that the English-based CEO that was being asked to transfer the money was actually speaking to a mix of sophisticated deep-fake technology, with an artificial intelligence algorithm working to disguise the fraudster’s voice and imitate that of the German CEO.


“In the next few years (or even sooner), we’ll see the technology advance to the point where only a few seconds of audio are needed to create a life-like replica of anyone’s voice on the planet.”



It was convincing enough for him to authorise the payment to a Hungarian bank account.


According to reports, Rüdiger Kirsch, fraud expert at insurance company Euler Hermes Group said that “the victim recognised his superior’s voice because it had a hint of a German accent and the same melody. This was reportedly the first time Euler Hermes has dealt with clients being affected by crimes that used AI mimicry.”


Kirsch said that “the fraudster called the victim company three times. Once the transfer occurred, the attacker called a second time to falsely claim that the money had been reimbursed. Then the hacker reported called a third time to ask for another payment. Even though the same fake voice was used, the last call was made with an Austrian phone number and the ‘reimbursement’ had not gone through, so the victim grew more skeptical of the caller’s authenticity and didn’t comply.”


While they didn’t comply with the secondary requests of the fraudster, the initial payment had been made, and it was funnelled through the Hungarian bank account to an account based in Mexico, and had been scattered to a number of accounts afterward.


Artificial intelligence company Dessa has in the past released deep-fake versions of Joe Rogan, as well as other extremely convincing versions of popular figures being faked into saying whatever the company desires. In that blogpost, the company says that deep fake technology is set to get exponentially better and more convincing, and will almost certainly be used in fraudulent circumstances to dupe people in ways like we’ve seen in this case. “In the next few years (or even sooner), we’ll see the technology advance to the point where only a few seconds of audio are needed to create a life-like replica of anyone’s voice on the planet.”

Featured Posts
Recent Posts
Archive
Search By Tags
Follow Us
  • YouTube Best Practice Icon
  • LinkedIn Social Icon
  • Facebook Basic Square
  • Instagram Social Icon
  • Twitter Basic Square

© 2020 by Best Practice

  • White YouTube Icon
  • White LinkedIn Icon
  • White Instagram Icon
  • White Facebook Icon
  • White Twitter Icon