Deepfake fraud: employee transfers 24 million euros to nothing – economy

It’s time again a scary message, which is currently traveling around the world from Hong Kong. Criminals allegedly defrauded an international company of almost 24 million euros. To do this, they are said to have used artificial intelligence to imitate voices and replace faces in videos. According to the reports, the victim was convinced through a video conference – in which everyone else present was said to have been computer-generated fakes.

You could interpret this news as evidence that you can no longer believe anything. If entire meetings can now be faked, then the information apocalypse is imminent. People have to stop trusting their eyes and ears.

Fortunately, the situation is not that dramatic. There are three reasons for this. Firstly, the Hong Kong police seem to be not being sure yourselfwhat actually happened. What is certain is that the fraudsters convinced an employee from the finance department to make a total of 15 transfers to five different accounts. However, it is unclear which technologies were used.

The victim is said to have initially received a message from the supposed head of finance that a secret transaction was necessary. The person then became suspicious. Only a video call with several trusted colleagues and the head of finance in London dispelled the doubts. Unfortunately, none of the participants were real.

Deepfake or doppelganger?

But that doesn’t mean that all faces were replaced in real time by so-called deepfakes. The police also thinks it is possiblethat the criminals recorded videos of those present that they had previously captured and simply faked the votes. They also allegedly did not interact directly with the victim and ended the meeting abruptly. This puts the narrative of the supposedly perfect fraud into perspective – there were probably plenty of warning signals.

The second reason not to overestimate the incident is to look at the past. For years there have been reports of alleged deepfakes that are said to have caused millions in damage or political confusion. None of the cases have been proven with certainty. There could also be a voice impersonator behind the supposedly computer-generated voice; supposed real-time deepfakes of politicians later turned out to be doppelgangers or technically less complex counterfeits out of here.

For the third reason you have to look back even further, for example into the SZ archive. A decade ago, a fraud scheme was described there that insurers call the “Fake President.” There were no synthesized voices or perfectly fake videos back then. Nevertheless, fraudsters managed to convince financial employees to transfer money on several occasions. All that was needed was hacked or fake email addresses, meticulousness and authority.

Manipulation does not require technology

A pattern emerges that can also be observed on social media: you don’t need complex, technically perfect counterfeits to cause confusion. It is often enough to cleverly manipulate people by gaining their trust, putting them under pressure or saying things that fit into their world view.

The supposed boss calls from abroad and needs a large sum of money for a supposedly secret project, or someone shares a video from Aleppo in Syria and writes that it shows ruins from Gaza City. The grandchild trick, in which criminals pose as relatives of older people, also usually works without technology.

Whether it’s a cheap fake, i.e. a cheaper counterfeit, or a deepfake: One piece of advice from Hong Kong investigators applies to all situations: Anyone who is asked to transfer money urgently should be suspicious – and if in doubt, ask a colleague or call the police.

source site