Beware of these false reports that defraud you

Guillaume Canet extolling the merits of a mobile application that would allow you to earn money during a TF1 report. Does this shock you? That’s good, it’s obviously a deepfake. This totally fake video was made using artificial intelligence.

The actor himself shared the charade on Instagram with the caption: “You don’t want to let me go a little with your bullshit… Here’s a good example of the disadvantages of artificial intelligence that can be found on social networks! »

Clicking on” I accept “you accept the deposit of cookies by external services and will thus have access to the content of our partners.

More information on the Cookie management policy page

False reports that affect many celebrities

Unfortunately, Canet is not the first to fall victim to this devious manipulation. Personalities such as Kylian Mbappé, Élise Lucet and Stromae have previously been targeted by similar false reports worthy of BFMTV or France 3.

But what is behind these misleading videos? To understand the workings of this machination, we turned to two experts in artificial intelligence.

For Victor Baait, it is crucial to understand “that most of these casino applications require a deposit of money to play. The trap closes when victims deposit money but can never withdraw it, even if they win.”

These scams, which have been increasing since the end of 2023, mainly target four main areas: discounted products, medicines, the stock market and gambling.

To raise awareness and warn about these fraudulent practices, Victor Baait lists these videos on his website. He underlines: “These videos circulate widely on social networks, in particular on the Meta advertising library. » In total, he has listed nearly 400 videos.

Some guidelines for detecting a deepfake

But how can we protect ourselves from these traps? Clara Bismuth, author of the book Synthetic Media – when deepfakes take over art, recommends: “It is essential to exercise discernment in analyzing the context and source of the video, as well as its visual and audio quality. »

Because yes, as in Guillaume Canet’s video, certain visual elements can alert us: “You have to pay attention to the movements of the lips which are not well synchronized, to the areas of blur. There is also the voice which will be robotic at times or body movements which are really not natural. »

But for Clara Bismuth, it is important to recognize that deepfake is not inherently malicious. It offers fascinating possibilities, especially in the fields of education and entertainment. “The last example that I saw is the last Indiana Jones, there is almost half an hour of film where we are with Indiana Jones, young, therefore with deepfake technology which is affixed to the face of the person. »

source site