They use AI to scam exchanges says Binance security chief

633
Advertisment

In the face of the collective hysteria that has been unleashed in the world with Artificial Intelligence (AI), few are talking about the dangers, although this represents a growing risk for users of bitcoin (BTC) and other cryptocurrencies that operate on exchanges.

Fraudsters are increasingly using AI tools to create deepfakes with those seeking to circumvent exchange companies’ KYC (Know Your Customer) verification. So warns Jimmy Su, chief security officer of Binance, the cryptocurrency exchange with the highest volume of trade globally.

“The scammer will search the internet for a normal photo of the victim. Based on that, it can produce videos using deep spoofing tools to evade verification,” Su said, according to a Cointelegraph post.

Detailing the modus operandi scammers use, Su adds that AI tools have become so sophisticated that they can even respond to audio instructions designed to verify that the applicant is human. In fact, there is already software that can do this in real time.

“For example, it requires the user to blink with the left eye, or to look left, right, up or down. Deepfakes are now so advanced that they can actually execute these commands,” the security chief added.

Su added that although AI is becoming more advanced, it is “not yet at the level where it can fool a human operator” because it is not indistinguishable from reality.

David Schwed, the chief operating officer of blockchain security firm Halborn suggests that a useful way to quickly detect a deepfake is to observe when the subject blinks. If it seems unnatural, it is very likely that it is a manipulated video with fraudulent intentions.

This is due to the fact that ultrafake videos are generated using image files obtained on the Internet, where the subject will usually have their eyes open, Schwed explains. Therefore, in a deepfake, the blinking of the subject’s eyes must be simulated.

Another formula that users can implement to avoid deepfake, is to improve privacy and anonymity, although curiously these security measures conflict with the use of centralized exchanges that implement KYC procedures.

Beware of deppfakes who fish cryptocurrencies

The security firm Kaspersky is also warning about deepfakes to carry out various scams, such as cryptocurrency fraud, or evade biometric security.

In some cases, scammers use images of celebrities or mix old videos to make live broadcasts on social networks. They show a pre-designed page in which they ask victims to transfer a certain amount in cryptocurrencies, promising that they will double any payment sent to them. As a result, victims of these scams lose the entire amount transferred.

Kaspersky recommends staying alert to threats related to deepfakes to avoid becoming a victim. In that sense, he suggests being attentive to the sudden movements that occur in the videos.

Some of these abrupt changes are alterations in lighting from one frame to another, changes in people’s skin tone, strange flickering or total absence of flicker. He also recommends looking for lip sync with speech, digital artifacts in the image, as well as intentionally encoded videos in low quality and with poor lighting.

On the other hand, it is advised to maintain a skeptical attitude towards voice messages and videos, which will not guarantee that people will never be deceived, but it can help avoid many of the most common pitfalls. First and foremost, it’s best to verify rather than trust, Kaspersky says.

Image by: Copyright: vadimjoker


Join our Newsletter
[newsletter_form lists="1"]