AI Scheme Sees $5.6 Million Stolen From FTX Claim Buyers

Reading Time: 2 minutes
  • A sophisticated fraud scheme has been uncovered which targeted companies purchasing fake FTX liquidation claims
  • The scam, involving AI-generated deepfake videos, resulting in over $5.6 million being stolen
  • The fraudster utilized advanced technology to impersonate legitimate claimholders, deceiving buyers into acquiring non-existent assets

Data analytics firm Inca Digital has identified a complex scam where deepfake technology was employed to dupe companies into buying fraudulent FTX liquidation claims, leading to losses exceeding $5.6 million. The perpetrators convincingly posed as legitimate claimants, leveraging AI-generated videos to mask their true identity and sell fictitious assets. This case underscores the increasing sophistication of AI-enabled fraud in the crypto industry, which has already seen warnings issued.

AI and Stolen Data Fuel the Fraud

Inca Digital revealed the mechanics of the scam in a report, which showed that the fraudsters, operating under the names Lim Chee Chong and Teh Jin Loon, successfully deceived at least two companies into purchasing non-existent FTX claims. The stolen funds were then rapidly moved through various crypto exchanges, making them difficult to trace.

Investigators believe the perpetrators used AI face-swapping technology to alter their appearance during due diligence calls with potential buyers, making this one of the first scams to use deepfake technology to such an extent. The fraudulent actors also submitted fake Singaporean identification documents, which displayed inconsistencies compared to legitimate government-issued IDs.

The scam’s success was largely due to unauthorized access to FTX creditor data, says Inca. The perpetrators likely sourced information from publicly available bankruptcy filings or a breach at Kroll, the firm responsible for managing FTX’s claims, which suffered a security breach in August 2023 after a SIM swap attack. The attack compromised non-sensitive customer data, including names, phone numbers, and email addresses, details which may have been used by the fraudsters to impersonate legitimate claim holders.

Laundering Through Crypto Exchanges

Following the fraudulent sales, the stolen funds were funneled through major cryptocurrency platforms, including Binance, CoinEx, and Gate.io. Blockchain analysis linked the scam to fresh wallets that had only received minor initial transactions before suddenly receiving large sums in USDC, with Inca pointing out that the movement of funds was “carefully structured to avoid detection, making it clear this was a well-planned operation.”

Interestingly, investigators also found a connection between the fraud’s deposit address at CoinEx and an MEV (Maximal Extractable Value) bot associated with Symbolic Capital Partners. While no direct link has been confirmed, the findings raise concerns about potential overlaps between high-frequency trading operations and illicit financial activity.

Worrying Development in Deepfake Use

Inca Digital’s CEO, Adam Zarazinski, told CoinDesk that the use of AI in this way represents a worrying development:

The use of deepfake technology in this fraud represents a significant escalation in the methods employed by cybercriminals. It highlights the urgent need for enhanced verification processes in financial transactions.

This incident serves as a stark reminder of the vulnerabilities present in the rapidly evolving cryptocurrency landscape. The integration of advanced technologies like AI in fraudulent schemes poses new challenges for security and verification protocols. Industry experts advocate for the adoption of more robust authentication measures and continuous monitoring to detect and prevent such deceptive practices.

As AI technology becomes increasingly accessible, the potential for its misuse in financial fraud grows. Stakeholders within the cryptocurrency sector are urged to remain vigilant and implement comprehensive safeguards to protect against these sophisticated threats.

Share