Impersonation Scams Explode 1,400% as AI Weaponizes Fraud: Chainalysis
Impersonation scams surged 1,400% in 2025 as AI tools made fraud 4.5x more profitable, driving total crypto theft to $17 billion.
Cryptocurrency fraud extracted a staggering $17 billion from investors in 2025, driven by a 1,400% surge in impersonation scams. A new Chainalysis report reveals that artificial intelligence has transformed low-level phishing into high-efficiency, industrial-scale theft.
The AI Multiplier
The barrier to entry for sophisticated fraud has collapsed. AI-enabled scams are now 4.5 times more profitable than traditional methods, with the average victim loss jumping 253% to $2,764. Automated tools allow attackers to scale personalized social engineering attacks that were previously impossible to manage manually.
This is not random noise. It is a structural shift in criminal unit economics.
AI-driven scam operations generated 4.5 times more revenue than traditional schemes… netting $3.2 million from a single attack on average.
Industrialized Phishing: The ‘Lighthouse’ Kit
The surge is powered by "Crime-as-a-Service" vendors. Google recently filed a lawsuit against the operators of Lighthouse, a Chinese-language phishing kit used by the "Smishing Triad" group. This infrastructure enabled scammers to blast 330,000 SMS messages daily, stealing over $1 billion from a million victims across 121 countries.
The kit provided templates for the massive "E-ZPass" toll collection scam, which targeted millions of U.S. residents with convincing payment demands. These campaigns are no longer sloppy attempts; they are pixel-perfect replicas deployed at nation-state scale.
Institutional Targets: The Coinbase Case
The impact is visible in high-value targets. In December, the Brooklyn District Attorney charged 23-year-old Ronald Spektor with stealing nearly $16 million by impersonating Coinbase customer support. Spektor allegedly used data from a bribery scheme to contact users, claiming their accounts were compromised, and tricked them into transferring funds to "secure" wallets.
This marks a pivot from technical exploits to psychological manipulation. As DeFi protocols harden their code, attackers are aggressively targeting the weakest link: the user.