by Tim Leogrande, BSIT, MSCP, Ed.S.
🗓 APR 30 2026 • 6 MIN 10 SEC READ
According to the 2025 Annual Report from the FBI Internet Crime Complaint Center (IC3), Americans were cheated out of approximately $21 billion in 2025, marking the first time in the organization’s 25-year history that complaints about individual cybercrime incidents exceeded one million. The report specifically highlights two emerging threats that played a major role in this increase: artificial intelligence and cryptocurrency.
Together, these technologies are reshaping cyber fraud in complementary ways. AI makes scams more convincing, scalable, and difficult to detect, while cryptocurrency makes stolen funds faster to move, harder to trace, and significantly more difficult to recover.
What makes AI-enabled fraud dangerous is not simply the use of new technology, but the level of trust it allows criminals to establish. Traditional scams often depended on obvious red flags such as poor grammar, awkward phrasing, or suspicious communication patterns that provided victims indications of deception. AI has significantly minimized those warning signs by allowing scammers to create highly polished messages, realistic voice impersonations, and convincing digital identities that appear legitimate upon initial contact.
For the first time in its nearly 25-year history, the IC3 report includes a section dedicated to AI-related complaints, which totaled 22,364 in 2025 and accounted for nearly $893 million in losses.
<aside> 💡
Threat actors are using AI to produce deepfake videos of celebrities, clone voices of family members and company executives, and generate bogus identification documents to defraud their victims.
</aside>
Social media profiles that were once fairly easy to identify as fake are now quite sophisticated, and allow scammers to engage in misleading correspondence with their targets for weeks at a time. AI is also making it easier for threat actors to participate in business email compromise (BEC) by posing as executives or vendors.

Source: IC3
One of the biggest concerns is the impact of AI on the economics of fraud. Scams that previously required experienced workers fluent in the target's language and educated about the local way of life can now be carried out automatically, at minimal cost, on a large scale.
AI has also lowered the barrier to entry for cybercriminals who may not have advanced technical skills. Tools that were once limited to sophisticated threat actors — such as malware development, phishing kit creation, and social engineering scripts — can now be generated with simple prompts using publicly available AI platforms. This allows less experienced scammers to launch highly convincing attacks without possessing deep expertise in cybersecurity, coding, social engineering, or language localization.
This shift creates serious implications for organizations of every size, particularly small- and mid-sized businesses that may lack mature cybersecurity programs. A single AI-generated phishing email that convincingly imitates a respected executive or trusted vendor can bypass normal suspicion and trigger unauthorized wire transfers, payroll diversion, or credential theft.
Because these attacks are increasingly personalized and context-aware, traditional spam filters and employee awareness programs alone are often insufficient. Companies must now combine technical controls with stronger verification procedures, such as secondary approval processes for financial transactions and out-of-band confirmation for sensitive requests.
If AI strengthens the deception side of cyber fraud, cryptocurrency strengthens the monetization side. With almost $11.3 billion in stolen funds across 181,565 complaints in 2025, cryptocurrency-related fraud caused the most losses of any scam category.
The majority of these losses are the result of investment fraud, particularly schemes that used romance baiting. Romance baiting is the practice whereby threat actors contact victims through dating apps, social media, or text messages, gain their trust over the course of weeks or months, and then invite them to fraudulent trading platforms that promise enormous returns on investment. In broader reporting and law enforcement language, these schemes are often categorized as cryptocurrency investment fraud.