Ethical Hacking News
AI-Powered Cybercrime: A New Era of Scams and Deception
Mentions of AI on dark web forums have skyrocketed by 371% since 2019, with replies rising almost twelvefold. The use of Dark LLMs (Self-hosted Language Models) has become increasingly popular, with prices starting as low as $30 a month. The trade in deepfakes and impersonation tools has seen a sharp increase in sales since 2024, with complete synthetic identity kits costing around $5. Deepfake fraud has caused significant losses, including $347 million in verified losses in a single quarter. Scam call centers are using synthetic voices for first contact, with language models coaching the humans as they go. The use of AI by cybercriminals poses significant challenges for static defenses and makes it harder to identify attackers.
The world of cybercrime has undergone a significant transformation, thanks to the advent of artificial intelligence (AI). What was once considered the realm of experimental tools and techniques is now a readily available, off-the-shelf infrastructure that can be rented for a price. The latest data from Group-IB, a renowned cybersec biz, paints a grim picture of an AI-powered cybercrime landscape that has become increasingly sophisticated and widespread.
According to Group-IB's numbers, mentions of AI on dark web forums have skyrocketed by 371 percent since 2019, with replies rising even faster – almost twelvefold. This trend is not limited to a specific subset of users; instead, it reflects the broader shift towards AI-powered tools being used by cybercriminals to scale their scams and increase their reach.
The most notable aspect of this trend is the emergence of so-called Dark LLMs (Self-hosted Language Models) – language models built for scams and malware rather than polite conversation. These Dark LLMs are designed to be highly specialized, with prices starting as low as $30 a month. This makes them an attractive option for cybercriminals looking to automate their attacks without breaking the bank.
Another trend that has gained significant attention is the booming trade in deepfakes and impersonation tools. Group-IB reports that complete synthetic identity kits, including AI-generated faces and voices, can now be bought for about $5. These tools have seen a sharp increase in sales since 2024, indicating a growing market for AI-powered identity theft and impersonation.
The implications of these trends are far-reaching and potentially devastating. Group-IB notes that deepfake fraud caused $347 million in verified losses in a single quarter alone, with cases including cloned executives and fake video calls. In one notable instance, the firm helped a bank spot over 8,000 deepfake-driven fraud attempts over eight months.
Furthermore, Group-IB reports that scam call centers are using synthetic voices for first contact, with language models coaching the humans as they go. Malware developers are also starting to test AI-assisted tools for reconnaissance and persistence, hinting at more autonomous attacks in the future.
In a statement, Anton Ushakov, head of Group-IB's Cybercrime Investigations Unit, noted that "AI giving criminals unprecedented reach" is having a profound impact on modern cybercrime. He stated that this technology has enabled scams to be scaled with ease and hyper-personalization at a level never seen before. Moreover, there are concerns that autonomous AI could carry out attacks that once required human expertise.
From a defensive perspective, the use of AI by cybercriminals poses significant challenges for static defenses. When voices, text, and video can all be generated on demand with off-the-shelf software, it becomes much harder to work out who's really behind an attack. This leaves security teams struggling to keep up with the rapidly evolving threat landscape.
In conclusion, the rise of AI-powered cybercrime is a pressing concern that demands immediate attention from security experts and policymakers alike. As AI continues to shape the world of cybercrime, it's essential that we recognize its implications and develop effective countermeasures to combat this growing threat.
AI-Powered Cybercrime: A New Era of Scams and Deception
Related Information:
https://www.ethicalhackingnews.com/articles/The-Rise-of-AI-Powered-Cybercrime-A-New-Era-of-Scams-and-Deception-ehn.shtml
Published: Tue Jan 20 06:39:26 2026 by llama3.2 3B Q4_K_M