A purpose-built AI security agent detected 92% of exploited vulnerabilities in DeFi smart contracts in a new open source benchmark.
The study released Thursday by artificial intelligence security firm Cecuro evaluated 90 real-world smart contracts that were exploited between October 2024 and early 2026, resulting in verified losses of $228 million. The specialized system flagged vulnerabilities associated with $96.8 million in exploits, while the GPT-5.1-based baseline encoding agent had a detection rate of only 34% and coverage of $7.5 million.
Both systems run on the same leading edge model. The difference lies in the application layer: domain-specific approaches, structured review stages and DeFi-centric security heuristics on top of the model, the report said.
The findings come amid growing concerns that artificial intelligence is accelerating cryptocurrency crime. Independent research from Anthropic and OpenAI shows that artificial intelligence agents can now perform end-to-end exploits against most known vulnerable smart contracts, with exploit capabilities reportedly doubling approximately every 1.3 months. The average cost of an AI-driven vulnerability attempt is approximately $1.22 per contract, significantly lowering the barrier to large-scale scanning.
Previous reporting by CoinDesk outlined how bad actors such as North Korea are beginning to use artificial intelligence to expand hacking operations and automate parts of the exploitation process, highlighting the widening gap between offensive and defensive capabilities.
Sekulow believes that many teams rely on general-purpose artificial intelligence tools or one-time security audits, an approach suggested by the benchmark that could miss high-value, complex vulnerabilities. Some contracts in the data set had been professionally reviewed before being exploited.
The benchmark dataset, evaluation framework, and baseline agent are open source on GitHub. The company said it has not yet released a full security agent due to concerns that similar tools could be repurposed for offensive purposes.