CloudSecurityAlliance warns that AI-generated code often references packages that do not exist, enabling attackers to register those names and distribute malware, turning LLM coding assistants into a supply chain risk.
20% of AI-generated code references packages that don't exist.
Attackers are registering those hallucinated package names and filling them with malware.
It's called "slopsquatting" — and it turns every AI coding assistant into a potential supply chain attack vector.
This finding is one of many signals tracked across Cyber Security. The live feed updates every few hours with new authority voices, debates, and emerging ideas.
← Back to Cyber Security