Open Source Security FundingOss Item

Slopsquatting from LLM hallucinated packages

April 5, 2026CloudSecurityAlliance

CloudSecurityAlliance warns that AI-generated code often references packages that do not exist, enabling attackers to register those names and distribute malware, turning LLM coding assistants into a supply chain risk.

20% of AI-generated code references packages that don't exist.
Attackers are registering those hallucinated package names and filling them with malware.
It's called "slopsquatting" — and it turns every AI coding assistant into a potential supply chain attack vector.
CloudSecurityAlliance
llmsupply chainllmsupply chainnpm packagemalicious code

See what authorities are saying right now

This finding is one of many signals tracked across Cyber Security. The live feed updates every few hours with new authority voices, debates, and emerging ideas.

← Back to Cyber Security