AI hallucinates package names and developers install them

Mar 30, 2024 · 1 min read
AI hallucinates package names and developers install them

On Thursday, security researchers announced that they had discovered that ChatGPT frequently hallucinates package names. Researchers noticed that a particular package, huggingface-cli, was constantly recommended by ChatGPT despite not existing, so they created it. The fake package averaged about 10k downloads per month for at least three months.

This is a variant of a well-known phenomenon where developers copy-paste code from the internet into their terminal. Often, there are good reasons for this, like struggling with syntax or a task involving lots of boilerplate.

But it’s potentially problematic when the developer doesn’t understand what they’re copy-pasting, or when install hooks could be leveraged to attack the system. The increasing prevalence of supply chain attacks should give developers pause to at least Google the package name before installing it.

Pressed for comment, ChatGPT said: An attempt at a ChatGPT copy-paste macropad

Sharing is caring!