FR version is available. Content is displayed in original English for accuracy.
Advertisement
Advertisement
⚡ Community Insights
Discussion Sentiment
57% Positive
Analyzed from 3733 words in the discussion.
Trending Topics
#npm#package#run#code#scripts#more#don#packages#install#build

Discussion (109 Comments)Read Original on HackerNews
Which one?
Maven Central exists for decades the amount of incidents of people stealing namespaces is minimal.
One can't simply publish a package under the groupId "com.ycombinator" without having some way to verify that they own the domain ycombinator.com. Then, once a package is published, it is 100% immutable, even if it has malicious code in it. Certainly, that library is flagged everywhere as vulnerable.
It baffles me that NPM for so long couldn't replicate the same guardrails as Maven Central.
But in the Maven/Gradle ecosystem, most projects pin exact dependency versions. Support for version ranges and dynamic versions exist, but they are generally avoided because they hurt reproducible builds. That means a malicious new release does not automatically flow into most consumers’ builds just because it was published.
I'd go as far to say that NPM should:
1. Enforce scope (namespace) requirement, and require external verification (reverse DNS for example).
2. Disable version range support out of the box. User must --enable this setting from the command line at all times.
3. Remove support for install scripts completely. If someone wants to publish a ready-to-run software, there are plenty of other mechanisms.
Attackers go where the victims are. Frontend is a monoculture with the vast majority using NPM; backend, less so. This isn't an excuse for NPM, but another strike against it.
You could also argue that the attacks make a deeper point about frontend vs backend devs, but I won't go there.
And that number tends to reduce even more when the ecosystem matures.
With many other languages, you have a lot of functionality out of the box. Certainly, there have been bugs and security issues, but they're a drop in the bucket compared to what you see in the JS ecosystem. With other languages, you have a much smaller external dependency graph and the core functionality is coming from a trusted 3rd party.
The issue isn’t that the functionality doesn’t exist, it’s always backwards compatibility with versions where it did not yet exist.
I'm not saying that npm is doing everything right, but I suspect that beyond the obvious low-hanging fruit that we hear about pretty consistently with npm there's probably a long tail of less obvious stuff that can be exploited that will not be specific to npm. The fundamental problems with supply-chain vulnerabilities aren't going to go away if npm magically became pip or go modules overnight.
Both the Browser and Node.js standard library are fairly extensive. I don't think there's much you can do with other language you can't do with Node.js. And as a lot of newer languages have demonstrated (like zig and hare), you don't need an extensive one.
NPM's achilles is the pre/postinstall step which can run arbitrary commands and shell scripts without the user having any way to intervene.
Dependencies must be run in isolated chroot sandboxes or better, inside containers. That would be the only way to mitigate this problem, as the filesystem of the operating system must be separated from the filesystem of the development workflow.
On top of that most host based firewalls are per-binary instead of per-cmdline. That leads to the warnings and rules relying on that e.g. "python" or "nodejs" getting network access allowlisted, instead of say "nodejs myworm.js". So firewalls in general are pretty useless against this type of malware.
Your mismatch is that you think in policies, not assessments here. Nothing in my normal go workflow will ask me if I want to run "curl download whatever from the internet" when I run go build.
Though I agree with the difference in workflow, there is not a single mechanism in go catching this. go.mod files can be just patched by the worm, and/or hidden behind a /v123 folder or whatever to play shenanigans on API differences.
Examples that come to mind: webview/webview, webkit, cilium/ebpf and most other CGo projects that I have seen.
Something fascinating about the design and architecture of programming languages and their surrounding ecosystems is the enormous leverage that they provide to the "core team":
For every 1 core language developer[1]...
... there may be 1,000 popular package developers...
... for which there may be 1,000,000 developers writing software...
... for over 1,000,000,000 users.
This means that for every corner that is cut at the top of that pyramid, the harms are massively magnified at the lower tiers. A security vulnerability in a "top one thousand" package like log4j can cause billions of dollars in economic damage, man-centuries of remediation effort, etc.
However, bizarrely, the funding at the top two levels is essentially a pittance! Most such projects are charities, begging for spare change with hat in hand on a street corner. Some of the most used libraries are often volunteer efforts, despite powering global e-commerce! cough-OpenSSL-cough.
The result is that the people most empowered to fix the issues are the least funded to do so.
This is why NPM, Crates.io, etc... flatly refuse to do even the most basic security checks like adding namespaces and verifying the identity of major publishers like Google, Microsoft, and the like.
That's a non-zero amount of effort, and no matter how trivial to implement technically or how cheap to police, it would likely blow their tiny budget of unreliable donations.
The exceptions to this rule are package managers with robust financial backing, such as NuGet, which gets reliable funding from Microsoft and supports their internal (for-profit!) workflows almost as much as it does external "free" users.
"Free and open" is wonderful and all, but you get what you pay for.
[1] Most of us can name them off the top of our heads: Guido van Rossum, Larry Wall, Kerningham & Richie, etc.
In addition, crates.io has not flatly refused to support namespaces, there's an entire accepted RFC for it: https://github.com/rust-lang/rfcs/pull/3243
At the same time, note that namespacing does nothing to prevent any sort of problem here. Namespacing is great for package organization and making provenance more deliberately obvious, but beyond that it's not a security measure.
Why cooldowns? Most npm (or pypi) compromises were taken down within hours, cooldowns simply mean - ignore any package with release date younger than N days (1 day can work, 3 days is ok, 7 days is a bit of an overkill but works too)
How to set them up?
- use latest pnpm, they added 1 day cooldown by default https://pnpm.io/supply-chain-security
- or if you want a one click fix, use https://depsguard.com (cli that adds cooldowns + other recommended settings to npm, pnpm, yarn, bun, uv, dependabot and, I’m the maintainer)
- or use https://cooldowns.dev which is more focused on, well, cooldowns, with also a script to help set it up locally
All are open source / free.
If you know how to edit your ~/.npmrc etc, you don't really need any of them, but if you have a loved one who just needs a one click fix, these can likely save them from the next attack.
Caveat - if you need to patch a new critical CVE, you need to bypass the cooldown, but each of them have a way to do so. In the past few weeks, while I don't have hard numbers, it seems more risk has come from Software Supply Chain attacks (malicious versions pushed) than from new zero day CVEs (even in the age of Mythos driven vulnerability discovery)
If every other week I would notice the FDA recalls a popular brand that would have taken over my brain and transmit my bank password and SSN to a stranger, I might prefer drinking week old milk.
Edit: not dismissing your analogy, it’s pretty much it.
> Disclaimer: I maintain depsguard
Edit: added it back, inline.
Teams should be able to say "at least N developers have to agree to a release before it happens." This should be a policy they can control and lock down with a non developer account.
I think that npm can have its own cooldown and automated security scan. Socket.dev, StepSecurity both close a gap here by spending tokens to scan new popular packages. Whether they do it for marketing or out of the goodness of their heart, is irrelevant. They don’t charge for this service, and it’s something I’d expect Microsoft (who owns GitHub who owns npm) to do.
Postinstall scripts run at install time, with installer's privileges.
Its childish to believe that because you can't fix everything you shouldn't fix anything. Defense in depth.
You don't need to test a compromised package to have it execute code. Importing it anywhere in your tests is enough, even transitively.
It's for sure less likely to run but I doubt it's significantly different in practice.
This is definitely going to affect any packages that need to link to native code and/or compile shims, but these are very few.
But so is the regular code in those packages! It won't run at install time, but something in there will run -- otherwise it wouldn't have been included in the dependencies.
Thinking that eliminating post-install scripts will have more than a momentary impact on exploitation rates is a sign of not thinking the issue through. Unfortunately the issue is much more nuanced than TFA implies -- it's not at all a case of "Let's just stop putting the wings-fall-off button next to the light switch", it's that the thing we want to prevent (other people's bad code running on our box) cannot be distinguished from the thing we want (other people's good code running on our box) without a whole lot of painstaking manual effort, and avoiding painstaking manual effort is the only reason we even consider running other people's code in the first place.
RubyGems: https://www.sonatype.com/blog/anatomy-of-the-rubygems-rest-c... PyPi: literally the latest attack included publishing malicious packages on PyPi XZ Tools, a part of nearly every Linux distribution nearly merged in code to backdoor SSH: https://www.akamai.com/blog/security-research/critical-linux...
It is just easy pickings to blame npm specifically. Yes, while they do share some part of the blame, no package manager is immune from attack and certainly not ones where the attackers exploited being able to extract out secrets from a developer's environment variables or files. Seems more like developers should be managing their secrets better?
I also find that using the meme that this title snowclones is in bad taste too.
This mitigates things to a great extent.
I do not know who thought that having your dependencies depend on the internet with a zillion users doing stuff to each package was a good idea for enterprise environments...
It is crazy how much things can get endangered this way.
In fact, pip is much more dangerous than npm because it lacks a lockfile. uv fixes that, but adoption is proceeding at a snail’s pace.
In JS world there is plenty of competition for package managers pnpm/ yarn/ burn all viable alternatives to npm the package manager.
Public registries for languages tend to coalesce around one service . Nobody wants to publish their library to 4 different registries .
https://pip.pypa.io/en/stable/cli/pip_lock/
But who cares about pip, uv is here.
The other one a few days ago was also good: https://nesbitt.io/2026/02/03/incident-report-cve-2024-yikes...