ZH version is available. Content is displayed in original English for accuracy.
Advertisement
Advertisement
⚡ Community Insights
Discussion Sentiment
63% Positive
Analyzed from 3707 words in the discussion.
Trending Topics
#npm#token#github#package#packages#com#publish#release#dependencies#https

Discussion (121 Comments)Read Original on HackerNews
https://github.com/TanStack/router/issues/7383#issuecomment-...
the GitHub bot law: the GitHub bot situation is way worse than you imagine even if you are aware of the GitHub bot law.
yes, a cheap parody on Hofstadter's law, but that's how bad it is
And what? Just let the actor just keep using them to spread to other people?
Always rotate your tokens immediately if they're compromised.
If it hurts, well, that sucks. …but seriously, not revoking the tokens just makes this worse for everyone.
A fair comment would have been: “it looks like the payload installs a dead-mans switch…”
Asking the maintainers not to revoke their compromised credentials deserves every down vote it receives.
The next five years are going to be truly WILD in the software world.
Air-gapped systems are gonna be huge.
It has been pulled from the npm registry now.
Going to Trusted Publishing / pipeline publishing removes the second factor that typically gates npm publish when working locally.
The story here, while it is evolving, seems to be that the attacker compromised the CI/CD pipeline, and because there is no second factor on the npm publish, they were able to steal the OIDC token and complete a publish.
Interesting, but unrelated I suppose, is that the publish job failed. So the payload that was in the malicious commit must have had a script that was able to publish itself w/ the OIDC token from the workflow.
What I want is CI publishing to still have a second factor outside of Github, while still relying on the long lived token-less Trusted Publisher model. AKA, what I want is staged publishing, so someone must go and use 2fa to promote an artifact to published on the npm side.
Otherwise, if a publish can happen only within the Github trust model, anyone who pwns either a repo admin token or gets malicious code into your pipeline can trivially complete a publish. With a true second factor outside the Github context, they can still do a lot of damage to your repo or plant malicious code, but at least they would not be able to publish without getting your second factor for the registry.
That is why I want 2fa before publish at the registry, because with my gh cli token as a repo admin, an attacker can disable all the Github branch protection, rewrite my workflows, disable the required reviewers on environments (which is one method people use for 2fa for releases, have workflows run in a GH environment whcih requires approval and prevents self review), enable self review, etc etc.
Its what I call a "fox in the hen house" problem, where you have your security gates within the same trust model as you expect to get stolen (in this case, having repo admin token exfiled from my local machine)
> We impose tag protection rules that prevent release tags from being created until a release deployment succeeds, with the release deployment itself being gated on a manual approval by at least one other team member. We also prevent the updating or deletion of tags, making them effectively immutable once created. On top of that we layer a branch restriction: release deployments may only be created against main, preventing an attacker from using an unrelated first-party branch to attempt to bypass our controls.
> https://astral.sh/blog/open-source-security-at-astral
From what I understand, you need a website login, and not a stolen API token to approve a deployment.
But I agree in principle - The registry should be able to enforce web-2fa. But the defaults can be safer as well.
edit: two hard things in computer science: naming things, cache invalidation, off-by-one errors, security. something something
We (TanStack) just released our postmortem about this.
Is there evidence that any downstream packages that may have pulled/included tanstack packages should be considered safe?
Crazy that an "orphan" commit pushed to a FORK(!) could trigger this (in npm clients). IMO GitHub deserves much of the blame here. A malicious fork's commits are reachable via GitHub's shared object storage at a URI indistinguishable from the legit repo. That is absolutely bonkers.
When I read that, I thought they must be using 'fork' wrong, and actually mean branch on the official repo, as that can't be right!?" Good lord.
https://gajus.com/blog/3-pnpm-settings-to-protect-yourself-f...
Just a handful of settings to save a whole lot of trouble.
Completely unforced fragmentation of the dependency manager space imo
On a related note, it seems to be impossible to find the documentation of min-release-age by googling it. Very annoying.
npm config set min-release-age 7
The '7' is days. This is the only format that worked for me, just a single integer number of days.
Confirmed by trying to install the latest version of React 19.2.6 (published 5 days ago as of the time of this comment). It failed with a comment confirming that it could not find such a version published before a week ago.
If I see a package version dependency that looks like this: ^1.0.0 or even this: "*", then stop reading, pin it to a secure version immediately.
I think `npx` might pull down new versions, too? I wish npm worked more like Elixir where updating the lock file was an explicit command, and everything else used the lock file directly.
And then use distro packages.
(I'm not accepting distro fragmentation as counterargument. With containerization the distro is something you can choose. Choose one, help there, and use it everywhere.)
it used to be that projects that pinned deps were called out as being less secure due to not being able to receive updates without a publish.
different times, different threat model I suppose
This is still the right advice for libraries. For security it doesn’t matter a whole lot anymore as package managers can force the transitive dependencies version, but it allows for much better transitive dependency de duplication.
For non-libraries it doesn’t matter as the exact versions get pinned in the package-lock.
- Python inline dependencies in PEP-0723, which you can pin with a==1.0, but can't be hash-pinned afaik.
- The bin package manager lets you pin binaries, but they aren't hash-pinned either.
- The pants build tool suggests vendoring a get-pants.sh script[0] but it downloads the latest. Even if you pass it a version, it doesn't do any checks on the version number and just installs it to ~/.local/bin
[0]: https://github.com/pantsbuild/setup/blob/gh-pages/get-pants....
Given the recent lpe vulns docker 100% won’t cut it.
And containers were never meant primarily as a security boundary anyways
https://news.ycombinator.com/item?id=48086082
https://news.ycombinator.com/item?id=48086082#48087028
https://news.ycombinator.com/item?id=48101453
2. NPM still not only publishes them, but also keeps distributing them for anything beyond 5 minutes.
Microsoft/GitHub/NPM can only keep repeating "security is our top priority" so many times. But NPM still doesn't detect these simple attacks, and we keep having this every week.
Jesus, that's vindictive.
This doesn't really feel sustainable, you're rolling the dice every time the dependencies are updated.
Again? How have lifecycle scripts not instantly been defaulted off? Yes breaking things is bad, but come on, this keeps happening, the fix is easy, and if an *javascript* build relies of dependendlcy of dependency's pulled build time script, then it's worth paying in braincells or tokens to digure it out and fix the biold process, or lately uncover an exploit chain. This isn't even a compiled language.
I am, however, concerned that this will pwn my workplace. We don't use Tanstack but this seems self-propagating and I doubt all of our dependencies are doing enough to prevent it.
Every package manager that does not analyze and run tests on the packages being uploaded (like Linux distros do) is vulnerable.
(I'm not being stupid, even ten years ago there were arguments on HN about whether you should audit your dependencies)
I landed on the 'yes, you should know what code you are getting involved with' side.
pnpm, deno, or bun, none of which will run the malicious "prepare" hook in the first place unless specifically allowed.
Go Get is closer to always locking dependencies unless you explicitly upgrade them with a go get, so it's much much better in my view.
Yes, you can lock deps in NPM/Cargo/etc. but that's not the default. It is the default in Go.
In Go projects my policy for upgrading dependencies includes running full AI audit of all code changed across all dependencies, comes out to ~$200 in tokens every time but it gives those warm 'not likely to get pwned' vibes. And it comes with a nice report of likely breaking changes etc.
BTW a curated mirror of <whatever ecosystem> packages, where every package is guaranteed to have been analyzed and tested, could be an easy sell now. Also relatively easy to create, with the help of AI. A $200 every time is less pleasant than, say, $100/mo for the entire org.
Docker does something vaguely similar for Docker images, for free though.
How is it not the default in npm?
Idk about Python, I refuse to use that language for other reasons.
https://news.ycombinator.com/item?id=47582632
One of the worst ecosystems that has been brought into the software industry and it is almost always via NPM. Not even Cargo (Rust) or go mod (Golang) get as many attacks because at least with the latter, they encourage you to use the standard library.
Both Javascript and Typescript have none and want you to import hundreds of libraries, increasing the risk of a supply chain attack.
At this point, JS and TS are considered harmful.
Other ecosystems package managers are really no different in a lot of ways.
NPM's biggest fault is just it allows post/pre install scripts by default without user intervention.
There are plenty of very popular packages with zero dependencies like Hono or Zod. If you decide to blindly install something with hundreds of deps it's on you.
That said, I do agree the JS standard library should provide a lot more than it does now.
The difference is that the usual C libraries don’t split the project into small molecules for no good reasons. You have to be as big as GTK to start splitting library in my opinion.
NPM is the windows of package managers right now.
Please correct me if I'm wrong but signed packages are still impractical in NPM which is why supply chain attacks still work by editing existing versions or pushing new point releases without a signature.
Or if you put all of the credentials in GitHub actions which is even more trivially exploitable through the actions marketplace because it is just git with a thin proxy, you have an even wider attack vector