I tried my best to use this tool and many other all in one tools (asdf, rtx etc) but individual tools like tfenv, pyenv, nvm feel way more ergonomic that all the all in one tools to a point that i don't mind having so many tools
Which has been turned into some kind of system aimed at generating/distributing F/OSS revenue based on usage via crypto. Pkgx is the package manager that drives it, which used to be called 'tea'.
Here's a previous discussion: https://news.ycombinator.com/item?id=30778924
And a more recent one: https://news.ycombinator.com/item?id=33681216
This was at the "let's download the package repository to get started" step. So, meh...
1. Install Nix from https://nixos.org/download.html
2. Find the package from https://search.nixos.org/packages
3. Start a shell with `nix-shell -p <name>`, have the package available in it
Sorry, no.
> So, switch to using both?
This is an option, yes. Plus, it has the added benefit of you not needing to manage `brew doctor`, `brew cleanup`, etc. yourself. So you're not stuck with weird packages you installed once, never needed again, and forgot to clean up.
It's strange that people are so against declarative systems, or even file-based OS configuration. When I get my new Macbook I was up-and-running within a few minutes. I can't imagine maintaining a list of brews I need to re-install just to set up everything + my configs + everything else. Nix Darwin just made this so ridiculously easy.
Plus I can share almost all of my configuration with my Linux setups so I have a near-consistent environment whether I'm on Mac or Linux.
The overhead of remembering the names for Brew, Apt, Snap, or whatever package managers exist seems like a lot of overhead, and I just value declarative, reproducible systems. Managing my packages on the fly?
Sorry, no.
It surely is complex, but I think you're intrinsically devaluing the importance of dependency management. Your important 'stuff to do' is built on top of a lot of software, getting that supply chain right (and reproducible) is at least as important.
For example, if you have a JS project with a package.json, Nix offers node2nix as a way to transform that package.json into Nix-like expressions. But in an alternate universe npm and lockfiles would "work" well enough to where we wouldn't need to rely on nix for package pinning.
There's all this work put into reproducibility that thinks that the answers are around adding wrappers around the existing tooling. It's good as a last resort, but if those efforts were going more into each language's ecosystem may we would end up in a scenario where each packaging tool didn't have to come up with its own magic way of doing things.
Nix and Bazel are complex because they try to hard to work well despite the tooling, rather than getting tooling to a place where all these layers of hacks were not an issue. And so downstream of that, "simple" tools become too complex from all the incidental complexity introduced by this way of doing things.
Nix the language doesn't have such a thing (would be a bit of a category error), and nixpkgs the ecosystem is so far away from that kind of thing...
Language ecosystems around sharing source code cannot exist as they do today if every dependency pinned its dependencies to specific versions. Source distribution like that has different requirements than binaries.
This feels like a solution for a problem that doesn’t exist.
This is solved for many languages: have one directory per project (node_modules, target) and optionally have a user cache so you don't have to redownload stuff. Or, have one usercache, but still separate by versions. I think people know how painful c and c++ versions are and don't want to do that again. Even with c/c++: cmake, automake, and bazel are there to wrangle package versions.
Though, I do agree things like Firefox or Thunderbird don't need to be hidden in a mysterious cache location
That’s the part I don’t get; this reads like it’s replacing brew. Libraries are entirely different from applications. I can easily see wanting a library temporarily to satisfy a version constraint, but something I’m going to use directly? Why?
If it’s for the interpreter (Python for example), and someone has pinned it to a specific - not floating up - requirement, that’s a problem with the author IMO. I shouldn’t need to install 3.7.3 when semver (if taken seriously, which Python does) states that >= 3.7 suffices.
I have no comment on Node because the entire ecosystem is a hellscape.
Ah I think that's why then. Venvs are IMO the best of every example I've given. When you make one, Python is embedded in the venv. So your system python doesnt really matter. As long as it creates a valid python venv and embeds a valid version of Python, you're fine after sourcing the venv.
I believe every other one of those versions libraries, but not the go/cargo/rustc/node programs themselves. There are solutions for each language independently, just as there is for python, but we don't have it as easy as you do with venvs which take care of both at the sane time!
But I do agree. If you're using something all the time, just brew install it, which is also what you've done with python I'm assuming. This just brings the power of venvs to every language.
Also I'm not surprised that the owner of brew chose to override such a monumentally important command like `env`... Smh
To each their own. If your workflow works for you, have at it.
> written by homebrew author (brew is notoriously slow)
> written in typescript
hmmm... I have my doubts on that claim, especially when there is no evidence to support it.
$ bun
command not found: bun
^^ type `pkgx` to run that
$ pkgx
running `bun`…
Bun: a fast JavaScript runtime, package manager, bundler and test runner.
# …I'm also genuinely surprised they abandoned the sha256 from brew (e.g. "welp, it is what it is" https://github.com/pkgxdev/pantry/blob/main/projects/httpie.... ). Ah, it's an implied .sha256 path from their magic distribution something something: https://dist.pkgx.dev/?prefix=httpie.io/
I find it actually useful, sometimes I may want to try something temporarily for intermediate result now and then and having it installed doesn’t yield benefits.
Good example: I generate asyncapi docs and doing an npx is fine. Alternative approach like pulling a docker image and the executing something with all volume sharing, ports what not is a but cumbersome because am too lazy to press too many keys.
If Max added “offline caching” to pkgx, you could still continue to use utilities even after the shell session ended or if you lost Internet connectivity!
/s