Now if only someone could do the same for shell scripts. Packaging, dependency management, and reproducibility in shell land are still stuck in the Stone Ages. Right now it’s still curl | bash and hope for the best, or a README with 12 manual steps and three missing dependencies.
Sure, there’s Nix... if you’ve already transcended time, space, and the Nix manual. Docker? Great, if downloading a Linux distro to run sed sounds reasonable.
There’s got to be a middle ground simple, declarative, and built for humans.
But: it’s the same skill set for every one of those things. This is why it’s an investment worth making IMO. If you’re only going to ever use it for one single thing, it’s not worth it. But once you’ve learned it you’ll be able to leverage it everywhere.
Python scripts with or without dependencies, uv or no uv (through the excellent uv2nix which I can’t plug enough, no affiliation), bash scripts with any dependencies you want, etc. suddenly it’s your choice and you can actually choose the right tool for the job.
Not trying to derail the thread but it feels germane in this context. All these little packaging problems go away with Nix, and are replaced by one single giant problem XD
ChatGPT writes pretty good nix now. You can simply paste any errors in and it will fix it.
#! /usr/bin/env nix-shell
#! nix-shell -i bash -p imagemagick cowsay
# scale image by 50%
convert "$1" -scale 50% "$1.s50.jpg" &&
cowsay "done $1.q50.jpg"
Sure all of nixos and packaging for nix is a challenge, but just using it for a shell script is not too badIf you do care, then welcome to learning about niv, flakes, etc.
[0]: admittedly 3 years ago or so.
The most shocking part about nix is the nix-shell (I know I can use it in other distros but hear me out once), its literally so cool to install projects for one off.
Want to record a desktop? Its one of those tasks that for me I do just quite infrequently and I don't like how in arch, I had to update my system with obs as a dependency always or I had to uninstall it. Ephemerality was a concept that I was looking for before nix since I always like to try out new software/keep my home system kind of minimalist-ish Cool. nix-shell -p obs-studio & obs and you got this.
honestly, I like a lot of things about nix tbh. I still haven't gone too much into the flake sides of things and just use it imperatively sadly but I found out that nix builds are sandboxed so I found a unique idea of using it as a sandbox to run code on reddit and I think I am going to do something cool with it. (building something like codapi , codapi's creator is kinda cool if you are reading this mate, I'd love talking to ya)
And I personally almost feel as if some software could truly be made plug n play (like imagine hetzner having nix os machines (currently I have heard that its support is finnicky) but then somehow a way to get hetzner nix os machines and then I almost feel as if we can get something really really close to digital ocean droplets/ plug n play without any isolation that docker provides because I guess docker has its own usecases but I almost feel as if managing docker stuff is kinda harder than nix stuff but feel free to correct me as I am just saying what I am feelin using nix.
I also wish if something like functional lua (does fxn lua exist??) -> nix transpiler because I'd like to write lua instead of nix to manage my system but I guess nix is fine too!
IMO it should stay that way, because any script that needs those things is way past the point where shell is a reasonable choice. Shell scripts should be small, 20 lines or so. The language just plain sucks too much to make it worth using for anything bigger.
> any script that needs those things
It's not really a matter of needing those things, necessarily. Once you have them, you're welcome to write scripts in a cleaner, more convenient way. For instance, all of my shell scripts used by colleagues at work just use GNU coreutils regardless of what platform they're on. Instead of worrying about differences in how sed behaves with certain flags, on different platforms, I simply write everything for GNU sed and it Just Works™. Do those scripts need such a thing? Not necessarily. Is it nicer to write free of constraints like that? Yes!
Same thing for just choosing commands with nicer interfaces, or more unified syntax... Use p7zip for handling all your archives so there's only one interface to think about. Make heavy use of `jq` (a great language) for dealing with structured data. Don't worry about reading input from a file and then writing back to it in the same pipeline; just throw in `sponge` from moreutils.
> The language just plain sucks too much
There really isn't anything better for invoking external programs. Everything else is way clunkier. Maybe that's okay, but when I've rewritten large-ish shell scripts in other languages, I often found myself annoyed with the new language. What used to be a 20-line shell script can easily end up being 400 lines in a "real" language.
I kind of agree with you, of course. POSIX-ish shells have too much syntax and at the same time not enough power. But what I really want is a better shell language, not to use some interpreted non-shell language in their place.
#!/bin/bash
make $1
Has multiple possible problems with it.There is no package manager that is going to make a shell script I write for macOS work on Linux if that script uses commands that only exist on macOS.
If you're allowed to install any deps go with uv, it'll do the rest.
I'm also kinda in love with https://babashka.org/ check it out if you like Clojure.
We use it at $work to manage dev envs and its much easier than Docker and Nix.
It also installs things in parallel, which is a huge bonus over plain Dockerfiles
Knowing the Python packaging ecosystem, uv could very well be replaced by something else. It feels different this time, but we won't know for a while yet.
I guess the issue is just that nobody has documented how to do that one specific thing so you can only learn this technique by trying to learn Nix as a whole.
So perhaps the thing you're envisaging could just be a wrapper for this Nix logic.
There's a reason why distroless images exist. :)
Hmm, last time I checked, uv installs into ~/.local/share/uv/python/cpython-3.xx and can not be installed globally e.g. inside a minimal docker without any other python.
So basically it still runs in a venv.
https://simonwillison.net/2024/Dec/19/one-shot-python-tools/
And there was a March discussion of a different blog post:
https://news.ycombinator.com/item?id=43500124
I hope this stays on the front page for a while to help publicize it.
at the end of article it shows an mcp to fetch youtube subs. I've made a similar one using simonw's llm as a fragment, if you find it useful.
llm -f youtube:<id>
llm -f yt:<lang>:<id>
https://github.com/redraw/llm-fragments-youtubeI'm also looking forward to the maturity of Rust-based type checkers, but solely because one can almost always benefit from an order of magnitude improvement in speed of type checking, not because Python type-checking is a "shit show".
I do grant you that for outsiders, the fact that the type checker from the Python organization itself is actually a second rate type checker (except for when one uses Django, etc, and then it becomes first-rate) is confusing.
Strangely, I've found myself building personal cross platform apps in game engines because of that.
And uv is fast — I mean REALLY fast. Fast to the point of suspecting something went wrong and silently errored, when it fact it did just what I wanted but 10x faster than pip.
It (and especially its docs) are a little rough around the edges, but it's bold enough and good enough I'm willing to use it nonetheless.
But that ship has sailed for me and I'm a uv convert.
Thought I was the only one thinking this. Got to open an issue, I think it would be nice to have some more examples showcasing different use cases.
uvx --with mkdocs-material --with mkdocs-material-extensions --with mkdocs-nav-weight mkdocs serve -a localhost:1337
Funnily enough it also feels like it is starting faster.Hopefully more languages follow suit on this pattern as it can be extremely useful for many cases, such as passing gists around, writing small programs which might otherwise be written in shell scripts, etc.
I still do because:
- Go gives me a single binary
- Dependencies are statically linked
- I don’t need any third-party libs in most scenarios
- Many of my scripts make network calls, and Go has a better stdlib for HTTP/RPC/Socket work
- Better tooling (built-in formatter, no need for pytest, go vet is handy)
- Easy concurrency. Most of my scripts don’t need it, but when they do, it’s easier since I don’t have to fiddle with colored functions, external libs, or, worse, threads.
That said, uv is a great improvement over the previous status quo. But I don’t write Python scripts for reasons that go beyond just tooling. And since it’s not a standard tool, I worry that more things like this will come along and try to “improve” everything. Already scarred and tired in that area thanks to the JS ecosystem. So I tend to prefer stable, reliable, and boring tools over everything else. Right now, Go does that well enough for my scripting needs.
I still use both Go and Python. But Python gives me access to a lot more libraries that do useful stuff. For example the YouTube transcript example I wrote about in the article was only possible in Python because afaik Go doesn't have a decent library for transcript extraction.
$ uv run --python=3.13 --with-requirements <(uv export --script script.py) -- python
>>> from script import X
I'd love if there were something more ergonomic like: $ uv run --with-script script.py python
Edit: this is better: $ "$(uv python find --script script.py)"
>>> from script import X
That fires up the correct python and venv for the script. You probably have to run the script once to create it.You can make `--interactive` or whatever you want a CLI flag from the script. I often make these small Typer CLIs with something like that (or in this case, in another dev script like this, I have `--sql` for entering a DuckDB SQL repl)
If I’ve ever had to run a “script” in any type of deployed ENV it’s always been done in that ENVs python shell .
So I still don’t see what the fuss is about?
I work on a massive python code base and the only benefit I’ve seen from moving to UV is it has sped up dep installation which has had positive impact on local and CI setup times.
"missing x..."
pip install x
run script
"missing y..."
pip install y
> y not found
google y to find package name
pip install ypackage
> conflict with other package
realize I forgot a venv and have contaminated my system python
check pip help output to remember how to uninstall a package
clean up system python
create venv at cwd
start over
...
</end of time>
People complain because the experience is less confusing in many other languages. Think Go, Rust, or even JS. All the tooling chaos and virtual environment jujitsu are real deterrents for newcomers. And it’s not just beginners complaining about Python tooling. Industry veterans like Armin Ronacher do that all the time.
uv is a great step in the right direction, but the issue is that as long as the basic tooling isn’t built into the language binary, like Go’s tools or Rust’s Cargo, more tools will pop up and fragment the space even further.
Tools like uv solve the "it works on my machine" problem. And it's also incredibly fast.
People have suggested using other languages that might be faster but the business always choices what’s best for everyone to work with.
What if you don't have an environment set up? I'm admittedly not a python expert by any means but that's always been a pain point for me. uvx makes that so much easier.
Nowadays I normally use `python venv/bin/<some-executable>`, or `conda run -n <some-env> <some-executable>`, or packaged it in a Singularity container. And even though I hear a lot of good things about uv, given that my job uses public money for research, we try to use open source and standards as much as possible. My understanding is that uv is still backed by a company, and at least when I checked it some time ago (in peps discussions & GH issues) they were no implementing the PEPs that I needed -- even if they did, we would probably still stay with simple pip/setuptools to avoid having to use research budget to update our build if the company ever changed its business model (e.g. what anaconda did some months/year? ago).
Digressing: the Singularity container is useful for research & HPC too, as it creates a single archive, which is faster to load on distributed filesystems like the two I work on (GPFS & LustreFS) instead of loading many small files over network.
python3 -m venv venv
Activate the virtual environment:
source venv/bin/activate
Deactivate the virtual environment:
deactivate
If you don't see what the fuss is about, I'm happy for you. Sounds like you're living in a fairly isolated environment. But I can assure you that uv is worth a lot of fussing about, it's making a lot of our lives a lot easier.
I think their docs could use a little bit of work, especially there should be a defined path to switch from a requirements.txt based workflow to uv. Also I felt like it's a little confusing how to define a python version for a specific project (it's defined in both .python-version and pyproject.toml)
How to migrate from requirements.txt: https://pydevtools.com/handbook/how-to/migrate-requirements.... How to change the Python version of a uv project: https://pydevtools.com/handbook/how-to/how-to-change-the-pyt...
Let me know if there are other topics I can hit that would be helpful!
Actually, I’ve thought of something! Migrating from poetry! It’s something I’ve been meaning to look at automating for a while now (I really don’t like poetry).
The `requires-version` field in `pyproject.toml` defines a range of compatible versions, while `.python-version` defines the specific version you want to use for development. If you create a new project with uv init, they'll look similar (>=3.13 and 3.13 today), but over time `requires-version` usually lags behind `.python-version` and defines the minimum supported Python version for the project. `requires-version` also winds up in your package metadata and can affect your callers' dependency resolution, for example if your published v1 supports Python 3.[old] but your v2 does not.
pyproject.toml is about allowing other developers, and end users, to use your code. When you share your code by packaging it for PyPI, a build backend (uv is not one, but they seem to be working on providing one - see https://github.com/astral-sh/uv/issues/3957 ) creates a distributable package, and pyproject.toml specifies what environment the user needs to have set up (dependencies and python version). It has nothing to do with uv in itself, and is an interoperable Python ecosystem standard. A range of versions is specified here, because other people should be able to use your code on multiple Python versions.
The .python-version file is used to tell uv specifically (i.e. nobody else) specifically (i.e., exact version) what to do when setting up your development environment.
(It's perfectly possible, of course, to just use an already-set-up environment.)
Try `uvx migrate-to-uv` (see https://pypi.org/project/migrate-to-uv/)
(If you want to write TOML, or do other advanced things such as preserving comments and exact structure from the original file, you'll want tomlkit instead. Note that it's much less performant.)
Some stuff like npm or dotnet do need an npm update / dotnet restore when I switch platforms. At first attempt uv seems like it just doesn't really like this and takes a fair bit of work to clean it up when switching platforms, while using venvs was fine.
For me there used to be a clear delineation between scripting languages and compiled languages. Python has always seemed to want to be both and I'm not too sure it can really. I can live with being mildly wrong about a concept.
When Python first came out, our processors were 80486 at best and RAM was measured in MB at roughly £30/MB in the UK.
"For the longest time, ..." - all distros have had scripts that find the relevant Python or Java or whatevs so that's simply daft. They all have shebang incantations too.
So we now have uv written in Rust for Python. Obviously you should install it via a shell script directly from curl!
I love all of the components involved here but please for the love of a nod to security at least suggest that the script is downloaded first, looked over and then run.
I recently came across a Github hosted repo with scripts that changed Debian repos to point somewhere else and install ... software. I'm sure that's all fine too.
curl | bash is cute and easy and very, very insecure.
No? You can install it via pip.
https://github.com/InboraStudio/Proxmox-VGPU
Note the quite professional looking README.md and think about the audience for this thing - kittens hitting the search bong and trying to get something very complicated working.
Read the scripts: they are pretty short and could put your hypervisor in the hands of someone else who may not be too friendly.
Now pip has the same problem except you don't normally go in with a web browser first.
I raised an issue to at least provide a hint to casual browsers and also raised it with the github AI bottie complaint thang which doesn't care about you, me or anything else for that matter.
Not an expert but I think there's performance gains to calling the binary directly rather than through python.
1) Subscribe to the GitHub repo for tag/release updates.
2) When I get a notification of a new version, I run a shell function (meup-uv and meup-ruff) which grabs the latest tag via a GET request and runs an install. I don't remember the semantics off the top of my head, but it's something like:
cargo install --jobs $(( $(nproc) / 2 )) --tag ${TAG} --git ${REPO} [uv|ruff]
Of course this implies I'm willing to wait the ~5-10 minutes for these apps to compile, along with the storage costs of the registry and source caches. Build times for ruff aren't terrible, but uv is a straight up "kick off and take a coffee break" experience on my system (it gets 6-8 threads out of my 12 total depending on my mood).Eh. There's a lot of space in the middle to "well actually" about, but Python really doesn't behave like a "compiled" language. The more important question is: what do you ship to people, and how easily can they use it? Lots of people in this thread are bigging up Go's answer of "you ship a thing which can run immediately with no dependencies". For users that solves so many problems.
Quite a few python usecases would benefit from being able to "compile" applications in the same sense. There are py-to-exe solutions but they're not popular or widely used.
#!/usr/bin/env bash
eval "$(conda shell.bash hook)"
conda activate myenv
python myscript
Admittedly this isn't self contained like the PEP 723 solution. $ dnf search --cacheonly uv
Matched fields: name (exact)
uv.x86_64: An extremely fast Python package installer and resolver, written in RustAs Ive gotten older I've grown weary of third party tools, and almost always try to stick with the first party built in methods for a given task.
Does uv provide enough benefit to make me reconsider?
I don't know why there is such a flurry of posts since it's a tool that is more than a year old, but it's the one and only CLI tool that I recommend when Python is needed for local builds or on a CI.
Hatch was a good contender at the time but they didn't move fast enough, and the uv/ruff team ate everybody's lunch. uv is really good and IMHO it's here to stay.
Anyway try it for yourself but it's not a high-level tool that is hiding everything, it's fast and powerful and yet you stay in control. It feels like a first-party tool that could be included in the Python installer.
The answer is an unequivocal yes in this case. uv is on a fast track to be the defacto standard and make pip relegated to the ‘reference implementation’ tier.
Try it for <20mins and if you don't like it, leave it behind. These 20mins include installation, setup, everything.
If you use anything outside the standard library the only reliable way to run a script is installing it in a virtual environment. Doing that manually is a hassle and pyenv can be stupidly slow and wastes disk space.
With uv it's fast and easy to set up throw away venvs or run utility scripts with their dependencies easily. With the PEP-723 scheme in the linked article running a utility script is even easier since its dependencies are self-declared and a virtual environment is automatically managed. It makes using Python for system scripting/utilities practical and helps deploy larger projects.
Really? `apt install pipx; pipx install sphinx` (for example) worked flawlessly for me. Pipx is really just an opinionated wrapper that invokes a vendored copy of Pip and the standard library `venv`.
The rest of your post seems to acknowledge that virtual environments generally work just fine. (Uv works by creating them.)
> Sometimes you're limited by Python version and it can be non-trivial to have multiple versions installed at once.
I built them from source and make virtual environments off of them, and pass the `--python` argument to Pipx.
> If you use anything outside the standard library the only reliable way to run a script is installing it in a virtual environment. Doing that manually is a hassle and pyenv can be stupidly slow and wastes disk space.
If you're letting it install separate copies of Python, sure. (The main use case for pyenv is getting one separate copy of each Python version you need, if you don't want to build from source, and then managing virtual environments based off of that.) If you're letting it bootstrap Pip into the virtual environment, sure. But you don't need to do either of those things. Pip can install cross-environment since 22.3 (Pipx relies on this).
Uv does save disk space, especially if you have multiple virtual environments that use the same packages, by hard-linking them.
> With uv it's fast and easy to set up throw away venvs or run utility scripts with their dependencies easily. With the PEP-723 scheme in the linked article running a utility script is even easier since its dependencies are self-declared and a virtual environment is automatically managed.
Pipx implements PEP 723, which was written to be an ecosystem-wide standard.
How much you can benefit depends on your use case. uv is a developer tool that also manages installations of Python itself (and maintains separate environments for which you can choose a Python version). If you're just trying to install someone else's application from PyPI - say https://pypi.org/project/pycowsay/ as an example - you'll likely have just as smooth of an experience via pipx (although installation will be even slower than with pip, since it's using pip behind the scenes and adding its own steps). On the other hand, to my understanding, to use uv as a developer you'll still need to choose and install a build backend such as Flit or Hatchling, or else rely on the default Setuptools.
One major reason developers are switching to uv is lockfile support. It's worth noting here that an interoperable standard for lockfiles was recently approved (https://peps.python.org/pep-0751/), uv will be moving towards it, and other tools like pip are moving towards supporting it (the current pip can write such lockfiles, and installing from them is on the roadmap: https://github.com/pypa/pip/issues/13334).
If you, like me, prefer to follow the UNIX philosophy, a complete developer toolchain in 2025 looks like:
* Python itself (if you want standalone binaries like the ones uv uses, you can get them directly; you can also build from source like I do; if you want to manage Python installations then https://github.com/pyenv/pyenv is solid, or you can use the multi-language https://asdf-vm.com/guide/introduction.html with https://github.com/asdf-community/asdf-python I guess)
* Ability to create virtual environments (the standard library takes care of this; some niche uses are helped out by https://virtualenv.pypa.io/)
* Package installer (Pip can handle this) and manager (if you really want something to "manage" packages by installing into an environment and simultaneously updating your pyproject.toml, or things like that; but just fixing the existing environment is completely viable, and installers already resolve dependencies for whatever it is they're currently installing)
* Build frontend (the standard is https://build.pypa.io/en/stable/; for programmatic use, you can work with https://pyproject-hooks.readthedocs.io/en/latest/ directly)
* Build backend (many options here - by design! but installers will assume Setuptools by default, since the standard requires them to, for backwards compatibility reasons)
* Support for uploading packages to PyPI (the standard is https://twine.readthedocs.io/en/stable/)
* Optional: typecheckers, linters, an IDE etc.
A user on the other hand only needs
* Some version of Python (the one provided with a typical Linux distribution will generally work just fine; Windows users should usually just install the current version, with the official installer, unless they know something they want to install isn't compatible)
* Ability to create virtual environments and also install packages into them (https://pipx.pypa.io/stable/ takes care of both of these, as long as the package is an "application" with a defined entry point; I'm making https://github.com/zahlman/paper which will lift that restriction, for people who want to `import` code but not necessarily publish their own project)
* Ability to actually run the installed code (pipx handles this by symlinking from a standard application path to a wrapper script inside the virtual environment; the wrappers specify the absolute path to the virtual environment's Python, which is generally all that's needed to "use" that virtual environment for the program. It also provides a wrapper to run Pip within a specific environment that it created. PAPER will offer something a bit more sophisticated here, for both aspects.)
(also: possible there's always been a way and I'm an idiot)
#!/usr/bin/env -S mise x xh jq fzf gum -- bash
todo=$(xh 'https://jsonplaceholder.typicode.com/todos' | jq '.[].title' | fzf)
gum style --border double --padding 1 "$todo"
It makes throwing together a bash scripts with dependencies very enjoyableuv is the magic that deals with all of the rough edges/environment stuff I usually hate in Python. All I need to do is `uv run myFile.py` and uv solves everything else.
Firstly, I have been a HN viewer for so many time and this is the one thing about pep python scripts THAT always get to the top of leaderboard of hackernews by each person discovering it themselves.
I don't mean to discredit the author. His work was simple and clear to understand. I am just sharing this thesis that I have that if someone wants karma on Hackernews for whatever reason, this might be the best topic. (Please don't pitchfork me since I don't mean offense to the author)
Also, can anybody please explain to me on how to create that pep metadata in uv from just a python script and without anything else, like some command which can take a python script and give pep and add that in the script, I am pretty sure that uv has a feature flag but I feel that the author might've missed out on this feature because I don't know when coding one off scripts in python using AI (gemini) it had some options with pep so I always had to paste uv's documentation I don't know, so please if anybody knows a way to create pep easier using the cli, then please tell me! Thanks in advance!!
$ touch foo.py
$ uv add --script foo.py requests
Updated `foo.py`
$ cat foo.py
# /// script
# requires-python = ">=3.13"
# dependencies = [
# "requests",
# ]
# ///But my tool was really finnicky and I guess it was built by AI ,so um yea, I guess you all can try it, its on pypi. I think that it has a lot of niche cases where it doesn't work. Maybe someone can modify it to make it better as I had built it like 3-4 months ago if I remember correctly and I have completely forgotten how things worked in uv.
The downside is that there are a bunch of seemingly weird lines you have to paste at the begging of the script :D
If anyone is curios it's on pypi (pysolate).
Not quite the same but interesting!
Each environment itself only takes a few dozen kilobytes to make some folders and symlinks (at least on Linux). People think of Python virtual environments as bloated (and slow to create) because Pip gets bootstrapped into them by default. But there is no requirement to do so.
The packages take up however much space they take up; the cost there is unavoidable. Uv hard-links packages into the separate environments from its cache, so you only pay a disk-space cost for shared packages once (plus a few more kilobytes for more folders).
(Note: none of this depends on being written in Rust, but Pip doesn't implement this caching strategy. Pip can, however, install cross-environment since 22.3, so you don't actually need the bootstrap. Pipx depends on this, managing its own vendored copy of Pip to install into multiple environments. But it's still using a copy of Pip that interacts with a Pip-styled cache, so it still can't do the hard-link trick.)
Note that PEP 723 is also supported by pipx run:
Some Python devs told me, it's an awesome language, but they envy the Node.js ecosystem for their package management.
Seems like uv finally removed that roadblock.
I'm building yt-dlp / uvx based WebUI
- https://github.com/ocodo/uvxytdlp
Still work in progress but shaping up nicely.
Also that would be reproducible, in contrast to what is shown in the blog post. To make that reproducible, one would have to keep the lock file somewhere, or state the checksums directly in the Python script file, which seems rather un-fun.
Can you not use `uvx` with your script because it only works on packages that are installed already or on PyPi?
I don't think running with uv vs uvx imposes any extra limitations on how you specify dependencies. You should either way be able to reference dependencies not just from PyPi but also by git repo or local file path in a [tool.uv.sources] table, the same as you would in a pyproject.toml file.
You can use uvx run scripts with a combination of the --with flag to specify the dependencies and invoking python directly. For e.g
uvx --with youtube-transcript-api python transcript.py
But you wont get the benefit of PEP 723 metadata.
I like it though. It's very convenient.
There are options to both lock the dependencies and limit by date:
https://docs.astral.sh/uv/guides/scripts/#locking-dependenci...
https://docs.astral.sh/uv/guides/scripts/#improving-reproduc...
People act like this happens all the time but in practice I haven't seen evidence that it's a serious problem. The Python ecosystem is not the JavaScript ecosystem.
An easy way to prove that this is the norm is to take some existing code you have now, and update to the latest versions your dependencies are using, and watch everything break. You don't see a problem because those dependencies are using pinned/very restricted versions, to hide the frequency of the problem from you. You'll also see that, in their issue trackers, they've closed all sorts of version related bugs.
Bruh, one-off scripts is the whole point of Python. The cheat code is to add "break-system-packages = true" to ~/.config/pip/pip.conf. Just blow up ~/.local/lib/pythonX.Y/site-packages/ if you run into a package conflict (exceedingly rare) and reinstall. All these venv, uv, metadata peps, and whatnot are pointless complications you just don't need.
That's bait! / Ads are getting smarter!
I would also have accepted "unless you're geh", "unless you're a traitor to the republic", "unless you're not leet enough" etc.
And so, if you are the kind of person who has not heard of it, you probably don't read blogs about python, therefor you probably aren't reading _this_ blog. No harm no foul.
Is there a version of uv written in Python? It's weird (to me) to have an entire ecosystem for a language and a highly recommended tool to make your system work is written in another language.
Interestingly, the speed is the main differentiator from existing package and project management tools. Even if you are using it as a drop-in replacement for pip, it is just so much faster.
There are many competing tools in the space, depending on how you define the project requirements.
Contrary to the implication of other replies, the lion's share of uv's speed advantage over Pip does not come from being written in Rust, from any of the evidence available to me. It comes from:
* bootstrapping Pip into the new environment, if you make a new environment and don't know that you don't actually have to bootstrap Pip into that environment (see https://zahlman.github.io/posts/2025/01/07/python-packaging-... for some hints; my upcoming post will be more direct about it - unfortunately I've been putting it off...)
* being designed up front to install cross-environment (if you want to do this with Pip, you'll eventually and with much frustration get a subtly broken installation using the old techniques; since 22.3 you can just use the `--python` flag, but this limits you to environments where the current Pip can run, and re-launches a new Pip process taking perhaps an additional 200ms - but this is still much better than bootstrapping another copy of Pip!)
* using heuristics when solving for dependencies (Pip's backtracking resolver is exhaustive, and proceeds quite stubbornly in order)
* having a smarter caching strategy (it stores uncompressed wheels in its cache and does most of the "installation" by hard-linking these into the new environment; Pip goes through a proxy that uses some opaque cache files to simulate re-doing the download, then unpacks the wheel again)
* not speculatively pre-loading a bunch of its own code that's unlikely to execute (Pip has large complex dependencies, like https://pypi.org/project/rich/, which it vendors without tree-shaking and ultimately imports almost all of, despite using only a tiny portion)
* having faster default behaviours; e.g. uv defaults to not pre-compiling installed packages to .pyc files (since Python will do this on the first import anyway) while Pip defaults to doing so
* not (necessarily) being weighed down by support for legacy behaviours (packaging worked radically differently when Pip first became publicly available)
* just generally being better architected
None of these changes require a change in programming language. (For example, if you use Python to make a hard link, you just use the standard library, which will then use code written in C to make a system call that was most likely also written in C.) Which is why I'm making https://github.com/zahlman/paper .
Besides, the most important tool for making python work, the python executable itself, is written in C. People occasionally forget it's not a self-hosting language.
A tool written in Python is never going to be as fast as one written in Rust. There are plenty of Python alternatives and you're free to use them.