- pip doesn't handle your Python executable, just your Python dependencies. So if you want/need to swap between Python versions (3.11 to 3.12 for example), it doesn't give you anything. Generally people use an additional tool such as pyenv to manage this. Tools like uv and Poetry do this as well as handling dependencies
- pip doesn't resolve dependencies of dependencies. pip will only respect version pinning for dependencies you explicitly specify. So for example, say I am using pandas and I pin it to version X. If a dependency of pandas (say, numpy) isn't pinned as well, the underlying version of numpy can still change when I reinstall dependencies. I've had many issues where my environment stopped working despite none of my specified dependencies changing, because underlying dependencies introduced breaking changes. To get around this with pip you would need an additional tool like pip-tools, which allows you to pin all dependencies, explicit and nested, to a lock file for true reproducibility. uv and poetry do this out of the box.
- Tool usage. Say there is a python package you want to use across many environments without installing in the environments themselves (such as a linting tool like ruff). With pip, you need to install another tool like pipx to install something that can be used across environments. uv can do this out of the box.
Plus there is a whole host of jobs that tools like uv and poetry aim to assist with that pip doesn't, namely project creation and management. You can use uv to create a new Python project scaffolding for applications or python modules in a way that conforms with PEP standards with a single command. It also supports workspaces of multiple projects that have separate functionality but require dependencies to be in sync.
You can accomplish a lot/all of this using pip with additional tooling, but its a lot more work. And not all use cases will require these.
pip install
something and have it fail and then go back to zero and restart and have it work but at some point that will fail. conda has a correct resolving algorithm but the packages are out of date and add about as many quality problems as they fix.I worked at a place where the engineering manager was absolutely exasperated with the problems we were having with building and deploying AI/ML software in Python. I had figured out pretty much all the problems after about nine months and had developed a 'wheelhouse' procedure for building our system reliably, but it was too late.
Not long after I sketched out a system that was a lot like uv but it was written in Python and thus had problems with maintaining its own stable Python enivronment (e.g. poetry seems to trash itself every six months or so.)
Writing uv in Rust was genius because it eliminates that problem of the system having a stable surface to stand on instead of pipping itself into oblivion, never mind that it is much faster than my system would have been. (My system had the extra feature that it used http range requests to extract the metadata from wheel files before pypi started letting you download the metadata directly.)
I didn't go forward with developing it because I argued with a lot of people who, like you, thought it was "the perfect being the enemy of the good" when it was really "the incorrect being the enemy of the correct." I'd worked on plenty of projects where I was right about the technology and wrong about the politics and I am so happy that uv has saved the Python community from itself.
and I think a killer feature is the ability to inline dependencies in your Python source code, then use: uv tool run <scriptname>
Your script code would like:
#!/usr/bin/env -S uv run --script # /// script # requires-python = ">=3.12" # dependencies = [ # "...", # "..." # ] # ///
Then uv will make a new venv, install the dependencies, and execute the script faster than you think. The first run is a bit slower due to downloads and etc, but the second and subsequent runs are a bunch of internal symlink shuffling.
It is really interesting. You should at least take a look at a YT or something. I think you will be impressed.
Good luck!
It also automatically creates venvs if you delete them. And it automatically updates packages when you run something with uv run file.py (useful when somebody may have updated the requirements in git). It also lets you install self contained (installed in a virtualenv and linked to ~/.local/bin which is added to your path)python tools (replacing pipx). It installs self contained python builds letting you more easily pick python version and specify it in a .python-version file for your project (replacing pyenv and usually much nicer because pyenv compiles them locally)
Uv also makes it easier to explore and say start a ipython shell with 2 libraries uv run --with ipython --with colorful --with https ipython
It caches downloads. Of course the http itself isn't faster but they're exploring things to speed that part up and since it's written in rust local stuff (like deleting and recreating a venv with cached packages) tends to be blazing fast
Also I don’t recognise errors and I don’t know which python versions generally work well with what.
I’ve had it happen so often with pip that I’d have something setup just fine. Let’s say some stable diffusion ui. Then some other month I want to experiment with something like airbyte. Can’t get it working at all. Then some days later I think, let’s generate an image. Only to find out that with pip installing all sorts of stuff for airbyte, I’ve messed up my stable diffusion install somehow.
Uv clicked right away for me and I don’t have any of these issues.
Was I using pip and asdf incorrectly before? Probably. Was it worth learning how to do it properly in the previous way? Nope. So uv is really great for me.
I saw a discourse reply that cited some sort of possible security issue but that was basically it and that means that the only way to get that functionality is to not use pip. It's really not a lot of major stuff, just a lot of little paper cuts that makes it a lot easier to just use something else once your project gets to a certain size.
However, the speed alone is reason enough to switch. Try it once and you will be sold.
now though, yes unequivocally you are missing out.
it does virtualenv, it does pyenv, it does pip, so all thats managed in once place.
its much faster than pip.
its like 80% of my workflow now.
I've used pip, pyenv, poetry, all are broken in one way or another, and have blind spots they don't serve.
If your needs are simple (not mixing Python versions, simple dependencies, not packaging, etc) you can do it with pip, or even with tarballs and make install.
Using uv means your project will have well defined dependencies.
The only way to justify VC money is a plot to take over the ecosystem and then profit off of a dominant position. (e.g. the Uber model)
I've heard a little bit about UV's technical achievements, which are impressive, but technical progress isn't the only metric.
> I haven't felt like it's a minor improvement on what I'm using
means that this:
> I'd love if we standardized on it as a community as the de facto default
…probably shouldn’t happen. The default and de facto standard should be something that doesn’t get put on a pedestal but stays out of the way.
It would be like replacing the python repl with the current version of ipython. I’d say the same thing, that it isn’t a minor improvement. While I almost always use ipython now, I’m glad it’s a separate thing.
The problem is that in the python ecosystem there really isn't a default de facto standard yet at all. It's supposed to be pip, but enough people dislike pip that it's hard as a newcomer to know if it's actually the standard or not.
The nice thing about putting something like this on a pedestal is that maybe it could actually become a standard, even if the standard should be simple and get out of the way. Better to have a standard that's a bit over the top than no standard at all.
…Not with IPython. But with an implementation written in Python instead of C, originating from the PyPy project, that supports fancier features like multi-line editing and syntax highlighting. See PEP 762.
I was apprehensive when I heard about it, but then I had the chance to use it and it was a very nice experience.
This to me is unachievable. Perfection is impossible. On the the way there if the community and developers coalesced around a single tool then maybe we can start heading down the road to perfectionism.
What people seem to miss about Pip is that it's by design, not a package manager. It's a package installer, only. Of course it doesn't handle the environment setup for you; it's not intended for that. And of course it doesn't keep track of what you've installed, or make lock files, or update your `pyproject.toml`, or...
What it does do is offer a hideously complex set of options for installing everything under the sun, from everywhere under the sun. (And that complexity has led to long-standing, seemingly unfixable issues, and there are a lot of design decisions made that I think are questionable at best.)
Ideas like "welllll I use poetry but pyenv works or you could use conda too" are incoherent. They're for different purposes and different users, with varying bits of overlap. The reason people are unsatisfied is because any given tool might be missing one of the specific things they want, unless it's really all-in-one like Uv seems like it intends to be eventually.
But once you have a truly all-in-one tool, you notice how little of it you're using, and how big it is, and how useless it feels to have to put the tool name at the start of every command, and all the specific little things you don't like about its implementation of whatever individual parts. Not to mention the feeling of "vendor lock-in". Never mind that I didn't pay money for it; I still don't want to feel stuck with, say, your build back-end just because I'm using your lock-file updater.
In short, I don't want a "package manager".
I want a solid foundation (better than Pip) that handles installing (not managing) applications and packages, into either a specified virtual environment or a new one created (not managed, except to make it easy to determine the location, so other tools can manage it) for the purpose. In other words, something that fully covers the needs of users (making it possible to run the code), while providing only the bare minimum on top of that for developers - so that other developer tools can cooperate with that. And then I want specialized tools for all the individual things developers need to do.
The specialized tools I want for my own work all exist, and the tools others want mostly exist too. Twine uploads stuff to PyPI; `build` is a fine build front-end; Setuptools would do everything I need on the back-end (despite my annoyances with it). I don't need a lockfile-driven workflow and don't readily think in those terms. I use pytest from the command line for testing and I don't want a "package manager" nor "workflow tool" to wrap that for me. If I needed any wrapping there I could do a few lines of shell script myself. If anything, the problem with these tools is doing too much, rather than too little.
The base I want doesn't exist yet, so I've started making it myself. Pipx is a big step in the right direction, but it has some arbitrary limitations (I discuss these and some workarounds in my recent blog post https://zahlman.github.io/posts/2025/01/07/python-packaging-... ) and it's built on Pip so it inherits those faults and is that much bigger. Uv is even bigger still for the compiled binary, and I would only be using the installation parts.
Node.js's replacement for virtualenv is literally just a folder named "node_modules". Meanwhile python has an entire tool with strange ideosyncracies that you have to pay attention to otherwise pip does the wrong thing by default.
It is as if python is pretending to be a special snowflake where installing libraries into a folder is this super hyper mega overcomplicated thing that necessitates a whole dedicated tool just to manage, when in reality in other programming languages nobody is really thinking about that the fact that the libraries end up in their build folders. It just works.
So again you're pretending that this is such a big deal that it needs a whole other tool, when the problem in question is so trivial that another tool is adding mental overhead with regard to the microscopic problem at hand.
# /// script
# requires-python = ">=3.12"
# dependencies = [
# "pandas",
# ]
# ///
h/t https://simonwillison.net/2024/Dec/19/one-shot-python-tools/A simple example a came across is having to rename some files:
1. you just open the shell in the location you want
2. and run this command:
uv run https://raw.githubusercontent.com/SimonB97/MOS/main/AITaskRu... "check the contents of the .md files in the working dir and structure them in folders"
There's a link to Magentic-1 docs and further info in the repo: https://github.com/SimonB97/MOS/tree/main/AITaskRunner (plus two other simple scripts).
I would want distributed projects to do things properly, but as a way to shorthand a lot of futzing about? It's excellent
python's wheels are falling off at an ever faster and faster rate
But there are a shocking number of install instructions that offer $(npm i -g) and if one is using Homebrew or nvm or a similar "user writable" node distribution, it won't prompt for sudo password and will cheerfully mangle the "origin" node_modules
So, it's the same story as with python: yes, but only if the user is disciplined
Now ruby drives me fucking bananas because it doesn't seem to have either concept: virtualenvs nor ./ruby_modules
That said, I agree that `npm i -g` is a poor system package manager, and you should typically be using Homebrew or whatever package manager makes the most sense on your system. That said, `npx` is a good alternative if you just want to run a command quickly to try it out or something like that.
This whole installing the same dependencies a million times across different projects in Python and Node land is completely insane to me. Ruby has had the only sane package manager for years. Cargo too, but only because they copied Ruby.
Node has littered my computer with useless files. Python’s venv eat up a lot of space unnecessarily too.
As long as you're in the ruby-install/chruby ecosystem and managed to avoid the RVM mess then the tooling is so simple that it doesn't really get any attention. I've worked exclusively with virtualenvs in ruby for years.
(i) wrongly configured character encodings (suppose you incorporated somebody else's library that does a "print" and the input data contains some invalid characters that wind up getting printed; that "print" could crash a model trainer script that runs for three days if error handling is set wrong and you couldn't change it when the script was running, at most you could make the script start another python with different command line arguments)
(ii) site-packages; all your data scientist has to do is
pip install --user
the wrong package and they'd trashed all of their virtualenvs, all of their condas, etc. Over time the defaults have changed so pythons aren't looking into the site-packages directories but I wasted a long time figuring out why a team of data scientists couldn't get anything to work reliably(iii) "python" built into Linux by default. People expected Python to "just work" but it doesn't "just work" when people start installing stuff with pip because you might be working on one thing that needs one package and another thing that needs another package and you could trash everything you're doing with python in the process of trying to fix it.
Unfortunately python has attracted a lot of sloppy programmers who think virtualenv is too much work and that it's totally normal for everything to be broken all the time. The average data scientist doesn't get excited when it crumbles and breaks, but you can't just call up some flakes to fix it. [1]
I wish they would stick to semantic versioning tho.
I have used two projects that got stuck in incompatible changes in the 3.x Python.
That is a fatal problem for Python. If a change in a minor version makes things stop working, it is very hard to recommend the system. A lot of work has gone down the drain, by this Python user, trying to work around that
Then in their main code they imported the said error.py but unfortunately numpy library also has an error.py. So the user was getting very funky behavior.
Run npm install
Delete node_modules and wait 30minutes because it takes forever to delete 500MB worth of 2 million files.
Do an npm install again (or yarn install or that third one that popped up recently?)
Uninstall/Upgrade npm (or is it Node? No wait, npx I think. Oh well, used to be node + npm, now it's something different.)
Then do steps 1 to 3 again, just in case.
Hmm, maybe it's the lockfile? Delete it, one of the juniors pushed their version to the repo without compiling maybe. (Someone forgot to add it to the gitignore file?)
Okay, do steps 1 to 3 again, that might have fixed it.
If you've gotten here, you are royally screwed and should try the next javascript expert, he might have seen your error before.
So no, I'm a bit snarky here, but the JS ecosystem is a clustermess of chaos and should rather fix it's own stuff first. I have none of the above issues with python, a proper IDE and out of the box pip.
Most packing managers are developed.
Pnpm is engineered.
It’s one of the few projects I donate to on GitHub
Similar problems with runtime version management (need to use nvm for sanity, using built-in OS package managers seems to consistently result in tears).
More package managers and interactions (corepack, npm, pnpm, yarn, bun).
Bad package interop (ESM vs CJS vs UMD).
More runtimes (Node, Deno, Bun, Edge).
Then compound this all with the fact that JS doesn't have a comprehensive stdlib so your average project has literally 1000s of dependencies.
Will also note that in my years of js experience I've hardly ever run into module incompatibilities. It's definitely gnarly when it happens, but wouldn't consider this to be the same category of problem as the confusion of setting up python.
Hopefully uv can convince me that python's environment/dependency management can be easier than JavaScript's. Currently they both feel bad in their own way, and I likely prefer js out of familiarity.
In practice I find this a nuisance but a small one. I wish there had been a convention that lets the correct version of Node run without me manually having to switch between them.
> More package managers and interactions (corepack, npm, pnpm, yarn, bun).
But they all work on the same package.json and node_modules/ principle, afaik. In funky situations, incompatibilities might emerge, but they are interchangeable for the average user. (Well, I don't know about corepack.)
> Bad package interop (ESM vs CJS vs UMD).
That is a whole separate disaster, which doesn't really impact consuming packages. But it does make packaging them pretty nasty.
> More runtimes (Node, Deno, Bun, Edge).
I don't know what Edge is. Deno is different enough to not really be in the same game. I find it hard to see the existence of Bun as problematic: it has been a bit of a godsend for me, it has an amazing ability to "just work" and punch through Typescript configuration issues that choke TypeScript. And it's fast.
> Then compound this all with the fact that JS doesn't have a comprehensive stdlib so your average project has literally 1000s of dependencies.
I guess I don't have a lot of reference points for this one. The 1000s of dependencies is certainly true though.
You basically need to just remember to never call python directly. Instead use uv run and uv pip install. That ensures you're always using the uv installed python and/or a venv.
Python based tools where you may want a global install (say ruff) can be installed using uv tool
uv itself is the only Python tool I install globally now, and it's a self-contained binary that doesn't rely on Python. ruff is also self-contained, but I install tools like ruff (and Python itself) into each project's virtual environment using uv. This has nice benefits. For example, automated tests that include linting with ruff do not suddenly fail because the system-wide ruff was updated to a version that changes rules (or different versions are on different machines). Version pinning gets applied to tooling just as it does to packages. I can then upgrade tools when I know it's a good time to deal with potentially breaking changes. And one project doesn't hold back the rest. Once things are working, they work on all machines that use the same project repo.
If I want to use Python based tools outside of projects, I now do little shell scripts. For example, my /usr/local/bin/wormhole looks like this:
#!/bin/sh
uvx \
--quiet \
--prerelease disallow \
--python-preference only-managed \
--from magic-wormhole \
wormhole "$@"I don't understand why people would rather do this part specfically, rather than activate a venv.
Also we don't have a left pad scale dependency ecosystem that makes version conflicts such a pressing issue.
we don't have a left pad scale dependency ecosystem that makes version conflicts such a pressing issue
TensorFlow.
I have to imagine that the python maintainers listen for what the community needs and hear a thousand voices asking for a hundred different packaging strategies, and a million voices asking for the same language features. I can forgive them for prioritizing things the way they have.
Get me the deps this project needs, get them fast, then them correctly, all with minimum hoops.
Cargo and deno toolchains are pretty good too.
Opam, gleam, mvn/gradle, stack, npm/yarn, nix even, pip/poetry/whatever-python-malarkey, go, composer, …what other stuff have i used in the past 12 months… c/c++ doesn’t really have a first class std other than global sys deps (so ill refer back to nix or os package managers).
Getting the stuff you need where you need it is always doable. Some toolchains are just above and beyond, batteries included, ready for productivity.
I agree that the old days of "there are 15 dep managers, good luck" was high chaos. And those who do cutesy shit like using "replace" in their go.mod[1] is sus but as far as dx $(go get) that caches by default in $XDG_CACHE_DIR and uses $GOPROXY I think is great
1: https://github.com/opentofu/opentofu/blob/v1.9.0/go.mod#L271
If you want Numpy (one of the most popular Python packages) on a system that doesn't have a pre-built wheel, you'll need to do that. Which is why there are, by my count, 54 different pre-built wheels for Numpy 2.2.1.
And that's just the actual installation process. Package management isn't solved because people don't even agree on what that entails.
The only way you avoid "worry about modifying the global environment" is to have non-global environments. But the Python world is full of people who refuse to understand that concept. People would rather type `pip install suspicious-package --break-system-packages` than learn what a venv is. (And they'll sometimes do it with `sudo`, too, because of a cargo-cult belief that this somehow magically fixes things - spoilers: it's typically because the root user has different environment variables.)
Which is why this thread happened on the Python forums https://discuss.python.org/t/the-most-popular-advice-on-the-... , and part of why the corresponding Stack Overflow question https://stackoverflow.com/questions/75608323 has 1.4 million views. Even though it's about an error message that was carefully crafted by the Debian team to tell you what to do instead.
This makes a big difference. There is also the social problem of Python community with too loud opinions for making a good robust default solution.
But same has now happened for Node with npm, yarn and pnpm.
It has a super limited compiled extensions ecosystem, plugin ecosystem and is not used as a system language in mac and linux.
And of course node is much more recent and the community less diverse.
tldr: node is playing in easy mode.
I never liked pyenv because I really don't see the point/benefit building every new version of Python you want to use. There's a reason I don't run Gentoo or Arch anymore. I'm very happy that uv grabs pre-compiled binaries and just uses those.
So far I have used it to replace poetry (which is great btw) in one of my projects. It was pretty straightforward, but the project was also fairly trivial/typical.
I can't fully replace pipx with it because 'uv tool' currently assumes every Python package only has one executable. Lots of things I work with have multiple, such as Ansible and Jupyterlab. There's a bug open about it and the workarounds are not terrible, but it'd be nice if they are able to fix that soon.
Realistically, the options on Linux are the uv way, the pyenv way (download and compile on demand, making sure users have compile-time dependencies installed as part of installing your tool), and letting users download and compile it themself (which is actually very easy for Python, at least on my distro). Compiling Python is not especially fast (around a full minute on my 4-core, 10-year-old machine), although I've experienced much worse in my lifetime. Maybe you can get alternate python versions directly from your distro or a PPA, but not in a way that a cross-distro tool can feasibly automate.
On Windows the only realistic option is the official installer.
1. `uvx --from git+https://github.com/httpie/cli httpie` 2. https://simonwillison.net/2024/Aug/21/usrbinenv-uv-run/ uv in a shebang
I’m sure it was already possible with shebangs and venv before, but uv really brings the whole experience together for me so I can write python scripts as freely as bash ones.
pyenv virtualenv python3.12 .venv
.venv/bin/python -m pip install pandas
.venv/bin/python
Not quite one command, but a bit more streamlined; I guess.E.g. if the thing you run invokes python itself, it will use the system python, not the venv one in the first case.
uv run --python 3.12 --with label-studio label-studio
Made my life so much easier
uvx --from 'huggingface_hub[cli]' huggingface-cli
This is of course bad practice. What I would like instead is what PHP's composer does: installing stuff automatically changes pyprpject.toml (or whatever the standard will be with uv), automatically freezes the versions, and then it is on git diff to tell me what I did last night, I'll remove a couple of lines from that file, run composer install and it will remove packages not explicitly added to my config from the environment. Does this finally get easy to achieve with uv?
If you change your pyproject.toml file manually, uv sync [1] will update your environment accordingly.
[0]: https://docs.astral.sh/uv/guides/projects/#managing-dependen... [1]: https://docs.astral.sh/uv/reference/cli/#uv-sync
Whatever, I think I'll try it for myself later today. It's long overdue.
pyproject.toml represents an inter-project standard and Charlie Marsh has committed to sticking with it, along with cooperating with future Python packaging PEPs. But while you can list transitive dependencies, specify exact versions etc. in pyproject.toml, it's not specifically designed as a lockfile - i.e., pyproject.toml is meant for abstract dependencies, where an installer figures out transitively what's needed to support them and decides on exact versions to install.
The current work for specifying a lockfile standard is https://peps.python.org/pep-0751/ . As someone else pointed out, uv currently already uses a proprietary lockfile, but there has been community interest in trying to standardize this - it just has been hard to find agreement on exactly what it needs to contain. (In the past there have been proposals to expand the `pyproject.toml` spec to include other information that lockfiles often contain for other languages, such as hashes and supply-chain information. Some people are extremely against this, however.)
As far as I know, uv isn't going to do things like analyzing your codebase to determine that you no longer need a certain dependency that's currently in your environment and remove it (from the environment, lock file or `pyproject.toml`). You'll still be on the hook for figuring out abstractly what your project needs, and this is important if you want to share your code with others.
Sure, that's not what I meant (unless we call pyproject.toml a part of your codebase, which it kinda is, but that's probably not what you meant).
In fact, as far as I can tell from your answer, Python does move in the direction I'd like it to move, but it's unclear by how far it will miss and if how uv handles it is ergonomical.
As I've said, I think PHP's composer does a very good job here, and to clarify, this is how it works. There are 2 files: composer.json (≈pyproject.toml) and composer.lock (≈ PEP751) (also json). The former is kinda editable by hand, the latter you ideally never really touch. However, for the most part composer is smart enough that it edits both files for you (with some exceptions, of course), so every time I run `composer require your/awesomelib` it
1) checks the constraints in these files
2) finds latest appropriate version of your/awesomelib (5.0.14) and all its dependencies
3) writes "your/awesomelib": "^5.0"
4) writes "your/awesomelib": "5.0.14" and all its dependencies to composer.lock (with hashsums, commit ids and such)
It is a good practice to keep both inside of version control, so when I say "git diff tells me what I did last night" it means that I'll also see what I installed. If (as usual) most of it is some useless trash, I'll manually remove "your/awesomelib" from composer.json, run `composer install` and it will remove it and all its (now unneeded) dependencies. As the result, I never need to worry about bookkeeping, since composer does it for me, I just run `composer require <stuff>` and it does the rest (except for cases when <stuff> is a proprietary repo on company's gitlab and such, then I'll need slightly more manual work).
That is, what I hope to see in Python one day (10 years later than every other lang did it) is declarative package management, except I don't want to have to modify pyproject.toml manually, I want my package manager do it for me, because it saves me 30 seconds of my life every time I install something. Which accumulates to a lot.
Every time when I have to reorganize or upgrade my AI repos, it’s yet another 50GB writes to my poor ssd. Half of it is torch, another half auto-downloaded models that I cannot stop because they become “downloaded” and you never know how to resume it back or even find where they are cause python logging culture is just barbaric.
I have used virtualenvwrapper before and it was very convenient to have all virtual environments stored in one place, like ~/.cache/virtualenvs.
The .venv in the project directory is annoying because when you copy folder somewhere you start copying gigabytes of junk. Some tools like rsync can't handle CACHEDIR.TAG (but you can use --exclude .venv)
NeutralCrane has a really helpful comment below[0], would love to have a more thorough post on everything!
Note, I use PyPI for most of my day-to-day work, so I say this with love!
The author says that a normal route would be:
- Take the proper route:
- Create a virtual environment
- pip install pandas
- Activate the virtual environment
- Run python
Basically, out of the box, when you create an virtual it is immediately activated. And you would obviously need to have it activated before doing a pip install...In addition, in my opinion this is the thing that would sucks about UV to have different functions being tied to a single tool execution.
It is a breeze to be able to activate a venv, and be done with it, being able to run multiple times your program in one go, even with crashes, being able to install more dependencies, test it in REPL, ...
I disagree though it is activated immediately, or at least to me with venv I always have to activate it explicitly.
It can be used effectively, but does not make it easy to do so.
uv makes itself harder to misuse
I do hope the situation changes one day. Python packaging is such a mess, but Poetry is good enough and actually works, so I'll stick with it for now.
Usually, you can directly make a pre-built wheel, and then an installer like Pip or uv can just unpack that into the environment. If it needs to be build on the user's machine, then you offer an sdist, which specifies its build backend. The installer will act as a build frontend, by downloading and setting up the specified backend and asking it to make a wheel from the sdist, then installing the wheel.
Poetry's build backend (`poetry.masonry`) doesn't build your external dependencies unless a) you obtain an sdist and b) the sdist says to use that backend. And in these cases, it doesn't matter what tools you're using. Your installer (which could be Pip, which is not a package manager in any meaningful sense) can work with `poetry.masonry` just fine.
If you can give a much more specific, simple, reproducible example of a problem you encountered with external dependencies and uv, I'll be happy to try to help.
> Package managers are for keeping track of which pieces of code you need in your project's environment. Build systems are for... building the code, so that it can actually be used in an environment.
This probably means something to the developers of the package managers and build systems, but to me, as a Python developer who wants to be able to publish a pure Python CLI program to PyPI, it seems like a distinction without a difference.
Personally I consider this one of uv's greatest strengths. The inflexibility and brittleness of Poetry's build system is what made me give up on poetry entirely. Had poetry made it easy to plug in a different build system I might never have tried uv.
Can uv work with that?
Edit: might be possible now? https://simonwillison.net/2024/Dec/19/one-shot-python-tools/
https://packaging.python.org/en/latest/specifications/inline...
Especially useful if the script has dependencies on packages in private repos.
Has been super useful to write single-file tools/scripts which LLMs can understand and run easily.
Use Nix for Python version as well as other bin deps, and virtualenv + pip-tools for correct package dependency resolution.
Waiting 4s for pip-tools instead of 1ms for uv doesn't change much if you only run it once a month.
I wish there was a way to either shebang something like this or build a wheel that has the full venv inside.
uv has support for it: https://docs.astral.sh/uv/guides/scripts/#running-a-script-w... (which only helps if your team is all in on uv, but maybe they are)
https://discuss.python.org/t/pep-722-723-decision/36763 contains the reasoning for accepting 723 and rejecting 722.
> A user facing CLI that is capable of executing scripts. If we take Hatch as an example, the interface would be simply hatch run /path/to/script.py [args] and Hatch will manage the environment for that script. Such tools could be used as shebang lines on non-Windows systems e.g. #!/usr/bin/env hatch run
https://micro.webology.dev/2024/08/21/uv-updates-and.html shows an example with uv:
> With this new feature, I can now instruct users to run uv run main.py without explaining what a venv or virtualenv is, plus a long list of requirements that need to be passed to pip install.
That ends:
> PEP 723 also opens the door to turning a one-file Python script into a runnable Docker image that doesn’t even need Python on the machine or opens the door for Beeware and Briefcase to build standalone apps.
Pyenv + poetry already gives you ability to "pull in local dependencies". Yes, you have to create a virtual environment and it's not "ad-hoc".
But if you're going to pull in a bunch of libraries, WHY would you want to invoke python and all your work dependencies on a one liner? Isn't it much better and easier to just spell-out the dependencies in a pyproject.toml? How "ad-hoc" are we talking here?
Previously I could allocate a whole week to setup initial scaffold for the project. Also more tools - more failure points, so I can flex on stupid juniors how smart I am. Now I can’t even go to pee with how fast and easy this freaking uv is. WTF.
I do like the idea of getting rid of pyenv though. And since poetry has failed to become as widespread as I hoped, maybe uv has a better shot?
So I can just do uv {package} for a quick and dirty global install. I'm so used to pip install being global by default just making this shorthand makes things a bit easier.
i would highly recommend only using —-system in a docker container or similar
uv run -q --with pandas==2.1.4 python -c "import pandas; print(pandas.__version__)" 2.1.4
Examples include, quant libraries, in-house APIs/tools, etc.
Conda packages general binary packages, not just python packages. uv is just python packages.
When we talk about local dependencies in the Python packaging ecosystem, it's usually adding some package on your file system to your environment. The existing title made me think this would be about the `[tool.uv.sources]` feature.
Really, it's about how we create environments on-demand and make it trivial to add packages to your environment or try other Python versions without mutating state.
not with this attitude of getting scared of things by watching someone doing something, for sure
[0]: https://flox.dev/
aka how to say that you've never really tried learning Nix without saying it directly.
I use Windows, and not WSL. Nix does literally nothing for me.