It is entirely possible to use poetry to determine the precise set of packages to install and write a requirements.txt, and then shotgun install those packages in parallel. I used a stupidly simple fish shell for loop that ran every requirements line as a pip install with an “&” to background the job and a “wait” after the loop. (iirc) Could use xargs or parallel too.
This is possible at least. Maybe it breaks in some circumstances but I haven’t hit it.
Not as an excuse for bad behavior but rather to consider infrastructure and expectations:
The packages might be cached locally.
There might be many servers – a CDN and/or mirrors.
Each server might have connection limits.
(The machine downloading the packages miiiiiight be able to serve as a mirror for others.)
If these are true, then it’s altruistically self-interested for everyone that the downloader gets all the packages as quickly as possible to be able to get stuff done.
I don’t know if they are true. I’d hope that local caching, CDNs and mirrors as well as reasonable connection limits were a self-evident and obviously minimal requirement for package distribution in something as arguably nation-sized as Python.
And… just… everywhere, really.
We've used different combinations of pipx+lockfiles or poetry, which has been so far OK'ish. But recently discovered uv and are wondering about existing experience so far across the industry.
At the same time, poetry still uses a custom format and is pretty slow.