pip install <my_entire_universe_of_requirements>
pip freeze > constraints.txt
And now in any new environment: pip install <any_subset_of_requirements> -c constraints.txt
Now you can install prod requirements or dev requirements or whatever other combination of requirements you have and you guarantee to have the exact same subset of packages, no matter what your transitive dependencies are doing.You can use pip-compile from pip-tools if you want the file to include exact hashes.
Also pip freeze does not emit a constraints file, it emits (mostly) a requirements file. This distinction is rarely important, but when it is, it can cause a lot of problems with this workflow. For example, a constraints file cannot include any information about which extras are installed, which pip freeze does by default. It also can't contain local or file dependencies, so if you have multiple projects that you're developing together it simply won't work. You also can't have installed the current project in editable mode if you want the simple "pip freeze" workflow to work correctly (although in practice that's not so difficult to work around).
Pip-tools does work a bit better, although the last time I used it, it considered the dependency chains for production and for development in isolation, which meant it would install different versions of some packages in production than in development (which was one of the big problems I was trying to solve).
From my experience trying basically every single option in the packaging ecosystem, there aren't really any solutions here. Even Poetry, which is pretty much best-in-class for actually managing dependencies, struggles with workspace-like installations and more complicated build scripts. Which is why I think pretty much every project seems to have its own, subtly unique build/dependency system.
Compare and contrast this with, say, NPM or Cargo, which in 95% of cases just do exactly what you need them to do, correctly, safely, and without having to think about it at all.
They're constraints not dependencies they don't need to be installed and you can just update your requirements as you need and regenerate them.
> Also pip freeze does not emit a constraints file, it emits (mostly) a requirements file. This distinction is rarely important, but when it is, it can cause a lot of problems with this workflow. For example, a constraints file cannot include any information about which extras are installed, which pip freeze does by default
Pip freeze does not use extras notation, you just get extra packages listed as individual dependencies. Yes there is an important distinction between constraints and requirements but Pip freeze uses an intersecting subset of the notation.
> You also can't have installed the current project in editable mode if you want the simple "pip freeze" workflow to work correctly
That's why the workflow I gave to generate the constraints didn't use the -e flag, you generate the constraints separately and then can install however you want, editable or not.
> From my experience trying basically every single option in the packaging ecosystem, there aren't really any solutions here. Even Poetry, which is pretty much best-in-class for actually managing dependencies, struggles with workspace-like installations and more complicated build scripts. Which is why I think pretty much every project seems to have its own, subtly unique build/dependency system.
People have subtly different use cases that make a big impact on what option is best for them. But I've never been able to fit Poetry into any of my use cases completely, whereas a small shell script to generate constraints automatically out of my requirements has worked exceedingly well for pretty much every use case I've encountered.
It's still not great though, since that only pins version numbers, and not hashes.
You probably don't want to manually generate requirements.txt. Instead, list your project's immediate dependencies in the setup.cfg/setup.py file, install that in a venv, and then 'pip freeze' to get a requirements.txt file. To recreate this in a new system, create a venv there, and then 'pip install -c requirements.txt YOUR_PACKAGE'.
The whole thing is pretty finicky.
pip freeze > requirements.txt
if you want. The only catch is you should be using a similar architecture and Python version in both development and production.This would also pin a few non-project dependencies such as `disttools` but that shouldn't be a problem.
Edit: TIL that pip constraints is a thing. See the comment posted by oblvious-earth for an even better approach.