My worst experiences universally have always been python projects. I don't think I've had a single time where I cloned a python project and had it just work.
Beyond just the code, I've had lots of mixed experiences with CI/CD being smooth. I unfortunately don't think I've been in a single shop where deployments or ci have been a good experience. They often feel very fragile and undocumented and hard for newcomers.
I will never recommend Python outside of a small team. It is organizational molasses. My current company has multiple teams striving to keep our Python tech stack serving our growing technical and organizational scale.
I have fixed this in two companies in no small part with migrating to Go. I am on my third.
Ever try conda though? I’ve had moderate success with pipenv, but tbh I don’t love it as it hides too many things when installing a package fails.
I'm curios if you can spot a pattern in the platform (win/osx/linux), type of project, or is it all over the place?
My own experience with Python boils down to creating a virtualenv, installing the deps, setting up configuration (or just copying it from somewhere) and creating a database, and I'm off to the races. The only exception in recent memory was when a project had two dozen microservices, half of the codebase was on private package repository, and we used Poetry. The combo required somewhat more involved setup. That said, IIRC all the projects had fully pinned package versions (package==x.y.z).
In contrast, every time I touch something in JS land I get the same experience you described for Python. On one project we literally copied node_modules across machines (including servers), because it was unbounded amount of time trying to do a full reinstall. Anecdotally, amount of churn in JS is much higher, and the maintenance load increases proportionally.
Usually it's something like:
- have a project in JS with some dependency X that's no longer on the bleeding edge, but works nice
- want to depend on a new package Y for some new feature
- the new package Y depends on a library Z that's higher than what the other dependency (X) can work with
- try to update the original dependency (X)
- wailing, gnashing of teeth, and considering the switch to agriculture instead
In my experience, if you're not closely tracking the bleeding edge, upgrading packages and updating your code accordingly, your JS developer experience will be abysmal.
Agree on the CD part, especially the fragility and more manual work than if the deploy is some manually driven (semi-)automated process.
So there is virtualenv, built in, but... if there is a venv directory, Python doesn't just use it.
Like you have app.py, and you python app.py, that doesn't run it with the venv python. This leads to all sorts of problems with scripts that assume they're running under venv. Which means you probably want to write a script that sources venv just so you don't forget, but if you place it in the same directory you may forget you need to call the script, so you probably want to add an extra directory to hide all the python code so you only see the shell script that you need to run to properly setup the environment to run the python code. Or just use an IDE.
Just "pip install." But pip isn't installed and ensure pip doesn't work? What do I even do then?
I recall downloading a project that required a library that wasn't available for the newest version of python, so when you tried to install the requirements pip wouldn't find it. I discovered this, naturally, because I updated my operating system so the python version changed which means the project that used to work stopped working! What is the solution for installing multiple python versions side by side? Hint: it's not an official project by the Python organization but something you can find on github.
- pyenv for installing multiple versions of python on my machine
- direnv for managing environments (env variables, python version, and virtual environment)
- pip for installing dependencies (pinning versions and only referencing primary packages in requirements.txt - none of their dependencies)
This makes everything extremely easy to work with. When I cd into a project directory direnv loads everything necessary for that environment.
Each project directory has a .env and a .envrc file. The .envrc looks something like this:
layout python ~/.pyenv/versions/3.11.0/bin/python3
dotenv .env
Absolutely no headaches working on dozens of local python projects.Do you mind sharing why do you think this happens ? Although I never worked professionally with python, this sentiment matches with my experiences as a user. So I don't have a lot of context why this is the case.
Some siblings in this thread provided some explanations that mostly boils down to 'bad tooling' in one form or another. But this doesn't feel right.
In my opinion if it was just bad tooling this problem would be solved by now.
1. Extremely difficult to setup the code base, because of dependency spaghetti 2. Lot of breaking changes across different libraries, making maintenance not so easy.
Easiest projects to maintain were written on Go, Java, Ruby,
A new dev could get up and running quickly with "install vagrant; vagrant up", but that was hiding a lot of complexity behind a very leaky abstraction.
I got a new Chromebook from work, and had VSCode+Docker running an existing Postgres+Django+etc dev environment in literally 15 minutes. I was shocked. Devcontainers are magic, and poor Python DX is a skill issue.
Oh yes, the language whose ecosystem only hears about backwards compatibility in their own death marches? Not their problem. It's the developers, it's _their_ problem.
Not the standard library which _removes_ packages, breaking code which I recently cloned. See "imp".
And not the next python version, which throws a syntax error on bare excepts, breaking old code for absolutely zero benefit beyond pretending to be a linter.
Something this old shouldn't have this property. Nothing "modern" even comes close. Look at the top languages, Python, JavaScript, and Java, and you don't even have to consider too much how abysmal these languages are in this regards.
It's not an accident -- reading through the emacs-devel mailing list, it's easy to see how much effort the maintainers pour into backward compatibility. It's one of Emacs' unspoken guiding principles[1].
At the same time, it's not that surprising either. Emacs does not have other objectives that more modern languages/ecosystems do: no revenue or growth targets, corporations or VCs breathing down its neck, or a mandate to be "modern". Its most vocal and experienced users, who are also its volunteer maintainers, decide what its priorities should be. Since they've been using it for decades, backward compatibility is high on the list.
[1]: It's "spoken" guiding principles being to further the goals of the GNU project.
Fast setup and revision are important but incomplete list of maintenance tasks; are metrics/logs predictably named and accessed? Can you perform manual experimentation without hard-to-configure client (ir: hit the server with a browser or run a cli)?
Also, "cycle time" or "revision time" are soo important, but I havent found a good way to do that with AI model development :( any tips here?
Easy to maintain is not only about keeping something alive with minimal effort over longer periods of time. It also plays a pivotal role for scalability in any direction. Adding more engineers/teams, adding more unforseeable features, iterating quickly in general, surviving more traffic/load, removing technical bottlenecks, ... everything is so much easier when the project is easy to work with and maintainable.
In ruby, for example, I can pretty trivially clone any open source gem and run the specs in < 5 minutes. Patching something and opening a PR in under an hour is definitely standard.
On the other hand, getting a development environment running for a company's proprietary web app is often a hassle. Mostly though this isn't because of the language or dependencies, it's because of:
- Getting all the dependent services up and running (postgres version X, redis Y, whatever else) with appropriate seed data.
- Getting access to development secrets
My company (infield.ai) upgrades legacy apps, so we deal with setting up a lot of these. We run them in individual siloed remote developer environments using devcontainers. It works OK once we've configured the service containers.My initial reaction was that it was a list of fairly complex things, but they are not necessarily complex to implement, even if people commonly over-complicate a couple of those things or make them a pain for other developers to setup, which seems to be part of the point.
[1] - The Innovation Delusion: How Our Obsession with the New Has Disrupted the Work That Matters Most - https://a.co/d/eInjwZD