Have seen too many cases where 10-20 lines of code avoided the need to pull in an external library with multiple dependencies.
Ironically, I also find for anything not extremely mainstream, external libraries tend to appear more complete/functional than they actually are. Often find I end up having to fork and/or rewrite them for my use case.
Sure, some say everything should be fixed with PRs. But my technical goals and timeline constrains don’t necessarily align with the maintainer’s. So fork/rewrite it is!!
I do agree with the nature of self sufficiency. That is the start of durability. Most people find this revolting though. The goal, for most people, isn’t stuff that works properly. The goal is inclusion and comfort, a social baseline opposed to a utility baseline.
That said, sometimes we need to take on the risk and effort of making a second system. I have often thought about the relearning/doomed-to-repeat-history problem, and I wonder if software - especially some open source software - might be uniquely positioned to build a second system due to bug trackers.
The bug trackers in software like Firefox effectively capture a large percentage of a project's history and design decisions. It seems to me that the bug tracker for a projects' predecessor could lay the proper frame for its successor.
I'm willing to accept a little bloat and pass on inventing wheels myself if I can grab something reliable off the shelf. I don't think that makes me less self reliant.
We should begin collecting and centralizing the insights learned from the development of software outside the source code of specific projects
At some point we might be able to be confident that the current version of all our dependecies has been carefully reviewed by enough reliable people, but right now we're not even moving in that direction; so, minimizing the dependencies is the proper thing to do.
Utterly bizarre rant.