People are vaguely good and competent, they leave systems in a locally-optimal state.
In general only changes that are "one step" are considered, and they allways leave things worse when you are currently in a locally optimal state.
A multi-step solution will require a stop in a lower-energy state on the way to a better one.
Monotonic-only improvement is the path to getting trapped. Take chances, make mistakes, and get messy.
Better for developers? Better for users?
Better for speed? Better for maintenance? Better license? Better software stack? Better telemetry? Better revenues through subscriptions?
At most they might care about non funcional requirements (e.g. security and performance)
The evolution disagrees.
All major trophic level breakthroughs are powered by evolving a reserve of efficiency in which mult-step searching can occur.
multicellular life, collaboration between species, mutualism, social behavior, communication, society, civilization, language and cognition are all breakthroughs that permitted new feature spaces of exploration that required non-locally-optimal transitions by the involved systems FIRST to enable them.
Trust is expensive and can only get bought in the presence of a surplus of utility vs requirements.
> The evolution disagrees.
It’s not an either/or. Vast modularized localized improvement allows for the ability to prune and select what does and doesn’t work.
Ah yes monotonic-only improvement by way of making every small, messy mistake possible and still probably going extinct is definitely the way to go
There's a certain dopamine hit you get for voluntarily trudging into the trough of despair, pushing the Sisphyean boulder of better design uphill in the optimistic belief that you can do better, and then actually arriving at something slightly better.
Way more fun than conceptualizing programming as duct-taping libraries, frameworks, and best practices together.
Only if new joiners wouldn’t feel like they have to “show up with something” making existing stuff obsolete.
Well not blaming people or companies just thinking out loud.
Only if companies treated workers with dignity and not like they’re disposable cogs.
Only if companies understood value of standards that would prevent new joiner from wreaking havoc.
https://en.wikipedia.org/wiki/Activation_energy
The ELI5 version is that atoms are all trying to find a comfy place to be. Typically, they make some friends and hang out together, which makes them very comfy, and we call the group of friend-atoms a molecule. Sometimes there are groups of friendly atoms that would be even comfier if they swapped a few friends around, but losing friends and making new friends can be scary and seem like it won't be comfy, so it takes a bit of a push to convince the atoms to do it. That push is precisely activation energy, and the rearrangement won't happen without it (modulo quantum tunneling but this is the ELI5 version.)
In the software world, everyone is trying to make "good" software. Just like atoms in molecules, our ideas and systems form bonds with other ideas and systems where those bonds seem beneficial. But sometimes we realize there are better arrangements that weren't obvious at the outset, so we have to break apart the groupings that formed originally. That act of breakage and reforming takes energy, and is messy, and is exactly what this author is writing about.
On one hand you have guys like the OpenBSD team that work on Mostly Boring Things and making serious inroads at improving software quality of Mostly Boring Components that power the hidden bits of the Internet that go relatively unnoticed.
On the other hand, you have "improvements" from Apple and everyone else that involve an ever-changing shell game of moving around UI widgets perpetuated by UI designers on hallucinogens.
Are these browsers like Chrome that are elaborate ad dispensing machines really improvements from the browsers of yore? IE 4 may have sucked by modern standards but it also didn't forward every URL I visit to Google.
I've been around since the beginnings of the WWW and it's reached the point where I am struggling to understand how to navigate these software "improvements". For the first time I have felt like my elderly parents using technology. I haven't gotten stupider; the software has become more difficult to use. It has now become some sort of abstract art rather than a tool for technologists.
Further down, he talks about changing the structure of the software in order to support planned features, etc.
So putting it all together, “better” == more featureful at lower cost with reduced marginal pain (to the developers) of further expansion.
I’d say “better” should mean enabling users to achieve their goals with minimal friction for the user (i.e., program p is designed to allow users do task (or set of tasks) t faster/better/more efficiently/whatever). But of course I would say that, I’m a user of software, not a developer of it.
Consider the notion of a Mac-assed apps. They make life as a Mac user much nicer because they integrate so well with the environment and other native apps. But lo! L unto man was revealed his Lord and Savior Electron. Much nicer for developers than having to port programs across several different native environments. So native goes the way of the dinosaur (with some exceptions, of course). That’s a massively canned just-so story, of course, so don’t take it too seriously as actual analysis.
But the moral of the story is that, as a user, it’s endlessly fascinating to me, watching developers talk about development and how much their focus tends towards making their lives as developers easier, even at the cost of sacrificing users’ experiences and expectations as guiding principles.
Love him or hate him, but it’s one of the things that I appreciate Linus Torvalds for emphasizing occasionally: computers are tools people use in order to get things done (for whatever purposes, including recreation).
(That said: There is an irreducibly human element of play involved here for developers too. And even non-developers can be fascinated by computers in/for themselves, not just as sheer tools you’d ideally not even notice (in the Heideggerian sense of tools ready at hand versus present at hand). I’m one of those outsiders. No shame in it.)
That's it, I think. Then you recurse up into architecture.
Bad architecture is hard to follow. (spaghetti code) Good architecture is easy to change.
Yes, this means you can have code that's neither bad (it's easy to read) nor good (but still hard to change). In the past I've called this "lasagna code" - the layers are super clear, and it's easy to follow how each connects, but they're far too integrated to allow for any changes.
It's harder to phrase on the level of "software", but mabye something like:
Bad software is hard to use. Good software does its job and then gets out of the way.
Ditto.
> I haven't gotten stupider; the software has become more difficult to use.
I can't speak for you, but I'm becoming less interested in new shiny in a lot of things beyond UI widgets. There's a reason why we olds have a reputation of falling behind, and it's not because engineers and inventors explicitly make things that only young people can learn.
Are you sure? Replace "young" with "inexperienced" and that's exactly what I see in most new software: the focus is on the broadest userbase possible, which is entry-level products and UIs. Nobody's focusing on making expert tooling, everything is geared towards the lowest common denominator -- because supposedly that's where the money is.
Sometimes when reviewing people’s redesigns, I can’t see the beautiful thing that they’re envisioning, only the trough. And over the years I’ve noticed that a lot of redesigns never make it out of the trough. I like the idea of doing small things quickly, I think that’s good, but that’s also technical debt if the redesign never results in a benefit.
Prototyping and figuring out where the most friction is, chipping away at it with each new feature that touches that area.
One of the cleverest things I figured out on my own rather than stealing from others, was to draw the current architecture, the ideal one, and the compromise based on the limits of the our resources and consequences of earlier decisions. This is what we would implement if we had a magic wand. This is what we can implement right now.
It’s easier to figure out how to write the next steps without climbing into a local optimum if you know where the top of the mountain is. Nothing sucks like trying to fix old problems and painting yourself into new corners. If the original plan is flawed it’s better to fix it by moving closer to the ideal design than running in the opposite direction.
What usually happens is people present an ideal design, get dickered down by curmudgeons or reality, and start chopping up their proposal to fit the possible. Then the original plan exists only in their heads and nobody else can help along the way, or later on.
Distinguishing between the idea and the implementation is vital.
If the idea is good then a few rounds of review is all that's needed to shore it up. If the idea is bad, then there's more work to be done. Letting people know that you like the idea is key. There's also room for being okay with the implementation if it differs from how you'd do it.
I think the article could be a lot shorter and easier to understand if it simply said that the current design is in a local maximum, and you have to work your way incrementally out of the local maximum to reach a different local maximum. I think programmers would get that metaphor a lot more easily than the "buying widgets for a new factory" metaphor.
I do like how the article puts the spotlight on designing the process of change: picking the route, picking the size of the steps, and picking the right spot to announce as the goal. That gives me a lot of food for thought about the changes my team is contemplating right now.
Perhaps to rephrase it even simpler:
To reach higher mountains we need to climb down our current peak, walk through valleys, until we find higher mountains to climb.
Why would the current design be at a local maximum in the first place?
The curve picture feels like a false idol, as soon as he starts doing TA on it, the carriage is well in front of the horse
Microservices is just a buzz word for an overly prescriptive (thankfully waning in popularity) type of distributed system. When you are developing a distributed system, the infrastructure is a primary consideration that is potentially even more important than anything in the app layer.
I'd usually agree, especially as things get big.
But Kent is also pretty famous for throwing out code if things aren't shaping up. He does this in micro increments however, usually with just-written code.
I've just spent years wrestling with someone else's poorly written, ill-intentioned code, bringing it into line. I've taken the above approach of slowly reworking it. Sometimes I wonder if I just kept the tests and jettisoned large bits of it if I'd be better off?
Very contextual of course, but sometimes you have to explore a little bit to know the right places to make tradeoffs.
Then you read his latest book "Tidy First" and it tells you when you move out multiplying width and height into an area function you have now made a beneficial design change in your system and the relationship between caller and box, a "tiny step". And suddenly all the doubts wash away.
Not sure what it is with this industry, but the writing is just useless.
Netscape pre 5 or 6 was a mess. It was a downloadable desktop application that kept getting pushed to deliver new features with a struggling UI framework. Additionally, I would imagine that the group delivering this was rather small in respect to the size of the task. They didn't have CI/CD, git, etc to give feedback. This reeks of an overmainaged project that was intentionally underfunded.
Ultimately.. it was an unmaintable mess that required a rewrite to even continue. To me it sounds like it was tech debt piled deeper and higher.
What came of this? Complete browser rebuilds (mozilla mosaic, chrome, etc), and finally this caught fire through the Chrome project and Javascript acceleration at google.
As for the Netscape anecdote, I wouldn't put too much weight on that part.
We do not know the extent of it, we do not know if it achieved its goals, and we absolutely can not say whether or not the alleged rewrite contributed to or affected the evolution of the product into firefox and eventually chrome etc
something I have noticed in this industry is that big companies think they can outsource their staffing issues and "save on labor". But in the end they pay more in management of outsourced assets, inevitable maintenance of poorly designed and implemented software, delays in delivery, and of course the churn and burn of hiring/firing contractors. Then they end up redoing everything with local talent with 1/8th the team in half the time.
It only took 3-4 years to realize this but this is what the "trough of despair" really looks like.
This also is why I do not believe LLMs pose as big a threat to software development as we're told. Maintenance will always require humans that can simultaneously comprehend the system as it is today and the system as it should be in the future.
Salary has long since been disconnected from skill, ever since cheap money flooded the industry, and easier abstractions made it seem like "everyone can code". Perhaps "fog a mirror" shouldn't be the only programmer criterion.
The art is to design things in such a way so that a minimum amount of time is spent in the trough.
You know when you get to the point your data structures just make working on the code a breeze, when your library functions provide all the right low pieces to whip up new features quickly and easily with names and functionality that actually fit the domain... Basically, when all the pieces 'gel' :-D
That for me is programming nirvana :-D
(Yes there's a typo in the url. It bugs me, too)
prior discussion: https://news.ycombinator.com/item?id=30128627
* Replaced dusty old bugs with shiny new bugs.It probably helps that I have 30+ years of experience and always pick architectures I have used before on successful projects.
Secondly: I think this may be reflective of someone that hasn't sat down and realized the environment that they're in. Creating a poor architecture or approach for the first go is usually a sign of dysfunction or inexperience.
Inexperience: It's more that the individual hasn't sat down, realized that the initial approaches are in appropriate and should be designing first before pushing forward. Experience should be fleshing out a lot of these details before coding anything and get the protocols and conflicts resolved months before they happen. (This is where I see a Staff+ being responsible and assisting in the development of the project)
Dysfunctional environment: Our culture in software engineering has forgone formal design before creating a solution. Typically, most development is dictated by "creating a microservice" first and then trying to design as you go along. This is so aggressive in a coding first approach that many even forgo testing. Why does this exist? Partly the incentives by business/management to deliver fast and distrust in the survivibility in the product.
---
That being said: Am I promoting a "perfect design" (as I've been accused of doing) first? No, iteration will happen. Requirements will change, but if you're applying strong tooling and good coding practices.. rearranging your arch shouldn't be as big of an issue as it currently is.
Not every new feature needs to go in an existing repository. Sometimes it makes perfect sense to implement the new functionality in a separate executable and artifact that doesn't carry along all the technical debt of the old project.
https://gavinhoward.com/2022/10/technical-debt-costs-more-th...
Not everyone can do this, however.
In a way, the quality of the design (and implementation) is fine, but the quantity of features is insufficient and limited by social issues.
It's basically the opposite problem of systemd.
iOS 6 = good
iOS 7 = bad
iOS 8 = better
And now: iOS 16 = good
iOS 17 = bad
iOS 18 = even worse (and ugly)
iOS 19 = hopefully good