There is a cost to the mother - major surgery - to the benefit of children. Modern society considers this a worthwhile trade, but it would have meant the death of the mother not even a century ago.
I have no real way to tie this back to programming; other than perhaps to note that we're likely still in the dark ages of programming; the programming equivalent of a cesarean section is likely still years away.
Look at HTTP, WebSockets, NodeJS, JS! Docker!
Churning out "apps" and delivering deliverables is the focus, not engineering. Ticking off boxes for enterprise is more important than product feel and user satisfaction. And so on.
And once you got hooked on docker and nodejs (with typescript and VS Code) why would you go back to hacking perl scripts that invoke long forgotten binaries written in the darkest C dialects, darker than void* itself, even if that damn docker image is 600MB, the runtime memory cost is not much less, and it's managed by kubernetes which needs gigabytes of RAM just to run an apiserver and scheduler and a kubelet (node/server manager).
One facet of the tie-in is the age-old question of what to optimize for: Programmer time? Execution speed? Code compactness? Code maintainability? Cf. The Story of Mel [0].
Besides elected c-sections, the prevalence has been increasing, where a labor shows the slightest risk and suddenly it requires a c-section.
But I think that the part that most rubbed me wrong was the realization that it Apgar (a rather crude system for summarizing neonatal health, IMO) that, in part, caused the C-section to rise in popularity.
That's gross. It's like using `free` to measure RAM usage on your server, and then determining that every app with usage over n needs tighter JPG compression, reducing image quality. In other words: you measured the wrong thing, implemented the wrong fix, reduced the quality of the outcome, and then made the above standard practice.