Thank you, but I read their comment carefully, and I'd like to let this person (Matthias247) speak for themselves. (I've asked mods to unflag my comment.) I hope they will respond.
To reply to the take on their comment that you've just written: I'm not talking about the decisions junior engineers make. I'm talking about the decisions senior architects, who whiteboard and diagram solutions as complicated as necessary (which is the correct approach), make. They are making the wrong decisions, using the wrong trade-offs. They are not doing their job well.
The specific issues you have paraphrased could be solved in a different way (I'll just quote what you just said: "something like a 'thread' leak". This has specific possible solutions). The point is, that way is not the way that has been chosen, due to bad, incorrect, wrong decisions.
It's not that there are leaks or bugs (I'm not talking about the work of junior engineers). It's that the chosen, correctly implemented algorithm implements the wrong choices.
Let me give you an analogy: there is a very, very good sort algorithm called quicksort. It has very good behavior and is commonly used. It has excellent theoretical properties.
In its first naive implementation the worst case happens when an array is already sorted or nearly sorted. (http://www.geeksforgeeks.org/when-does-the-worst-case-of-qui...) [1] As a practical matter sorting things are often done in cases where they might be sorted already or nearly so.
So it's not that the other cases don't need to be taken into consideration - after all even bubble sort works optimally when lists are already sorted....
It's that it's wrong to code quicksort by making the choices that ignore the most common case. Anyone coding the naive quicksort implementation I mentioned on data that is frequently already sorted or nearly sorted is not doing their job well.
In the case of network logic, the wikipedia article I linked shows that it does not even have technical properties that mean it is theoretically correct under all network conditions. So it's even worse than a naive quicksort: it's broken for the most common case, and not theoretically correct (because that's not possible) for every case.
They simply need to wake up and change their trade-offs and priorities. For example, by randomizing sort order for quicksort, of course this adds steps - at the same time, it improves the most common condition (sorting an already-sorted or nearly-sorted array.) Use this analogy and, yes, by God, code (and more importantly, architect) for the common case!
[1] http://www.geeksforgeeks.org/when-does-the-worst-case-of-qui...