https://medium.com/@penguinpress/an-excerpt-from-how-not-to-...
There are also some true idiots in the Republican party that work similarly.
It was comforting
The rebuttals to his design are basically just "every room needs a window", without any real justification. Do you think he didn't think of that? That he just forgot to put windows in?
My dorm at college had windows, and they were entirely worthless. There were 2 windows, 2'x6', frosted glass, and they could open about 3". I don't think they added much to the room
Nah, looking at the floor plans, he was just optimizing for cost at the expense of user experience.
To rebut your anecdata with my own: My dorm rooms at college had large 5' x 4' windows that opened. The unlimited fresh air and view over the campus helped me keep my sanity while studying for a 4-year engineering degree. A windowless room would have driven me stark raving mad in the first year.
I think Munger should build a dorm room for himself that doesn't have windows and live in it for a few years before deciding that other people don't need them.
Whether he thought of it or not, bedrooms without windows are unsafe because in a fire you're trapped. Safety regulations are written in blood.
TBD, apparently, is an alternative source of ~$1B:
> Estimated Budget: The projected construction budget for Phases 1 and 2 of the Project is $600M – $750M.
Or am I just a hopelessly anxious person lol
This is the general basis for why I tend to pick tools & concepts that are at least a half-decade old. The space of unknown unknowns in something that has been around this long should be vanishingly-small, especially if we are applying the tool or concept in a typical way.
So not just list everything that could go wrong, but maybe: what's a terrible day for your service/system that's most likely to happen? Cascading failures? Outage that makes accessing/recovering your system impossible? Backups unusable?
This can help get past some individual biases.
I think the simple idea of “risk analysis” is much more intuitive and better captures the idea that is being conveyed.
Where I think such a practice can be useful is in forcing you to confront unpleasant possibilities you would otherwise try to ignore, and thus at least briefly plan for them.
That is, failure isn't always caused by me doing something. Sometimes it's caused by something external. My wonderful new box might not ship if we can't get the chips to make it? I should probably look at lining up a second source. (Yeah, a couple of years ago we saw that that may not work. You can't always prevent everything bad that can happen. You can prevent some of them, though, and it makes enough difference to be worth trying.)
Not only did I buy a Zune, I was an early enthusiastic customer of their streaming subscription service and thought that it would topple the iTunes sales model. We all know how well that went.
I see the value here being: when you can’t understand the causal relationships for success, use the causal relationships for failure that you can understand and then avoid specific failure modes.
“Friend gives generally bad advice” is not that kind of a clear causal relationship.
https://www.google.com/search?q=steve+jobs+ipod+water+air+bu...
Another is when Feynman ducked a piece of o-ring in a glass of ice water, causing it to fail.
In mathematical optimization theory, duality or the duality principle is the principle that optimization problems may be viewed from either of two perspectives, the primal problem or the dual problem. If the primal is a minimization problem then the dual is a maximization problem (and vice versa). Any feasible solution to the primal (minimization) problem is at least as large as any feasible solution to the dual (maximization) problem.
Here's where I'd share examples and it wouldn't be funny and instead start a flamewar.