His advocacy could be considered quite extreme. He apparently suggested (I don't know whether jokingly or not) that programmers not test their code, and instead write it on paper with various kinds of mathematical correctness proofs, and only then type it on a computer. It's admittedly thought provoking but I have an uncomfortable feeling that some naive people might have followed his advice very literally, and off a cliff.
Generally, I think people should read writing from decades ago in its proper context.
I am reminded of his GOTO Considered Harmful. It was radical advice in its day. Not too many languages immediately after actually tossed it. It wasn't always obvious at the time how to transform some structures efficiently. E.g. the designers of UNIX were fully convinced by structured programming but C still had GOTO. Even now there's GOTO in drivers etc. in the Linux kernel. People debate whether Rust should have it. Etc. Despite this everyone accepts the basic idea these days.
Nowadays machines are so fast and stores are so huge that in a very true sense the computations we can evoke defy our imagination. Machine capacities now give us room galore for making a mess of it
If only Dijkstra was around today... those thoughts were extremely prescient. I've seen far more overly complex and abstracted code than "too simple" code.
Though seriously it's mind blowing how convoluted modern desktop applications have sometimes become. Would be nice to have more KISS applications instead of fancy monstrosities for our daily usage.
Electronic engineering can contribute no more than the machinery, and that the general purpose computer is no more than a handy device for implementing any thinkable mechanism without changing a single wire. That being so, the key question is what mechanisms we can think of without getting lost in the complexities of our own making. Not getting lost in the complexities of our own making and preferably reaching that goal by learning how to avoid the introduction of those complexities in the first place, that is the key challenge computing science has to meet. --- The important distinction between "Computing Science" and "Computer Science". The former is what we need to focus on.
Does this overestimation of the usefulness of the gadget hurt computing science? I fear it does. At the one end of the spectrum it discourages the computing scientist from conducting all sorts of notational experiments because "his word-processor won't allow them", at the other end of the spectrum the art-and-science of program design has been overshadowed by the problems of mechanizing program verification. The design of new formalisms, more effective because better geared to our manipulative needs, is neglected because the clumsiness of the current ones is the major motivation for the mechanization of their use. --- The need to focus on better Formal Notations to design "Correctness" into Programs in the first place.
Btw, does anyone know of a good, cheap source of fresh parchment?
(:P)
Well, that passage was painfully prescient, even thought the Challenger disaster was not the result of computer failure, but of rubber o-rings denatured by extreme cold. The poor shuttle software and the primitive hardware on which it ran was consistent with the Rogers Commission's criticism of the shortcomings NASA's organizational culture with its acceptance of poor quality control, planning, and inadequate equipment design.
Aaronson seems happier or at least more composed now, though. A few months have gone by since that post.