I looked for a reference just now and couldn't find one. This mentions it:
> Dijkstra was famous for his general rejection of personal computers. Instead of typing papers out using a word processor, he printed everything in longhand.
https://www.mentalfloss.com/article/49520/retrobituaries-eds...
> That's like saying being an expert cook but not wanting to taste food.
Oh ho! Don't let him hear you say that, eh? You'd get a scolding. He's the one who said, "Computer science is no more about computers than astronomy is about telescopes" and "Calling it computer science is like calling surgery knife science."
The analogy would be more like "an expert chef who refused to eat frozen dinners" maybe? :)
In principle, the suitability function would be evaluated over the entire lattice; in practice, that function, whether explicitly or implicitly, includes a strong weight for "distance from existing solutions". In either case, this split in focus between the interior and the boundaries of the solution space means that programmers are often highly concerned with specific details that do not even appear (because they have been abstracted away) in the objects with which informaticians work.
As an example: theory people love to use 1-ary trees (induction steps cost nothing in proofs, but cases are expensive) and they will use 2-ary trees (sometimes even without pressure to sympathize with the machine) but systems people and programmers use k-ary trees (where, if it's been determined by measurement and not by compatibility, k depends upon "the" bandwidth-delay product between the storage hierarchy levels for which the tree is optimized ... or at least what the bandwidth-delay product had been at the time of writing).
- - - -
FWIW, when I was poking around last night looking for references for the "forced to get a Mac" story I found this.
Tony Hoare:
> The first time I visited Edsger in Eindhoven was in the early Seventies. My purpose was to find out more about the THE operating system, [Dijkstra, May 1968.] which Edsger had designed. In the computing center at which the system was running I asked whether there was really no possibility of deadlock. "Let's see" was the answer. They then input a program with an infinite recursion. After a while, a request appeared at the operator's console for more storage to be allocated to the program, and this was granted. At the same time they put a circular paper tape loop into one of the tape readers, and this was immediately read into buffer file by the spooling demon. After a while the reader stopped; but the operator typed a message forcing the spooler to continue reading. At the same time even more storage was allocated to the recursive program. After an interval in which the operator repeatedly forced further foolish storage allocations, the system finally ground to a complete halt, and a brief message explained that storage was exhausted and requested the operator to restart operations.
> So the answer was YES; the system did have a possibility of deadlock. But what interested me was that the restart message and the program that printed it were permanently resident in expensive core storage, so that it would be available even when the paging store and input/output utilities were inoperative. And secondly, that this was the very first time it had happened. I concluded that the THE operating system had been designed by a practical engineer of high genius. Having conducted the most fundamental and far-reaching research into deadlock and its avoidance, he nevertheless allocated scarce resources to ensure that if anything went wrong, it would be recognized and rectified. And finally, of course, nothing actually ever did go wrong, except as a demonstration to an inquisitive visitor.
There is a reason it is called COMPUTER science while astronomy is not "telescope science". The analogy is faulty as well:
>> "Computer science is no more about computers than astronomy is about telescopes":
You can't compare astronomy to CS--astronomy needs both computers and telescopes and spaceships--but in CS, the computer is the central figure. If not, can you think of ANY other tool that represents CS?
>> "Calling it computer science is like calling surgery knife science." is another faulty analogy. The knife does not have the same amount of critical importance both as a mean and an end in surgery, as does the computer for CS; in other words, a knife is merely a means to an end in surgery, but in CS, the computer is both the mean and the end: in the latter case, the computer is a sort of representative of all the knowledge and practice in CS, in a given time. The sophistication of the computer and what it can do, represent, to a large extent, what we have achieved in CS--but you can't say the same about a knife vs. surgery. Capisch?
The domain of astronomy is the starry sky and the Universe it reveals. The domain of surgery is anatomy, physiology, metabolism. In Informatics (not everyone calls it "Computer science", eh?) the domain is formal systems.
In each case the instruments (telescope, scalpel, digital computer) are not the main focus of investigation, they are tools, not the domain of study.
> the computer is the central figure
This is precisely the misunderstanding that Dijkstra tilted against.
> can you think of ANY other tool that represents CS?
Yes. The human brain.
I'll leave you with another joke, one of my favorite, although I don't know who said it, "Computer science could be called the post-Turing decline in the study of formal systems."
Are you sure? If I were to pick any arbitrary computer scientist (even stipulating it won't be EWD himself, this would still be "demonic choice", from your point of view), are you prepared to argue that whomever I pick does/did not do cs for the "mere" theory?
Exercise N: Was Euclid's GCD doing computer science?
Exercise S: Is watching TikTok doing computer science?
Hint: Gurfr dhrfgvbaf ner zrnag gb vyyhfgengr gur fhssvpvrapl naq/be arprffvgl bs pbzchgref gb qbvat pbzchgre fpvrapr.
And historically most of what we call computer science was developed before the advent of the computer. Turing, Church, Boole, Quine, Haskell Curry, etc. Wittgenstein, Russell & Whitehead (Principia Mathematica), etc. I could list names all day, none of whom used a mechanical computer.