That isn't to say that he did not have valid reasoning for his dislikes as a language expert and a mathematician but those reasons are not ones the bulk of practitioners would likely agree are valid.
And it's not like he was a lone crank railing against the 99% who were moving forward with a well-founded idea of how computing could scale indefinitely by composing incorrect programs. Just like today, 99% of programming, including programming for government programs and vital infrastructure, was done by people who were only hoping to make their own projects succeed well enough for the next six months to three years. I'm sure there were brilliant people who made intelligent arguments against the need for correctness, but their arguments didn't carry the day. Complacency and short-term thinking did.
In that context, Dijkstra's pessimism and his use of harsh, attention-getting language makes a lot of sense. How many people at the time really understood that in 2020 every part of our civilization would depend on code compiled by compilers with bugs, linked with libraries with bugs, in virtual machines with bugs, on operating systems with bugs, on CPUs with bugs in their microcode, and yet it would still all mostly work?
That is clearly wrong if one is willing to take a moment to stop gazing at the wonder of pure mathematics and look at the outside world. There is no notion of "correctness" for the pyramids of Egypt, the dykes of the Netherlands, Milan Cathedral, or the world economy and yet those huge-scale systems all function.
> Dijkstra's pessimism and his use of harsh, attention-getting language makes a lot of sense.
It makes sense in terms of Dijkstra's personality, but it makes no logical sense. If you believe mathematical correctness is such a valuable principle, then you should be able to leverage that same principle in one's own arguments. The fact that Dijkstra couldn't dispassionately prove that correctness was vital and had to resort to emotionally-loaded weasel words undermines his own claim.
Of course there are critical software engineering projects as well, Boeing MAX comes to mind.
The BASIC he is describing had no call stack, nothing that would resemble a function in today's languages. FORTRAN at the time handed masses of state around in global variables and the latest version had just added subroutines and functions. This is after ALGOL 60 had established the model that we all take for granted today.
I think the bulk of practitioners today would absolutely agree with him.
About programming languages I don't think there's any question he was right. Sometimes we agree with him even without realizing it, for example, the original argument of GOTO considered harmful was that GOTO statements decreased "linearity" by having the control flow jump to random places. The recent trend in mainstream PLs to adopt functional-style control structures (JS array methods, Java streams, etc.) is predicated on the exact same rationale.
I'm also in broad agreement on his stance about natural language being just flat out wrong in the context of computing.
But the man made an exorbitant amount of claims, many of which are just provably wrong. The bits about BASIC and COBOL are particularly egregious examples of his at times cavalier attitude. After all, arrogance is measured in nano-Dijkstras.
> It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration.
is at best a hyperbolic joke. I dearly hope he wasn't being serious.
"The second major development on the software scene that I would like to mention is the birth of FORTRAN. At that time this was a project of great temerity and the people responsible for it deserve our great admiration. It would be absolutely unfair to blame them for shortcomings that only became apparent after a decade or so of extensive usage: groups with a successful look-ahead of ten years are quite rare! In retrospect we must rate FORTRAN as a successful coding technique, but with very few effective aids to conception, aids which are now so urgently needed that time has come to consider it out of date. The sooner we can forget that FORTRAN has ever existed, the better, for as a vehicle of thought it is no longer adequate: it wastes our brainpower, is too risky and therefore too expensive to use. FORTRAN’s tragic fate has been its wide acceptance, mentally chaining thousands and thousands of programmers to our past mistakes. I pray daily that more of my fellow-programmers may find the means of freeing themselves from the curse of compatibility."
To this day if I saw 6502 assembly on a resume I'd be intrigued and if I saw BASIC I'd view it as a yellow flag.
> Python —"the infantile disorder"—, by now nearly 30 years old, is hopelessly inadequate for whatever computer application you have in mind today: it is now too clumsy, too risky, and too expensive to use.
> It is practically impossible to teach good programming to students that have had a prior exposure to JS: as potential programmers they are mentally mutilated beyond hope of regeneration.
> The use of Java cripples the mind; its teaching should, therefore, be regarded as a criminal offence.
The quotes about languages were always controversial, weren't they? But it's clear now in retrospect what Dijkstra was complaining about. He found FORTRAN to trick people into thinking that programming was merely about specifying arithmetic operations in a certain order, considered BASIC to require mental models which rendered folks memetically blind to actual machine behaviors, and thought COBOL tried to be legible to management but ended up being confusing to everybody.
> Many companies that have made themselves dependent on AWS-equipment (and in doing so have sold their soul to the devil) will collapse under the sheer weight of the unmastered complexity of their data processing systems.
Yep.
> In the good old days physicists repeated each other's experiments, just to be sure. Today they stick to Python, so that they can share each other's programs, bugs included.
Reproducibility is a real problem, and sharing code is just the first step. It's an embarrassment to physics and mathematics that we don't have a single holistic repository of algorithms, but have to rebuild everything from scratch every time. (Perlis would tell Dijkstra that this is an inevitable facet of computing, and Dijkstra would reply that Perlis is too accepting of humanity's tendency to avoid effort.)
> You would rather that I had not disturbed you by sending you this.
Heh, yeah, let's see what the comment section is like.
I think the second issue, is it is fairly safe to say that the path from Python 2 to 3 was not well thought out and has been a disaster. A lot of people where burned by it, and it left a bad taste in a lot of peoples mouths.
That being said, Python enjoys a huge userbase so I would not worry about the hate, it's just not the language of choice for some people and that is fine.
First, the excellent readability leads directly into hard-to-read code structures. This might seem paradoxical but Dijkstra insisted that the same thing happened in FORTRAN, and I'm willing to defer to his instinct that there's something about the "shut up and calculate" approach that physicists have which causes a predilection for both FORTRAN and Python.
Second, Python 2 to Python 3 was horrible, and created political problems which hadn't existed before. Now, at the end of the transition, we can see how badly it was managed; Python 2 could have been retrofitted with nearly every breaking change and it would have been lower-friction. Instead, there's now millions of lines of rotting Python 2 code which will never be updated again. Curiously, this happened in the FORTRAN world too; I wasn't around for it, but FORTRAN 77 was so popular compared to future revisions and standardizations that it fractured the FORTRAN community.
That said, he did have relatively nice things to say about Haskell [2] and preferred Haskell to Java:
> Finally, in the specific comparison of Haskell versus Java, Haskell, though not perfect, is of a quality that is several orders of magnitude higher than Java, which is a mess (and needed an extensive advertizing campaign and aggressive salesmanship for its commercial acceptance).
I imagine that he would have liked something structured, equational, declarative, and modular; he would have wanted to treat programs as mathematical objects, as he says in [2]. Beyond that, though, we'll never know. He left some predictions like [3] but they are vague.
[0] https://en.wikipedia.org/wiki/ALGOL_60
[1] https://www.cs.utexas.edu/users/EWD/MCReps/MR35.PDF
[2] https://www.cs.utexas.edu/users/EWD/transcriptions/OtherDocs...
[3] https://www.cs.utexas.edu/users/EWD/transcriptions/EWD12xx/E...
The first languages and their compilers were strongly driven by hardware constraints, a kilobyte of memory costing an arm and a leg.
Imagine storing function names in memory to compile the program, it doesn't fit in 1 kB. Imagine storing the whole source code in memory for processing, it doesn't work when there is less than a 1 MB of memory available.
It's ridiculous today but it's real reasons why things were made global back then or why C/pascal split the code between a header and a source file.
And, for all his complaining, I don't know of any language that he authored. He's sure good at telling everyone that they're doing it wrong, though...
I mean, if FORTRAN is a mess, then isn't ALGOL too? Is anyone here old enough to remember?
ALGOL 68, on the other hand... "The more I see of it, the more unhappy I become." [0]
[0] https://www.cs.utexas.edu/users/EWD/transcriptions/EWD02xx/E...
This is not the same as saying that EWD thought that ALGOL was good...
how they hurt in 2012 https://news.ycombinator.com/item?id=4926615
and 2011 https://news.ycombinator.com/item?id=2279260
Smaller threads: