But these days folks are mostly used to the C style syntax. And I'm not even arguing that it is a better language than C or others. But the whole industry has gone overall into believing that anything newly 'invented' is good and anything that's been around a while is passé. Ironically, at the same time as the core technologies we use are based on decades old tech like Unix, relational databases, TCP/IP, etc. And many others like Lisp and Smalltalk fell by the wayside at least partly due to performance issues that were made irrelevant by Moore's law long ago.
Oh humans... :)
Btw, Logo is another one that's under appreciated. Seymour Papert was brilliant in making programming more visual and intuitive for kids. And I didn't actually know until recently it's actually a Lisp based language with a lot of power. Who knew?
In some parallel universe, I'd love to see folks like those, along with many others from that era, as the ones we heap recognition on instead of our worship of current tech billionaires. Those guys generally understood the hardware, software, and core theory. Given the mess that is computing and the internet, it's a shame that we'll be losing them over the next few decades.
There are numerous languages today, including Haskell and Ocaml, that are far more removed from the Algol lineage than these two. Heck, the differences between Rust and C are probably more pronounced than between C and Pascal.
C, on the other hand, has needlessly complicated syntax; a function definition is hard to detect, and a pointer to a function is hard to interpret, because it's literally convoluted: https://c-faq.com/decl/spiral.anderson.html
Sadly, this is a general stylistic difference: where Pascal tries to go for clarity, C makes do with cleverness, which is more error-prone.
Overall we're still stuck in a bit of a near-monoculture of JS(TS) and Python, but it's a far cry from back in the day where there was very little openness to the sole blessed corporate stack (Typically Java or C#/CLR). I think we can only handle so many mainstream languages, but I do love all the experimentation and openness going on.
Problem of the Pascal syntax is that it prevents adoption of certain constructs, which are just not nice. A few examples
- lambda expression: `begin` ... `end`, say goodbye to nice one liners;
- binary assign: FPC has `+=` `-=` but obviously not `mod=`, `and=`, etc;
On top of that there are other things like
- shortened boolean evaluation (e.g `if someInt` => `if someInt != 0` is not possible because `and` is a two headed creature
- locals are still not default initialized
I use to like Pascal (actually Delphi then ObjFPC) much but nowaday I think the only good parts are in certain semantics, e.g no fallback in the `case`...`of` construct, manual memory management BUT ref counted arrays.
I would tend more to like a C like syntax with certain semantics coming from the Pascal world.
So, much of what you are complaining is largely pointless syntactic sugar issues, like people complaining about the difficulty of typing "begin" vs "{" when any modern editor can autocomplete, and nevermind the difficult parts of programming are rarely the limit on how fast one can type 5 characters vs 1. I might even go so far as to say, slowing down a bit probably actually increases the code quality.
(PS: I've programmed professionally in pretty much every mainstream language and quite a number that aren't mainstream. IMHO Object Pascal strikes a far better balance of performant code, ease of development and maintenance, and developer safety than most of the languages in modern use, maybe all of them. Its frankly a shame that more places don't take it more seriously and would rather invent yet another poor half baked language that takes another few thousand man years of effort for the compiler writers and the users to overcome as they are discovered).
If there's one thing I would eliminate from programming, despite their benefits, is the one liner lambda expressions. It has turned clean, readable Python code into muddy statements I need to pause to compile in my head to understand.
I am not a fan.
I read it as "C with two pulses"... Which is how I feel about C++ - unnecessarily complicated.
Mostly, but I'm told the new Austral[1] language has syntax very similar to that of Pascal's.
I think this is partially accepted to keep wages down. New languages allow fresh developers to be on a level playing field with more senior developers. Both have say 2 years experience in said new language. Fresh developers are cheaper and therefore push down wages.
Also, at better places the language itself is a tool - someone will be a senior in any other language as well.
Go has plenty of weaknesses versus Pascal, but two commonalities of the languages are lightning fast compile times and a pretty good experience for modelling data structures. Pascal is undoubtedly lower level and does not guarantee memory safety, whereas Go does but its GC is often less efficient and more memory-heavy than manual allocation.
Blow for blow, though, I'd say the largest weak point for Pascal is a somewhat archaic syntax and for Go, honestly, the concurrency model. (Channels are nice, until they are not. I feel as though it's easier, though not necessarily easy, to write correct programs using mutexes than Go channels in many cases. This is weird, because nothing has changed about the old shared memory with locks model since it was the source of so many problems. Yet, programmers, computers and toolchains have changed a lot. Rust with locks is a great example.)
But the biggest problem for Pascal is the lack of a strong killer app. Back in the day, libraries like VCL made Delphi amazingly productive for desktop apps. But VCL/LCL doesn't really hold up as well these days, where desktop apps are less important and the important features of GUIs has shifted a lot. That leaves Delphi and Object Pascal as a sort-of also-ran: It's not that Go is especially good, in fact I'd argue its claim to fame and namesake (the concurrency model) just wound up being kind of ... bad. But, now that it's here and popular, there's little reason for e.g. Go developers to switch to Object Pascal, a less supported language with less of a job market, less library support, etc.
And that really is a shame, because it isn't really a reflection of Object Pascal being unfit for modern software development.
The rule for mutexes is, never take more than one. As long as you only ever take one, life is pretty good.
When all you had was mutexes as your primitive, though, that became a problem. One is not enough. You can't build a big program on mutexes, and taking only one at a time.
But as you add other concurrency primitives to take the load off of the lowly mutex, and as you do, the mutex returns to viability. I use a lot of mutexes in my Go code, and I can, precisely because when I have a case where I need to select from three channels in some complicated multi-way, multi-goroutine choice, I have the channels for that case. The other concurrency mechanisms take the hard cases, leaving the easy cases for mutexes to be fine for once again.
The story of software engineering in the 1990s was a story of overreactions and misdiagnoses. This was one of them. Mutexes weren't the problem; misuse of them was. Using them as the only primitive was. You really, really don't want to take more than one at a time. That goes so poorly that I believe it nearly explains the entire fear of multithreading picked up from that era. (The remainder comes from trying to multithread in a memory-unsafe language, which is also a pretty big mistake.) Multithreading isn't trivial, but it isn't that hard... but there are some mistakes that fundamentally will destroy your sanity and trying to build a program around multiple mutexes being taken is one of them.
(To forestall corrections, the technical rule is always take mutexes in the same order. I consider experience to have proved that doesn't scale, plus, honestly, just common sense shows that it isn't practical. So I collapse that to a rule: Never have more than held at a time. As soon as you see you need more than one, use a different mechanism. Do whatever it takes to your program to achieve that; whatever shortcut you have in mind that you think will be good enough, you're wrong. Refactor correctly.)
Is there any shortcoming you can't apply that to? Don't malloc unless you free. If you cast in your program, make sure to cast to the correct type.
> The story of software engineering in the 1990s was a story of overreactions and misdiagnoses. This was one of them.
The problem of multiple mutexes was diagnosed well before the 90s. "Dining philosophers" was formulated in 1965.
https://www.adit.io/posts/2013-05-11-The-Dining-Philosophers-Problem-With-Ron-Swanson.html
https://www.adit.io/posts/2013-05-15-Locks,-Actors,-And-STM-In-Pictures.htmlRust doesn't solve the problem of multiple mutexes being tricky, but it does at least solve most of the other problems with sharing memory. To gain a little more assurance with Go, I do sometimes use the checklocks analyzer from gVisor, which gets you some of the way.
Ideally the actual rule is, never take a mutex while holding another mutex. You can take multiple mutexes simultaneously if the API supports it. (The problem is the API usually doesn't support it unless you implement the mutexes yourself using lower-level primitives, but that requires not actually using mutexes as your primitive.)
( > the technical rule is always take mutexes in the same order.
As you note, this doesn't actually work in practice, since you've given youself the opprotunity to get the order wrong every single time you do it.)
These days I develop servers on the JVM. We almost never think about mutexes or related things, libraries take care of that. I use Scala, and our entire data model is immutable, eliminating most race conditions. I think I had to declare something as volatile once or twice.
This is a recurring theme, not one isolated to the 90s.
I am 60, have used many languages, and used to love C and C++. I consider C and C++ archaic and Java is border-line. I thought Java was cool 10-20 years ago, but I've moved on to Scala.
Forgotten to the point that people thought Visual Basic was a good idea.
Visual Basic 5 was kinda awesome.
It may very well be that if I had looked at both again after, say, 5 years of programming I might have different views, but even now I still look back fondly on VB, it got shit done, esp if you needed a quick UI with not a lot of business logic.
I'm an ex-Delphi developer myself, and actually Go and Pascal are more closely related than you might think at first glance: Go code looks mostly like a C-family language, but the declaration syntax ("a int" instead of "int a") and the "package" concept which helps achieve fast compilation times are borrowed from Pascal. And both have a ":=" operator, although in Go it declares and assigns variable(s) with type inference, while in Pascal it's any assignment.
The article basically compare their CSV/JSON serialising library to Go's standard CSV/JSON libraries. Looking at the Go code, it's pretty clear why it has memory issues, it reads all the lines into a single object (well, `[][]string`) immediately, rather than reading line for line (which takes advantage of the stream).
I am not sure how this is remarkable and impressive for Pascal. They talk about how you don't need to use the `try..finally..Free` routine all the time, but that's only if the object in question is an interface. Interfaces are somehow handled by a reference counter in Object Pascal, so you need to know how to operate on objects vs interfaces, because they act very different. Pascal is full of these quirks.
In 1980 I was a freshman at UCSC, and the professors did not like C. So most classes used UCSD Pascal. While it apparently pioneered some cool ideas, it was not at all ready for industry use. The free function was just a suggestion, it didn't deallocate anything. Arrays were fixed size, and an array of size 80 was a different type than size 255 (and 255 was the maximum).
I remember the compiler class where we built a compiler-compiler using Pascal. It was pretty cool that the professor came up with a design that worked, but also quite dumb as we had to pass around a bunch of 255 char arrays. And also insane that we couldn't use the industrial strength tools like C and yacc available on the VAX / UNIX computers...
But what about Modulo-2? Well one professor would torture the class, making them use various not-C languages. One year it was PLZ (A PL/I based language created by Zilog Corporation). When I took the class, it was Modulo-2, using a compiler developed at CMU I think. It also implemented free() as a suggestion that did nothing, and had other warts. I was not impressed...
I realize that it is unfair complaining about shitty academic implementations, but that's what I lived through.
The interesting thing to me is that San Diego hosts the supercomputer centre and so there was a sense the engineers there really live in Fortan, did, and do.
(I was in the UK system at the same time as you, and my uni had Wirth on sabbatical for a year, during the ada/modula specification days. We all learned on Pascal on a Dec-10, unless you chose the other door and went LISP. I regret not going in the LISP door now, but hindsight is like that)
Now I am a Scala programmer, often doing pure-FP with Cats...
I also remember having to use some other (standard?) Pascal at university and it was much more limited and had annoying strict limitations like you describe. It seemed far less useful for anything practical. If that was my only experience with Pascal I would probably not have very fond memories.
What a terrible habit we have of speaking about our tools like they’re in competition with each other!
I don’t think I’ll ever meet a carpenter who talks about their hammer or even their manual crank drill being “still in yhe race”.
Tools have contexts where they might be used. Sometimes one tool will supersede another for all the day’s tasks, but tomorrow’s tasks will be different in unknown ways and whatever distinguishes one tool from another may be just the right thing there.
In programming languages, that might look like somebody setting aside a paradigm for a while because projects and architectures went a certain way, but then reviving that paradigm again when they go some other way.
Pascal has some cool stuff to it. We should be curious about that stuff and keep it in mind as new contexts emerge; but it’s never been in a race and we really don’t do ourselves much good in talking about it that way.
Standing up an established language in one of these runtimes is an upper division college level project. If you strongly felt that Algol was the clearest or most inspiring way to express your project, it’s not nearly so out of reach as it was a few decades ago.
That’s exactly why we’ve had this cambrian explosion of new and revived languages lately.
Recently, I came back to a pet project: genetic algorithms. I wrote a library for it with polymorphism, generics and some other (actually not so complicated) stuffs in FPC/Lazarus and then I must notice that my productivity suffered quite significantly compared to other languages like Python and F#. The thing is, on the first glance, everything is fine but going into the details, many small issues turnout to be big blockers.
For example, FPC introduced constref modifier for parameters. But if you declared as const instead, the compiler will still gives green light. But when running, the result is different in an inexplicable manner. Then there is a very subtle difference between a local procedure/function and a global one using as a comparer. Program compiled just fine without any hint or warning but the result is inexplicably wrong and causes a great deal of debugging effort. That was the case with generic objects list and sorting. Then there is obviously the problem with documentation and the excessive use of overloading and type alias in many libraries. For examples, TFixedPoint and TPoint in the Graphics32 library are totally different, but unfortunately assignment compatible. Thus without good documentation, one can mistakenly pass the parameters for one function for the other and the compiler can not detect it, ultimately defies the purpose of a strong static typing system. Not to mention the (not so small) quality issues with the tooling like the internal debugger crash or (sometimes) missing of declaration informations inside the editor.
All in all, I feel the Delphi/FP language is getting old and freight with many technical debts. Trying to introduce new concepts while keeping backward compatibility can make a programming language/system so bloat and hulking that maintain quality can hardly be achieved. It still serves the purpose but it requires IMO an urgent revamp.
"Freight" is a noun, not a verb. I can't guess the word you meant.
"Freighted"? (Weighed down.) "Fraught"? (Troubled by.)
The sad thing is that Pascal continued to evolve, but TP codified and fossilised it and that seems to be becoming a problem now.
Pascal evolved into Modula, which fairly soon became Modula-2 which is still around and enjoyed a moment in the sun.
(Modula-3 was someone else.)
Modula-2 evolved into Oberon, which is also still around.
Oberon evolved into Oberon 2, then was rebooted with Oberon 07, but which also led on to Active Oberon and Zennon.
Oberon+ is an attempt to re-unify them.
There’s a lot of assumptions and bias in here.
Great time indeed !
And what about.. umm... Modula 2/3 or Oberon? They don't gain as much industry attractions as Pascal does, eh?
Anders Hejlsberg was the author of the best Pascal implementation, greatly enhancing Pascal until he was seduced by Microsoft and helped start the evil that is .NET and C#.
Delphi was great until Borland decided to abandon most of their user base and pursue the corporate market.
Lazarus/Free pascal is really good, except for the abysmal documentation. The very lumpy approach to creating help actively prevents incremental improvements. There's no way to just fix one page of the help.
Agree; that was the first thing I changed in https://oberon-lang.github.io/; besides the few academic oddities, original Oberon is a much better language than Pascal or Modula.
> He became obsessed with purity, instead of usability
There was definitely an academic bubble; for example, the claim that Oberon is a system language and can only be specified with 16 pages is demonstrably false, especially since there is a lot of code in the Oberon system that can only be implemented at all by means of (partially undocumented) backdoors to the language; unfortunately, these backdoors bypass the compiler's type checking.
> Lazarus/Free pascal is really good, except for the abysmal documentation
Unfortunately, the language is a huge, partly unspecified patchwork, where apparently all sorts of current fashionable constructs have been built in, partly even redundantly. The resulting complexity is hardly manageable with the present development approach.
Some historical correction needed.
Many of the improvements in Turbo Pascal 4.0 onwards came from UCSD Pascal and Apple's Object Pascal, initially, and then followed up on interoperability with C++ for Borland products.
He wasn't seduced by Microsoft, rather by former Borland employees working at Microsoft wanting to refer him (those referral bonus), which he continuous refused until himself got pissed off with Borland's management.
"Anders Hejlsberg: A craftsman of computer language" interview
https://behindthetech.libsynpro.com/001-anders-hejlsberg-a-c...
Official list of supported platforms from freepascal.org: "Intel x86 (16 and 32 bit), AMD64/x86-64, PowerPC, PowerPC64, SPARC, SPARC64, ARM, AArch64, MIPS, Motorola 68k, AVR, and the JVM. Supported operating systems include Windows (16/32/64 bit, CE, and native NT), Linux, Mac OS X/iOS/iPhoneSimulator/Darwin, FreeBSD and other BSD flavors, DOS (16 bit, or 32 bit DPMI), OS/2, AIX, Android, Haiku, Nintendo GBA/DS/Wii, AmigaOS, MorphOS, AROS, Atari TOS, and various embedded platforms. Additionally, support for RISC-V (32/64), Xtensa, and Z80 architectures, and for the LLVM compiler infrastructure is available in the development version. Additionally, the Free Pascal team maintains a transpiler for pascal to Javascript called pas2js."
Would you mind expanding on why you feel this way? What Pascal do you use? Do you use Delphi or FPC/Lazarus? What don’t you like about the language (or your particular vendor implementation)?
C/AL is a product of Microsoft and is only used in their ERP Software Navision. It's a horrible experience to work with as you don't have a lot of modern languages features just for the sake of readability.
At some point in the article the author wrote that Rust code isn't readable for example. I'd argue code shouldn't have to be readable by non-programmers. And especially not if the language sacrifices features like creating objects or dynamic arrays...
But as I wrote, I don't have actual experience with Pascal so maybe it's actually better.
How about Ada or Oberon, both much better than Pascal or Modula-2?
Wirth made a mistake by fragmenting his language development over similar, but incompatible languages under different names.
When we look at C, the story is different. ANSI C was carefully designed to be backward compatible with K&R C. C99 didn't break too much in C90: test cases to demonstrate incompatibility have to be contrived. The name of the language didn't change.
Simply not changing the name is a powerful social tactic. People overlook differences when the name has not changed. (Look at how Lisp outsiders think that Lisp is all the same.)
Modern Fortran is very different from Fortran 66 or 77. Because the name is the same, the "Frankenfortran" is accepted in the same circles (e.g. scientific computing). Had the name changed, that would be unlikely.
I can't escape the suspicion that Wirth should have continued to use the Pascal name for that entire succession of languages.
A bit of Oberon survives in Go, which is probably the only reason I somehow like the language, despite their design decisions, which I must admit are still less draconian than Oberon-07.
Argh, I am to trigger happy.
But iirc, HN is Lisp?