But these days folks are mostly used to the C style syntax. And I'm not even arguing that it is a better language than C or others. But the whole industry has gone overall into believing that anything newly 'invented' is good and anything that's been around a while is passé. Ironically, at the same time as the core technologies we use are based on decades old tech like Unix, relational databases, TCP/IP, etc. And many others like Lisp and Smalltalk fell by the wayside at least partly due to performance issues that were made irrelevant by Moore's law long ago.
Oh humans... :)
Btw, Logo is another one that's under appreciated. Seymour Papert was brilliant in making programming more visual and intuitive for kids. And I didn't actually know until recently it's actually a Lisp based language with a lot of power. Who knew?
In some parallel universe, I'd love to see folks like those, along with many others from that era, as the ones we heap recognition on instead of our worship of current tech billionaires. Those guys generally understood the hardware, software, and core theory. Given the mess that is computing and the internet, it's a shame that we'll be losing them over the next few decades.
There are numerous languages today, including Haskell and Ocaml, that are far more removed from the Algol lineage than these two. Heck, the differences between Rust and C are probably more pronounced than between C and Pascal.
C, on the other hand, has needlessly complicated syntax; a function definition is hard to detect, and a pointer to a function is hard to interpret, because it's literally convoluted: https://c-faq.com/decl/spiral.anderson.html
Sadly, this is a general stylistic difference: where Pascal tries to go for clarity, C makes do with cleverness, which is more error-prone.
C is almost LR(1), if we allow prior declarations to decide how some tokens are classified, like whether an identifier is a variable or type name.
Declarations like
void (*signal(int, void (*fp)(int)))(int);
are LR(1).LR(1) sentences are harder to read than LL(1) because you have to keep track of a long prefix of the input, looking for right reductions (if you follow certain LR algorithms). LR parsing algorithms use a stack which essentially provides unlimited lookahead, in comparison to LL(1). Both LL(1) and LR(1) have one symbol of lookahead, but qualitatively it's entirely different because the lookahead in LR is happening after an indefinitely long prefix of the sentence which has not been fully analyzed, and has been shunted into a stack, to be processed later. Many symbols can be pushed onto the stack before a decision is made to recognize a rule and reduce by it. Those pushed symbols represent a prefix of the input that is not yet reduced, while the reduction is happening on the right of that. So it is backwards in a sense; following what is going on in the grammar is bit like understanding a stack language like Forth or PostScript.
An LL(1) grammar allows sentences to be parsed in a left to right scan without pushing anything into a stack to reduce later. Everything is decidable based on looking at the next symbol. Under LL(1), by looking at one symbol, you know what you are parsing; each subsequent symbol narrows it down to something more specific. Importantly, the syntax of symbols that have been processed already (material to the left) are settled; their syntax is not left undecided while we recognize some fragment on the right.
Under LR(1) it's possible for a long sequence of symbols to belong to entirely unrelated phrase structures, only to be decided when something finally appears on the right. A LALR(1) parser generator outputs a machine in which the states end up shared by unrelated rules. The state transitions then effectively track multiple parallel contexts.
if X > 0 then
Y := 0;
end if;
Curly braces are shorter, but a close curly brace will match any open curly brace. Such is the nature of trade-offs.Overall we're still stuck in a bit of a near-monoculture of JS(TS) and Python, but it's a far cry from back in the day where there was very little openness to the sole blessed corporate stack (Typically Java or C#/CLR). I think we can only handle so many mainstream languages, but I do love all the experimentation and openness going on.
Problem of the Pascal syntax is that it prevents adoption of certain constructs, which are just not nice. A few examples
- lambda expression: `begin` ... `end`, say goodbye to nice one liners;
- binary assign: FPC has `+=` `-=` but obviously not `mod=`, `and=`, etc;
On top of that there are other things like
- shortened boolean evaluation (e.g `if someInt` => `if someInt != 0` is not possible because `and` is a two headed creature
- locals are still not default initialized
I use to like Pascal (actually Delphi then ObjFPC) much but nowaday I think the only good parts are in certain semantics, e.g no fallback in the `case`...`of` construct, manual memory management BUT ref counted arrays.
I would tend more to like a C like syntax with certain semantics coming from the Pascal world.
So, much of what you are complaining is largely pointless syntactic sugar issues, like people complaining about the difficulty of typing "begin" vs "{" when any modern editor can autocomplete, and nevermind the difficult parts of programming are rarely the limit on how fast one can type 5 characters vs 1. I might even go so far as to say, slowing down a bit probably actually increases the code quality.
(PS: I've programmed professionally in pretty much every mainstream language and quite a number that aren't mainstream. IMHO Object Pascal strikes a far better balance of performant code, ease of development and maintenance, and developer safety than most of the languages in modern use, maybe all of them. Its frankly a shame that more places don't take it more seriously and would rather invent yet another poor half baked language that takes another few thousand man years of effort for the compiler writers and the users to overcome as they are discovered).
If there's one thing I would eliminate from programming, despite their benefits, is the one liner lambda expressions. It has turned clean, readable Python code into muddy statements I need to pause to compile in my head to understand.
I am not a fan.
Mostly, but I'm told the new Austral[1] language has syntax very similar to that of Pascal's.
I think this is partially accepted to keep wages down. New languages allow fresh developers to be on a level playing field with more senior developers. Both have say 2 years experience in said new language. Fresh developers are cheaper and therefore push down wages.
Also, at better places the language itself is a tool - someone will be a senior in any other language as well.
Eh, not really, at least not in the last 10+ years. I'm sure some obscure hotness does something neat, but mostly inconsequential for the vast, vast majority of shops.
>someone will be a senior in any other language as well.
While I agree with you, that often isn't the opinion of people hiring. If someone is looking for 2 years of java, in most places, 10 years of C# isn't what they are willing to hire.
Go has plenty of weaknesses versus Pascal, but two commonalities of the languages are lightning fast compile times and a pretty good experience for modelling data structures. Pascal is undoubtedly lower level and does not guarantee memory safety, whereas Go does but its GC is often less efficient and more memory-heavy than manual allocation.
Blow for blow, though, I'd say the largest weak point for Pascal is a somewhat archaic syntax and for Go, honestly, the concurrency model. (Channels are nice, until they are not. I feel as though it's easier, though not necessarily easy, to write correct programs using mutexes than Go channels in many cases. This is weird, because nothing has changed about the old shared memory with locks model since it was the source of so many problems. Yet, programmers, computers and toolchains have changed a lot. Rust with locks is a great example.)
But the biggest problem for Pascal is the lack of a strong killer app. Back in the day, libraries like VCL made Delphi amazingly productive for desktop apps. But VCL/LCL doesn't really hold up as well these days, where desktop apps are less important and the important features of GUIs has shifted a lot. That leaves Delphi and Object Pascal as a sort-of also-ran: It's not that Go is especially good, in fact I'd argue its claim to fame and namesake (the concurrency model) just wound up being kind of ... bad. But, now that it's here and popular, there's little reason for e.g. Go developers to switch to Object Pascal, a less supported language with less of a job market, less library support, etc.
And that really is a shame, because it isn't really a reflection of Object Pascal being unfit for modern software development.
The rule for mutexes is, never take more than one. As long as you only ever take one, life is pretty good.
When all you had was mutexes as your primitive, though, that became a problem. One is not enough. You can't build a big program on mutexes, and taking only one at a time.
But as you add other concurrency primitives to take the load off of the lowly mutex, and as you do, the mutex returns to viability. I use a lot of mutexes in my Go code, and I can, precisely because when I have a case where I need to select from three channels in some complicated multi-way, multi-goroutine choice, I have the channels for that case. The other concurrency mechanisms take the hard cases, leaving the easy cases for mutexes to be fine for once again.
The story of software engineering in the 1990s was a story of overreactions and misdiagnoses. This was one of them. Mutexes weren't the problem; misuse of them was. Using them as the only primitive was. You really, really don't want to take more than one at a time. That goes so poorly that I believe it nearly explains the entire fear of multithreading picked up from that era. (The remainder comes from trying to multithread in a memory-unsafe language, which is also a pretty big mistake.) Multithreading isn't trivial, but it isn't that hard... but there are some mistakes that fundamentally will destroy your sanity and trying to build a program around multiple mutexes being taken is one of them.
(To forestall corrections, the technical rule is always take mutexes in the same order. I consider experience to have proved that doesn't scale, plus, honestly, just common sense shows that it isn't practical. So I collapse that to a rule: Never have more than held at a time. As soon as you see you need more than one, use a different mechanism. Do whatever it takes to your program to achieve that; whatever shortcut you have in mind that you think will be good enough, you're wrong. Refactor correctly.)
Is there any shortcoming you can't apply that to? Don't malloc unless you free. If you cast in your program, make sure to cast to the correct type.
> The story of software engineering in the 1990s was a story of overreactions and misdiagnoses. This was one of them.
The problem of multiple mutexes was diagnosed well before the 90s. "Dining philosophers" was formulated in 1965.
https://www.adit.io/posts/2013-05-11-The-Dining-Philosophers-Problem-With-Ron-Swanson.html
https://www.adit.io/posts/2013-05-15-Locks,-Actors,-And-STM-In-Pictures.htmlRust doesn't solve the problem of multiple mutexes being tricky, but it does at least solve most of the other problems with sharing memory. To gain a little more assurance with Go, I do sometimes use the checklocks analyzer from gVisor, which gets you some of the way.
For the purposes of this discussion [1], Erlang is just Go except you don't have mutexes as an option at all, so anything you want locked has to be in a separate Erlang process (analog of goroutine). So if all you want is a shared dictionary to be used as a cache or something, it has to have its own process/goroutine, you don't get an option of just locking access.
Since that's how it works in Erlang, it has a bit of syntax grease around it, but not enough to make it just right; you've still got to do things like handle communication errors because it could be on a different node in a cluster whereas in Go it's just a local shared resource.
I think the main problem is that as useful as actors are as a concept, they're not a great foundational abstraction, which is to say, the base that everything is built on and you can't go below. It works, but then it means you're paying the full actor price for everything. But you don't always want the full actor price for everything, and you don't need to pay it because in practice "lack of actor isolation" is rarely the root cause for any particular problem, because that's too big a thing to be the root cause.
[1]: If that's too glib for you: https://news.ycombinator.com/item?id=34564228
Ideally the actual rule is, never take a mutex while holding another mutex. You can take multiple mutexes simultaneously if the API supports it. (The problem is the API usually doesn't support it unless you implement the mutexes yourself using lower-level primitives, but that requires not actually using mutexes as your primitive.)
( > the technical rule is always take mutexes in the same order.
As you note, this doesn't actually work in practice, since you've given youself the opprotunity to get the order wrong every single time you do it.)
These days I develop servers on the JVM. We almost never think about mutexes or related things, libraries take care of that. I use Scala, and our entire data model is immutable, eliminating most race conditions. I think I had to declare something as volatile once or twice.
That's not a problem with mutexes but with resource management in some languages. In Rust mutexes use RAII and unlock automatically - you cannot accidentally forget to unlock.
That's a language issue though, rather than a mutex one. It's reasonably straightforward to fix that, as some languages (like Nim) do.
This is a recurring theme, not one isolated to the 90s.
I am 60, have used many languages, and used to love C and C++. I consider C and C++ archaic and Java is border-line. I thought Java was cool 10-20 years ago, but I've moved on to Scala.
Scala feels more modern than C to me too, when talking about language concepts/features. Regarding syntax they are in the same basket.
One thing that is better about the syntax that I really like is the removal of BEGIN and END everywhere except around code in MODULEs and PROCEDUREs. IF/THEN/ELSE/END, FOR/DO/END, REPEAT/UNTIL, WHILE/DO/END, CASE/ELSE/END, no longer have BEGIN/END, even if there are multiple statements in them. This makes the code less verbose than C and C++ and equally or more compact vertically than them, depending on whether you put your braces on separate lines.
Pascal syntax is non-mainstream nowadays, as well as Lisps, MLs, and lots of others, but neither fits to the word "archaic"
Forgotten to the point that people thought Visual Basic was a good idea.
The real dumb move (though TBH i can only say that in hindsight) was that they made a free Linux version with Kylix, the license required any programs released to be under GPL but they didn't release Kylix itself as GPL.
This was in very early 2000s, when GPL wasn't the boogieman among developers that seems to be nowadays with all the permissive licenses, desktop software was still something people wanted, commercial software wouldn't touch GPL and yet a lot of new programmers were onboarding Linux. Having Delphi/Kylix full GPL with a CLA (like some other projects) would mean that a) Kylix would become part of various Linux distributions, especially during a time when distributions were the main source for tools for Linux users, b) anyone working on FLOSS would both use and improve the tool, c) mindshare among programmers would improve as anyone will be able to try it out for free (as long as they used Linux, but many programmers - especially younger programmers at the time - didn't mind that), d) companies, enterprises, etc that wanted to sell shareware or just didn't want the rules GPL imposed would still need to buy the full program
Sadly this seemed to be yet another case of when Borland started losing touch with programmers in the 90s.
OTOH, As many tool vendors will tell you its foolish to dismiss a tool simply based on price. If your paying your developers $100 an hour even tiny improvements in productivity can easily pay for a $1000 or more tool. Just the compilation speed alone vs C/etc is probably worth the 5+ mins a day in savings.
Visual Basic 5 was kinda awesome.
It may very well be that if I had looked at both again after, say, 5 years of programming I might have different views, but even now I still look back fondly on VB, it got shit done, esp if you needed a quick UI with not a lot of business logic.
I'm an ex-Delphi developer myself, and actually Go and Pascal are more closely related than you might think at first glance: Go code looks mostly like a C-family language, but the declaration syntax ("a int" instead of "int a") and the "package" concept which helps achieve fast compilation times are borrowed from Pascal. And both have a ":=" operator, although in Go it declares and assigns variable(s) with type inference, while in Pascal it's any assignment.
The article basically compare their CSV/JSON serialising library to Go's standard CSV/JSON libraries. Looking at the Go code, it's pretty clear why it has memory issues, it reads all the lines into a single object (well, `[][]string`) immediately, rather than reading line for line (which takes advantage of the stream).
I am not sure how this is remarkable and impressive for Pascal. They talk about how you don't need to use the `try..finally..Free` routine all the time, but that's only if the object in question is an interface. Interfaces are somehow handled by a reference counter in Object Pascal, so you need to know how to operate on objects vs interfaces, because they act very different. Pascal is full of these quirks.
In 1980 I was a freshman at UCSC, and the professors did not like C. So most classes used UCSD Pascal. While it apparently pioneered some cool ideas, it was not at all ready for industry use. The free function was just a suggestion, it didn't deallocate anything. Arrays were fixed size, and an array of size 80 was a different type than size 255 (and 255 was the maximum).
I remember the compiler class where we built a compiler-compiler using Pascal. It was pretty cool that the professor came up with a design that worked, but also quite dumb as we had to pass around a bunch of 255 char arrays. And also insane that we couldn't use the industrial strength tools like C and yacc available on the VAX / UNIX computers...
But what about Modulo-2? Well one professor would torture the class, making them use various not-C languages. One year it was PLZ (A PL/I based language created by Zilog Corporation). When I took the class, it was Modulo-2, using a compiler developed at CMU I think. It also implemented free() as a suggestion that did nothing, and had other warts. I was not impressed...
I realize that it is unfair complaining about shitty academic implementations, but that's what I lived through.
The interesting thing to me is that San Diego hosts the supercomputer centre and so there was a sense the engineers there really live in Fortan, did, and do.
(I was in the UK system at the same time as you, and my uni had Wirth on sabbatical for a year, during the ada/modula specification days. We all learned on Pascal on a Dec-10, unless you chose the other door and went LISP. I regret not going in the LISP door now, but hindsight is like that)
Now I am a Scala programmer, often doing pure-FP with Cats...
Ada and Oberon are even better.
I also remember having to use some other (standard?) Pascal at university and it was much more limited and had annoying strict limitations like you describe. It seemed far less useful for anything practical. If that was my only experience with Pascal I would probably not have very fond memories.
What a terrible habit we have of speaking about our tools like they’re in competition with each other!
I don’t think I’ll ever meet a carpenter who talks about their hammer or even their manual crank drill being “still in yhe race”.
Tools have contexts where they might be used. Sometimes one tool will supersede another for all the day’s tasks, but tomorrow’s tasks will be different in unknown ways and whatever distinguishes one tool from another may be just the right thing there.
In programming languages, that might look like somebody setting aside a paradigm for a while because projects and architectures went a certain way, but then reviving that paradigm again when they go some other way.
Pascal has some cool stuff to it. We should be curious about that stuff and keep it in mind as new contexts emerge; but it’s never been in a race and we really don’t do ourselves much good in talking about it that way.
Standing up an established language in one of these runtimes is an upper division college level project. If you strongly felt that Algol was the clearest or most inspiring way to express your project, it’s not nearly so out of reach as it was a few decades ago.
That’s exactly why we’ve had this cambrian explosion of new and revived languages lately.
I keep trying to figure out how to use something else, but it is really hard to break in. Those who have tried before me gave up because of the friction.
Recently, I came back to a pet project: genetic algorithms. I wrote a library for it with polymorphism, generics and some other (actually not so complicated) stuffs in FPC/Lazarus and then I must notice that my productivity suffered quite significantly compared to other languages like Python and F#. The thing is, on the first glance, everything is fine but going into the details, many small issues turnout to be big blockers.
For example, FPC introduced constref modifier for parameters. But if you declared as const instead, the compiler will still gives green light. But when running, the result is different in an inexplicable manner. Then there is a very subtle difference between a local procedure/function and a global one using as a comparer. Program compiled just fine without any hint or warning but the result is inexplicably wrong and causes a great deal of debugging effort. That was the case with generic objects list and sorting. Then there is obviously the problem with documentation and the excessive use of overloading and type alias in many libraries. For examples, TFixedPoint and TPoint in the Graphics32 library are totally different, but unfortunately assignment compatible. Thus without good documentation, one can mistakenly pass the parameters for one function for the other and the compiler can not detect it, ultimately defies the purpose of a strong static typing system. Not to mention the (not so small) quality issues with the tooling like the internal debugger crash or (sometimes) missing of declaration informations inside the editor.
All in all, I feel the Delphi/FP language is getting old and freight with many technical debts. Trying to introduce new concepts while keeping backward compatibility can make a programming language/system so bloat and hulking that maintain quality can hardly be achieved. It still serves the purpose but it requires IMO an urgent revamp.
"Freight" is a noun, not a verb. I can't guess the word you meant.
"Freighted"? (Weighed down.) "Fraught"? (Troubled by.)
The sad thing is that Pascal continued to evolve, but TP codified and fossilised it and that seems to be becoming a problem now.
Pascal evolved into Modula, which fairly soon became Modula-2 which is still around and enjoyed a moment in the sun.
(Modula-3 was someone else.)
Modula-2 evolved into Oberon, which is also still around.
Oberon evolved into Oberon 2, then was rebooted with Oberon 07, but which also led on to Active Oberon and Zennon.
Oberon+ is an attempt to re-unify them.
There’s a lot of assumptions and bias in here.
Great time indeed !
But TBH, nowaday's compilers do so much more, I'll happily trade compile time for that !
And what about.. umm... Modula 2/3 or Oberon? They don't gain as much industry attractions as Pascal does, eh?
Anders Hejlsberg was the author of the best Pascal implementation, greatly enhancing Pascal until he was seduced by Microsoft and helped start the evil that is .NET and C#.
Delphi was great until Borland decided to abandon most of their user base and pursue the corporate market.
Lazarus/Free pascal is really good, except for the abysmal documentation. The very lumpy approach to creating help actively prevents incremental improvements. There's no way to just fix one page of the help.
Agree; that was the first thing I changed in https://oberon-lang.github.io/; besides the few academic oddities, original Oberon is a much better language than Pascal or Modula.
> He became obsessed with purity, instead of usability
There was definitely an academic bubble; for example, the claim that Oberon is a system language and can only be specified with 16 pages is demonstrably false, especially since there is a lot of code in the Oberon system that can only be implemented at all by means of (partially undocumented) backdoors to the language; unfortunately, these backdoors bypass the compiler's type checking.
> Lazarus/Free pascal is really good, except for the abysmal documentation
Unfortunately, the language is a huge, partly unspecified patchwork, where apparently all sorts of current fashionable constructs have been built in, partly even redundantly. The resulting complexity is hardly manageable with the present development approach.
https://miasap.se/obnc/oberon-report.html
Oberon also has an interesting set of features to enable object-oriented programming without the syntactic sugar. Here is an article which describes basic abstractions in Oberon:
I am curious, what's the problem with redundancy? Having more than one way to do something doesn't seem particularly ominous to me. I'm sure you've got solid reasons.
Some historical correction needed.
Many of the improvements in Turbo Pascal 4.0 onwards came from UCSD Pascal and Apple's Object Pascal, initially, and then followed up on interoperability with C++ for Borland products.
He wasn't seduced by Microsoft, rather by former Borland employees working at Microsoft wanting to refer him (those referral bonus), which he continuous refused until himself got pissed off with Borland's management.
"Anders Hejlsberg: A craftsman of computer language" interview
https://behindthetech.libsynpro.com/001-anders-hejlsberg-a-c...
Official list of supported platforms from freepascal.org: "Intel x86 (16 and 32 bit), AMD64/x86-64, PowerPC, PowerPC64, SPARC, SPARC64, ARM, AArch64, MIPS, Motorola 68k, AVR, and the JVM. Supported operating systems include Windows (16/32/64 bit, CE, and native NT), Linux, Mac OS X/iOS/iPhoneSimulator/Darwin, FreeBSD and other BSD flavors, DOS (16 bit, or 32 bit DPMI), OS/2, AIX, Android, Haiku, Nintendo GBA/DS/Wii, AmigaOS, MorphOS, AROS, Atari TOS, and various embedded platforms. Additionally, support for RISC-V (32/64), Xtensa, and Z80 architectures, and for the LLVM compiler infrastructure is available in the development version. Additionally, the Free Pascal team maintains a transpiler for pascal to Javascript called pas2js."
Would you mind expanding on why you feel this way? What Pascal do you use? Do you use Delphi or FPC/Lazarus? What don’t you like about the language (or your particular vendor implementation)?
C/AL is a product of Microsoft and is only used in their ERP Software Navision. It's a horrible experience to work with as you don't have a lot of modern languages features just for the sake of readability.
At some point in the article the author wrote that Rust code isn't readable for example. I'd argue code shouldn't have to be readable by non-programmers. And especially not if the language sacrifices features like creating objects or dynamic arrays...
But as I wrote, I don't have actual experience with Pascal so maybe it's actually better.
How about Ada or Oberon, both much better than Pascal or Modula-2?
Wirth made a mistake by fragmenting his language development over similar, but incompatible languages under different names.
When we look at C, the story is different. ANSI C was carefully designed to be backward compatible with K&R C. C99 didn't break too much in C90: test cases to demonstrate incompatibility have to be contrived. The name of the language didn't change.
Simply not changing the name is a powerful social tactic. People overlook differences when the name has not changed. (Look at how Lisp outsiders think that Lisp is all the same.)
Modern Fortran is very different from Fortran 66 or 77. Because the name is the same, the "Frankenfortran" is accepted in the same circles (e.g. scientific computing). Had the name changed, that would be unlikely.
I can't escape the suspicion that Wirth should have continued to use the Pascal name for that entire succession of languages.
A bit of Oberon survives in Go, which is probably the only reason I somehow like the language, despite their design decisions, which I must admit are still less draconian than Oberon-07.
Argh, I am to trigger happy.
But iirc, HN is Lisp?