Homoiconicity anyone? Lisp is one of the oldest high-level programming languages, and it's still around.
[0] Shriram is an original member of the Racket project, so he's been working in the Lisp-like domain for at least 30 years and, specifically, he works in an offshoot of Lisp that is particularly concerned with questions of syntax. I think this establishes him as a reasonable citation for this topic.
[1] https://parentheticallyspeaking.org/articles/bicameral-not-h...
> doesn't gain anything semantically
Syntactic properties create semantic affordances. The reason "code as data" matters isn't that the parentheses look a certain way - it's that the AST is a first-class citizen of the runtime data model. Those are semantic capabilities - they determine what programs can express and compute, especially at the meta level. The syntactic uniformity is the mechanism; the ability to write programs that construct and transform other programs, using the same tools as everything else, is the payoff.
Homoiconicity doesn't make Lisp programs mean something different, but it gives you a more uniform and powerful meta-programming model.
Good thing I didn't do that?
> Syntactic properties create semantic affordances.
I don't disagree with this. Benjamin Pierce defines type-checking in the opening pages of Types and Programming Languages as an operation over syntax, for example.
My point was that the parent comment just kind of threw out "homiconicity" when somebody talked about writing properties about a language in that language, and those are entirely separate things. I was addressing a conflation of terms. The property that people generally refer to as "homoiconicity" is useful for things like writing macros, but it does not directly grant you access to any kind of property-checking capabilitiy. I mean, Rust's macro system is far from homoiconic (it's just an AST), but it gives you semantic capabilities. You know?
Agda, OTOH, is IMO the dependently typed language for type theorists, and does weird things, but "unproductive" is applicable only for a somewhat narrow view of productivity.
I don't consider there to be a dependently typed language for writing programs that you just want to run, but I would be delighted to be corrected on that viewpoint!
Practical Lispers would like to have a word - I've been witnessing extreme productivity on some teams.
Modern Lisp dialects (Clojure and likes) largely broke library fragmentation and the "not invented here" tendency that were causing real tensions in Common Lisp.
You realize that "The Lisp Curse" isn't some paper, survey or objective reflection? It's just someone's essay back from 2011 - an opinion.
You can take it word-by-word and apply to say Javascript, and it would largely feel true - JS arguably has the worst fragmentation of any ecosystem; dozens of competing frameworks, build tools, bundlers, test runners; new frameworks constantly replacing old ones; "Javascript fatigue" is a real thing, etc., but nobody talks about "Curse of Javascript"
I learned Lisp (once) and that opened up path to Clojure, Clojurescript, then Fennel, Janet and Clojure-Dart, libpython-clj, there's Jank that is about to break loose. And something opposite to fragmentation happened to me - all these incompatible runtimes became unified and controllable from the same substrate - I no longer feel like I'm having to switch between languages - the syntax and thinking stays stable. The runtime is just a deployment target.
The curse essay says: "Lisp empowers fragmentation". Actual experience says: "Lisp provides unity across fragmentation that already existed"
> You can take it word-by-word and apply to say Javascript, and it would largely feel true - JS arguably has the worst fragmentation of any ecosystem; dozens of competing frameworks, build tools, bundlers, test runners; new frameworks constantly replacing old ones; "Javascript fatigue" is a real thing, etc., but nobody talks about "Curse of Javascript"
You also need to take into account the denominator of "number of users", though. Clojure, with a tiny population, had a cambric explosion of libraries, and now we can't argue that those are dead on one argument, and that those are "done" on the next one. There is a huge fragmentation in the clojure world, and on small populations, that hurts. Case in point: SQL libraries. Korma, yesql, hugsql, honeysql, and those are just the popular ones. Case in point: spec vs. schema vs. malli. Case in point: leiningen vs. boot vs. deps.
> I learned Lisp (once) and that opened up path to Clojure, Clojurescript, then Fennel, Janet and Clojure-Dart, libpython-clj, there's Jank that is about to break loose.
As we lispers like to say a lot, the syntax (or lack thereof) is the smallest of the issues. There is a lot of semantic difference between all of those (except libpython-clj, which does not belong to that list; but we could add Hy instead). That's even before starting to talk about library compatibility. So I'd contest whether having a common syntax is a major productivity gain.
It's also the deficit of code we actually use day to day that is actually written in lisp.
I file it under the same heading as haskell - a language that clearly has useful ideas, but...
i talked about the lisp curse in this old paper. it's rough but explicitly mentions it
Lean and most type theoretic-based languages don't merely preach simplicity, they demand it. A function or type with a handful of terms or constructors might be provably inhabited/total, whereas one with 2 handfuls of terms or constructors might not be in a reasonable amount of time due to the exponential growth of the proof space. Factoring code optimally for provability yields the simplicity that Forth programmers are striving for.
As the saying goes, once you've seen one Forth, then you've seen one Forth.
I've mucked around with my own Forths in the past, including one that recognises lexical type, so you could build something like a parser in Forth. I didn't take it that far. Forth is normally conceived as being built from the ground up, but if you're you're going to implement it in C or C++ then you can be more imaginative.
I played around with colorforth for 5 minutes on a couple of occasions, but I ran away screaming. What - just what - the hell is going on? I'm sure it all works for Charles Moore, but for mere mortals it might as well be a klingon control panel.
I think Moore effectively gave up on programming a couple of years ago? There was some strange modification in the guts of Windows and he couldn't get his environment to work any longer. He concluded that the game was not work the candle.
I'm just leaving this here for anyone interested, seems relevant: https://github.com/replikativ/ansatz
Ansatz is a verified programming library for Clojure built on the Calculus of Inductive Constructions (CIC) — the same type theory that powers Lean 4.
Lean 3 was the least bloated theorem prover among Lean, Coq and Agda, and Lean 4 is the most bloated among this Big Three. This is very sad.
Personally, I stopped using Lean after the last update broke unification in a strange way again.
> Lean is far off the most bloated one. Isabelle most likely takes that spot.
Among these three is the operative phrase here.
I hate to be pedantic, but we are talking about theorem provers here :)
Originally Lean was coded in C++, and dynamically linked executable, if I remeber correctly.
There were some proposals like compressing all the .olean files, but (as far as I know) none of them were implemented. Well, even if some proposals were implemented, their contribution was effectively negated anyway.
i tend to stick with agda for doing mathy programming. i kinda want lean4 to replace haskell at some point in the future as the workhorse production typed fp language.
Fun challenge. Unlike the author, I have nothing really to add.
I just wanted to say that "I did NOT write it with ..."
You could start your list alphabetically with A, A+, and A++. A is derived from APL. A+ is a newer take on A. A++ is unrelated. https://a-plus-plus-devs.github.io/aplusplus/guide/getting-s...
The most widely used variant of these proof assistants are probably formally verified compilers, like compcert, which are used in some highly regulated industries like aviation.
[0]: https://isabelle.systems/zulip-archive/stream/247541-Mirror.... and https://lean-lang.org/ (Cedar)
https://github.com/dharmatech/symbolism.lean
Lean is astonishingly expressive.
Oh, what a beautiful world it would be if this were the case!
An attempt (without looking)
JavaScript QBasic PHP Haskell C C++ Ada Algol Racket Scheme Clojure Common-Lisp GOOL Fortran Awk Postscript Forth C# F# Lua Java D Odin Rust Zig Julia Python Nim MATLAB Bash Brainfuck Arnold-C Intercal Gleam Unison Ruby Crystal Erlang Go TCL
Phew!
"There are only two kinds of languages: the ones people complain about and the ones nobody uses". - Bjarne Stroustrup
Well ... that is a trend that is driven largely by people who love types.
Not everyone shares that opinion. See ruby.
It is very hard to try to argue with people who love types. They will always focus on "types are great, every language must have them". They, in general, do not acknowledge trade-offs when it comes to type systems.
So the claim "tend to grow them" ... it is not completely wrong, but it also does not fully capture an independent want to add them. It comes ALWAYS from people who WANT types. I saw this happen "live" in ruby; I am certain this happened in python too.
> inevitably, people want to push types. even Go. C++ templates are the ultimate example. if it can be computed at compile time, at some point someone wants to, like Rust's ongoing constification.
And many people hate C++ templates. But comparing that language to e. g. ruby is already a losing argument. Languages are different. So are the trade-offs.
> dependent types can get you there. hence perfectable.
So the whole point about claiming a language is "perfectable", means to have types? I don't agree with that definition at all.
> most languages have no facility for this,
How about lisp?
> this lets you design APIs in layers and hide them behind syntax.
The language already failed hard syntax-wise. This is a problem I see in many languages - 99% of the language designers don't think syntax is important. Syntax is not the most important thing in the world, but to neglect it also shows a lack of understanding why syntax ALSO matters. But you can not talk about that really - I am 100% certain alok would disagree. How many people use a language also matters a LOT - you get a lot more momentum when there are tons of people using a language, as opposed to the global 3 or 4 using "lean".
Also, I have to point out that of course Ruby has types. And it does type checking. It just does it when the line of code actually runs. (i.e. runtime type errors).
So the discussion here isn't should we check types or not. It's a question of when to do it.
Do you want to know you've made a mistake when you actually make it? Or do you want to find out an unknown amount of time later (e.g. in unfortunate cases, several months later, debugging an issue in prod. Not that I would know anything about that ;)
---
My own thinking on the subject is that it should be configurable.
Rust's level of correctness, for example is probably overkill for a game jam. (As is, arguably, using a low level language in the first place.)
But my thinking here is that correctness should be opt out rather than opt-in. If you have a good reason to make your program wrong by default, then you should be allowed to do that. But it should be a conscious choice! And every source file, at the top of the file, should remind you that you are making that choice: #JAMMODE
And if you intend to actually ship the thing, and charge money for it, in Serious Release Mode the compiler should refuse to build anything that's still in jam mode.
My point here is that some languages make jam mode the only option you have.
Who else would add them, besides people who want them? I'm confused about what you're even claiming here. It sounds like you feel that there's a vocal minority of type enthusiasts who everyone else is just humoring by letting them bolt on their type systems.
https://reservoir.lean-lang.org/@strata-org/Strata isn't done, but its goal is to let one build whole languages that can look like whatever you like. ones for little kids maybe
syntax matters for normal ppl even more than semantics in some ways since semantics can be optimized and refined on the backend, but everyone is (de)limited by their vocabulary
Highly recommended!
Could you elaborate?
> gives link to "limited evidence of benefits"
I'm not looking for absence of evidence, as absence of evidence does not mean evidence of absence. The original post claims tradeoffs, where are they? What are we trading? Even the most dynamic-language friendly results in your linked post either claim: "look at what we did without types!" or "even though there's no types in ruby, devs think about them all the time", which I mean, come on.
The only real criticism or tradeoffs on types mentioned is a person getting stuck on a single static type check compilation error, but I can easily counter that with "undefined is not a property of null".
From the link:
> if the strongest statement you can make for your position is that there's no empirical evidence against the position, that's not much of a position.
This is how hypotheses work. Maybe the problem is that we have a hard time proving anything about programming languages?
...
> Not everyone shares that opinion. See ruby.
All programming languages that have values (i.e. all of them) have types, because you cannot have a concrete value that doesn't have a type. This includes Ruby.
The only difference is whether the language lets you annotate the source code with the expected type of each value.
This is why you observe that all languages trend towards visible typing: The types are already there and it's only a matter of whether the language lets the programmer see it, or lets a linter enforce it, and everyone likes linters.
> So the claim "tend to grow them" ... it is not completely wrong, but it also does not fully capture an independent want to add them. It comes ALWAYS from people who WANT types.
Maybe you misidentified where the type declaration is coming from? It might not be coming from people who want to see types in the source code, it most probably is coming from people who want a decent linter.
In 2026, programming without type-enforcement is like programming using an LLM; it's quicker, but less safe.
I kind of think there's room for a new dynamically-typed language that is designed around being fast to execute and doesn't cost such a huge performance multiple right off the top, and starts from day 1 to be multi-thread capable, but on the whole the trend is clearly in the direction of static typing.
Other than the "new" qualifier, Lisp supports all of that - SBCL compiles to native code, ecl/gcl compile to C (IIRC), etc.
It depends; I recall programming in Tcl in the late 90s, and that has only the string and the list as datatypes, but it felt very powerful, like Lisp but without the easy syntax.
I was wondering why lisp (and forth) were omitted from the initial list of languages named in the post.
I guess Scheme is in the list has ok macros.
cyclone: safe C dialect preventing memory errors
zig: modern systems language with explicit control over memory
odin: another modern systems language
nim: Python-like syntax, memory safe, compiles to C/C++/JS
visual basic: event-driven language for Windows GUI apps
actionscript: language for Adobe Flash applications
php: server-side scripting for web development
typescript: JavaScript with static types
elm: functional language that compiles to JS, no runtime errors
purescript: Haskell-like language compiling to JS
haskell: purely functional, lazy language with strong types
agda: dependently typed functional language for theorem proving
idris: dependently typed language for type-driven development
coq: proof assistant based on Calculus of Inductive Constructions
isabelle: interactive theorem prover
clean: purely functional language with uniqueness typing
unison: content-addressed functional language with hashes instead of names
scheme: minimalist Lisp dialect used in academia
racket: a Scheme/Lisp dialect for language-oriented programming
prolog: logic programming with backtracking
ASP: Answer Set Programming for combinatorial search
clingo: ASP solver for logic-based reasoning
zsh: extended Bourne shell with advanced scripting
tcsh: enhanced C shell with command-line editing
awk: pattern-directed text processing language
sed: stream editor for text transformation
hack: PHP-derived language with gradual typing
verilog: hardware description language for digital circuits
whitespace: esoteric language using only spaces, tabs, newlines
intercal: esoteric language designed to be confusing
alokscript: can't find anything =(There are very minimal versions and also huge versions with lot of libraries, batteries and the kitchen sink.
- The compile speed of Go
- The performance of Go
- The single binary compilation of Go
- The type system of Kotlin
- The ecosystem of JVM (packages for anything I could dream of)
- The document sytem/tests of Elixir
- The ability to go "unsafe" and opt for ARC instead of GC
- The result monad/option monad and match statements from OCaml/Gleam
- A REPL like Kotlin or even better, OCaml
- A GREAT LSP for NeoVim
- A package/module system that minimizes transient dependencies
- No reliance on a VM like BEAM or JVM
I still dream about this "one size fits all" language.In many of these other categories, clisp exceeds requirements. The REPL and Doc situation is so good it's honestly worth it for those alone. People put up with `):'(,@ soup for good reason.
You can't design an abstractly "perfect" programming language without any context. Which is why the author I think focuses on "perfectable", as in the language can be made perfect for your purpose but it's not going to be one size fits all.
I understand that this workflow can't be realized in languages whose runtime semantics are derived from type-level stuff, and while that can be quite convenient I'm personally willing to give it up to unlock the aforementioned workflow.
That, and forgoing fancy compile-time optimization steps which can get arbitrarily expensive. You can recover some of this with profile-guided optimization, but only some and my best guess based on the numbers is that it's not much compared to a more full (but much more expensive) suite of compile-time optimizations.
Do you mean actual monads or just the specific result/option containers? If you mean a fully-fledged monad abstraction then you need a more sophisticated type system than what Kotlin provides (i.e. higher-kinded types).
The existing Result type was a mistake to expose to users, IMO, as it encourages exceptions-as-control-flow and error type hierarchies which complicate error-handling even further. The convenient `runCatching` API also completely breaks reasonable error-handling on the JVM and Kotlin's structured concurrency (which happens to use exceptions-as-control-flow to signal coroutine cancellation).
Overall, Kotlin is moving away from higher-kinded types in the core language, not toward them.
I've been wanting to adopt Lean for a project but wasn't sure about the speed. Nice to hear that it should be good on that front.
#eval (UInt8.ofNat 256 : UInt8)
#eval (4 - 5 : Nat)
The first should be a compile time error right, because `UInt8.ofNat` is going to require that its argument is 0-255. And the second should be a compile time error because subtraction should not give a `Nat` unless the first argument is definitely more than the second.Nope! Both give 0.
On the other hand, array indices by default do require such a proof, i.e., this code produces a compile time error:
def x := #[1, 2, 3, 4]
#check x[7]
Kevin Buzzard even wrote a blog post about a similar question about division by zero: https://xenaproject.wordpress.com/2020/07/05/division-by-zer...It definitely is a bad convention because it's highly surprising. That's what makes it a footgun.
> that would be a really annoying thing to use
Sure. So maybe provide "unchecked" versions for when people don't want to bother.
We've known this about interface design for literally decades. The default must be safe and unsurprising. You need to opt into unsafety.
You know that `Nat` represents non-negative numbers, and you see that `1 - 2` does not produce a compile error. What value do you expect then? What’s so surprising about choosing zero as a default value here? Do you expect it to panic or what?
These aren't the only reasonable semantics, and Lean will certainly let you define (for instance) a subtraction function on natural numbers that requires that the first argument is greater than or equal to the second argument, and fail at compile time if you don't provide a proof of this. These semantics do have the benefit of being total, and avoiding having to introduce additional proofs or deal with modeling errors with an `Except` type.
No doubt there will be plenty of comments to your comment trying to rationalise this.
Lol
- Emacs: https://github.com/leanprover-community/lean4-mode
- Neovim: https://github.com/Julian/lean.nvim
I'm using the Emacs lean4-mode and it's pretty good.
But as someone who came of age in the AIM / ICQ / IRC days, it feels pretty normal. That's just how we wrote. I still fall into it by accident when the context is right and I'm not thinking about it (eg Slack at work). I hope youngsters aren't judging me for it.
if you wanted more than one sentence you sent one then wrote the other
it's painful to read longform
the victorians didn't give up on punctuation and regular english just because they had the telegraph
but surely its correlated
most of this comes from me noticing how funny sql looks with all the people trying to use caps all over the place as if anyones working in a place without syntax highlighting in 2026. sql is the wild west and everyones sql looks like shit there is no shame. i was told i needed to use caps more early on in sql and i lmfao'd, but i was new to the career and that scarred me. i write lower case sql just to spite others now and if you see something capitalized you know i meant it, but for the most part you have to pay me to use my shift key.
my trauma is now your trauma