On the top page there's another thread about rust-analyzer vs RLS. What Aleksey said[0] about RLS that "[RLS's] current architecture is not a good foundation for a perfect IDE long-term," feels similar to my coworker's conclusion in her effort to provide better editor support for PHP[1].
Parsers for compiling code into machine-executables vs parsers to serve LSP responses have different tradeoffs. For example, Anders mentioned TS parser need to have good error-recovery, can respond with completions/errors when one file changes inside a thousand-files-project. I vaguely remember TS had a goal to provide completions in < 50ms and errors in <150ms. Such goals are hard to achieve as after-thought. If your core compiler doesn't do error recovery, such as PHP, you need someone to write a new compiler from scratch for a language server implementation. If tools such as RLS rely on compiler to dump all JSON metadata and figure out LSP responses[0], it's too slow for editors.
TS's good editor support doesn't come free. I think one of the most under-appreciated achievement of TS is how it took editor support seriously and designed its compiler infrastructure to do it well. That's why I don't believe in those hot-new-web-languages that try to take over TS by designing a better type system. TS brought the average developer's expectation of a language to having fast completion, fast error reporting, editor autofix, F2 to rename and renaming-a-file-in-editor-to-update-all-references. It's 2020 and people aren't going back to write code like in Notepad.
[0]: https://ferrous-systems.com/blog/rust-analyzer-2019/
[1]: https://github.com/microsoft/tolerant-php-parser
---
EDIT: grammar.
In a way this is very intuitive: a programming language is a kind of a UI for the language's runtime. The IDE is just another UI layer on top of that.
(32-bit Delphi 2 switched to a C-based compiler originally written by Peter Sollich. I maintained the front end on that compiler for 6 years; PS followed AH to MS.)
Any interesting stories to share? I read somewhere that now the compile is a million lines of undocumented code.
I may be remembering this wrong, but it seemed virtually free - it was about $100 when its competitors and most software was $500+. That was in about 1986. The only way to get a manual was to buy the software. Another world..
Sadly that market was abandoned by Borland when they decided to go after enterprises and renamed themselves to Inprise and their prices skyrocketed after that.
Likewise I paid 150 euros in today's money for Turbo C++ for Windows a couple of years later.
Borland products were king for small business, but then they decided to switch focus to corporate users, and the rest is history.
As side note, Delphi and C++ Builder are still quite loved in Germany, several companies keep using them, and there is at least an yearly conference.
I kind of hope Anders moves on from TypeScript soon, he’s done a fantastic job there considering the constraints of JavaScript, but I’d love to see him tackle something new.
I’d love to see him take lessons learned from C# for a language designed for WASM from the get go.
Anyone know a declarative interface for describing services and their interactions that “compiles” to a complete cloud resource manager.
Discussed at the time: https://news.ycombinator.com/item?id=11685317
I don't know enough to know if this hunch is correct, I'm wondering if anyone better informed can chime in.
Also very interesting how rosalyn had the idea of compiler as an API, we went in the opposite direction and did not have any extensibility, instead we made the Code editor an API
Everything - the editor, semantic analysis, version control, execution engine, everything - all use the same data structures (the same abstract syntax tree).
We use functional data structure everywhere and we do functional updates within the AST all the time; that's even how text entry in the editor updates the program.
Is that data structure suitable for all those purposes though?
How do you do optimisations like GVN on an AS->T<-?