(* Expressions *)
type Exp =
UnMinus of Exp
| Plus of Exp * Exp
| Minus of Exp * Exp
| Times of Exp * Exp
| Divides of Exp * Exp
| Power of Exp * Exp
| Real of float
| Var of string
| FunCall of string * Exp
| Fix of string * Exp
;;
let rec tokenizer s =
let (ch, chs) = split s in
match ch with
' ' -> tokenizer chs
| '(' -> LParTk:: (tokenizer chs)
| ')' -> RParTk:: (tokenizer chs)
| '+' -> PlusTk::(tokenizer chs)
| '-' -> MinusTk::(tokenizer chs)
| '*' -> TimesTk::(tokenizer chs)
| '^' -> PowerTk::(tokenizer chs)
| '/' -> DividesTk::(tokenizer chs)
| '=' -> AssignTk::(tokenizer chs)
| ch when (ch >= 'A' && ch <= 'Z') ||
(ch >= 'a' && ch <= 'z') ->
let (id_str, chs) = get_id_str s
in (Keyword_or_Id id_str)::(tokenizer chs)
| ch when (ch >= '0' && ch <= '9') ->
let (fl_str, chs) = get_float_str s
in (RealTk (float (fl_str)))::(tokenizer chs)
| '$' -> if chs = "" then [] else raise (SyntaxError (""))
| _ -> raise (SyntaxError (SyntErr ()))
;;
Hint, this isn't Rust.Programming-language researchers didn't start investigating linear (or affine) types till 1989. Without the constraint that vectors, boxes, strings, etc, are linear, Rust cannot deliver its memory-safety guarantees (unless Rust were radically changed to rely on a garbage collecting runtime).
>it's a damning indictment of programming culture than people did not adopt pre-Rust ML-family languages
In pre-Rust ML-family languages, it is harder to reason about CPU usage, memory usage and memory locality than it is in languages like C and Rust. One reason for that is the need in pre-Rust ML-family langs for a garbage collector.
In summary, there are good reasons ML, Haskell, etc, never got as popular as Rust.
I'll say for a long time I've been quite pleased on the general direction of the industry in terms of language design and industry trends around things like memory safety. For a good many years we've seen functional features being integrated into popular imperative languages, probably since map/reduce became a thing thanks to Google. So I'll us all credit for coming around eventually.
I'm more dismayed by the recent AI trend of asking an AI to write Python code and then just going with whatever it outputs. I can't say that seems like a step forward.
So if we are speaking of optimizing compilers there is MLton, while ensuring that the application doesn't blow up in strange ways.
The problem is not people getting to learn these features from Rust, glad that they do, the issue is that they think Rust invented them.
Sure, rewrites are most often better on simply being a rewrite, but the kind of parallel processing they do may not be feasible in C.