I think that Alan Key spoke about dynamic nature of OOP, that's one of the examples. Even in the statically typed environment OOP demonstrates its dynamic nature.
class Expression {}
class Add : Expression {}
class Const : Expression {}
class Var : Expression {}
void FSpecialization(Expression e1, Expression e2) { ... }
void FSpecialization(Const c, Var v) { ... }
...
void F(Expression e1, Expression e2)
{
FSpecialization(e1 as dynamic, e2 as dynamic);
}
How is this not exhausted? Any new sub class of expression will be routed through to the first FSpecialization. Same as it would with a _ -> default_f
in algebraic pattern matching.Because what happens if you introduce a new expression, but not realize this means the implementation of F should be updated?
Regarding algebraic pattern matching, you can let Haskell check exhaustiveness and then drop the default case. The compiler will then warn you if you forgot one.
To give you an example, in our codebase we have an enum that is used 3068 times in one solution. A very similar one (you would need domain knowledge to understand the difference) is used 2985 times. Both are not defined in that solution by the way.
It's not out of the question they will receive a new enum member down the line. It would be very useful if the compiler was able to tell you a switch that uses either enum is no longer exhaustive.
If you could exhaustively switch on an enum, you could skip the default: case and have the compiler enforce you covered all cases. (I know, the C# enum==int would make that impossible.)
If you would remove FSpecialization(Expression e1, Expression e2) a.k.a wildcard pattern, compiler would not warn you about your dispatching is not exhaustive. And if you'd add another type of expression, all your dispatchers and visitors without wildcard would become non exhaustive, staying valid code from a compiler perspective at the same time.