The reason the printing press became so revolutionary in latin alphabet using countries even though paper, printing and movable type were invented in China, is because it is very easy to make a machine that uses it. The English alphabet is more or less a common denominator of all other latin alphabet-based languages. Any non-latin/greek/cyrillic script is very hard to adapt to the various forms of technology throughout history.
But both papers illustrate a very real problem. Mathematical notation is horribly, infuriatingly, idiotically overloaded. And, no, context is not enough to divine the meaning. There are plenty of examples where a symbol is used for multiple meanings in the same paper or the same lesson.
My opinion is that the cause of all these problems is a very brain dead decision in math to allow adjacency of symbols to signify multiplication. This results in no longer being able to use multiple character names for things (variables, constants, etc.). And this is crippling. It results in the use of modified characters as symbols, use of characters from other languages as symbols and now the proposal of use of emoji as symbols. Because why not, emoji are plentiful and are now easier to input, store and display than ever.
The same issue (diversity of names) was solved by almost all by having mandatory separator characters(space, tab, comma, semicolon, etc) that are not allowed in names. Imagine a programing language where it is impossible to tell at first glance the meaning of something like "TotalWeight". Is it one variable? Is it Total * Weight? Is it TotalW * eight? Is it Total * W * eight? Is it T * o * t * a * l * W * e * i * g * h * t? We can rewrite this last one as aeghilo(t^2)TW . This is mathematical notation. And it will not change because it is entrenched.