> And I was countering that argument by saying that a large alphabet (a large set of pictograms) is hard to learn or at least takes a lot time, as can be seen from trying to learn Chinese or oher asian scripts.
And my counter to that is that the same way you do not learn greek in order to use some greek characters in math, the same way you do not learn chinese in order to use some chinese characters in math.
There are cases where "Δ" read as "delta" makes perfect sense for the delta between two values. Whereas "η" read as "eta" is not a obvious synonym for efficiency, or for the coefficient of viscosity, or for the metric tensor in Quantum Field Theory. And "c" read as "c" is not an obvious synonym for speed of light, or for 100, or for the space of convergent sequences. But "光" read as "Guāng"(chinese) or "hikari"(japanese) or "light"(english) might be obvious for something related to light. You do not need to learn the original pronunciation of the character to use it. Just have a 1 to 1 mapping between notation and notion. Same way Chinese and Japanese have completely incompatible pronunciations for the same characters.
And even better, just use words instead of characters.