---
They're bad for accessibility. Don't work with screen readers . Hard to make out for people without perfect vision. Harder to type out.
They don't render well on many systems. They can't be handwritten. How are they do be pronounced? "croissant emoji squared plus girl wearing hat"?
Conventions exist for what symbols to use where in science and math. Don't mess with this. Kids won't magically find math easier if you use emoji in place of symbols.
Instead, please focus your efforts of improving teaching methods.
I find it very hard to internalize math without going through the derivations with pen and paper. Though I must admit that I have slowly gotten to the point of preferring to proof-read calculations for publication by typing out the intermediate steps in latex and then commenting them out.
I knew a girl named 胡伊人 Hú Yīrén, who for a long time had her wechat display name as three emoji, a tiger face, a hand holding up one finger, and a human girl's face. (The intent would have been to read these as "tiger", "one", and "person", 虎一人 hǔ yìrén.)
But emoji didn't appear in notifications, and my phone was set to English rather than Chinese, so whenever I got a message from her the notification would read "[Tiger][No. 1][Girl] ...". I started thinking of her as "tiger number one girl", a bit.
Sure, girl wearing hat might be a bad choice, but I think "simple emoji like tree, fire, snowflake (for example). Do a good job of replacing Greek letters.
Another place I use emoji, is for a set of Labella which should have no order. Mathematica a often say they are naming some things 1,2,3, but please forget the integers have arthmetic and ordering. If you use fire, tree, snowflake as the names, no-one assumes an ordering, or arithmetic.
Not sure what is the problem here. Even without knowing Greek nor the Greek alphabet, its letters are sufficiently diverse and can be easily told apart.
I can understand emojis can make math more enciting for young children, but if high school+ people cannot get past foreign-looking Greek letters I doubt they would find the motivation to progress much further in understanding the concepts involved.
Are you saying kids will not be able to distinguish between the two symbols? I highly doubt that. They may not remember the names of the symbols, but emojis have the same problem.
I was inclined to agree with you, but then I remembered the zodiac sign emoji. Capricorn is basically η with a looped tail and taurus is gamma ɣ with a bigger loop. I'm pretty sure kids wouldn't be able to write them by hand (neither would I), but they have the advantage that if you type their names on a phone keyboard, you just need to recognize the symbol and tap it. Maybe all that's needed for Greek letters to compete with emoji is equal treatment by keyboards.
As for the criticism that emoji can't be handwritten or pronounced: show me anyone doing a proper xi, and ask the greek what they think of how foreigners pronounce their letters (or indeed how one pronounces bold face letters). Clearly these are problems that have been solved before, and so can be solved again to the same level of quality.
Further, when handwriting you really have far more freedom than when typesetting, I had a friend that taught me the useful shorthand of just drawing a sphere to indicate that the surface integral was over a sphere. I tend to name partial results things like boxes or triangles when I work out because lugging around a second or even third "a" is just not as clear. Importantly, it's also much more fun to call your partial integral "small house" than "iii".
All that being said, the example choices in the article seem like straight up bad choices: naming the sides of a triangle a specific direction seems unhelpful in the fairly common case of several triangles with opposite orientation for example.
I can't disagree with you, though. Curious what makes it fine there, but bad here.
The reason the printing press became so revolutionary in latin alphabet using countries even though paper, printing and movable type were invented in China, is because it is very easy to make a machine that uses it. The English alphabet is more or less a common denominator of all other latin alphabet-based languages. Any non-latin/greek/cyrillic script is very hard to adapt to the various forms of technology throughout history.
But both papers illustrate a very real problem. Mathematical notation is horribly, infuriatingly, idiotically overloaded. And, no, context is not enough to divine the meaning. There are plenty of examples where a symbol is used for multiple meanings in the same paper or the same lesson.
My opinion is that the cause of all these problems is a very brain dead decision in math to allow adjacency of symbols to signify multiplication. This results in no longer being able to use multiple character names for things (variables, constants, etc.). And this is crippling. It results in the use of modified characters as symbols, use of characters from other languages as symbols and now the proposal of use of emoji as symbols. Because why not, emoji are plentiful and are now easier to input, store and display than ever.
The same issue (diversity of names) was solved by almost all by having mandatory separator characters(space, tab, comma, semicolon, etc) that are not allowed in names. Imagine a programing language where it is impossible to tell at first glance the meaning of something like "TotalWeight". Is it one variable? Is it Total * Weight? Is it TotalW * eight? Is it Total * W * eight? Is it T * o * t * a * l * W * e * i * g * h * t? We can rewrite this last one as aeghilo(t^2)TW . This is mathematical notation. And it will not change because it is entrenched.
I would have struggled during my math and stats degrees if I had to distinguish between emoji and mathematical notation.
It doesn't really matter if you can't write down an emoji if you don't write anything down.
Because of the concepts used.
>I have to translate every character back to the concept meant by this character
Using emojis for variables won't help with that.
>Every scientific domain has its usual notation for specific concepts
Does that imply each emoji be used for a single concept across all domains? Not only that won't be possible (considering people don't even always use same variable for something, e.g. Pythagorean theorem being a^2+b^2=c^2, α^2+β^2=γ^2, x^2+y^2=c^2, ...) but is also a bad idea even if it could work as it will imply you've to remember every single emoji used.
Again, math is about concepts not what symbol you use for a variable. Using emojis won't make anything easier to teach or to understand.
I wouldn't use pictograms everywhere, but they could be elegant in certain contexts. E.g. having a dedicated pictogram for various base physics quantities (e.g. for mass, length, time, etc).
This is like asking why I'm writing in English even though it's not my native language, I do that because almost every educated person understands English, while almost nobody would understand if I wrote in Italian.
Pictograms would be difficult to standardize and hard to reproduce, unless you commit to a small set of them, which would just become an alphabet in disguise.
What if you have two different lengths, one being the distance between two objects and the other being the position of the center of mass over time? What about their two masses? And, in relativity, their two proper times?
Now you're back to using subscripts and the Latin alphabet because what better way to describe the precise object you're talking about in a given language (here: English) than the language itself?
>Using emojis for variables won't help with that.
Yes, they will, because there is an amount of meaning embedded in the graphical shape of the emoji used whereas there is none for a greek/chirillic/phoenician/etc. character.
But the real solution is to not limit names to a single character.
See my comment here: https://news.ycombinator.com/item?id=25207527
This is also the reason why notation is often terse, and highly domain/author specific. You get tired of writing long_variable_names very quickly if you do it OVER AND OVER by hand. Now think about how much work and confusion it would be to replace those symbols with little paintings...
I assure you there is plenty of exploration done in tools like Jupyter.
[1] https://www.jefkine.com/general/2016/09/05/backpropagation-i...
Something I strongly disagree with in both the Tau manifesto and this Emoji manifesto, is the notion that Tau and Emoji should be encorporated into the official literature. Pi is 'wrong' - always use Tau. Einstein's papers would be easier to understand if you used the fire emoji instead of E.
Of course not. But in the other hand, the responses to these arguments attack that aspect of the claim rather than the intent behind them. "You can't use Tau because all textbooks use Pi. You would need to reprint all textbooks in the world". "You can't use emoji because the support is bad, you can't draw them and you can't pronounce them."
What we're missing, and where all of these belong, is in a formal explanation format. Most attempts to break down and make concepts more palatable tend to be blogs, YouTube or even broadcast media. We don't see anything in between 'formal paper or textbook' and 'colourful diagram aimed at beginners'. Where are the colourful diagrams for your latest paper on Flat Chains in Banach spaces?
[1] https://betterexplained.com/articles/colorized-math-equation...
Thank you for connecting all these concepts together. And I agree that this is one of the hurdles preventing people from becoming more than beginners. We need a gentler progression in abstraction. For many people, learning math is like learning Vim.
I almost thought the entire article was sarcasm until I looked through Chrome.
[1] http://tug.ctan.org/info/symbols/comprehensive/symbols-letter.pdf
[2] http://www.ctan.org/pkg/halloweenmathOr go back in history I suppose (𐆖, 𐆗, ₶):
* https://en.wikipedia.org/wiki/Currency_symbol#List_of_histor...
energy = mass · lightspeed²
↑ As you would do in programming.
https://www.amazon.com/Algebraic-Geometry-Projective-Varieti...
This get's very tedious to write if you use this over and over in proofs.
↑ As you would do in programming.
See my comment here: https://news.ycombinator.com/item?id=25207527
And in case of some languages we can even use the full Unicode for variable names (Julia). The only restriction is we have mandatory separators and adjacency does not automatically mean multiplication.
See for example VS Code Live Share for the last one if you are not aware of it.