A couple of years ago, randomly browsing YouTube, I came across this home made video asking how they figured out the distance to the moon before modern technology. The host starts out small scale showing he can calculate the distance to things in his back yard using trigonometry and then scales it up to the moon.
My mind was blown, because no one ever told me that. It was simple, anyone could understand it. When I was in school, all I was told was to memorize abstract formulae like calculating the length of sides of triangles based on angles and known length of one side. It was never contextualized to any actual, let alone interesting or fascinating, applications.
That is the main struggle for many math teachers, they try to do all these fun and interesting explanations, and the kids just ignore it and go directly for the formulas and forgets everything else. the problem seems easy to solve until you have experienced trying to apply it yourself to a real class of kids needing the material for real grades. It can be done, but a teacher who could do it could make way more money in entertainment etc, since that is what is required to get kids to pay attention.
That was novel. That is how research is done…you start with a problem and figure out a way forward.
That is not how textbooks are written. It’s like, why waste our valuable paper to print the wrong way to do something even if it helps people learn? They only print the right way to do it. The student loses the ability to participate in the discovery process and just becomes a dumb initiate who is forced to believe whatever is written down. It’s more like a degenerate religion you’re forced to memorize without the inspiring examples of all the saints and martyrs who showed others the way before you.
I'm all but certain mine didn't. For sure there was the odd contextualization and examples of real world applications here and there but definitely not enough and definitely not interesting or fascinating ones.
Eg if there was one for trigonometry it would have been something lame like "Alice is standing in the field some distance from Bob and Billy. Calculate the distance based on <trigon blabla>"
Hardly riveting stuff :P
They don't have the time, attention spans or memory to be able to take in both the contextualization and to understand the formulas. After all it's only the formulas' that really matter in the end when performing the calculations in test/exams when you have to show your working out. I also find that I can understand the contextual quite easily but it's another thing to then apply it through formulas.
This is coming from a non-teacher but past student.
At University I had more time to think about the contextuals during my degree, but not in high school jumping from subject to subject.
I did not do well in either set of subjects in K-12 -- even to the point that my graduation from high school was threatened. In college I forced by way back through it all by sheer force of will and got my A's.
There's something fundamentally broken in math/science pedagogy as these subjects aren't really all that difficult. There's far too much time spent memorizing things that are trivial to look up and way too little time understanding how to use them.
An analogy might be learning to cook, and spending all of your time remembering precisely how many spoons, bowls, cups, and cloves of garlic or other ingredients you have. And doing that kind of thing for years, and maybe seeing a demo once of pouring water into a cup. And tests might contain problems like "a party of 5 is coming over for dinner, are you able to set places for all attendees for a 7 course meal?"
The real message being sent is this: "Sorry kids, actually cooking from recipes is only for academics, and to get your PhD and be allowed into the hallowed halls of these academic cooks you must come up with one original recipe (edibility will be determined by peer review)".
In college I retook everything from Algebra up and found the math pedagogy focused more on symbolic manipulation and getting used to how that works in each subject rather than drilling arithmetic in various guises. Tests that required various pre-derived formula were usually just an open book problem. And what mattered was how one went about solving the problem, not the rightness or wrongness of it. Calculators were absolutely expected so you didn't waste time fighting with trivial mistakes.
The sciences usually had a mandatory lab portion that forced application of math to the problem space. Because the labs typically had you collecting your own measurements, it forced you to work through the calculations yourself anyways since there was nowhere else to look up the answer. Again the methods and approaches were where the grade came from, not the slavery to memorization.
Still, while I think the approach I encountered in college was much better than grade school, it still wasn't as good as it could be.
I always wonder why editors don't understand the importance of these things and don't enforce them.
(Posted from a parallel universe.)
I know you mean this as a joke, but many cooking books are like that, and I for one value knowing the history (and science, as in "The Food Lab", highly recommended) of what I'm cooking or eating
You've just described practically any modern recipe website.
Makes me wonder if that's the difference, do people not check math papers unless they are specifically interested in that very proof or such?
An example is a paper with several mouse experiments. In results it'll often have a single section saying that mice were raised such and such under conditions A or B. They were treated with X or Y and samples were collected after certain times. But was Y treated on A or B mice? From results its clear that X was done on both but no mention for Y. I guess A is more the default and they'd have specified B if that were the case, so probably A.
Not saying the status quo is optimal of course just why it is the way it is now and probably won't change soon
A beautiful advice that I received as a student was to write mathematics as a series of definitions, propositions and proofs. No text is allowed to exist outside of these three. In practice it is difficult to enforce, but it is helpful to keep this as an aim.
Oh my goodness! So there are actual people who prefer that? I had to chew my way a fair share of books like that and I can’t stand them. Clearly the person who thought of these definitons, propositions and proofs had some reason to think of them. Sometimes they were trying to solve a problem, sometimes they were combining ideas, sometimes they were looking for structures with certain aesthetic properties. There was always a why behind why they thought about this and not something else. Sometimes there are multiple possible such reasons, that is fine. In that case the author can select whichever they fancy the most. Every time i had the misfortune to read a book like what you describe it felt like I was eating powdered milk without reconstituting it. There was a clear chain of thought between ideas and they choose to just hide it.
I understand that math requires work. One needs to get a paper and a pencil and work out examples, check proofs, play with definitions. But why wouldn’t the author write down what made them care about the next item?
> it is helpful to keep this as an aim.
Why?
Until mid-20th century, mathematics had never been communicated in this austere manner.
When you are a working mathematician, you never start with a definition. You start with a context in which your exploration begins.
It might be a question which someone else asked that interests you (and there is a story as to why). It might be that you don't even have a question, but the objects of your study are not studied enough, so you hope you stumble into one. You can easily tell why this is an interesting thing to look at.
You do some calculations, take a look at a few examples, see if you can make a stab at the chaos in front of your eyes and find a pattern, which we formally call a conjecture.
Then you see if this pattern holds in other cases, and why. This shapes the backbone of the proof.
Once you have a basic idea of a proof, you can formulate a theorem. In the formulation, you list all the conditions to which your proof applies. The pattern might be much more general, but your proof might work e.g. "only in cases when the order of the group is invertible in the base field", or some other.
After all the work is effectively done, you decide that a concept that appears repeatedly in your reasoning deserves to have a name. Something convenient to call it by, so you don't have to repeat yourself. So you make a definition.
You then decided to share your joy with this world, and write a paper.
You listen to the "beautiful advice", and throw out anything that makes your paper interesting.
Context goes out of the window, along with any hope for the reader to have any idea why your paper is worth looking at. At best, you'll advertise this in talks, or explain over beers. Side note: you will have to drink a lot of beers to make it in math.
Then, you follow the advice again, and lay things out in the order exactly opposed to the one you were thinking in: definition - theorem - proof - conjectures - examples - context.
Wait, you already scrapped context, and the examples you started with aren't illustrating the results you ended up proving.
So you tidy up your paper, come up with more specific examples, and remove anything that wasn't on the direct path to your result.
Having climbed to a place where you can see better, you pull the ladder up. Good luck to anyone outside the group of five people who are actively working in this niche!
And finally, you write an abstract to your paper, where you mention the things you defined. The abstract doesn't make any self by itself, one needs to be in-the-know to get half of it, and read your paper to understand it.
In practice, it acts as a "No Trespassing" sign for the outsiders (i.e. anyone not in direct contact with the five people you have beers with at the Annual Niche Field Conference).
Satisfied, you lean back and post it to arXiv.
It's been a beautiful day, you think. This practice is difficult to enforce, but you kept it as an aim, and got pretty close to perfection (as exemplified by a Bourbaki text, or anything by Serge Lang, but I repeat myself).
Somewhere not too far away, a student in the class you're teaching cries.
----------------
I said "you", but as someone who's written a couple of math papers, that's really me too. We are all taught in a horrendously backwards (literally!) manner.
This perversion of the beautiful art isn't a new observation. I can't write better about it than Vladimir Arnold[1] (a titan whose name is, I hope, familiar to you).
It's worth a read to anyone who has ever studied mathematics:
[1] https://www.uni-muenster.de/Physik.TP/~munsteg/arnold.html
I have read plenty of science papers and this is extremely far from universally true.
That's there for playing the grant funding game. It's a waste for people who care about the content of the research.
It doesn't have any of the pointless bloat of most "modern" web design, and is all the better for it.
Sometimes papers are technical and don't need to pretend that they are telling an exciting story of interest to a general audience. It isn't just a math issue.
This does a considerably better job of context/interest than the math example did.
What's the mathematician supposed to do, say "group theory is cool and important"?
I also hear people say programming is boring. This is absurd.
Expression and personal power over reality are where I get the joy.
A painter can create world and share a feeling. An author can manifests a memory. A musician can transmit a human experience without language.
Human imaginings about magic are immemorial.
Math describes reality. Math also can describe an extrapolation further.
Programming can manifest from the descriptive language of math into the real. It can use it as a pigment for a new kind of picture. Programming helps with everything below, and it enables new things slightly above.
By "above" I mean the layers of abstraction. A programmer isn't an painter, but a programmer/painter has an additional axis of art.
Programming is our species apex of material transcendence. I don't believe it's the top of the pile, but I have no conception for what's above. Its capacity for encapsulation seems to grow as the dreams do.
Programming grows to predict, programming grows to create. Everything is just a new library, a new framework, a new environment. How long before its limits are found, so we can find the next epiphanies?
More interestingly before using symbols as variables in math Egyptians were capable of doing Quadratic equations without these variables.
If I were teaching math today I would probably teach this. I would try to do algebra without using variables, and then I would use funny words like "blue" as Brahma Gupta use to. I think that this would probably stop a lot of the questions like "why are there letters in math!?"
Wouldn't people just instead ask "Why are there colours in maths?"?
I don't think that this:
blue + blue = 2*blue
Is any more meaningful or edifying than: x + x = 2*xI can't help but see a parallel with magicians, who can dazzle us because they are willing to go further than most of us, in terms of practice. In the same way, math gives you the ability to dazzle with surprising answers, to do a lot with a little.
> 3 + circle = 7
> 10 - cloud = 4
It seems like a perfectly reasonable exercise for a first grader. Clouds and puppy faces and circles. But I have to admit if I’d seen “x” in place of the symbols I’m not sure I would’ve thought it was so reasonable.
The Teaching of all Maths/Sciences should always start with a Real-World motivating example and then introduce the Maths as necessary to Solve it.
In this context see V. I. Arnold's essay; On Teaching Mathematics - https://www.uni-muenster.de/Physik.TP/~munsteg/arnold.html
Quote from the above article:
* Attempts to create "pure" deductive-axiomatic mathematics have led to the rejection of the scheme used in physics (observation - model - investigation of the model - conclusions - testing by observations) and its substitution by the scheme: definition - theorem - proof. It is impossible to understand an unmotivated definition but this does not stop the criminal algebraists-axiomatisators.
* What is a group? Algebraists teach that this is supposedly a set with two operations that satisfy a load of easily-forgettable axioms. This definition provokes a natural protest: why would any sensible person need such pairs of operations? "Oh, curse this maths" - concludes the student (who, possibly, becomes the Minister for Science in the future).
* We get a totally different situation if we start off not with the group but with the concept of a transformation (a one-to-one mapping of a set onto itself) as it was historically. A collection of transformations of a set is called a group if along with any two transformations it contains the result of their consecutive application and an inverse transformation along with every transformation.
- mathematics is boring to everyone right up until the moment you need it. Then suddenly it becomes very interesting.
The way mathematicians typically read papers is not by randomly picking through recent submissions to the arxiv and dutifully reading everything they come across. Instead, they stumble on a hard problem in their own research which they don't know how to solve, and they search to see if anyone else has worked on it before. The paper you would have discarded as pointlessly abstract or ridiculously overspecialized just yesterday suddenly reads like a riveting novel today. No amount of creative writing tips would have made it any more interesting to you yesterday - unless the writers happened to anticipate the exact reason you would end up becoming interested in it ahead of time.
That might be true for higher level math, but anything at the graduate or undergraduate level has been already curated to be interesting.
You have to learn a whole new alphabet and signs.
This is done for the sake of quick communication between mathematicians, but it's necessary to make a study and see the pros and cons.
While it's true that it makes communication faster and straightforward it keeps so many people outside of the field.
Maybe the field would benefit to go more towards philosophy and logic, explaining it with words.
In other competitive fields such as banking, basketball and football you have the higher ups caring about the pyramid below them, if only as a place to recruit new talents.
Among the math higher ups, only Jim Simons cares about the "math pyramid" so to speak.
One has to be pragmatic, the goal of getting the population interested in math is GDP and median quality of life.
I know those things are very mundane for mathematicians who are absorbed in their world trying to be the ones cracking the Rienmann hypothesis, but even as that individual you have slightly better odds at making it if your surroundings look like Zurich or Cambridge vs. Baltimore or Mobile.
Matter of fact you have better odds if your country can extend the areas looking like Zurich and Cambridge and reduce the areas looking like Baltimore or Mobile.
Interesting perspective.
I studied Philosophy and Logic in university:
https://en.wikipedia.org/wiki/List_of_logic_symbols
Much of it was familiar to me because, earlier, I took a class my Physics professor insisted we should take, as he put it, "if you want to get out of the dark ages": Programming in APL.
AS it turns out many of the symbols used in APL come from Logic.
To this day I find it disturbing that Python uses "^" for bitwise XOR, because both in Logic and APL, this is the symbol for AND. Anyone who studied Logic instantly recognizes the APL logic operators.
I say "interesting perspective" because the reality of what you are asking is precisely opposite what you think the outcome would be.
Among all those symbols only ">" and "<" are somewhat intuitive, all the others you have to learn what they mean.
Even "=" is derivative of "<" and ">" because by reasoning you can understand that you get to it by rotating the 2 lines about 30 degrees after realizing that you are dealing with 2 numbers which are in fact the same and not one being bigger than the other
def velocity(time_ms): return ...
vs.
v(t) = ...
Like nearly every operation and variable is one character or symbol long (with the puzzling exception of trig where you get a whopping 3 characters - sin/cos/tan/etc.)
Not the case for most CS papers.
A big problem for CS papers, particularly in PL (programming language research), seems to be heavy reliance on assumed knowledge of Greek letters and notation in the very field-specific way you like to use them. People would understand your paper if only they had read your previous three, which alluded to what you might have meant by these Greek letters, but only by figuring out the two citations they each have in common. If you are bad at pronouncing Greek letters and it’s a PDF so you can’t copy them, you can’t even google what you see. Even if you could, it wouldn’t help. Notation is ten times harder to search for.
(I have never, ever had this problem reading a law paper, not even slightly, not even once.)
There’s an interesting demo here from Will Crichton about how to prepare better documents for conveying understanding in PL. He has a thing to show you the “read as” on hover. https://twitter.com/wcrichton/status/1442891297333800966 https://willcrichton.net/nota
Oh, Lord; NO!
I would like see all Human Narratives/Unnecessary frivolities/Assume-reader-is-a-Idiot language banished from the Teaching of ALL Maths/Science.
What we we need is a focus on the direct teaching of Principles along with their Real World Applications.
Let's have some examples!!
(1) Dimension.
So, suppose we are in the first class in linear algebra:
"Maybe you have heard that the real line has 1 dimension, is 1 dimensional, the plane is 2 dimensional, and the space we live in is 3 dimensional. Well, that's all true enough, but in linear algebra we do better and have more: For one, we get to say clearly what is meant by dimension, that, in particular, why the line, plane, and space are 1, 2, 3 dimensional. For much more, for any positive integer n we have n-dimensional space.
Next, in linear algebra n-dimensional space is a relatively easy generalization of what we already know well in dimensions 1, 2, 3.
Why might we care? For example, we know well what distance is in dimensions 1, 2, 3, and distance in n dimensions is a straight forward generalization. In dimensions 2 and 3, we understand angle, and also that carries over to n dimensions. For more, with computing it is common to have a list of, say, 15 numbers. Well, for just one benefit, with linear algebra we get to regard that list as a point in n = 15 dimensional space, and doing so lets us do some powerful things with representing and approximating that list."
So, we get some sense of previews of coming attractions and some invitation to higher dimensions.
(2) Optimization.
"There is a subject, with a lot of development just after WWII, called linear programming (LP). The programming is in the English sense of operational planning as in war logistics and planning as was crucial in WWII. The linear is the same as in linear algebra.
The main goal, point of LP is to find how to exploit the freedom we have in doing the operations, the work to be done, to get the work done as fast or cheaply as possible, that is, to find an optimal way to do the work.
So, the subject LP is part of optimization. There have been some Nobel prizes from applications of LP and other math of optimization to economics. There have been applications of LP to feed mixing, oil refinery operation, management of large projects, and parts of transportation."
(3) The Simplex Algorithm.
"Maybe in high school algebra you saw the topic of systems of linear equations. Well, it is fair to say that the standard way to solve such a system is Gauss elimination due to C. F. Gauss.
The idea is simple: Multiplying one of the equations by some non-zero number and adding the resulting equation to another of the equations does not change the set of solutions. So, doing that in a slightly clever way results in the system of equations with a lot of zeros, about half all zeros, so that the set of solutions is obvious just by inspection.
Then for linear programming, in practice the main solution technique is the simplex algorithm, and it is just but done with optimization in mind."
(4) Completeness.
A rational number can be written as p/q for integers p and q. We will see, easily, that the rational numbers are not up to carrying the load, are not up to doing the work we need done. So we need a more powerful system of numbers -- we need the real numbers.
Here is a really simple place the rational numbers fail to do what we want: At times we consider square roots. E.g., the square root of 9 is 3. Well, what is the square root of 2? Suppose that square root were a rational number, i.e., so that
(p/q)^2 = 2
Then we have
p^2 = 2q^2
so that the left side has an even number of factors of 2 while the right side has an odd number. Tilt. Bummer! That can't be. That's a contradiction.
So, there is no rational number that is the square root of 2. So, for something really simple, just finding a square root, the rational numbers fail us, can't carry the load or do the work.
The real numbers will let us find the square root of 2 and much more. With the real numbers we get what we call completeness. A joke, basically correct, is that calculus is the elementary consequences of the completeness property of the real numbers. Then we generalize: Banach space is a complete normed linear space. Hilbert space is a complete inner product space. The Fourier transform works because of completeness. So, we move on and see how the real numbers are complete ...."
"Tedious"? Yup! Can use Dedekind cuts or maybe something called the normal completion, and especially the first is darned tedious!
There is a good math writer G. F. Simmons, of Introduction to Topology and Modern Analysis, who stated that the two pillars of analysis were linearity and continuity -- nice remark. He also stated that really to understand, have to chew on all the arguments, etc. or some such.
Then I decided to study the proofs really carefully, so to "chew", and in hopes of finding techniques I could use elsewhere. When I mentioned that study technique, objective to my department Chair his remark was "There is no time." -- he also had a point.
Commonly there is an intuitive explanation of what is going on and some views that can provide motivation to study the stuff at all.
There are a lot of books and papers. As a student, I saw a lot of the books, got copies of some of them, put them on my TODO reading list, etc. Eventually, after falling far enough behind on the list, I wondered just where all those books were coming from? It dawned on me, profs need to publish so they do. They are also supposed to have grad students and do. Then the grad students take the advanced course by their major prof and end up with a big pile of notes. Then the grad student, as an assistant prof, wants to publish so cleans up the pile of notes and contacts the usual publishers to publish a book. The top university libraries are essentially required to buy the books, so they get published and bought. And, then, often, there the books sit, gathering dust. I won't say that writing those books was a total waste, and I won't say that students should spend more time reading those books. Or, the books are there on the shelves. They are not really difficult to find. The books have work that was done. Maybe the work is useful now; maybe someday it will be useful; whatever, the work is done, the results found, and there in case they do become useful.
In the meanwhile, back to the mainline of math education, research, applications, usually there can be some helpful intuitive explanations and motivating example applications!
Apparently some authors just give up and assume that their books will mostly just gather dust. But once I wrote Paul Halmos, likely my favorite author, and got back a nice letter from him with "It warms the heart of an author actually to be read, and clearly understood, by ordinary humans." -- at the time I had no academic affiliation and was just reading his book on my own. So, Halmos was surprised that an ordinary human would be reading and understanding his book.
Ah, in what I wrote, I left out that also in linear algebra in n dimensional space, the Pythagorean theorem still holds, that is, an n dimensional version holds!
Adding unnecessary complexity will take away from its "purenesss" and terseness. Language is not a barrier to entry.
You will then be graded on incomplete formulas but great storytelling.
Let's leave the storytelling to all the other fields of life.