(Closely related is the idea of invariants: properties that are preserved by particular operations or functions. Often invariants are related to some algebraic structure of the system, but can be easier to identify and support a lot of the same insights. Reasoning about invariants of systems is another great way to make progress on hard problems.)
I find that very few engineers (especially in hardware) have had exposure to this stuff. Being the only one in the room who's had an abstract algebra course means I've occasionally been able to provide a completely different line of attack on hard problems. This has been good for my career!
As an example, I once helped a friend debug a complex system that was not behaving correctly. There was an input state, a nasty sequence composed of simple operations applied to that state, and an output state, which was not behaving as expected. A bit of algebraic thinking showed that each of the simple operations preserved an invariant, so no sequence of valid operations, no matter how complex, could produce that output. This meant that debugging attention could be directed at the implementations of those simple operations, which led to finding bugs in short order. This saved a lot of work because the actual sequence came from elsewhere and would have been difficult to audit!
I relate its value in programming to the Torvalds adage "Bad programmers worry about the code. Good programmers worry about data structures and their relationships." Taking some abstract algebra really helps you think about data structures and their relationships, and to architect "good bones" in your code.
I had much better intuition for analysis than for algebra. The results in algebra are just more surprising than those in analysis for me. My attempts at proofs in algebra were kind of like random-walks where I would eventually stumble upon the answer and then I would have to reconstruct the logical steps to get there without all of the unnecessary circumlocution.
Years later, as a much better student, I took a graduate course in group theory and really enjoyed it because I actually spent some time studying the subject.
I really love the way that abstract algebra deals with such simple, almost meager entities: sets with just a few basic operators. The theorems about these completely abstract, virtual and not actual things, reveal properties that are foundational for all math and somehow underly our reality.
The problem is the embarrassment of riches [1].
Compounding the problem is the awful marketing some of this products have for casual and hobby learners. At this point, I'm willing to spend a few hundred dollars for a personal edition of a product like Wolfram's Mathematica, Matlab or Maple, but I'm not sure what would be the best investment for my time and money.
Could you recommended any courses or books using a CAS to teach math concepts and applications?
1: https://en.wikipedia.org/wiki/List_of_computer_algebra_syste...
Opinion: There’d be a lot more people interested in math if it were taught with greater emphasis on visualization, experimentation, and self-verification (i.e. via CAS/programming).
Speaking for myself, it vastly enhances the value of my “pen and paper” and “stare at book” math time (the old fashioned way—also valuable and necessary, but [for me] not sufficient, for deep understanding).
For me, a group is the set of isomorphisms of a graph. If you expand the definition of "graph" a little to include continuous spaces, that is sufficient to define all groups. And yet, this nice, intuitive definition of a group might show up at the end of a course in group theory - if you're lucky.
It really is a shame how much intuition is stripped from mathematics teaching in the name of formalism.
Grouoa are taught in the context of addition/subtraction or permutations or symmetry groups, far more intuitive that graphs.
Eventually, you can justify the study of permutations using the fact that if you take any algebraic structure and consider the group of automorphisms, you get a group.
Well, then you are missing the entire (additive) group of integers... But you are right, in that automorphisms are the most important example, and they also play a huge theoretical role (as representations of abstract groups). This is what gives the group theory its importance in math and physics.
Alternatively, take a directed graph with:
V = The integers
E = {x, y | x - y = 1}
and that graph has the same isomorphism group.
Replace the nodes of that graph with asymmetric graphs, and the resultant undirected graph again has the same isomorphism group.
I don't disagree that working straight from the graph-theoretic definition might make things harder. My complaint is that maths is taught as formal definition -> theorems. What I would like to see is intuitive definition -> formal abstraction of intuition -> theorms.
In my maths degree I spent far too long asking myself why is the definition of this thing this way?
That is a really helpful theorem for intuition. Holy shit!
Also most treatments do not cover categories, which seems to have its own separate literature.
Perhaps having some geometric intuition will greatly help, viz coordinate systems in 3d Eucliden space, answering interesting questions like when is it legal to take dot product of two vectors? and if two unit vectors are parallel are they the same vector?
I feel that mathematical topics almost always benefit from positing some problem, and then "inventing" the mathematics that allow you to answer the question, followed immediately by some examples of other questions this topic can help you answer (as at least an informal justification of some of the seemingly arbitrary choices made).
Then show a similar topic you cannot quite answer and build on it with that. Eventually you will have built up the majority of the topic with motivations for each part.
Abstract algebra might be trickier to motivate than many subjects, but it still should be possible. Yet given how much trouble there seems to be in writing out motivation for more concrete topics finding a textbook that provides motivation throughout for this area seems tricky at best.
(R,∗,1) isn't a group, (R_+,∗,1) or (R\{0},∗, 1) are, but that doesn't really work as an example of two different algebras defined on the same set. Having your operations obey the distributive property is incompatible with that kind of structure other than the zero ring (0=1).