If I had to teach someone OOP I'd say, OOP is like the real world if it were made of puppets and every puppet could pull another puppets strings, pull a string and a dog barks and wags its tail causing a cat to meow, the key is just not tangling up your strings.
I really don't understand why people have such issues with monads. I don't have a haskell background and originally built up most of my functional programming skill in Python (going somewhat against the grain) and I find monads to be a fancy scary term for an extremely simple concept. I think if they were named something friendlier perhaps people's eyes wouldn't immediately glaze over and they could realise that, actually, monads are just a simple strategy for controlling how things are executed/computed. Computational or execution strategies, if you will. Lets just call them strategies and see if they are less confusing and scary?
So I think there's a spectrum of understanding and effective use when it comes to monads and other algebraic structures like these, and I'm skeptical when people say, "oh yeah, a monad is just X, it's simple." I've said that myself in the past and I was wrong, and so were most of the people I've heard say that.
To put it another way: when I hear people who know what they're talking about say it's simple, they are talking about its structure, not about understanding what it is and how it is used. In that sense it is quite simple. But as Euclid said, there's no royal road to geometry (er, or monads...).
That is, the basic ideas for things like a Maybe or similar were obvious and intuitive to me, and everyone software dev for long has done such things, even if they never think about what parts they could refactor out into something reusable.
But the leap from that to how it makes IO possible without side effects I just did not grok. It really only clicked reading something or other that described (admitting it was lying a little bit) the IO monad as something that threaded the state of the world through the program; that is, that world state becomes both an (invisible) input and output for functions, and so effects on it are fully encapsulated by the param(s) and return type of the function.
No...see...you understand a monad. That is not the same thing as groking "Monads" in the abstract sense.
For instance, I don't think the State monad really has anything to do with how things are 'executed/computed'
As for calling them strategies… if we called them that it would make learning and understanding the intermediate and advanced concepts significantly less accessible and more difficult. We should call things what they are.
The reason could be that the term "monad" comes from category theory, and that is a topic not everybody is too familiar with...
Sure - I wasn't seriously suggesting we rename them, just that in my personal experience, newcomers get scared off seemingly by the name alone before ever actually finding anything out about them. A friendlier name may avoid this for newcomers. It definitely wouldn't be worth it for more advanced practitioners where referring to them exactly and in the context of their category theory roots is much more useful than friendliness.
Its an abstraction and abstractions are hard. In away, its similar to why beginners tend to have trouble with recursion (if you try to understand a recursive program by stepping through without using abstractions like preconditions and postconditions)
Here's a breakdown of the difference:
The first class in the major is in Racket (Standard) or Haskell (Honors).
The second class in the major is in C, and also covers basic UNIX tools.
The third class is in x86 assembly and C, focused on understanding low-level systems.
1. Intro to CS in Python ending with a taste of Java
2. Java
3. C (intro to UNIX)
4. C++ (design first, implement after, waterfall, etc.)
Along with 3 there is data structures taught in Java [0] and with 4 there is web dev taught in Node.js.0: Following this text: http://opendatastructures.org/ Taught by its author, Dr. Pat Morin.
"Functional" programming is taught in third year (COMP 3007), but you don't get too deep into it.
My issue with the listed classes isn't the chosen languages. A commenter mentioned that you can learn a lot of functional (and other) concepts in a procedural or object oriented language. My issue is that I basically learned the same concepts over and over again, just using different programming languages. You learn to iterate over a multidimensional array in Java, and then you do so again with C, but you have to deallocate and reallocate, and then you do it in C++, but you create an object for the array. It's all the same stuff.
What I'm getting at is that computation exists on several planes of abstraction and maybe covering C, x86, and Haskell might give you a good overview it will still leave things hidden because there is also Prolog and the branch of computation that Prolog leads to. Most schools settle for teaching their students the theory and some marketable skills by sticking to well-known languages like Java and Python.
"Grokking" a programming languages is just a means to an end, and far from being the most important means (well, for interesting problems, at least). Most (though not all) of important concepts in CS have absolutely nothing to do with the choice of language.
At my university they taught Scheme (SICP) in Intro to Comp Sci, and C -- both in the first year -- and all the rest was just data structures, algorithms, computational complexity, operating systems, numerical algebra, compilation, and electives in vision/graphics/PLs/DSP/whatever. We could do the exercises in whatever language we chose (usually C). In short they taught us just theory and one practical PL (maybe Java, too, for concurrency problems) so that we could actually do the exercises, and boy did we have a lot of good programmers who came out of that school. I owe to my university the realization that the programming language is one of the least important things in CS and software engineering.
On my university in Portugal back in the mid-90's, we had:
Pascal, C, C++, Caml Light, Smalltalk, Prolog, Java, PL/SQL, x86 and MIPS Assembly
Additionally we also got lots of CS stuff like lambda calculus, relational algebra, language semantics and so on.
It always feels strange so to me that there are universities out there focusing on single languages, or paradigms.
Keep in mind it should be stuff that will have staying power, and not the latest fad, because you tend to go to school for at least 4 years.
Aside from that, learning Haskell at university has eventually made me a better programmer. It gives you a whole other world outlook and that's useful once you finally start figuring out how to build reasonable software.
Here's how I just read this: "The pursuit of what makes me look good shouldn't be dampened by silly things like serving people other than myself".
And than just to dampen that statement, you added "Haskell made me a better programmer", which clearly implies you got a job, serving the needs of others.
Would it be fair to say you are more concerned with looking good than serving others, based solely off what you said? Correct me if I'm wrong please.
It's really not much to it. Get value records right. Get method records right. Sadly, none of C, x86 or Haskell does this.
It's overall badly (at least foggily) written, but I find this quote really on point:
>The moral of the story is clear: real programmers don't reason about their programs, for reasoning isn't macho. They rather get their substitute for intellectual satisfaction from not quite understanding what they are doing in their daring irresponsibility and from the subsequent excitement of chasing the bugs they should not have introduced in the first place.
And that, ladies and gentlemen, is why I think people think C/C++ is an acceptable tool for anything besides drivers and kernels, for which they're only acceptable because there's no better alternative. Sure, pointer-pointers is a magnificent idea! I'll totally keep that under control!
People who can use C++ need to reason about their programs because otherwise they're going to shoot their foot off. You can be irresponsible in a language like python and get away with it because you're not going to cause the issues you would in C++.
I think that, to some extent, this is true. If you don't know what you're doing I find it's easier to code yourself into a corner in C++ than something like python.
(Disclaimer: I haven't used C++ in a decade)
The nontraditional categories I've encountered are something like:
The macro facilities in lisp, scheme, forth (or factor), template haskell, and C++ templates.
Array languages like APL, K, or R.
Prototype inheritance with smalltalk or Lua.
Avoiding side effects with Haskell, Ocaml, or C#'s linq.
The "we're serious about type theory" languages like Haskell, coq, agda, and idris.
Perhaps it would be better to create a separate intro course for CS majors. On the other hand, most curriculum in CS demand that students learn about languages like prolog and Haskell, so it's not like students will never be exposed to it.
The department only has 7 professors and is struggling to even offer enough intro classes for their own majors, much less for students in other majors who only need a single CS credit. The engineering department is upset because they want their students to learn C. Physics wants them to know something else, the business department wants them exposed to a tiny bit of Java and so on.
When I started in 2008, there were about 30 freshman taking CS1. Now it's >150 last I heard. They really had no choice but to (diplomatically) tell the other departments to go pound sand. They can't compromise their own majors in favor of those from another department.
My guess is they aren't the only CS department facing this problem.
It's no surprise that this attitude can discourage students without programming experience... A parallel intro track seems like a nice solution that has worked in a number of schools.
He states:
-Most students taking CS courses are already familiar with an imperative laguage.
-we are all shaped by the tools we train ourselves to use, and in this respect programming languages have a devious influence: they shape our thinking habits.
This would imply that most CS students already come pre-ruined. At that point then, does it matter what language is picked?
Another alternative would be to start functional languages with data structures and algorithm courses(Generally taken sophomore year, and one of the first CS specific classes taken), and use that to fix broken habits. idk if the professors though want to take time to teach a new language if students already know one.
"It is practically impossible to teach good programming to students that have had a prior exposure to BASIC: as potential programmers they are mentally mutilated beyond hope of regeneration."
and
"The use of COBOL cripples the mind; its teaching should, therefore, be regarded as a criminal offense."
and
"FORTRAN, 'the infantile disorder', by now nearly 20 years old, is hopelessly inadequate for whatever computer application you have in mind today: it is now too clumsy, too risky, and too expensive to use."
Basically he's a mathematician who codes, and who wants code to be math.
Unfortunately in the real world code is also politics, business, history, sociology, and psychology. The math part is important, but it's not the whole story.
He's often funny, in a dry way, but it's useful to understand that he's actually not very insightful about the psychology of effective programming.
He praises mathematical intuition, and he wants to develop it, but he doesn't seem to have considered the possibility that simply throwing Haskell at people, or asking them to work through formal methods, may not be the best way to do that.
There's almost no useful research I know of that examines what mathematical intuition is, and even less about whether it's teachable. So Haskell and formal methods may help some people some of the time. But it's not in any way a given that they're the best of all possible teaching methods.
He also misses an entire class of UI/UX bugs where the code works exactly as it's designed to, and is provably formally correct, but an application is useless because the UI is incomprehensible and/or misleading and/or actively distracting to the task at hand.
Ironically, it's exactly that interface between fuzzy human expectations and implicit modelling tendencies and the rigidity of code that causes the most issues for practical projects.
This doesn't just apply to applications - it also applies to languages, operating systems, and development environments, some of which add unnecessary cognitive loads which make programmer mistakes more likely.
His solution - formal rigour - only works if you can formulate a problem precisely in the first place.
The set of problems where it's possible to do that doesn't come close to covering all the problem spaces touched by computing.
I'm not sure about Haskell as a beginner's language, but I consider Scheme to be almost ideal for that purpose since it's so easily digestible.
My intro class used Dr. Scheme, and I think it worked very well for introducing new concepts. Following classes were mostly C++ with some C and assembly mixed in. When they finally introduced Java it took students a couple days learn it since they all had decent C++ backgrounds from previous classes. I don't think that would work in reverse, a Java background is not going to let you pick up C++ in a couple days. That is what the department did not understand, not knowing Java isn't that much of a determent to getting an entry level Java job. With a solid foundation, a junior programmer can pick up Java quite fast.
Seeing QuickSort in just one line was what hit it home.
I remember thinking at the time that this was really incredible, but that Haskell had no future. This was at the time of Hugs 98, although I remember hearing about GHC. I'm glad to be wrong about this.
Edit: this is what I had in mind: http://augustss.blogspot.com/2007/08/quicksort-in-haskell-qu...
Btw, I took AP Compsci with C++ in Austin around that time, so we might know each other.
Most of the bloat comes from the fact that when you implement it in Haskell, the resulting program doesn't really depend on being executed in the IO monad, and can be easily modified to run in many other monads too (like ST, or some transformed monad). The result is that the Haskell imperative version is far more general than a similar C implementation.
There are plenty of other classes at the grad and undergrad level that teach you about lower level concepts such as pointers and memory (system, C, OS, ...). The Java classes teach different concepts.
I think teaching the motivation for learning it before beginning would have been better than just throwing it at freshmen, but I can't truly blame anyone other than myself.
It is not only that functional languages are more succinct, easier to reason about and are more readable, they also often come with significantly better type systems. Some have type systems sophisticated enough to specify nearly all the legal states and guarantee these are the only states the program can stay in at compile time. One can look at languages with dependent types for that.
On the more practical front, Erlang proved itself over three decades to be one of the best choices when it comes to fault-tolerant, highly concurrent systems. Jane Street is using OCaml for all of their trading software, Morgan Stanley has moved to Scala, and so on and so forth.
Purely functional programming still remains a purely academic exercise because it fetishizes type systems to the detriment of all other concerns in software engineering. Although I do enjoy some of the things that come out of that kind of work, e.g. parser combinators.
This is false.
Also, as far as I can see, only you mentioned pure functional programming. Sophisticated type systems do not mandate purity; see Scala for example.
And I realise that is an unfair target, but there is very little user facing FP software. The only one I can think of I have used is xmonad, which is both hard to use and fairly buggy
Or maybe Genera.
Or maybe Remote Agent software used by Nasa Deep Space 1?
Or eventually the train control systems running on software from Siscog?
(reading http://clean.cs.ru.nl and looking at the archives of the mailing list may give the impression the project died at the end of 2011, but http://clean.cs.ru.nl/Download_Clean has binaries from November 2014)
Now I write mostly in python and go, learning a bit of haskell.
"A fundamental reason for the preference is that functional programs are much more readily appreciated as mathematical objects than imperative ones, so that you can teach what rigorous reasoning about programs amounts to. The additional advantage of functional programming with “lazy evaluation” is that it provides an environment that discourages operational reasoning."
And of course we got C, Java, Matlab, and VHDL, besides a bunch of assembly. VHDL or Verilog would maybe also a nice eye opener for CS students. It's again another mindset.
Well yes, it has some value. However, just because a lot of people like/believe something doesn't make it good/true.
It's not like corporations have some vast conspiracy in place to encourage the use of Java; it's simply that Java, for a number of reasons, is attractive to middle-management types, even though it's not the best language to make good software. That's not necessarily a criticism of Java; from many perspectives, making good software is not the primary goal.
Dijkstra's big thing was prioritizing good software over cheap software. This may not be a realistic goal, but it certainly explains why he preferred Haskell to Java.
He wrote some really good essays on why you need languages like Haskell to raise the overall quality of software floating about. The more you can shift the burden of guaranteeing correctness away from humans and towards infallible mechanical systems (like Haskell's relatively powerful type system), the more likely you are to end up with good/correct software.
Like cigarettes?
Sounds an awful lot like, the Bandwagon fallacy