(Skip to 0:30 in the first video, it's full of this kind of "context".)
How do you transform someone's voice like that?
Kyle wrote the You Don't Know JS series. This book is just as good.
As I understand it, the author is using Hindley-Milner as if was the name of a syntax, like Backus-Naur form. But this is something I've never heard, and I wonder if the credit for the Haskell type signature style belongs just as much to Rod Burstall or David Turner, in other words I'm inclined to believe that it is something conventional that evolved over years.
It helps bridge the gap between looking at all of this from the more math/Haskell perspective and how it's implemented in javascript, without sacrificing definitions (as much as possible).
Also is his explanation of monads as 'functors that can flatten' a simplification for the purposes of teaching, or is that more or less what they are?
This is not a criticism of any kind; this is a point about definitions. There are definitions of functional where Erlang is functional, and IIRC Elixir can be said to support it.
(And there are definitions of "functional" where almost every language in current use is "functional". There's even some so weak that C is "functional" because it has "function pointers", though this one is now out-of-date and not currently being used by anyone. But, yes, there was once a time in which C would have been considered "unusually strong" in its "functional programming" support, because other contemporary languages didn't even have function pointers.)
"Also is his explanation of monads as 'functors that can flatten' a simplification for the purposes of teaching, or is that more or less what they are?"
A little of both. Technically it is correct, but the "flattening" in question applies to many things that most programmers wouldn't consider "flattening". For instance, consider monadic IO as Haskell uses. There is a way in which you can consider the execution of an IO value as "flattening" it, and it corresponds to the mathematical term, but it's not what most people have in mind. There's more to "flattening" than "simplifying data structures in some manner"; it doesn't even always involve what we'd traditionally think of as data structures at all, such as, again, IO.
Personally I think it is an actively unhelpful metaphor for these reasons, as it is very prone to leading to false understanding, but YMMV.
It would be very helpful to see an explanation of this spectrum you describe for someone who is not really familiar with the definitions. I would love to read an explanation of the various “functional” paradigms as they diverge from “conventional” (ie. C) programming languages.
This was also one of the reasons that assembly programmers were always banging on about the power of assembly back in the day. Nowadays the only remnant of that argument is the claim that you can write more optimal assembly than the compiler. But back in the day, assembly programmers enjoyed the ability to have a pointer to a function and jump to it and/or call it (it's a bit fuzzier in assembler) and people using high-level languages were definitely getting a "weaker" experience. Today we expect our high level languages to also provide us significant power advantages over assembler. (Of course you can do anything in assembler, but not as concisely necessarily.)
When I got into computing, the definition of "functional" that excluded C included having "closures". This is a function pointer + an environment for the function to run in. C only has the function pointer; you don't get an environment. It is more convenient than nothing, but vastly less useful than a full closure. (You can do them manually, but they become problematic fast.)
Stepping up from there, you got languages that generally permitted an imperative style of programming, but "preferred" what we would today call a functional style, when you use map, filter, and such to perform operations. These languages loved them some linked lists; linked lists everywhere. With their own special names like "cons lists". They also tended to be garbage collected, which for a while was a "functional programming" thing, but is now also simply an accepted thing that a language may be, regardless of how "functional" it is.
This definition is still in some usage today, though some improvement in understanding the structure of the relevant of code ("iteration" as a concept you can abstract, rather than accidentally conflating "a linked list" with "the thing you can iterate on") and the fact that hardware is getting ever-more grumpy about non-locality has erased the linked list obsession. You can either have a "functional language" like Lisp, or you can program in a "functional style" in a multi-paradigm language like Javascript. In the latter case, you can do a lot of work with the functional paradigm, but technically you always end back up at structured programming with some flavor of OO, which is the dominant language paradigm. (Languages can be multi-paradigm, but there is always a dominant paradigm, one that when the paradigms conflict, is the winner. And my personal default definition of OO includes Javascript, considering the prototype system a detail relative to the fact you still have "collections of data with associated methods".) People who say that "Javascript is a functional language" mean this definition.
Finally, there's the Haskell definition. Here, immutability is the default, and possibly the only option. Type systems are very strong, able to express types like "a block of code that does not do any IO" that other languages can not express, or can only do very laboriously. You get "functor" and "monad" and such being not just demonstrated on a one-off basis, but being the foundational abstractions of libraries and entire applications. People argue over how much category theory you have to know to practically use these languages. F#, O'Caml, and Haskell live here. Haskell is as far as you can currently go in this direction and get a language usable for practical tasks, work that you can build a business on.
(As an interesting side bar, I think Erlang made an error here, although a perfectly understandable one. When it was written, one of the reasons immutability was favored at the academic level was that it helped write multi-core code. At the time, only big industry and academia had multi-core systems. But you only really need isolation between threads. Immutability is one way to achieve this, but you can also make it so that it is impossible to communicate "references" between processes/threads, so everything is a copy. Within an Erlang process there's no reason not to allow one to "assign" to existing variables. But at the time, "access control" and "immutable" were sort of conflated together. Rust is the first big language that seems to be levering those concepts apart in a really systematic way.)
However, the spectrum keeps going from here. Past Haskell there are functional languages that get really mathematical, and are focused on proving code, creating even more elaborate type systems such as dependent types ("this function takes a string whose length is a prime number", to give a silly example), and constraining the abstractions even further for things like total functional programming, which is one of the most interesting possible ways to limit programming so that it is not Turing Complete, but can still do real work. Here you can get such exotica as ways of using the type system to separate out what code is total, and what is not, in various principled ways. One of the common "I've been in this industry for 20 years and it sucks and here's what we all need to do to fix it" posts is to extol the virtues of one or more of these things. However, while there has been some interesting progress on many of these fronts in the past couple of decades, they remain impractical.
> ...
>> ...monads...
There are also a couple of definitions of "monad" going around -- in array languages (J, APL, Q) a "monad" is something with arity 1 (like unary negate), to be contrasted with "dyads" which take two parameters (infix operators etc.)
That said, they do occasionally form useful abstractions.
Sure.
> Lisp and Elixir are dynamically-typed, which is why you don't see monads in those.
More to the point, Lisp and Elixir are impure functional languages, which is why you don't have some pure construct, like monads, that isolates IO.
In Haskell IO is isolated, and then the Monad interface is used for some particularly important ways of interacting with values of that type. That's not quite the same thing as "monads isolate IO" - the Monad interface is useful for quite a few other types of values.
This is the canonical source: https://wiki.haskell.org/Typeclassopedia
In my understanding however, it's valuable to note that the chain-ability of the bind operation also sets up a continuously nested set of closures, which is where the real power comes into play to give you a useful approximation to imperative programming. (This can easily be abused, of course, to circumvent thinking and structuring code functionally.)
Related to this, SJP stressed in a talk some years back about how monads conveniently encapsulate the unavoidable messiness of side-effects in the least painful way yet discovered.
There are lots of ways to conceptualize monads. I think "functors that flatten" is accurate, but is just one way.
I'm not expecting the whole engineering field to start exploring the question of what pure mathematics (and not some watered down for-engineers version) can do to fundamentally transform the the way programmers think and talk about what it is they do. But the fact that there are almost zero people from the geometric side of pure mathematics (though again, there are plenty of logicians) working together with everyday programmers, that's what I wish I saw more of every time I see one of these explanations of functional programming that seem almost always to be pedagogically colored by logicians hands.
- https://egghead.io/instructors/brian-lonsdorf
- https://www.youtube.com/watch?v=h_tkIpwbsxY
I'm pretty sure he's the only person to make this topic so approachable. I'm glad he's into having fun with it. Quite refreshing!
FP Resources:https://github.com/functionalflow/brains/projects/9
That seems extremely dishonest. The reason why we name variables, is so that they can hold different values. There is no guarantee that every run of the script will have the same initial variables. If it was, you might as well just type in the result.
Context: self-taught programmer in the data science/statistical modeling world.
I lean towards organizing my applications into very dumb objects which are supported by FP-style business logic. At a glance you can infer what's going on quite easily due to the idiomatic use of the objects, but the object orientation mostly ends there. My business logic is organized into isolated modules that are as pure as I can manage without being a nut about it. The objects recruit or are operated on by that logic, so their implementation is very light and clean as a result. Like I mentioned, tests for this kind of code are really nice. They tend to be concise.
It's not perfect, but I feel like it's a way FP has greatly improved my code and what I deliver to my team in general. It's an attempt to merge the benefits of two paradigms, I suppose.
A pure function is more of a declaration of truth than a list of steps. When your whole application is made up of declarations of truth, it is simpler to reason about and rearrange.
It's simply the fact that the function depends on nothing but its inputs (but includes all the surprising ramifications of that).
On one level, there is no such thing as FP. All there is, is assembly-language (or binary) instructions being executed in a CPU that has access to some memory. (Almost) every instruction creates some kind of side effect (including changing the flags).
But nobody wants to program at that level, so we build abstractions on top of it. All higher-level languages create an abstraction. Even C creates an abstract machine, even though it's very close to the hardware. If the abstraction doesn't leak, you can just think about the abstraction, and ignore what's going on at the level(s) below it.
FP creates an abstraction that's at a higher level than many other abstractions. Within that abstraction, (almost) all you have are functions and values. Memory and registers are below that level. The changes to the call stack when you call a function are below that level. Those things are therefore not seen as side effects, because they are below the level of abstraction you're working at.
But disk is not. Therefore writing to disk is seen as a side effect, and those other things are not.
Not sure what exactly you mean by writing a register; if you directly mutate a location, it is considered side-effect. Otherwise any register rewrites that happen are just implementation details and if you have a different architecture than von-neumann than it could also be implemented without writing to a register.
Calling these side affects is like calling Python a low level language because programs you write still use registers and raw memory access.
Can anyone explain?
JavaScript is a language in which one can apply the FP paradigm, with some (considerable) effort. This book explains both the underlying FP paradigm and how to apply it in JS.
Other languages (e.g. Haskell, Scala, F#) are designed for the FP paradigm and make it much easier to apply. But the paradigm itself is the same for all of them, as it arises from fundamental mathematical laws.
disclaimer: I haven't read the book and am not sure if this is mentioned anywhere.
What helped me when thinking about functional design in javascript was realizing that all js functions actually only have one parameter, an array of arguments used by the caller:
function add(x, y) {
return x + y;
}
is effectively syntactic sugar for function add() {
const x = arguments[0];
const y = arguments[1];
return x + y;
}
`add(1,2,3,4)` ignores 3 and 4 instead of being an error. While seemingly obvious that these two functions would have different definitions: `add1(1)(2)` and `add2(1,2)`, thinking about it in types helped me process it when thinking out how they are actually written: add1 :: [Number] -> [Number] -> Number
add2 :: [Number, Number] -> NumberEdit: Also fine to not have my opinion on this, but a lot of people shared my opinion, check the Egghead comment section. I would've loved to watch that video, with a more professional voice and walkthrough.
I've watched this gradual change in the users of my own forum I started over 10 years ago. People with no skin in the game think so highly of their opinion that they use it to discredit and dismiss something as terrible garbage.
Seems related to the rise of self-entitlement culture: In this case, a free video series is terrible because you didn't like the voice.
Internet discussion has become very toxic (well, perhaps it always has been). The other day I was on IRC and asked a question about Eclipse which was doing some quirky things that I didn't know how to disable and one of the people in the chat (not a huge open freenode chat but the chat of a smaller, private community) responded "Java is horrible" or something along those lines. I hadn't even mentioned that I was writing Java! What's funny is that I can't imagine anyone responding in this way were it an in-person conversation. You get much more "I don't care much for Java" or even "What are you using Eclipse for?"
I guess because if you're a nasty person AFK people will avoid you and you do face the risk of some social ostracization. However, with the internet, it's harder to prune these people from your social circle.
The internet has made it so easy to share negativity or whatever opinion you're able to belt out from the hip that people mistake that for the need to do it. "What, I can't have an opinion, now?" It's almost comical how toxic Youtube and Facebook comments are for this reason.
But I also think there's some element of human nature where you see someone having fun and feel the need to snipe them with something negative. I remember a lot of that from my childhood.
I'll never forget in Boy Scouts when we were climbing down some boulders and one of my mates decided to jump down each boulder instead of climbing down. He was soon sniped with "Quit trying to be cool, Jacob" which put him in line, climbing down like the rest of us.
Not to double down so hard on a fellow HNer, but it reminds me of that:
- Person A: "Here's a link to their free video course."
- Person B: "Those videos are terrible, Jacob. Edit: What, I have opinions, too."
This sort of personae is not something to aspire to, and the world really doesn't need more of this personae, but in a culture of mimicry, that's what is honored as people try to differentiate themselves from the bland masses. Cultural erosion.
Imagine if your approach, instead of knee-jerk negativity and dismissal, was to enumerate those videos you thought were better if your intention was actually to communicate that you've seen better ones. That would've been a great contribution.
There's no point in attacking the course based on this aspect of its presentation. If it were super polished but made a bunch of mistakes, that's when you need to shout loudly to anyone you care about "don't watch that course! It's full of mistakes!"
I learned a lot from the course content and the classroom stop-motion animations were refreshing; it kept me watching with a smile on my face.
It's nice to have some personality and silliness in an online classroom environment; very little differentiates the various JS teaching sites other than their media players and subject matter coverage right now.
Daniel Shiffman's unique style also breaks the “person tonelessly narrating to occasional cursor movements over a screencap” standard that lots of online courses have settled into, and is similarly engaging as a result: https://www.youtube.com/channel/UCvjgXvBlbQiydffZU7m1_aw
Just for entertainment value, specifically https://www.youtube.com/watch?v=BfS2H1y6tzQ
But this is the great thing about the internet - there's so much information on how to make more of it, you'll definitely be able to find videos which are more to your taste.
I recently watched Andrew Van Slaars series on the Maybe type: https://egghead.io/courses/safer-javascript-with-the-maybe-t... - that was much more traditional and slower paced, and still very good. Maybe give that a shot if you found the Frisby stuff too frenetic.
Fast paced places the viewer in control. Too slow pace is like keeping them prisoner or feeding them via drip line.
May I suggest a perspective modification? I believe you can start your journey to recovery with a healthy doze of this https://www.youtube.com/watch?v=hUes6y2b--0, more of the same, twice a day, complete the course, or until the pressure in your loins subside.
Seems misleading at best, as you mimic only some parts of functional programming. For example, for-loops are not used but neither are recursion and tail calls mentioned.
> [T]yped functional languages will, without a doubt, be the best place to code in the style presented by this book. JavaScript will be our means of learning a paradigm, where you apply it is up to you.
Surprising how they teach the typed functional programming paradigm in a language which does not support you in it. Going from JavaScript to Haskell, wouldn't PureScript be a better stepping stone than this? Consider tail call elimination or all the support that type checking gives you to get the type nestings right, especially when you are a beginner and may have issues even with String being [Char] (unlike JavaScript) let alone Monads etc.
(EDIT: In case you didn't check the contents of the book: Yes, this is a book that teaches Monads, type classes, pointfree style, Hindley-Milner(!) etc., not a form of FP that would be natural in JS.)
Most of the book isn't really a beginner's guide. A more accurate title might be "JavaScript for Haskell Programmers".
And to the point... how would one go about taking "Imperative Haskell" Code and making it more functional... you can't. Either you write Haskell functionally or whatever you wrote will refuse to compile. I love this book and the Egghead.io videos that followed. I love this guide and the egghead.io videos. That content made me curious about Haskell and Tacit Programming.