My perspective on Julia is that it has 3 ingredients:
1. A principled design that derives from the experiences of past programming language and particularly the creator's experiences with Lisps. This is where a lot of the "magic" comes from: multiple dispatch, the type system, metaprogramming, etc. The article covers this aspect.
2. A need to be accessible to those transitioning from other languages, like MATLAB and Python. MATLAB, for example, has guided function naming (although Numpy also has similar names for similar reasons). The author mentions the lack of distinction between creating a variable and changing its binding: I'd suggest this is an example of something affected by this design point.
3. A need to be fast. The author brings up the Int vs BigInt distinction. Python, for example, allows Ints to get as big as you want but at a cost. Adding to Ints is not simply an add instruction, you must do a lot more work. Julia, falling on the side of performance, elects to distinguish between arbitrary BigInts and machine Int.
Parallelization in Julia is still a little scary though, I'm waiting a bit before delving into that.
NB. I'm very much in favour of a principled (qua philosophical) approach to language comparison (etc.) but its rarely done well.
Why did these languages not take off (at least pre-Julia)? I have heard other people "debate" (and I use it hear to say disagreement on principle not on details of said debate that Ruby and other langs are Lisp-like, but fall short. Dylan seems to have been Lisp (proper) without Lisp syntax on purpose (after intentionally moving from the design phase). So why do languages with such powerful expressiveness (for your value of the word, I do not want to start that discussion either) never take off, Dylan or otherwise? It seems that is what all programmers, at least the ones more advanced than me, clamor for.
There's a renaissance in native-compiled languages now, mainly thanks to LLVM and the JVM. Having a fast optimizing compiler back end that generates binaries on many platforms is a huge head start, and goes a long way to making the language immediately useful. The JVM gives you those and then some.
No, technology just goes in circles.
Like 30 years ago when people started to realize P-Code and other VM approaches were too slow and resource hungry to be useful targeting minicomputers.
Now mobiles and high electricity costs are making developers reach the same conclusions again.
A lot of the comparisons in this article seem like that to me. Julia and Common Lisp are apparently just close enough to make a point-by-point comparison like this plausible, but things are not quite aligned close enough to make it work. It's still a good article with a lot of solid meat in it, but I think the topic would have been better served by going up the abstraction ladder a bit and talking about how the different paradigms of each language motivated the differences between them.
Disclaimer: I'm only somewhat familiar with Julia and not at all with Common Lisp.
In the message passing approach you dispatch based on the type of the first argument to the method. Because it would be redundant to explicitly write down the argument it is commonly syntactically elided (though not fully in Python) and you get coupling of the methods and the object. This is not a fundamental property of OO but accidental feature found in most OO languages.
In the generic function approach, the dispatch is extended so one can dispatch on the type (among other things) of (ideally) all of its arguments. Julia follows this approach afaik and if one is familiar with the generic function approach it is not a controversial claim at all.
Erik Naggum explains it here in more depth: http://www.xach.com/naggum/articles/3243735416407529@naggum....
There also isn't OO-style inheritance and polymorphism. Julia's multi dispatch reminds me most of Clojure's protocols, which is a great feature. It achieves polymorphic behavior of functions without having to do ugly things to the objects. That Julia can make this fast and ubiquitous in the language is pretty awesome.
I'm also new to Julia, but I really like the above design decisions, among others features.
I suspect the authors main problem with Clojure is, that it is a mostly functional language which heavily emphasizes doing things in the functional way and discouraging imperative programming whereas CL is more like a true multiparadigm language.
I used to think that multiparadigm is best, but after migrating from Scheme which is mostly functional but has a lot of mutation and a sad lack of interesting datastructures apart from Lisp to Clojure which has good support for dicts and persistent data structures I think I prefer a community that is more focused on one approach.
That's not a very nice thing to do, suspecting people without any kind of evidence. Not to mention the fact that there is a `set!` form in Clojure, which makes it entirely possible to write very imperative code (and thread-local semantics don't matter in single-threaded programs).
Anyway, "problems with Clojure" can be very different for different people. I like Clojure design as a language - even its interop with OO host features are very neat - but then when I want to hack some simple script in a REPL I not only need to write this:
$ rlwrap java -cp "clojure-1.5.1.jar" clojure.main
but then I need to wait for freaking 6 seconds for the prompt to appear. 6 seconds. I don't know what more I could write here, so I'll just paste this (Chicken Scheme): $ time csi -e '(exit)'
csi -e '(exit)' 0,01s user 0,00s system 81% cpu 0,007 total
So that's my problem with Clojure, nothing to do with "functional way", right?I couldn't find any specifics on how it's done in the linked PDF (modules are described at the end, 11 section), but I think both Clojure and Racket do this already. The `require` mini langauge in Racket is very rich and allows for prefixing, renaming, selective import of identifiers and so on: http://docs.racket-lang.org/reference/require.html#%28form._...
It is an amazing language for its intended use, which is algorithmic code. From a language perspective, I would chose it over Matlab, R, Numpy, etc any day. It is approachable for grizzled library writers (types, optimizations, introspection at many levels, macros, etc) and more "casual" untrained scientist types (who just want to punch in their algorithm and call it a day). But if you want systems programming, there are plenty of other languages which fill that niche better.