This should also help with optimization, and a larger amount of code can be optimized together, while (C)Python can only optimize up to the Python/C boundary.
Again redefining these things here... if the language has tools for all of inline raw chip specific instructions, compiler optimized versions of those instructions, virtualized and then optimized instructions, a jit compiler, and then also high level interpreter operations, then how can it be in anyway an encompassing system and how can that be coherent among all levels without requiring someone to know all levels at which point, I'm going to say no thanks and stick to tools at their respective levels of abstraction where I can reason them to be coherent rather than a dynamically allocated string which is sometimes tied to only working on amd64 because someone wanted an assembler way of pattern matching for some reason.
Maybe within the scope of scientific computing this seems logical if you just restrict yourself to matrix multiplication or whatever, but I don't see how that makes a "language" and it can be coherent, composable, etc...
I've seen a ton of examples, however, over the years of "stunt driven" technological advantages that people thought would be interesting but turned out to be the wrong solution for the wrong problem, but none that claim to break the laws of physics and reason with their evangelism than Julia. Even this article starts with "you must understand the quirks" and I hope that isn't a goal for their design because they are also claiming that I shouldn't have to know the quirks so which is it.