I'd say that even once Go has generics, that will simply take it from being an awful scientific programming language to a mediocre one. I don't really understand why there's a few people who seem to think it's a good idea to try to do their scientific computing in Go when there are so many better options for that that already exist.
Post-generics, though, if you really insist, it will probably be the case that Go can be upgraded from "mediocre" to "tolerable" with a lot of library work. (Part of the "mediocre" is lacking libraries. That aspect can be fixed.) But it'll be hard to start that work without running generic code. Even if you assume the current documentation is the final specs, you still won't be able to guess the performance implications of anything you'd be blindly writing, and performance is very important for this sort of code.
I didn't mean "now" as in right this second. Generics will indeed be key.
What would be the disadvantage of using a syntactic preprocessor to accomplish the same thing? We have an API for parsing golang. What if there was a way of marking certain files to be parsed and re-written with function calls? It could be done with a file suffix, and this could be made fairly convenient.
I don't think this solves the issues. For example, I have yet to think of a way to overload `+` in a consistent and performant manner. E.g. think of `a += b` vs. `a = a + b`. Either you make them separate operators (in which case there's no guarantee they do the same thing - see for example Python, where they commonly differ) or you are overallocating in the common case.