If you really can predict the future please tell me lotto numbers rather than being abrasive.
All of the original creators were clearly of the "best tool for the job, even if you have to build it from scratch" persuasion. How many different languages and programming constructs can you find on Unix or Plan 9? The notion of centralizing all development around a single all-purpose language, a la C++ or Java, is anathema to their stated engineering principles and the opposite of the design elements of the many computing environments they spent their careers designing and using.
One can quite easily disagree with their design choices and tradeoffs. I find static compilation short-sighted. But I do appreciate how it fits into their overall approach; and in that way I find static compilation less short-sighted in Go than in Rust. Some aspects of goroutines are problematic. Hassles with dealing with timeouts exposed flaws in their scheduling and message passing abstractions. But those flaws are in the details, not the fundamental model; and the better way to implement those details is more difficult to determine. (I've had to wrestle with such questions myself as I've written multiple async I/O libraries and frameworks. If Go had come out 10 years earlier my entire career trajectory may have been different.)
FWIW, I don't write any Go software. I admire it from a distance, the same way I admire C++, Rust, and Java[1], while I slave away in C and several other languages. Though, there's an honesty and clarity in Go that is lacking in other languages, precisely because Go isn't trying to be an all-purpose language, and because the authors have an understanding of (and deliberately leverage) the interplay between design goals and implementation constraints that can only come with having designed and implemented umpteen different languages before.
[1] I did have a short tryst with Java a long time ago, but it actually ended up driving me into the arms of C. Not because of the language, but because of the ecosystem--licensing, tooling, interop, etc, was an extremely poor fit for Linux, and in many ways still is.
As per the creators of the language who claimed that they got the idea of writing a new language while waiting for C++ to compile, it was. You can read up about this from them, they were quite clear about it. They even expressed surprise as to why they didn't get the migration of C++ programmers they were anticipating to golang, but instead got Python programmers.
It's not like the lack of green threads stopped people from writing highly concurrent code. Libraries like Vertx, Reactivex, and Akka have existed for a while now. Also, all major platforms have a notion of green threads/coroutines (Java is getting the former and C++ is getting the latter). This was practically the only thing golang had going for it, and once they're implemented for those platforms, it's hard to make a case for golang given it's bad design decisions.
> Hassles with dealing with timeouts exposed flaws in their scheduling and message passing abstractions.
That's a good point, which is why Java's green thread implementation is superior to golang's. Timeouts and cancellations are designed in from the start, as well as hierarchies of green threads (similar to Erlang's).
Again, it just show's that golang does the minimum, and pushes complexity onto the user under the guise of simple language design.
> licensing, tooling, interop, etc, was an extremely poor fit for Linux, and in many ways still is.
The JVM is open source and has been for a while now. Could you elaborate on why it's not a good fit for Linux?
So the future invention of something that doesnt currently exist is going to be better than something that does exist is your argument?
If we're comparing language exposed we've got pprof which can profile and generate flamegraphs etc all part of it as standard that blows your argument away IMO [0]