You disagree that the basic functionality of Go, including type and memory safety, have been proven by prior language and production deployments? That's my basic premise. If Pascal/Oberon-like languages and GC's are proven, then one that's very similar to them will likely maintain those properties and deliver the same results. That's supported by the number of amateurs that ported and re-implemented Oberon OS & compilers successfully in 6 months to 2 years at ETH.
Whereas Rust uses a combination of proven primitives and exotic techniques to achieve its goals. The exotic stuff, integration strategy, and implementation are new territory that exposes a level of risk a tried-and-true method (eg knock-off Oberon) doesn't have. It's new and unfamiliar territory to programmers at large even if some methods were seen in academia (eg Cyclone). It must be shown to be effective in terms of daily usage, standard libraries, and tooling because it's such new and unfamiliar territory.
When it is, as I have confidence in the team(s), it will then have a different argument where it eliminates way more risks than it introduces. Already has it at application level if compiler is robust. Overall thing needs more deployment, though, so people can see if it lives up to it, how much, and in what scenarios. That's the risk perception I'm talking about. Right now, it's a big unknown to many outsiders compared to traditional languages and platforms.