I have been wondering for quite some time how one could implement that. That is, not in the sense of how to code that, but rather how few changes one would have to make to the language to get as many effects as possible in the direction of genericity. There is some inspiration in Lua, Scheme48 and some dialects of ML (functors, I believe?) in the form of "higher-order modules", where the module (would be package in Go?) could have parameters that you'd have to supply when importing it. The things you could obviously supply would be at least constants, functions and types. (One might look at functions as types of computational processes, though, and at function signatures/interfaces as their respective type classes. This perspective could subsume functions as types, and perhaps integers as nullary functions returning an integer.) The question is how to reasonably do the import strings. Good thing about Go is that you already have provisions in the language in the sense that the string can be technically arbitrary. A subset of the reflection interface could additionally be evaluated at compile time to provide for ad-hoc specializations of generic code by writing straightforward code that would be easily eliminated/specialized in a module instantiation (like loops over struct fields etc.)
The interface to the feature is perhaps more important than the complexity of the implementation because it will affect many more people - only a few programmers will work on the compiler but tens of thousands of programmers will be writing code using it. I make no claims as to how complex this would be to implement, but it probably wouldn't stand out. The interesting thing is that this shouldn't necessitate any changes in the language of generic modules (no <>s and such). It merely parameterizes some types and constants in a module. As such, after a certain phase in compilation, the process is the same as for a non-generic module so perhaps it's a low-complexity change in the implementation, too (not just in the language spec).