It's worth noting that go modules use a novel dependency resolution algorithm which is extremely simple to reason about/implement, fast, and produces more reliable builds than npm/bundler/cargo. That's why I was excited about it, anyway. It removes the ever-present NP-complete assumptions in this space, so from a computer science perspective it's extremely interesting.
I've heard/read this, but I can't tell what is necessarily novel about it...to me, it reads like old-school/boring Maven transitive dependency resolution.
(Not holding out maven as best practice, it's just what I know best in terms of pre-version ranges, pre-lock file dependency management, once those features become state of the art in ~2010.)
...that said, Maven does actually support version ranges; ~10 years ago when I last used it, either it didn't support them then, or we didn't use it, so perhaps that is why vgo seems so familiar. Or I just have a terrible memory.
Anyway, if anyone can correct me on my fuzzy assertion that "vgo is like maven w/fixed versions", I'd appreciate it!
vgo's more-constrained specification for dependencies means there is exactly one right answer, and it can be easily and quickly calculated by both computer and human.
Whether or not this will turn out to matter in practice is still an open question.
Overall, I am trying to be optimistic about the future of Go dependency management, but I am not planning to switch the projects I work on in my company from dep to Go modules until most of those rough edges are either smoothed, or officially recognised as "works as intended", with viable workarounds.
[1] https://github.com/golang/go/issues?q=is%3Aissue+is%3Aopen+l...
With that said, some none negligible chunk of the issues you linked fall into cosmetic (better error message), proposals, works as intended (docs could improve) or self inflicted misconfigurations of running a go beta. 53 sounds worse than it is.
I don't know why this had so much drama around it to be honest. And yes, it is far from perfect, it will be approximately as crummy as python/ruby/node which is still an improvement.
(all praise mwhudson for maintaining the Go snaps -- `snap info go` for the whole story).
It lists the channels and some versions, and has a short description; "This snap provides an assembler, compiler, linker, and compiled libraries for the Go programming language."
When you said "the whole story" I expected there to be some sort of story but I guess I might have misunderstood what you meant.
I'm not sure if I'd use it for deployment - but for development it's quite versatile.
It was not fun when VS Code (with the Go plugin) would automatically remove my imports every time I saved the file because it couldn’t find it.
./example.go:8:9: imported and not used: "net/http"
So VS Code's Go plugin removes the unused import because otherwise the code is invalid. I'm a vim user and `vim-go` has the same behavior.[1] I know it's not the official upstream remote, but I find this one the easiest to remember.
The rather late support of "out-of-tree" building might be caused by a conflict between groups pressing to refrain from the $GOPATH approach (rising in size due to rising Go popularity itself) and Go Dev Team / early-adopters especially as the $GOPATH approach is a central part of Go.
While I can see the intention of Go modules, I believe that many people which belong to the first group migrated from other languages like JavaScript and expect to migrate the workflow as well. I highly encourage everyone to try the $GOPATH approach before rooting for any side.
I've tried it. Can we now get real modules?
Go is a Google project, and Google has a very unique approach to package management: commit everything to the monorepo. The GOPATH is, in essence, a monorepo. If you want to change the API of a library, well, you can just change all its callers across your GOPATH, too. And so for a long time the Go team was unconvinced that package management was a problem.
For example, the Go FAQ [0] still has this to say:
> How should I manage package versions using "go get"?
> "Go get" does not have any explicit concept of package versions. Versioning is a source of significant complexity, especially in large code bases, and we are unaware of any approach that works well at scale in a large enough variety of situations to be appropriate to force on all Go users...
> Packages intended for public use should try to maintain backwards compatibility as they evolve. The Go 1 compatibility guidelines are a good reference here: don't remove exported names, encourage tagged composite literals, and so on. If different functionality is required, add a new name instead of changing an old one. If a complete break is required, create a new package with a new import path.
It is true that if you write perfectly backwards compatible code, then you don't have a versioning problem, but if you think that's a viable solution you're ignoring certain realities of software engineering.
It wasn't until early last year that Russ Cox [1] publicly declared that versioning was a problem and set out to introduce a package manager into the Go toolchain. As it turns out, GOPATH is entirely incompatible with the approach to package versioning that the Go team settled on. You simply can't have two versions of the same package in your GOPATH, unless you're willing to rename one and rewrite all the import paths. Given that public opinion had turned again GOPATH [2], it was finally time to do away with it.
So it took about a year and a half from the time the Go team admitted GOPATH was a problem to shipping a release that made it unnecessary. That's really not too bad. The frustrating part of this saga were the first seven years during which the Go team refused to admit there was a problem at all.
[0]: https://golang.org/doc/faq#get_version [1]: https://research.swtch.com/go2017 [2]: https://github.com/golang/go/issues/17271
My personal theory is that what made Russ Cox cave in was his discussions with Sam Boyer. Cox thought Boyer was going down the wrong path, and thought he had a better solution. Unfortunately, the Go community didn't seem to have read the discussions the two were having, because pretty much everyone thought Dep (Boyer's tool) was blessed by the Go team and was going to be the official package management tool. I can forgive the drama of the end result is a real, non-Google package management system, though.
(While I didn't appreciate the drama, I'm somewhat relieved Dep is not going to be the official solution. Dep is okay when it works, but inherits pretty much all the warts of Glide, which Boyer also worked on. Glide has been an absolute nightmare to work with. Dep is in fact worse than Glide in some respects -- due to weaknesses in its solver, it's completely incompatible with certain significant community packages such as the Kubernetes client. Of course, Dep is not yet 1.0, but I would not say things were looking that promising.)
That's fine and works perfectly well in the appropriate environment (e.g. large structured work environment with processes etc etc), but for my personal work I prefer to just checkout wherever the hell I want and go from there. Really looking forward to module support so I can use golang for some personal small-scale projects easily without having to go through a lot of the ceremony of setting go up on say a raspberry pi - just checkout and go (no pun intended) will be a breath of fresh air.
It's true that it encourages working with all the code in GOPATH. This is a good thing. Your GOPATH is a view of the whole Go ecosystem. You fix a bug where it makes sense and it is picked up by all users. Sadly, vendoring already messed this up.
I think it's insanity when every program has a different idea of what code a given import path refers to (like is often the case with project-based package managers and vendoring as well). It's no fun to juggle the version differences in your head while working on multiple projects.
Go modules have some good ideas here. Semantic import versioning hopefully reduces the number of different versions you have to consider.
Doing the version selection not per-project but globally for all of GOPATH should still result in a working (but not necessarily reproducible or high-fidelity) build. It definitely reduces the amount of code you need to deal with.
Modules and vendoring
When using modules, the go command completely ignores vendor directories.
By default, the go command satisfies dependencies by downloading modules
from their sources and using those downloaded copies (after verification,
as described in the previous section). To allow interoperation with older
versions of Go, or to ensure that all files used for a build are stored
together in a single file tree, 'go mod -vendor' creates a directory named
vendor in the root directory of the main module and stores there all the
packages from dependency modules that are needed to support builds and
tests of packages in the main module.
To build using the main module's top-level vendor directory to satisfy
dependencies (disabling use of the usual network sources and local
caches), use 'go build -getmode=vendor'. Note that only the main module's
top-level vendor directory is used; vendor directories in other locations
are still ignored.OK, so that makes it sound like it has nothing to do with GOPATH.
But interesting (annoying?) that vendor won't kick in unless you specifically ask for it.
> Very nice, go build ignored the vendor/ folder in this repository (because we’re outside $GOPATH)
and
> Oddly these are stored in $HOME/go/src/mod not the $GOCACHE variable that was added in Go 1.10
Maybe Go 2.0 will be stable.
Or you're using the isolated-GOPATH trick.
$ mkdir -p .gopath/src/github.com/foo
$ ln -s ../../../.. .gopath/src/github.com/foo/bar
$ GOPATH=$PWD/.gopath go install github.com/foo/bar
For example, this is one of my Go projects where you can unpack the release tarball anywhere and `make && make install` just works: https://github.com/sapcc/swift-http-import