Current job: there's about 10-20x as many integration points with downstream services, many of them owned by other teams. No investment in making it possible to spin up isolated service graph on a dev machine. Heavy use of persistent shared testing environments in company data centre, these environments usually in one or more state of brokenness. Some use of mountebank to replace troublesome downstream services with stubs returning canned data.
http://www.smashcompany.com/technology/docker-is-a-dangerous...
When I need to run multiple database, or want a database with a copy of production data, then sometimes we run that remotely. Running a few servers that all developers can use seemed to be standard practice 10 years ago, but the practice seems to be in retreat, replaced by the notion of "Use Docker to run every service on your own machine." But I have not seen that Docker actually makes that easy to do, and if your team has a full-time devops person who can spin up some extra machines, then usually it saves the whole team a lot of time to simply rely on a few central servers that the devops person has set up for development.
Also: `go fmt` is incredible (and built in to literally every Golang environment). We used to work primarily in Node.js but since making the switch last year to Golang we've noticed an increase in code-review productivity. Everyone's code looks the same so we don't have to spend time worrying about dev environments.
docker-compose up -d
What’s not easy about that?
However if your application docker container requires compilation and you need to restart docker compose once you make changes to the codebase (and/or want to run integration tests), the whole docker compose flow might add an extra minute or two to your feedback loop and slow down development.
In those cases I'd rather run everything locally (still would include docker-compose.yaml file for developers who prefer that and are not comfortable running all required services locally).
Developing Go applications I have found my development flow much more agile without docker, docker was decreasing my productivity a bit so I stopped using it.
Our current integration test suite runs entirely through docker compose flow normally (in the CI or locally) but launching the integration test suite outside of docker, if you have all dependencies running locally, is quite faster and saves a lot of time, thus increasing my productivity.
http://www.smashcompany.com/technology/docker-protects-a-pro...
And then scroll down to where you see this:
UPDATE 2018-07-09
And read that Slack conversation. That is a real conversation that actually happened. We lost 3 days trying to fix that bug, which in the end was a complex interplay of the Docker cache and the way the Dockerfile was written.
This is the sales pitch for Docker:
docker-compose up -d
But that Slack conversation is the reality.
For the people who prefer to use personal computers, there's a variety of options:
1. Virtualization. Previously we used VMWare or VirtualBox, but now offer Docker installs that replicate what CI runs.
2. Following the installation documents. Our stack is complicated, but doesn't change very much so one can run a script to install everything on a standard nix that we support, or edit it to support their preferred configuration. People who have nonstandard nix boxes tend to know what they're doing when it comes to compiling from source.
3. Use the dev machine you're given as a server, and work on your personal remotely.
All have their pros and cons, but it seems to work out.