With static, using-the-language dependency injection, isn't the question of "how does ABC component get access to DEF?" answerable with the normal IDE/language tooling, rather than some magical library's way of doing it? You can just find the calls to a constructor and look at the arguments.
My experience is based on my bad experience with runtime DI libraries, and is definitely biased against them, but I must be missing something here.
* Static dependencies make testing harder, no question about it. This is mediated in dynamic languages like Ruby by mocking statics. While you can actually do this in Java with Powermock, avoiding mocks entirely is even better. If you can't use a real object, use a fake that implements the relevant interface.
* Statics mean singleton, and that invariant often changes as a product matures. It's very easy to go from "the database" to "a database", and when you have 500 places getting "the database" it's very hard to make that evolution.
* Statics make it very hard to maintain module boundaries, because every static is its own exported interface. In a long-running project, binding tends to get tighter and tighter as every module reaches out for the statics of other modules.
Sure, folks can write bad code with DI systems too. And I'm no fan of Spring - not because of the DI, which is fine, but because of the need to wrap everything else in the universe and now you have to understand both how the underlying thing works and the way Spring changes it. But something like Guice or Dagger is just the right amount of glue to hold a system together, without getting in your way.
Do you object to passing interfaces vs concrete types? That is a wholly orthogonal concern; you make the choice to extract interfaces with or without DI.
Maybe you have an example?
As a Java developer for all my career here is my take. In Java world there is this cultish cottage industry of "frameworks" for all sorts of work. Most Java developers are not expected to write plain code with JDK standard + some external libraries. Creating an object via "new Obj()" might cause programing universe to collapse so DI framework is must for enterprise Java developers.
If I were to tell at work that Go has inbuilt http server where we could implement a few handlers and have a basic service running. They would not be shocked that a http listener could be this simple but rather ask "Does Go bundles "weblogic/websphere/tomcat/netty" server with it or else how can it work? Same with testing, no understanding on how, what is to be tested but everything about "JUNIT/Mockito/Mockster/SpringUnit" or whatever.
There is no requirement for understanding basic concepts for testing, client/server, dependency, error management etc. So even basic functionalities are understood in terms of a branded framework. This is their main frame of reference.
It is hard for them to open the terminal and execute their application JAR from the command line--only ever from the IDE. Oh wait--they need Tomcat/Apache & a few hours of dealing with classpath issues.
On the other hand there are the enthousiasts (like you I presume) who like tinkering and using the language to the fullest. Any successful project needs at least a few of these people, but they can also go overboard by building a lot of custom functionality where any standard library could have been used.
While I'm also an advocate for increasing knowledge for the systems you're working with it's no 'sin' to use some libraries. For instance: for your HTTP server example, it's quite easy to just listen on a socket and respond to a request. But you want parallelism, so you need a thread pool. And queues, configuration and error handling. That will escalate quickly so why not pick whatever Java servlet implementation which has most of the complexity and - more important - is already production tested so it won't fall over when you deploy it live. And then there's stuff like OpenID connect or SOAP (yes, still exists) where you can 'plug in' an implementation on some servers so you can get work done instead of worrying about getting all the implementation details right for some complex protocol.
Well I am kind of recommending using standard libraries. And servlet implementation I am using embedded tomcat as I mentioned in other comment. What I am not doing is generating gratuitous scaffolding of dozen packages and innumerable classes because that is the "best practice".
The dumb thing is most of the time only one type is ever injected. Its all hypothetical flexibility which has a cost but no benefit
Ha. Calling `new` would either be an absolute enterprise Java sin or an obscure aracana. Some people are quite proud of the fact of converting compile time errors to runtime exceptions. Because you know "best practices" and all.
I think what you are referring to is just manually doing DI. I.e. you defined constructors that accept dependencies and then call them all in a main function. I think this is tolerable if your codebase is structured for it. In typical java codebases, it gets ugly really fast. IMO this is caused by a general proliferation in the number of classes (due to class-per-file among other reasons), as well as a tendency to never use "static" DI. As an extreme example, if you needed to inject a logging dependency, then almost all code would need to be part of the DI graph. In a typical web backend, you might DI the sql connection pool. This causes basically all code to need DI since it either uses sql or has a transitive dependency that uses sql. IMO injecting the connection pool is not useful since it's not useful to write tests where you inject anything other than a real sql connection.
Go doesn't require one class (well, type or struct in Go) per file, and has much more flexibility in how you build packages as a result. I think it's a good thing that dependencies like the logger and the database are passed around explicitly: I've learned the hard way that "explicit is better than implicit" even when it means a bit more boilerplate.