> Not sure I understand, are you implying Deno does automatic tree-shaking on package imports? If not, how does "deno download exactly the code that is imported" and not just a whole package?
npm install copies in to your node_modules an entire zip/tarball package. Deno uses ES2015+ module syntax and it's only going to grab the JS files that are imported (or imported by imports). So it "naturally" tree shakes at the file level, and doesn't have a concept of a "package" in the same way. It's not directly going to tree shake inside of single file boundaries in the way that a major bundler/packager might (though the V8 JIT should still sort of indirectly compensate).
So yeah, if the package is published as just a single (M)JS file it will still get entirely downloaded by Deno, but if the modules are split across multiple files, Deno will only download the ones that are directly or indirectly imported (and will have no idea of the remainder of the files in the "package").
> I disagree with this. In my opinion, this is done by using a pull-through cache that caches every package developers request and so inherently has a cache of the packages that will go to production.
> Is it possible to do this in deno today? I don't really get that sense.
Yes, because URLs are just URLs, you could always have a Proxy service running at a URL that knows how to request the packages from upstream. https://jsproxy.mydomain.example.com/lodash/v1.1/module.js could be simple caching proxy that knows how to get lodash from upstream if it doesn't have the requested version cached (or sends a 404 error if it isn't allowed to cache a specific version or whatever).