Sindresorus wrote a gist "Pure ESM modules"[0] and converted all his modules to Pure ESM, breaking anyone who attempted to `require` the latest versions of his code; he later locked the thread to prevent people from complaining. node-fetch released a pure ESM version a year ago that is ~10x less popular than the CommonJS version[1]. The results of these changes broke a lot of code and resulted in many hours of developers figuring out how make their projects compatible with Pure ESM modules (or decide to ignore them and use old CommonJS versions)--not to mention the tons of pointless drama on GitHub issues.
Meanwhile, TC-39 member Matteo Collima advocated a moderate approach dependent on where your module will be run [2]. So the crusade is led not by the Church, but by a handful of zealots dedicated to establishing ESM supremacy for unclear reasons (note how Sindresorus' gist lacks any justifications and how weak TFA's justifications are). It's kind of like the Python 2 to 3 move except with even less rationale and not driven by the core devs.
0 - https://gist.github.com/sindresorhus/a39789f98801d908bbc7ff3...
1 - https://www.npmjs.com/package/node-fetch?activeTab=versions
2 - https://github.com/nodejs/node/issues/33954#issuecomment-924...
This crusade is nowhere near zealous nor righteous enough against the infidels & non-believers.
But it also hasn't been effective enough at supporting/supplying the crusade either.
Matteo's statement was that Node hasn't stabilized their loader support so tools have a harm time migrating to esm. Imo it's a pity ecmascript never stabilized a module registry, that esm 1.0 shipped & most people thought it would happen; it's long felt like a bait & switch. But it wasn't a feature browsers needed or really wanted so that unfulfillment was unsurprising. Anyhow, IMO Matteo is making a technical point that it's still hard to finish the move, which is a different spin IMO than a "advocated a moderate approach".
Given the hurt we legitimately experience, I really wish Node and/or WinterCG or someone would prioritize figuring out & implementing whatever needs to go into a module registry/loader. And then beg the big tool chains that need this stuff to expedite their migrations, pretty pretty please.
Okay, but let's not resist inconveniencing ourselves with the facts.
ESM got browser support in 2017 and stable Node.js support in 2020.
It only became awful for me when people started publishing pure ESM packages; `npm i node-fetch` suddenly began resulting in broken scripts and I had to learn why. Prior to that, I happily used CJS outside of the browser and what I suppose is ESM in the browser (the `import` syntax provided by bundlers).
> a different spin IMO than a "advocated a moderate approach".
He said, "If your module target the Browser only, go for ESM. If your module target Node.js only, go for CJS."
This is moderate compared to the "Pure ESM" approach. The fetch API is built into browsers so I don't see why anyone would use `node-fetch` outside of Node.js, and yet the maintainers of `node-fetch` went Pure ESM anyway. Also that GitHub issue is titled "When will CommonJS modules (require) be deprecated and removed?" and his response was "There is no plan to deprecate CommonJS"[0].
0 - https://github.com/nodejs/node/issues/33954#issuecomment-776...
That preference has had an impact on the Node ecosystem at large, given how prolific an OSS contributor Sindre is, but IMO that influence has been earned by the large body of work he's contributed.
I'm working on greenfield projects that leadership is still insisting we avoid ESM-only packages for. The move that Sindre (and others like wooorm) made has not been well-received.
I'm all for ESM but alienating your own users is stupid. Create node-fetch-esm and support both versions until commonjs popularity would be low enough to be dropped.
I've definitely worked around my fair share of CommonJS issues but until ESM "just works" I'm slightly pained by how aggressive the tone of this article is.
In fact it’s a very illustrative example of why CommonJS should be taken out back and shot.
Re mocks, an ecosystem should not be held back solely due to an arcane edge case. The apps that use test doubles can be rearchitected to support test doubles to the programmers' satisfaction. Most people do not use test doubles & the benefits of ESM & not having to deal with CJS outweigh the downsides of losing some convenience in mocking modules.
This is something that will not be popular with clingers to CJS, but it is something that will benefit the wider ecosystem. The vocal minority which is holding back the ecosystem should start making plans to migrate because it is in the process of happening right now...
So I think the general resentment is having to do extra work to support CJS which is not standard across all JS platforms...and some legacy libraries are still written in CJS requiring an interop. So it would be great to not have to do this extra work to support the legacy CJS on Node.js when every other JS platform is using ESM. In this case, one person's convenience comes at a cost to everyone else & at some point, that one person is going to have to suck it up & do the work to support his use case...just as everyone else had to do the work to support the legacy CJS for years now.
My code works fine and the article isn't doing itself any favors by mocking me (heh) for thinking so.
We would be much better off if it had never been created.
I think the biggest miss was not making mixed mode (default) for Node do it the way webpack/babel, etc did it by default in terms of interop. I get they wanted to make it more implicit to call cjs from esm, in the end it just inhibits conversion of existing libraries as dependencies are now a bigger hurdle.
Frankly, I like the Deno way of things better. I find it annoying, to say the least that the TypeScript team won't consider allowing you to import with a .ts(x) extension, and convert to .js(x) as part of the build process... no, you must omit the extension.
I've been using the import/export syntax since well before it was standardized via babeljs, these days I kind of want to remove webpack/babel from my pipelines altogether and mostly just rely on esbuild. I've also been using/following rome.tools development, having switched over several projects from eslint already, and will probably start with their bundler when it's ready.
I think there's a way to go with tree shaking and static analysis in that direction to reduce load. I also would not mind seeing the efforts to treat TS extensions as comments in the JS engines in that it would be easier to serve up straight TS/JS without bundling/minifying. I'm not sure we'll ever see a return to that in practical terms.
In the end, it's evolving. I'd also like to see Cloudflare, Deno and others come together on a more common server interface so that you can target multiple clouds with a single codebase. I don't know how well that would ever work out at this point though. There's aspects that I definitely like to all of them.
Huge huge agreement.
I forget the specifics but there was some super tiny corner case around maybe default exports that could potentially create ambiguity & that spawned a multi-year bellyaching around doing anything at all for interop. What Node got was incredibly hard fought for against much resistance to interop.
But the final compromises made everything so much more painful for everyone. So many esm projects but oh look a .eslintrc.cjs, how unsurprising & sad.
It's extra maddening because node had a wonderful just works (except that tiny tiny tiny corner case) interop via @standard-things/esm, which seamlessly let the two worlds interop. It'd been around for years before node started shipping support, and it was no ceremony just works bidirectional interoperability, and it took basically no effort or thought from the developers point of view to use. It sucked seeing us walk back from great, mired by frivolous over concern for a obscure corner-case.
I do much like the `import` syntax personally and its a little cleaner to read, but CommonJS and AMD were the undisputed winners of the module format until ES Modules were born. Not that I have a problem with ES Modules, I don't, however I am interested in what was so insufficient about the preceding formats that we couldn't have standardized on them
EDIT: I know about the deal with CommonJS being synchronous. That isn't per se an issue I don't think, esp. because AMD built on top of CommonJS primitives, and with minimal refactoring CommonJS code could be used in the browser when defined this way if asynchronicity is a must. Generally, what I "imagine" browsers doing with CommonJS is making the `require` calls async in the background (IE non visible to developers) so they can resolve the modules then parse the code. This isn't terribly different from how import statements work today.
I'm wondering why we didn't undertake the work to just improve the existing format, more or less.
EDIT 2: I'm interested from a historical perspective. I think ESM is the right choice and 100% the future.
This is because of a much deeper issue: static analysis is highly complex with the near-free-for-all that is CommonJS require & module.exports syntax. ES Modules is stricter and much easier to statically deal with.
At a high level, why? You can throw just about anything in an exports.module statement, and the syntax to "require" it also has a lot of leeway. You can actually see the code for this in the Node codebase--module resolution is handled in javascript @ /lib/internal/modules/cjs/loader.js vs /lib/internal/modules/esm (heads up, both approaches are a Lot to grok)
Understand that with the CJS approach, you can dynamically export modules at runtime under whatever name you wish, with whatever value you want, which may even include dynamic require statements themselves. Nightmare for static analysis.
It makes a lot more sense if you try it for yourself. Build a module resolution algorithm including: determining all the imports, all the files those imports are from, mixing with 3rd party and local imports, and building that chain recursively.
You can do it, but the edge cases surrounding CommonJS make it super difficult. I'd go so far as to say it's basically impossible to get 100% success in all the desired scenarios without directly invoking the code.
I think dynamic imports have some of the same footguns here, to be honest. Can't deny ESM is easier to statically analyze though, that much appears to be true across the board based on available evidence.
I mean, I guess people will directly write AMD modules, and make modules using some giant script that uses cat, but the future of JavaScript lies with making each source file a valid, correct piece of JavaScript. When each source file is valid and correct, and doesn’t need to be preprocessed in order to work, your tooling will work a lot better.
The browser authors know you can’t un-ship JavaScript features. ES6 import/export is damn good stuff, and people in the browser aren’t saddled with some weird compatibility shim like AMD.
The adoption of ES6 modules in the client-side landscape has far outstripped its adoption in Node.js. I honestly can’t wait for require() to die, in both its cjs and AMD variations. The tooling support for ES6 modules is miles better.
I don't know if that's the entire story -- probably not -- but I do know that is one major differentiator for things like generating import-graphs and performing tree shaking.
(you can still do like `import('foo' + someVar)` which will only invoke dynamically at runtime, so I'm not sure how that case is dealt with)
That case is dealt with more like a `fetch('foo' + someVar).then(r => eval(r.text()))` or similar (but of course it is not just a eval and it instead returns the exports of the module).
Dynamic imports and static ones behave very differently and static analysis generally ignores dynamic imports IIRC.
You also need to treat dynamic imports as async including everything that comes with that (error checking, awaiting, etc.)
1. Because `require()` is "just a magic function", it can't be statically analyzed by a JS runtime prior to actually running the code. This leads to limitations with regards to tree-shaking and other optimizations.
2. The last point leads to the even bigger (and probably "deal-breaker") reason for the change, the desire to fetch packages from URL sources. Since the syntax cannot be parsed efficiently, runtimes like Deno and Bun would have a much harder time fetching resources from URLs prior to running the code. The idea here, IIRC, was to eliminate the install step, the need for centralization on a single package manager and registry, and a general "non-Web" approach to the idea of packages and modules in JS.
I believe the `import` syntax was chosen to allow transitions away from `require()`, so that your programs wouldn't just stop working if ESM was enabled.
The second point actually isn't strictly valid. I've written my own "all-in-one" async custom loader [0] that can require() CommonJS/AMD includes, regular "add a script tag" includes w/out any exports, or even css stylesheets all asynchronously, with asynchronous dependency trees for each async dependency in turn. You can define in the HTML source code a "source map" that maps each dependency name to a specific URL, so that you don't need knowledge of the filesystem tree to load dependencies.
Ideally, this source map can be generated via the tooling you use to compile the code (e.g. `tsc` is aware of the path to each dependency) but I haven't written my own tool to generate the require path to url map.
I’m going to guess the good faith answer really involves some version of “CommonJS has some shortcomings and we didn’t want to confusingly write mostly-same syntax so we designed something new based on ideas from numerous languages.”
A bare-bones implementation of AMD could be put together with less than a kilobyte of JavaScript (this is what we used at Mozilla for a minute circa 2012). Meanwhile, the ECMAScript folks were working on ES6, which was going to have a module system. Why would the browser build in support for a highly-opinionated system that you could implement yourself so trivially, all while a TC39-blessed standard was in the works?
> what I "imagine" browsers doing with CommonJS is making the `require` calls async in the background (IE non visible to developers) so they can resolve the modules then parse the code
That's not possible. You need to run the code to know what's being required: if I call `require('./' + getModuleName())`, you don't know what's being required until `getModuleName()` is evaluated. So you actually need to start running the JS. You need to pause execution of the code calling `require()` (a la `alert()`), and then you can download and parse the required module. When the file is downloaded, you can parse and execute the imported module. Each file would need to be downloaded/parsed/executed _synchronously_ in the order that each `require()` happens in: it's only async in so far as the JS pauses execution and picks up later.
> This isn't terribly different from how import statements work today.
Not so. You can find and resolve `import` statements (note: not `import()` calls, though these return Promises) without executing a JS file. You can parse the imports out of a file in one pass and fetch/parse/repeat for each import in the dependency tree before anything starts executing. Since "native" imports are static and declarative, you can resolve all of them without ever executing any code. And any dynamic imports return promises that the programmer needs to explicitly handle the behavior of at runtime.
> just improve the existing format
1. You'd have to kill dynamic imports (passing anything other than a string literal to `require()`, which would be impossible to do without breaking compatibility and couldn't be polyfilled.
2. AMD allowed a callback syntax for `require()` (it came out years before promises), which is cumbersome. Adding promises later would be challenging and leave technical debt.
I wrote an "aio loader" many years ago that can load (in the browser) AMD/CommonJS/node or just "include this script in your html" dependencies that asynchronously loads dependencies (and their own dependencies) with support for use via plain `require()` without callbacks, `require(foo, foo => {})` callback support, and even dynamic async loading (`var App = await requireAsync("foo")`).
I never published it publicly (it's just ticking away on our production sites) but I was motivated to push it to GitHub just now [0].
I only started my career in ernest in 2012, but even then compatibility with old versions of IE was a major point, due to their high market share.
IE6 was officially retired in 2014, but even then it still accounted for 4.2% of the traffic:
https://www.computerworld.com/article/2488448/ie6--retired-b...
Then there were IE8-11, but it was IE6 which lingered way past its welcome, considering it was originally released in 2001.
This existed: the UMD module format was the turducken you got if you built modules to work both as AMD and CommonJS at the same time. AMD wrappers, async require, and a bunch of boilerplate to determine if the module was being loaded by an AMD loader or in a CommonJS environment (or worst of all, a CommonJS environment with AMD loader primitives).
It was a lot of ugly boilerplate. I don't think I ever saw a project intentionally write UMD modules by hand. I do recall some Typescript projects that distributed as UMD modules for a while, because that was boilerplate Typescript was always good at streamlining.
> I do much like the `import` syntax personally and its a little cleaner to read, but CommonJS and AMD were the undisputed winners of the module format until ES Modules were born. Not that I have a problem with ES Modules, I don't, however I am interested in what was so insufficient about the preceding formats that we couldn't have standardized on them
I think it is absolutely the syntax that needed standardizing. AMD was always a hack for module loading using available browser tech as best as it could and screaming for better syntax. There was so much pain every time working with AMD in making sure that define() wrappers were correct and the list of dependencies correctly matched the names and order of those as parameters of the module's function wrapper. AMD was always in desperate need of an import syntax. (One of the reasons Typescript was built was to provide such an import syntax ahead of ESM standardization. It's why I started using Typescript in the 0.x wilds.)
In many ways ESM were always the natural improvement of the AMD format. One of the things that hung browser standardization in various stages was debates about how compatible to be with AMD. There were multiple attempts and a lot of debate at "Loader APIs" that could be extension points to directly interface classic AMD loaders such as Require.js and the Browser's. Had one of those Loader APIs made the final cut it likely would have been possible to "natively" import legacy AMD directly from ESM.
Loader APIs lost to a number of factors including complexity and I think also the irony that CommonJS won the "bundler war" while those debates were going on. I think it must have seemed that the writing was on the wall that AMD compatibility was no longer that useful and Loader APIs were never going to be great for CommonJS compatibility (again, because of those synchronous assumptions that doomed CommonJS to always be the nemesis of browser modules).
(The dying compromises of the "Loader APIs" tangents is what eventually delivered importmaps.)
AMD compatibility without "Loader APIs" is basically impossible. Even though Require.JS was quite dominant, it was never the only loader, and part of its dominance was it was an extremely configurable loader with tons of plugins. There wasn't an "AMD loader standard" that browsers could emulate.
I generally do think that ESM is what we got trying to fix the syntax needs of AMD and clean up and actually standardize the AMD loader situation. In the end it didn't end up backwards compatible with AMD like it tried to do, but from my impression it certainly tried and that was unfortunately part of why ESM standardization was so slow and what led to such a larger mess of CommonJS modules in the wild in the time that took.
Instead, I had another idea: to import @azure/service-bus dynamically, using ES dynamic `import`, in place of static imports. But, since I'm using Node.JS, I have to set the package.json type to module, so I can use top-level `await`, with dynamic `import` to import, with the help of a ternary, the correct implementation on the fly.
I had a extremely bad experience trying to convert a CommonJS project to use ES Modules before. So I did not go through with the plan.
Finally, after spending some time trying to not use CommonJS I gave up, and in place of dynamic import I used the "good" ol' `require()`, ending up with something like this:
const { ServiceBus } = (isDev ? require('./my-service-bus') : require('@azure/service-bus')) as import('@azure/service-bus');
.And that was that.
Project maintainers have to make ES Modules practical before it's pretty.
I think there should be more praise on these guys for what they accomplished given the state of JavaScript when they started. They saw a problem and came up with a solution. Was it perfect? No, but it's not this abominable creation.
Much like John Resig's work on jQuery nudged JavaScript forward, so did the work on CommonJS/Node.
CJS was doing just fine in Node.js for nearly a decade before ESM came along and made everything more difficult by shoving browser constraints into a server-side runtime. ESM may be the right direction for the whole ecosystem in the long run, but it's a little backwards to say the perfectly good incumbent system is "hurting" the language because everyone who invested in it doesn't want to go through the pain of migrating to a new fashionable system that is worse in many ways.
"In 2009, CommonJS was exactly what JavaScript needed. The group took a tough problem and forced through a solution that continues to be used millions of times a day.
But with ESM as the standard and the focus shifting towards cloud primitives — the edge, browsers, and serverless compute — and CommonJS simply doesn’t cut it. ESM is a better solution for developers, as they can write browser-compliant code — and for users who get a better end experience."
We all lose the ability to simply have a local index.html file, and have it Just Work (TM):
<script src="script.js"></script>
This ability is amazing for demos, fast iteration, onboarding new devs and developing without a ton of layers of js ecosystem machinery.Deno doesn't care about retaining this level of developer experience because Deno is marketing its own runtime / build step / ecosystem.
Shouldn't that be a relative path for it to actually work as you intend?
I think it may be necessary/prudent to get some level of JSX support into browsers, much like the ts as comments efforts. Not sure how that will/would land. I was a pretty big fan of E4X, and had a prototype similar to React about a decade before it. In the end, who knows.
I can't think of a concrete benefit to a developer that ESM brings (just pain, but maybe I'm biased by what I'm exposed to). Probably why it's so slow to be adopted.
Also, even some cases with bundlers, some of the modern bundlers (esbuild and swc) are still directly bundling to ESM now as the target. Lazy-loading boundaries and common/shared-code boundaries are just ESM imports and there's no "runtime emulation" there, just native browser loading at that point. They are just taking "small" ESM modules and making bigger ones.
ESM may very well be the module system designed for a world that'll actually never exist, and will mostly just be an ill defined compilation target. But hey, maybe the next web module system will do better - those wasm working group people are working hard on their module system - and it's intended as a compilation target from the start, so shortcomings in it can be patched over by tools from the start :)
That said, we aren't that far off. Many site are spewing several MB of JS on load, and it's relatively well performing even on modest phones these days. At least relative to 90's dialup where the rule on loads was measured close to 15s. Things are absolutely snappy (mostly). I think the biggest hurdle today is shear entropy. React+MUI+Redux is imo pretty great, and getting to something similar in pure JS would take a lot of effort. Not insurmountable, but significant. There's still a new framework of the month nearly every month in the JS space.
Getting movement is hard. It'll take time and persistence.
This sentence sounds ok until Python and Ruby are held as the apparent gold standard of server development. That's not really the case, I think?
But I can confirm deploying Python or Ruby apps is generally bigger PITA.
Node has cjs and esm, Browser has legacy (create sub-objects in the global or window object) and esm.
I tried to write a polyglot, but was not successful. https://stackoverflow.com/questions/48396968/72314371 proposes a clever polyglot exploiting that await is parsed differently at the top level when in esm or not. However this doesn't help, because import and export keywords always fail hard (not catchable by try-catch) and eval (yuck!) doesn't help because inside eval your are legacy.
So you have to bundle if you want to provide for everybody... (shrugs)
As a language, JavaScript may support loading from file system or web. But, one of its use cases must not be code reuse on both client and server.
If I am to write a project in this way, how would it look like?
Assume, I write some server code in a server project that I want to use in client as well. How would I access that code from client? I need to make it accessible over internet. But, that is in a project that has other code as well, some of I do not want to be accessed over internet. What do I do? I will write rules in my web server to only allow access to specific files. Here and beyond, this process becomes complex and will get only worse.
I think ESM modules are most of the times desirable by developers of small to mid size packages. These developers want to have their packages used both on server and in browser.
On the other side heavy duty packages which are able to generate production loads (see Fastify) should use CommonJS. In my opinion production loads should not have importable elements, stay private and rather just run and get the job done. At the same time it is really not that important for such loads to have good loading times for the micro packages used.
This is debatable, but yet another aspect to consider: server side production loads must use CommonJS.
What's actually preventing this from happening is the fact that not all official packages are ESM ready.
So the solution might be actually very easy. Set a deadline for the implementation of ESM and send frequent notifications to developers.
After the deadline all those running production loads will have to switch to ESM syntax when updating. This seems facile and organized.
The problem has always been, and continues to be, stubborn backward compatibility. Node is making the same mistake that Microsoft made with Windows, that Apple did not make with OSX - rather than letting go of a system that's been outgrown and forcing the userbase to grow, they cling to the old way and the old API, allowing it to ferment.
This is the fault of the team working on Node and leadership making poor choices, including TC39.
I think this forced ESM is causing a lot vulnerabilities in code base due to the pain converting projects to ESM is. I am busy enough to try to get tests pass for some new version of a package that got ESM-only.
Wished they would publish cjs and esm for packages that target Node. Fine you go ESM-only for browser only packages.
It's rare that standards don't beat out non-standards in adoption.
ESM is the standard; there is only one way this ends,and it's not in favour of CJS.
Your "everything" somehow doesn't account for stuff that wasn't written in NodeJS's standards-incompatible way to begin with.
async function esmodule(commonjsModule) { return new Promise((res, rej) => res(require(commonjsModule))) }
?
There’s also some unsolved mysteries surrounding the path resolution defined by a package.json file, but at least there’s now a proper way to have a package use project root relative imports. Things usually go well until you get back to the browser, which now needs an Import Map to bridge the two worlds. I still haven’t figured out how to wean off NPM either since all the magic compiling CDNs use its namespace to create browser friendly bundles…sort of where we started from.
There’s a few foot guns on bundles now too, like deduping React so hooks work, along with some surprises about modules being stateful. And while Deno is pushing the dream forward, I can’t help but feel they compromised the vision too far for Node compatibility. At this rate, I could see Node v30 being a merge of the two projects.
I’m honestly happy it’s all coming along. It seems like this is JavaScript’s Python 3 moment, where everyone has to rewrite code to slightly new paradigm for the next generation of apps to fully appreciate. I’m most thankful for async imports operating like ordinary Promises!
That's just not good enough, it needs to strangle, crush and bury javascript...
But now, I get to fight importing dependencies with cancerous ESM design and think about language basics, while spec writers revert us all to CS 101 students trying to figure out how to do elementary things.
Which isn't going to change any of my mature code, either, I'm going to wrap the entire script in:
(async () => {
})();
and then: const { default: foo } = await import('bar');
So, has anything meaningful changed? No. I don't need to tree shake on the server, and if you're tree shaking on the client-side, you've already screwed up, and you're too inexperienced to realize it./s