(I tried opening an swc issue about optionally using typescript ast info (via a plugin, not in swc core) to have more correct usage-based polyfill detection, but that was closed as unlikely to be acted upon.)
That mirrors my experience too on working in various projects. The automatic polyfilling story is such a good thing in theory, but reality isn't as rosy and much more polyfills than necessary are included.
Genuinely curious why anyone would target IE6 in 2023. Is it a personal goal to have massive coverage for your library?
The reason those PRs were never opened/merged is the maintainer of many of those libraries [has a strong stance on "breaking" changes][2] in software:
> I have developed an intense avoidance for breaking changes in all of my packages, because I don't want to inflict hundreds of millions of dollars of person-hour cost on the entire industry unnecessarily.
IMO this argument avoids the opposite claim, that people then spend a ton of time (and money) trying to make old tech work with newer tech since not everyone maintains to the same standards of backwards compatibility.
But regardless, no one is required to stick to a particular way of creating open source software, so the one benefit here is that you are free to [fork the library][3] (assuming its license allows for that) to remove some backwards compatibility that isn't relevant to you.
[1]: https://twitter.com/ljharb/status/1704912065486618915
[2]: https://github.com/import-js/eslint-plugin-import/pull/2447#...
[3]: https://www.npmjs.com/package/react-outside-click-handler-li...
If you do some simple digging into these libraries, you will find that these types of commits are quite common within them.
https://github.com/jsx-eslint/eslint-plugin-react/commit/e1d...
https://github.com/jsx-eslint/jsx-ast-utils/commit/bad51d062...
https://github.com/jsx-eslint/eslint-plugin-jsx-a11y/commit/...
He would rather see the download count of these polyfill libraries https://github.com/ljharb/ljharb#projects-i-maintain increase, compared to assessing the health of the JavaScript ecosystem.
However, there are some things he does that are incomprehensible.
For example, Enzyme. Over three years ago this issue was opened for Enzyme on React 17: https://github.com/enzymejs/enzyme/issues/2429
Nothing moved for a while, and I think he said something along the lines of “if you want React 17 support, stop complaining and help”. So the community got involved. There are multiple PRs adding React 17 support. Many unofficial React 17 adapters. A lot of people have put a lot of work into this, ensuring compatibility, coverage etc. Yet to this day, none of them have been merged. Eg https://github.com/enzymejs/enzyme/pull/2564
Given the amount of time that has passed, and the work the community has put in, something is amiss. It feels like he’s now intentionally avoiding React 17+ support. But why? I don’t understand why someone would ask for help then ignore the help when it comes in. That isn’t much better than the swathe of rude/entitled comments he was getting on the issue before he locked it.
I ended up migrating to RTL, but this made many of my tests more complicated (especially compared to shallow rendering).
There are a handful of rules that are nice to have in a TypeScript project to make sure devs don't do things that break type safety. Plus some that avoids mistakes from slipping through (even though the code is reviewed).
One thing I've found super useful is to have @typescript-eslint/ban-ts-comment enabled, but configured so that you can still use @ts-expect-error and the others as long as you provide a comment when doing so. This is so nice when doing code reviews, either someone has provided a very good reason for the exception, or it is clear that the same result could have been achieved with a better approach. Same goes for disabling eslint rules inline in a file, also allowed if commented. I feel that this is a very good compromise between being strict with linting, but also not letting linting get in the way.
If you coded alone for 10 years and then add a strict linting config you're going to have a really bad time.
If you actually follow the advice a linter gives you, you come up a 10x better developer.
Of course not all lint rules are created equal, but some are arguably existential.
I think that's the most important takeaway here. The library author decided not to make breaking changes and to keep compatibility wherever he can. I don't think that's a requirement many people have (not to this level, anyway), but it's not unreasonable either.
No project is required to use or accept his code. People want qs, resolve, and nvm.sh, and this one person is willing to provide it to everyone for free.
I don't care if he refuses suggestions because of "breaking changes" or because "they don't spark joy". It's his project, you can disagree with him all you want, but you can't complain that the free work he's doing isn't to your liking.
I think it's telling that a lot of people are willing to argue with the maintainer but very few people are willing to step up to provide and maintain a better fork.
Docs should show the recommended version (modern) and show what options are available to go deeper.
Obviously adding those settings for every pollyfill in non-trivial, but burdening everyone with every pollyfill ever is also suboptimal. If anything, this would make cleanup easier going forward since it would all be classified
Usually it's the top-level application's author who chooses and configures polyfills.
Now one may reasonably ask, why doesn't the library just call Object.defineProperties directly, and tell the user to install the appropriate polyfill?
I'm going to guess that a library that Just Works after an npm install will see much better adoption than one that requires each user to configure their babel/swc/etc. correctly, especially since the library can be a dependency of another library.
There's currently no standardized mechanism in the npm ecosystem to do the equivalent of "Install this library, and also configure your environment to pull in all required polyfills" so that the required functionality is available in global scope. One reason is because the transpilers that automatically polyfill into global scope are third-party tools.
Maybe a standard mechanism like this should exist, but it doesn't today, hence the quite reasonable choice of library authors to directly use polyfills because doing so:
1. Avoids pollute the global namespace by avoiding applying a polyfill globally
2. Works as a dependency without additional configuration by the user
3. Preserve backwards compatibility
A somewhat cheap fix to at least reduce duplication of polyfills would be for libraries that need polyfills to accept a wide version range. That would give the package manager room to pick a version that's compatible across call sites.
It wasn’t and isn’t uncommon to pull down a dependency from npm and expect it to work in multiple runtimes.
Maybe a step to a more sane situation would be reducing redundancies between polyfill libraries to ensure they don't step on each other's toes.
> The new dependencies were all polyfills for JavaScript functions that have long been supported everywhere. The Object.defineProperties method for example was shipped as part of the very first public Node 0.10.0 release dating back to 2013. Heck, even Internet Explorer 9 supported that. And yet there were numerous packages in that dependend on a polyfill for it.
Unfortunately, that compounded with browser of the era (Internet explorer...) having basically 0 support for modern javascript led to a proliferation of dependencies, polyfills, etc that are nearly impossible to remove from the ecosystem.
I've not seen a lot of node apologists that are fine with the current ecoystem. The problem is righting the ship is going to be terribly hard. Either existing frameworks/libraries need to go through the effort of saying "Ok, do I really need is-even, let's remove it" or we need new frameworks/libraries to abandon tools and the ecosystem in favor of fatter and fewer dependencies.
I think the issue all stems from the fact that before 2010ish, there was one library and one framework, jquery (Ok, there were others... but were there really?) and that added a good 1mb to any webpage. The notion was we do more with less if we had a bunch of smaller deps that didn't need to be brought in.
I remember this era. I’ve been using nodejs before npm existed and so many silly things have happened in that time.
I think the core problem the JS ecosystem has always had is that most JS developers are relatively inexperienced. (JS is very beginner friendly and this is the price we pay). I still vividly remember being at nodecamp in 2012 or something listening to someone tell me how great it would be if the entire OS was written in javascript. It didn’t matter how much I poked and prodded him, he couldn’t hear that it might not be an amazing idea. I think he thought it would be easier to reimplement an OS kernel in JS than it would be to just learn C. And there were lots of people around with a sparkle in their eye and those sort of wacky ideas - good or bad. It was fun and in hindsight a very silly time.
So yeah, of course some idiot in JS made is-even. And is-odd (which depends on is-even). I see all of this as the characteristic mistake of youth - that we go looking for overly simple rules about what is good and bad (JS good! C bad!) and then we make a mess. When we’re young we lack discernment about subtle questions. When is it better to pull in a library vs writing it yourself inline? When is JS a good or a bad idea? When should you add comments, or tests? And when do you leave them out?
Most of the best engineers I know made these sort of stupid philosophical mistakes when they were young. I certainly did. The JS ecosystem just suffers disproportionately from this kind of thing because so many packages in npm are written by relatively new developers.
I think that’s a good thing for our industry as a whole. But I also get it when Bryan Cantrill describes JS as the failed state of programming languages.
Stating that you maintain 800 NPM libraries brings more clout and money than maintaining a foundational one.
Even with foundational packages things tend to go wrong. Why add features to an existing package if I can write several plugins? Or even worse in some cases: why use the existing configuration file if I can instead just ask users to install dozens of dummy packages that only exist to trigger a feature in my Core package?
I was sympathetic to that idea then, it sounded good in theory, however in practice it was horrible.
Today I enjoy coding in "bloated" languages with only a few external dependencies.
I always thought about something like this, with on the fly manipulation of packages via SWC would be pretty fast I think
How should NPM prevent archaic dependencies, or the "even more bizarre" (author's words) problem of developers calling polyfills directly instead of the function that the polyfill fills?
You can always fork the projects with this guy's polyfills if you want, but you'll end up forking quite the collection of projects. Most of them are very minor and only end up polyfilling anyway, so you can probably get rid of the packages you don't want in an afternoon. Fork them and maintain them yourself if you're so inclined, don't complain that the free work he's doing for you isn't to your specifications.
One thing I've noticed is the rampant duplication of polyfills and babel helpers. To the point that I now have overrides setup via pnpm and I re-write many imports of polyfills to point at my own shims, which simply re-export existing functionality native to the language, most of the time.
For smaller utility packages, I often simply clone the repo and copy things over that we need, or copy the src right out of the node_modules folder if possible, then I strip away all the superfluous imports (and often convert from commonjs to ESM if needed)
Saves so much headache, its better for users, smaller builds etc.
You wouldn't happen to have an example of what you're doing laying around would you? I'd be genuinely curious to try stuff like that out.
Thanks for the kind words! It's feedback like this that encourages me to keep writing about it.
I share your experiences regarding babel helpers and haven't found a good solution myself. Similar to you, I often patch unnecessary stuff out via patch-package, but that approach doesn't scale well.
Patching packages is definitely something I still have to do to strip polyfills and convert CJS to ESM if I can’t simply re-compile source
This is what's sometimes called a "ponyfill". The idea is to avoid messing with global scope (monkeypatching), which could be problematic if you have multiple polyfills for the same API or polyfills that don't perfectly match the native behavior.
This can be a good thing in some situations, but in general it's probably best to leave polyfill decisions to the bundler so you can decide which browsers you want to support. Or even produce multiple versions, a lightweight one for modern browsers and one with tons of polyfills that gets served to ancient ones.
Good point. Agree that the ideal scenario would be that the end user (or the tools they use) have the final say in which polyfills to load. It's a bit of a bummer that they are shipped as part of npm packages without an easy way to get rid of them.
I wonder if our industry will move to publishing the original source files to npm in the long run. Only the last piece of the chain, the developer using these dependencies, knows what their target environments are. So the bundler could then downlevel or polyfill the code for the specified targets.
Ok, maybe someone else monkeypatched it. But at least you’d end up using the native functionality if it was there.
I suppose in that case you could argue the real bug is in the XHR API, but it only affected the extension because the extension was using a fetch polyfill that relied on it in functions that could be triggered by an external page.
I was surprised to learn that Object.values is only supported in Node >v7, Object.fronEntries was added in v12, etc. So for this project maybe the polyfills are needed.
[1] https://github.com/jsx-eslint/eslint-plugin-react/pull/1038
"Should libraries still be polyfilling to support ancient runtimes?" and "How should that polyfilling be implemented?"
Even if a library wants to maintain backwards compatibility, we can still argue this method of polyfilling (especially the phony polyfills) is damaging to the wider javascript ecosystem.
In an ideal world, the cost of polyfills for developers who don't need them should be zero.
For developers using bundlers, the bundler is expected to implement any required polyfills for the developer's targeted runtimes, and having the library ship with it's own polyfills is counterproductive at best. However, I suspect these libraries wish to maintain compatibility for developers not using bundlers.
Maybe npm should be upgraded to support multiple variants of packages? That way these libraries could ship both polyfilled and non-polyfilled versions of their packages in parallel.
I wish we as in the industry would find a better solution to adapt to this. It's a bit unfortunate that the polyfills as part of the library code itself, which makes it difficult to get rid of them once they're not needed anymore.
If the package.lock file gets deleted or someone runs a global npm-update then npm will update any packages while respecting semantic versioning.
It's possible an organisation forgot to include the package.lock file in their deployment image and they get updated npm packages every time they redeploy. It's also possible a developer making minor changes to a legacy system triggers packages to be updated, perhaps without even noticing.
Each tiny module did just a bit more than it should have done or included just one more dependency than was necessary, sometimes the scope of the module would grow over time and all this added up. Also, different sub-modules used different sub-sub-modules for the same functionality so this caused a lot of duplication in the higher level modules.
For my own open source project, I've always been very careful about which dependencies I use. I favor module authors who try to keep their number of dependencies to a minimum. A lot of times, it comes down to figuring out the correct scope of the module... Most low level libraries should not need to do their own logging; therefore, they should not need to include sub-modules to colorize the bash output; instead, they should just emit events and let higher level modules handle the logging. Anyway there are many cases like that where modules give themselves too much scope.
You should seriously think about consolidating them into a book. Something I notice other engineers struggle with is how to properly assess performance, read heap snapshots, or even understand how to read a flamegraph for stack tracing tools. It would be nice to point, or buy, them a resource showing this.
I'd definitely buy a copy.
People really have no mercy upon themselves, to deal with the bloated crap of the [struggle-stack™](https://twitter.com/brianleroux/status/1643337745463644160)
All of the cruft that you don't use will get optimised away by the compiler, right?
I'm not aware of any production ready WASM frameworks, but I'm ready for it.
2. Whatever dependency hell exists in the source language still exists at WASM compilation time
3. There will never be WASM frameworks because they're generally not the bottleneck.
The closest to WASM framework was Cappuccino, which let you compose a whole application in a language close to Objective-C
2. True, but compilers are generally better than transpilers.
3. Have you seen https://yew.rs/ ?
are Bun and Deno solving this problem to some extent?
node.js/bun/deno need a battery-included stdlib to me like what python provides.
Deno encourages you to submit the original sources which can be even in TypeScript if you want. The users are very close to the newest Deno release and there is barely anyone staying on old versions. This works because Deno takes semver very seriously, which in turn encourages folks to upgrade. It removes the need for polyfills and allows you to always use the latest JS features.
Disclaimer: I work at Deno