They've added a cryptographic hash/integrity and the async/defer attributes to the script tag, but something as essential as a fallback if a script or stylesheet fails to load (which the browser is best placed to know), has no built in functionality.
Instead you're left doing JavaScript tricks which for missing CSS gets a little ugly[0]. But CDN with local fallback (or visa versa) has been common now for decades but yet no official support at all. Honestly if the integrity attribute is specified the browser should just be able to fall back to a cached copy it has (e.g. jquery.1.2.3.min.js has a crypto hash of ABC123, and I have that file already).
[0] https://stackoverflow.com/questions/7383163/how-to-fallback-...
A lot of libraries, JQuery, Lodash, Angular, Vue, React, Bootstrap's JS, module loaders, etc aren't simply offering "improved interactivity" they're offering core functionality. In essence the site runs on these libraries, if you remove them there's nothing left to regress too.
I've worked in several companies and never seen progressive enhancement used. It might have made sense back in the IE6 era when JavaScript was just for whiz-bang, these days JS libraries are holding the whole site's data context/state and generating Ajax as needed (Vue, Angular, React, etc). That's core, there's nothing progressive that can be removed from that.
Progressive Enhancement only makes sense for small toy sites or for academics to play with. Even Netflix's famous examples are about web services going offline, not losing core JavaScript libraries.
I really want this feature. I think there might be some cross-origin issues that need to be handled correctly (e.g. you might be able to fingerprint a user by probing for parts of their cache), but for common things like exact copies of jquery, this would be super useful.
I understand that the CDN version of the library may have already been cached by the browser while visiting other websites, but does it really save that much time/traffic compared to self-hosting?
If every site you visit has 350kb of stuff that would benefit from a CDN JS but also some CSS and fonts (google fonts, bootstrap, etc.) If you visit 50 pages a day in a 30 day month, that's a little over 500mb of data.
.35mb x 50sites x 30days = 525mb
That would be a ton of easily avoidable data in regards to mobile plans depending on where you are. This number isn't 100% accurate though, many "normal" (read - not techy hackernews readers) might only visit say a dozen sites a day or less (let's ignore apps like facebook/snapchat/etc). Even that might be a stretch.
Then again students and other "savy" users might be going across hundreds of new sites a day.
For you the host? Unless you're a massive beast, most of us "hobbiests" fit within the free bandwidth of 5$ vps services anyway.
Original, jQuery CDN:
https://code.jquery.com/jquery-X.Y.Z.min.js
Google:
https://ajax.googleapis.com/ajax/libs/jquery/X.Y.Z/jquery.min.js
Microsoft:
https://ajax.microsoft.com/ajax/jquery/jquery-X.Y.Z.min.js
Microsoft ASP.NET:
https://ajax.aspnetcdn.com/ajax/jquery/jquery-X.Y.Z.min.js
jsDelivr:
https://cdn.jsdelivr.net/npm/jquery@X.Y.Z/dist/jquery.min.js
cdnjs:
https://cdnjs.cloudflare.com/ajax/libs/jquery/X.Y.Z/jquery.min.js
Yandex.ru:
https://yastatic.net/jquery/X.Y.Z/jquery.min.jsKeyCDN has an online asset performance tool that we can use to compare the hosted jquery.min.js files. The numbers included here are results received (to the San Francisco location) in ms of [DNS lookup time] / [time to connect to server] / [overhead of TLS connection on individual asset] / [time from client HTTP request to receiving first byte from server]:
Original, jQuery CDN:
https://tools.keycdn.com/performance?url=https://code.jquery...
8 / 2 / 79 / 85
Google:
https://tools.keycdn.com/performance?url=https://ajax.google...
32 / 2 / 132 / 155
Microsoft:
https://tools.keycdn.com/performance?url=https://ajax.micros...
128 / 3 / 122 / 130
Microsoft ASP.NET:
https://tools.keycdn.com/performance?url=https://ajax.aspnet...
128 / 3 / 114 / 120
jsDelivr:
https://tools.keycdn.com/performance?url=https://cdn.jsdeliv...
64 / 3 / 118 / 129
cdnjs:
https://tools.keycdn.com/performance?url=https://cdnjs.cloud...
64 / 2 / 118 / 125
Yandex.ru:
https://tools.keycdn.com/performance?url=https://yastatic.ne...
32 / 139 / 667 / 993
When I tested them, only jsDelivr and cdnjs/Cloudflare recieved green results (under 200ms time to connect and under 400ms time to first byte) from all 16 worldwide test locations. Averaging the results between these two across 16 locations, I would go with jsDelivr who had a faster average TTFB. The fact that they are combining CloudFlare, Fastly, StackPath, and Quantil (who I had never heard of until today) might explain their global results.
Nowadays, with all the Node.js stuff that goes around modern front-end, I don't see the point of embedding a JavaScript library from a CDN, unless that library is dependent on a remote service, e.g. Google Analytics, Google Maps, etc… That being said, if you are still maintaining a legacy website that depends on jQuery, you should consider to embed the library like this instead:
<script>window.jQuery || document.write('<script src="/js/jquery.min.js"><\/script>')</script>
What does Node.js have to do with deciding whether to get your static assets from a public CDN or not? I hope your not serving your static assets with Node.js.
There are tools like Grunt and WebPack (which depend on Node.js) that can bundle all your dependencies.
I cannot provide details about how they work because I don't do front-end development, but I can tell you about years ago when I had to copy & paste both code and links to jQuery and other libraries like BackBone or Ember.js (relevant at the time) into my projects. Nowadays, web developers seem to prefer the use of tools that came from the Node.js ecosystem to handle these dependencies in a more "engineer-ish" way using NPM packages.
Many times the 100KB of JavaScript is faster to load when minified and combined with other site code and served compressed over a single HTTP request or streamed via HTTP/2. It's almost always faster to use an existing connection than to start a new one.
Also there isn't one canonical version of jQuery. There's dozens of potential versions available[1]. So it's not immediately clear that a user will have the version a site depends on.