By what metric?
Taking all your above examples, I (and many others) could argue that the move to web brought new techniques that overall improved software for developers and users. That's not to say I'm right, or you are, but to point out that everything you put forward is purely subjective.
What has objectively gotten worse in the past 10 years?
On the user's side: Just pick any set of well-established best practices such as Shneiderman's Eight Golden Rules or Nielsen & Molich's 10 Usability Heuristics, an then pick a typical 2024 electron app that has an equivalent from the 2003-2013 era and is written with a typical UI technology of the time (such as Windows Forms), and compare the two UIs with respect to those best practices. -- I'm pretty sure you will find usability blunders in today's software that you simply couldn't commit back then, even if you tried. -- Essential UI elements being hidden away (with no indication that such hiding is taking place) based on viewport size, leaving the user unable to perform their task is one thing that immediately comes to mind. Another example I happened to experience just yesterday: UI elements disappearing from underneath my mouse cursor when my mouse cursor starts to hover over them.
Also: Just look at the widget gallery in Windows Forms, providing intuitive metaphors for even quite subtle patterns of user interaction and check how many of those widgets you find implemented in modern web-based design languages and web component systems. ...usually you don't get much beyond input fields, buttons, and maybe tabbed-views if you're lucky. So today's software is relegated to using just those few things, where, 10 years ago, you had so many more widgets to pick and choose from to get it just right.
On the developer's side: Was JavaScript ever actually designed to do the things it's being used for today? Is dependency hell, especially in the web ecosystem, worse today than it was 10 years ago?
Excellent, we have something objective to look at. Now, where's the studies, reports, etc. that this has declined in the past decade? I'm not asking for a double-blind, peer reviewed study, just something a bit more concrete than "stop the world, I want to get off."
> Was JavaScript ever actually designed to do the things it's being used for today?
Was anything?
> [...] Now, where's the studies, reports, etc. [...] "stop the world, I want to get off."
This argument is getting a bit tediuos. It started with you offering an opinion. I offered a counter-opinion, while clearly marking my opinion as such using language such as "I think ...", "I would say ...", "If I were to speculate ..."
I'm clearly not alone with my opinion (see original post), and you're trying to undermine your opponents' credibility by getting ad-hominem and pointing out that their position lacks the kind of research which you yourself did not provide either.
> > Was JavaScript ever actually designed to do the things it's being used for today?
> Was anything?
Hyperbole. Many things were designed to do the things they now do. Lua was designed as a language for embedding. SQL was designed as a language for querying databases.
I've never seen a chat app taking gigabytes of RAM before Electron, for example.
I've extremely rarely seen applications going nuts, eating several CPU cores and draining my battery in 20 minutes before Electron, for example. Now it's a weekly occurence.
It's improved only for developers who only know web development. And we users pay for it in hardware costs, electricity costs etc.
Is that a general software problem or a problem specific to Electron? Is that a permanent problem or a problem right now because of the technology and your attitude towards it?
I say this because I do recall seeing complaints about Java being bloated in the 2000s. I briefly used Swing in my university days and it was pretty awful compared to HTML at the time. In 2044, maybe I'm going to be shaking my fist at the new-fangled tech and telling everyone how nice Electron apps were in comparison.
It's bloated in the 2023s too. Last year I caught Android Studio (which I wasn't even using at the moment, just had it open from a quickie fix a few days ago) going over 4 Gb of ram. I had two projects open, under 20k lines of code total (ok, maybe I should count again).
But why bring Java in? We're talking about native applications vs applications that pull in a copy of Chrome and half of the npm registry. Java isn't native either.