The thing that really bothers me is that if you look at what regular home and office users were doing on their computers in the 90's it's almost identical to what those users are doing right now, except in the 90's they had orders of magnitude slower computers with orders of magnitude less memory. Yet in many cases those 90's computers were MORE RESPONSIVE than what they have today.
All of those countless billions of investment in technology hasn't done a damn thing for the productivity of Sally the office worker or Billy the 6th grader. Arguably, it's made their lives worse (viz. social media's deleterious effects on mental health). Now everyone's pushing the heck out of AI and all I see is high schoolers using ChatGPT to cheat themselves out of an education. They can't read (critically), they can't write, they can't even spell!
So in light of all that, why should we be pushing more and more computing power (and memory, the original issue) on regular users who aren't getting any benefit (broadly, to their way of life) out of it?
Gosh, now I sound like a luddite!
Certainly not, but you can't fix it by putting less RAM in the machines of people with budget constraints. The developers will just pay for more themselves and then not care about those people because people who can't afford RAM generally aren't lucrative customers.
And it's also worth considering what actually causes this.
Developers want their code to work on every platform. They don't want to write different code for each platform. But each platform wants them to have to, because that makes it more likely there will be software that only works on their platform, or that doesn't work on some new competing platform. So they refuse to develop or implement cross-platform standards.
Then someone else has to do it, but that's rather a lot of work, and it turns out the easiest way to do it is to piggyback on the work already done for browsers to make them work on every platform. That's Electron. It's terribly inefficient but it saves the developer a lot of porting work, so it's widely used.
If Apple doesn't like this, they should provide cross-platform native APIs for developing applications.
It’s not in Apple’s interest to do that. It would cost a lot of money to develop and only benefit the competition. It would also slow down Apple’s own ability to innovate on the APIs until the competitors catch up.
Or are you saying Apple should develop the APIs for Windows and Linux as well? Why would they do that?