Meanwhile Ventrilo 3 has a 5.4 Meg installer, consumes 4 megs of RAM, and does none of those things. The newer bloated version has a 7.9 meg installer.
Though honestly, crossplatform software too often just bundles Electron rather than relying on the entire OS layer (on Mac and Windows anyways) having all of this stuff built-in. For bugs and reproducibility it's nice, but it really sucks for "downloading the same bytes over and over again". This is downstream of OS vendors just historically not fixing bugs, but in an alternate universe people would have working OS stacks and we would use that.
1. does not run on Linux
2. the macOS client is incompatible with many servers
3. has been in development for twenty-one (21!) years
As much as any company is willing to load their customer-facing or retail software with a million kinds of garbage, that's the last thing they want in their own toolkit.
One of the things that inspired that was seeing how much better the mobile version of websites often were than their apps. Thinking about Facebook, Twitter, Reddit and similar. Their apps don’t have tabs, often drop state, and seem to occasionally focus on shiny over usable. My local newspaper prevents you from copying any text in the app. The mobile websites on the other hand tend to mostly have an annoying “our app provides a better experience” banner to bypass, otherwise they’re engineered to work, not push the envelope. They’re important enough for some resources, but it doesn’t seem like there is quite as much noise during the development for the ‘fallback version’. And I think that makes all the difference.
It's been around for 21 years, has a wikipedia page [0], and is mentioned in at least one old meme-worthy DOTA song [1], so the OP likely didn't feel it needed an introduction.
[0] https://en.wikipedia.org/wiki/Ventrilo [1] https://youtu.be/qTsaS1Tm-Ic?feature=shared
It certainly has its place, and I laud the authors for their efforts, but seeing how every startup is using electron for their native applications, I have little hope for lean software.
At the end of the day, developers need to finance their projects. No other toolchain out there [1] is going to give you the flexibility, development speed, and freedom to develop beautiful looking desktop apps using the muscle memory you trained while writing webpages. Of course, you can write the same application in Qt, GLFW, whatever, but I don't think anyone will disagree that it's much slower to build and prototype responsive UIs with these tools.
[1] Wry and Tauri (https://tauri.app/) might be noteworthy, but I don't know how much of a difference they make, as the runtime is still JavaScript, HTML, and CSS.
1. https://federicoterzi.com/blog/why-electron-is-a-necessary-e...
I have been using linux for 20 years and I've yet to use one electron based software.
There is no problem with LGPL. From your link: " QT is free to use as long as you release your code as GPL"
That is false. It is free as long as you don't statically link qt, and you distribute your eventual changes to qt itself (which aren't needed in most cases).
Of course if we spread misinformation, we might draw different conclusions, which is why it is important to start from correct non-made-up premises.
Why do you restrict yourself to only knowing how to write webpages?
> Of course, you can write the same application in Qt, GLFW, whatever, but I don't think anyone will disagree that it's much slower to build and prototype responsive UIs with these tools.
Maybe because you don't have the "muscle memory" to write Qt?
But since you mentioned Qt, one of the greatest hits to developing cross platform applications was the Nokia/Microsoft disaster.
Nokia bought Qt from Trolltech and made it LGPL, because their plan was to make money from the hardware not the software. Then they died, for reasons that have been commented on endlessly.
From the ashes of Nokia rose Digia or whatever it's called this week, a company that maintains Qt badly and thinks it's a good idea to threaten developers that download their LGPL product.
RIP Qt, RIP cross platform development.
JavaScript might be a disaster of a language, but it is faster to make a UI with CSS. I can totally see why startups pick web programming to ship desktop apps.
That’s just how the market is. If you want to build your app with electron, you’ll find mountains of skilled developers everywhere. If you use QT, you’ll either have to pay an absolute fortune for the 10 people that know it, or accept hiring people that have never used it before.
The web is the best cross-platform environment we have as evidenced by the fact that developers flock to stuff like Electron at all, but then you end up needing to ship your own entire browser engine to achieve a reasonable level of control. If PWAs didn't threaten App Store business models we'd probably be in a better place RE web app distribution (just use the browser you're already using anyway).
Doesn't address the issue that web app development is flooded with bad choices and opinions that lead people to decide that the whole ecosystem is overcomplicated and bloated, but that's not an opinion I hold very dear as someone who feels pretty comfortable with web tech stacks and understand where they came from.
I do think the skill bar is too high right now, where most engineers are likely to do a bad job RE performance and security with what we have unfortunately. But I'm not confident your average engineer would do any better if it were a different ecosystem.
I suspect that web developers who only know web development flock to electron because they think it's easier than learning a new technology.
In my professional experience, it is perhaps easier for a web developer who only knows JS to get a prototype working, but when you want a nice application you end up having to re-implement a number of things that any regular widget toolkit would already offer. So on the long term I don't think it's cost efficient at all, but at that point you've already spent resources on your electron GUI so you keep going forever.
JavaFX, even though it's outdated, is quite up to the job of replacing most Electron-based UIs. Qt is definitely extremely powerful, but has the drawback of being tied into the C++ ecosystem which seems rather dated now. Even some hobbyist efforts are worth mentioning in this category: AvaloniaUI (in the C# ecosystem), HaxeUI & FeathersUI (both in the Haxe ecosystem and building on game engines).
I think, the bigger problem is sourcing developers. Web developers are comparatively cheap and abundant, so a commercial entity is always going to have trouble justifying hiring a comparatively expensive and difficult-to-find developer in the C#, ObjC/Swift, or Java ecosystem, when the job can also be accomplished by a web developer.
My non-trivial app can be built with jlink to be a no-dependencies on JVM binary for all Operating Systems.... each of which sits at just 30MB... and when run, it needs around 60MB of RAM, which is a lot but I am yet to find a multiplatform UI toolkit that delivers much less than that... except for some toy frameworks which can't really be used for realworld apps.
Ofc I’m simplyfying a bit since if you use Firefox as your browser and Tauri uses Chromium that’s two browsers worth of memory usage, but still not 5 or 10.
The average software project sits on ginormous mountain of existing software. Libraries, components, tools, operating systems, etc. As a percentage of the overall source code, the tiny bit you add is a vanishingly small proportion. All this stuff exists, is being maintained by someone, and replacing it with something else has very low economic value. It adds negative value when it doesn't work because then you have to fix it or deal with the problems it is causing. But if it works as advertised, it just levels the playing field. Because everybody else is at that level as well.
Your attention as a software engineer should be focused mainly on things that others don't have that are valuable. It's always been like that. What has changed over time is the amount of stuff that you no longer have to build or worry about that much. That's the value of cloud based services. You get a lot of decent quality stuff that you pay a premium for that would be very expensive to match with in house development. Reinventing wheels like that is not lean but stupid.
So your server binary is tiny, but you're serving full size unoptimized images and wasting bandwidth for every single user that visits the site, as well as the uploader. Even thumbnails are served full size.
In this case you're optimizing for the wrong thing.
> This article is a bit hyprocritical. The example is an image sharing tool, but you don't resize images at all.
Can you explain how serving full-sized images opens up additional security vulnerabilities?
I don't see the connection between your argument about bandwidth and the OP's argument about attack surface.
More lines of code don't necessarily correlate to less secure - in fact, the author's tool makes a big security mistake, it doesn't strip EXIF.
To be clear, I brought up GNU/Linux distributions as examples of container-free packaging, and as notable collections of relatively lean programs, but not necessarily as an example of the combined systems being particularly lean themselves. Though then again, compared to something like recent Windows versions, perhaps even the Linux-based systems with larger DEs would seem lean.
Not better, just faster.
I wish things were like the 90s and software dev environments like Rebol [ http://www.rebol.com ] were still king.
It was simple to use, small executables, expressive, but no longer maintained :(
It's not always that we need better tooling. Here I think we need better developers, as in "developers who care about this issue".
Developers are the ones including the bloat, right? If all the developers started working slower but including less bloat, what would the managers do? Probably the managers have no clue about the complexity of what their devs do: they just compare the tasks with t-shirt sizes (with estimations that are generally completely wrong, but they still use them).
The problem is that if one developer works slowly writing less bloated code, and their 3 coworkers keep adding bloat, then not only it's still getting bloated, but the first dev appears to be less productive.
It's a kind of competition between developers, where those who do the better job lose.
Minimize the usage of JS-based libs/tools, perhaps? Yes, I'm looking my daily tools like VSCode, Postman etc which are Electron-based. Perhaps rewriting it into Go/C++/Pascal could shrink the bloat.
Does every program need all of that code? There's libraries in there to handle OpenGL (in half a million different versions), Vulkan, AMD, Intel, Nouveau, etc... Nope, you usually need just the tiny bits relevant to your application+hardware. But what's easier, figuring out which bits you don't need - or making the stack more portable and future-proof, by always shipping everything?
A lot of complexity is accidental, but most of it comes from conscious choices to make life simpler for everyone. Of course taken to the logical extreme, we do end up with Electron, but where are we supposed to draw the line?
Windows 95 was almost 30: https://ia803207.us.archive.org/view_archive.php?archive=/22...
Another fun article currently on the front page, diving into the insanity that is 8086/286/386 addressing modes: https://blogsystem5.substack.com/p/from-0-to-1-mb-in-dos
Even these "simple" operating systems managed to pack an incredible amount of complexity - again, just to deal with hardware, portability, different APIs, etc. Consumers - we, we demanded all of that.
If we really wanted/needed simpler software, OpenBSD is right around the corner. I've used it on&off as a daily driver for a bit, and it has an incredibly high ratio of code quality/readability vs how practical it is for everyday things (while remaining very portable). But simplicity is an uphill battle.
It creates a motivation system of to get something working, or even more cynically to just get something that looks like its working. The half dozen intermittently en vogue development acronyms also further this mindset. I don't see how to overcome this issue, because it's something like a tragedy of the commons. Nobody (at the top) wants to reduce bloat because it would likely reduce rather than increase profit on short time frames. Yet, at scale, it's leading to a complete enshittification of all software.
And also because they were probably "the first guy", chances are that they don't even know about bloat at all.
A while back, I wanted to host a pastebin for sharing bits of code and other text. Then I realized that about 10 lines of (compile-time) elisp gave me everything I needed to turn any webserver I could ssh to into a pastebin with no runtime dependencies aside from nginx: https://fwoar.co/pastebin/3daaf7ce49ca221702c70b0d10ac5caec8...
I remember using Windows 9x, the running jokes about poor quality and security of all MS products. Adobe’s formats came from those early days and are roundly mocked. Hell, I’ve built replacements for 90s software and, I can assure you, what I replaced was not high quality or robust at all.
On this very site, we discussed Horizon: a project started in the 90s and 20000s that was so badly built that it led to hundreds of innocent sub-postmasters being imprisoned, bankrupted and a number committed suicide.
Is the author just romanticising the “good old days?”
In the 90s, the economics around software had already heated up to the point where there was an insatiable appetite for software engineering manpower, but the university system wasn't yet geared to churning out specialists in this field in such large numbers, so a lot of software engineers back then were people coming from other professions who picked it up autodidactically and were just not very good. At the same time programming languages and tooling weren't yet at a point where they were good at guiding people towards good software engineering practice, and this lead to a kind of software quality crisis.
But this situation changed fast. I would say from maybe roundabout 2003 to maybe roundabout 2013 there was a bit of a "golden period" where we had good reason to be optimistic about the future of software quality. The software quality crisis of the 90s was largely overcome through better education, better software engineering methodology, and better programming language ecosystems and toolchains. Back in those days we still had purpose-built tooling for doing things like desktop UIs. Windows Forms based in C# and Aqua-era MacOS GUI programming in ObjC were actually quite a good experience for both developers and users. We also had cross-platform ways of doing GUI programming like Swing on Java.
In the next ten years, i.e. the ten years leading up to now, things took a decided turn for the worse. If I were to speculate about the reasons, I would say it was related to the rise of mobile, and the continued rise in the importance of the web platform over the desktop platform, meaning that application development now had to straddle web, mobile, and desktop as three distinct development targets. This created a need for truly cross-platform application development, while Apple and Microsoft continued to make plays to fortify their monopoly power instead of giving the world what it needed. Swing/JavaFX lost its footing when enterprises decided that web was all they really needed.
So, to answer your intial question: Has software quality really gotten worse? I would say, yes, over the last 10-15 years definitely. If you compare now to the mid-90s, then maybe, maybe not.
By what metric?
Taking all your above examples, I (and many others) could argue that the move to web brought new techniques that overall improved software for developers and users. That's not to say I'm right, or you are, but to point out that everything you put forward is purely subjective.
What has objectively gotten worse in the past 10 years?
Bloat does increase attack surface as mentioned.
There was reliable software in the old days as well. VAX/VMS, Windows NT 3.5x, my SGI workstation. Few wanted to pay for it. Today we have FLOSS.
It's 2024, why are we still blaming everything except the Operating Systems?
> simple products importing 1600 dependencies of unknown provenance.
Put yourself back in 1984... you've got an IBM XT with 2 floppy disks. You made write protected copies of all your important disks, and even more copies of your boot disk.
You'd go to a computer show, or your user group, and come home with stacks of software of unknown provenance, and then just try everything out over the next few weeks.
You were safe because your system made it easy to know what you were risking when you ran a program. There was one simple rule that was easy to understand:
Only un-write protected floppy disks in the drives were at risk.
That quite limited computer system was, in effect, a capability based security system. Crude, but extremely effective.
Here it is 40 years later, and the ability to just run code with abandon like we used to seems to be a fantasy to younger people. Because we don't expect our operating systems to be at least as safe as MS-DOS on an IBM-XT.
What about suffering software users.
"! want to end this post with some observations from Niklaus Wirth's 1995 paper.
"To Some, complexity equals power. (...) Increasingly, people seem to misinterpret complexity as sophistication, which is baffling - the incomprehensible should cause suspicion rather than admiration.""
Who were the "some people" to which Wirth referred. A wild guess: software developers.
I know many people push for interoperability, but that is a very hard problem. Open APIs are easy; I should be able to write e.g. my own Slack client, for my specific platform.
It is not a problem that there exists an (official) ElectronJS Slack app. The problem is that I am forced to use it. And what does it bring to Salesforce, except an opportunity to add telemetry in the app? With an open API, they would still make companies pay 5$ per account per month.
I believe that open APIs would enable better clients for popular services.
Yes, Qt is quite a bloaty - the binary size is 139.2mb currently, but I think with static linking and some trimming, I can get it much lower.
Who maintains the 1600 dependencies of a project? Pretty sure some of those expose vulnerabilities. Not counting those that are downright malware.
It's unrealistic to expect companies the size of Microsoft to take a break from putting spyware in your operating system and, for example, revert the piece of shit Skype has become back into a native app.
The whole image is ~80MB, compressed. It is, indeed, impressively lean.
And recently, I've installed `clickhouse-client` (a new SQL database), which needs almost 900 MB for just a CLI client!!! Absolutely insane!
I use QGIS, which is an open source alternative to ArcGIS, and a non-IT friend asked something to draw maps and see imagery -- I recommended QGIS, and he wrote: "1 GIG download? WTF IS THAT?" Oops. We didn't notice the little alternative open-source app turned into such a behemoth. (https://download.osgeo.org/qgis/windows/weekly/?C=M&O=D -- actually, since last year, it grew by 20%!)
The reason for this kind of bloat to me seems the race for version updates. And it probably did make sense in late '00s, when you could claim Linux ecosystem be underdeveloped. But 15 years late, it's still here. Every package is updated at high pace, breaking stuff downstream, and now instead of settling on compatibility, everybody just started to ship docker containers.
clickhouse-client is just a symlink to the main ClickHouse binary. That binary also includes the server and a lot of useful utilities.
It's large, yes, but it's on purpose it's super useful when you need to have a single binary for server, client, Zookeeper server, Zookeeper client, local data analysis tool etc.