Linux with glibc is the complete opposite; there really does exist old Linux software that static-links in everything down to libc, just interacting with the kernel through syscalls—and it does (almost always) still work to run such software on a modern Linux, even when the software is 10-20 years old.
I guess this is why Linux containers are such a thing: you’re taking a dynamically-linked Linux binary and pinning it to a particular entire userland, such that when you run the old software, it calls into the old glibc. Containers work, because they ultimately ground out in the same set of stable kernel ABI calls.
(Which, now that I think of it, makes me wonder how exactly Windows containers work. I’m guessing each one brings its own NTOSKRNL, that gets spun up under HyperV if the host kernel ABI doesn’t match the guest?)
Turns out that Nix is built against a different version of glibc than SteamOS, and for some reason, that matters. You have to make sure none of Steam's libraries are on the path before the Nix code will run. It seems impractical to expect every piece of software on your computer to be built against a specific version of a specific library, but I guess that's Linux for you.
Honestly I might buy a T-shirt with such a quote.
I think glibc is such a pain that it is the reason why we have so vastly different package management and I feel like non glibc things really would simplify the package management approach to linux which although feels solved, there are definitely still issues with the approach and I think we should still all definitely as such look for ways to solve the problem
To be honest, I think OSes are boring, and should have been that way since maybe 1995. The basic notions:
multi-processing, context switching, tree-like file systems, multiple users, access privileges,
haven't changed since 1970, and the more modern GUI stuff hasn't changed since at least the early '90s. Some design elements, like tree-like file systems, WIMP GUIs, per-user privileges, the fuzziness of what an
"operating system" even is and its role,
are perhaps even arbitrary, but can serve as a mature foundation for better-concieved ideas, such as: ZFS (which implements in a very well-engineered manner a tree-like data storage that's
been standard since the '60s) can serve as a founation for
Postgres (which implements a better-conceived relational design)
I'm wondering why OSS - which according to one of its acolytes, makes all bugs shallow - couldn't make its flagship OS more stable and boring. It's produced an anarchy of packaging systems, breaking upgrades and updates,
unstable glibc, desktop environments that are different and changing seemingly
for the sake of it, sound that's kept breaking, power management iffiness, etc.There's a lot to like about BSD, and many reasons to prefer OpenBSD to Linux, but ABI backward-compatibility is not one of them!
One of Linux's main problems is that it's difficult to supply and link versions of library dependencies local to a program. Janky workarounds such as containerization, AppImage, etc. have been developed to combat this. But in the Windows world, applications literally ship, and link against, the libc they were built with (msvcrt, now ucrt I guess).
Why should everything pretend to be a 1970s minicomputer shared by multiple users connected via teletypes?
If there's one good idea in Unix-like systems that should be preserved, IMHO it's independent processes, possibly written in different languages, communicating with each other through file handles. These processes should be isolated from each other, and from access to arbitrary files and devices. But there should be a single privileged process, the "shell" (whether command line, TUI, or GUI), that is responsible for coordinating it all, by launching and passing handles to files/pipes to any other process, under control of the user.
Could be done by typing file names, or selecting from a drop-down list, or by drag-and-drop. Other program arguments should be defined in some standard format so that e.g. a text based shell could auto-complete them like in VMS, and a graphical one could build a dialog box from the definition.
I don't want to fiddle with permissions or user accounts, ever. It's my computer, and it should do what I tell it to, whether that's opening a text document in my home directory, or writing a disk image to the USB stick I just plugged in. Or even passing full control of some device to a VM running another operating system that has the appropriate drivers installed.
But it should all be controlled by the user. Normal programs of course shouldn't be able to open "/dev/sdb", but neither should they be able to open "/home/foo/bar.txt". Outside of the program's own private directory, the only way to access anything should be via handles passed from the launching process, or some other standard protocol.
And get rid of "everything is text". For a computer, parsing text is like for a human to read a book over the phone, with an illiterate person on the other end who can only describe the shape of each letter one by one. Every system-level language should support structs, and those are like telepathy in comparison. But no, that's scaaaary, hackers will overflow your buffers to turn your computer into a bomb and blow you to kingdom come! Yeah, not like there's ever been any vulnerability in text parsers, right? Making sure every special shell character is properly escaped is so easy! Sed and awk are the ideal way to manipulate structured data!
I wish either of those systems had the same hardware & software support. I’d swap my desktop over in a heartbeat if I could.
AppImage have some issues/restrictions like it cant run on older linux than one it was compiled on, so people compile it on the oldest pc's and a little bit of more quirks
AppImage are really good but zapps are good too, I had once tried to do something on top of zapp but shame that zapp went into the route of crypto ipfs or smth and then I don't really see any development of that now but it would be interesting if someone can add the features of zapp perhaps into appimage or pick up the project and build something similar perhaps.
It makes sense. Every distribution wants to be in charge of what set of libraries are available on their platform. And they all have their own way to manage software. Developing applications on Linux that can be widely used across distributions is way more complex than it needs to be. I can just ship a binary for windows and macOS. For Linux, you need an rpm and a dpkg and so on.
I use davinci resolve on Linux. The resolve developers only officially support Rocky Linux because anything else is too hard. I use it in Linux mint anyway. The application has no title bar and recording audio doesn’t work properly. Bleh.
Who needs ABI compatibility when your software is OSS? You only need API compatibility at that point.
surely forced versioning of GLIBC didn't help.
"This program requires GLIBC_2.33"
Windows kept logging down the system trying to download a dozen different language versions of word (for which I didn't have a licence and didn't want regardless). Steam kept going into a crash restart cycle. Virus scanner was ... being difficult.
Everything just works on Linux except some games on proton have some sound issues that I still need to work out.
Is this 1998? Linux is forever having sound issues. Why is sound so hard?
What are some examples?
A recent example is that in San Andreas, the seaplane never spawns if you're running Windows 11 24H2 or newer. All of it due to a bug that's always been in the game, but only the recent changes in Windows caused it to show up. If anybody's interested, you can read the investigation on it here: https://cookieplmonster.github.io/2025/04/23/gta-san-andreas...
I see there are guides on Steam forums on how to get it to run under Windows 11 [0], and they are quite involved for someone not overly familiar with computers outside of gaming.
0: https://steamcommunity.com/sharedfiles/filedetails/?id=29344...
It's a great game, unfortunately right now I am not able to play it anymore :( even though I have the original CD.
Unfortunately, Wine is of no help here :(
Also original Commandos games.
One more popular example is Grid 2, another is Morrowind. Both crash on launch, unless you tweak a lot of things, and even then it won't always succeed.
Need for Speed II: SE is "platinum" on Wine, and pretty much unable to be run at all on Windows 11.
Whether that was a Windows compatibility issue or potentially some display driver thing, I'm not sure. (90's Windows games may have used some DirectDraw features that just don't get that much attention nowadays, which I think may have been the issue, but my memory's a bit spotty.)
1. The exact problem with the Linux ABI
2. What causes it (the issues that makes it such a challenge)
3. How it changed over the years, and its current state
4. Any serious attempts to resolve it
I've been on Linux for may be 2 decades at this point. I haven't noticed any issues with ABI so far, perhaps because I use everything from the distro repo or build and install them using the package manager. If I don't understand it, there are surely others who want to know it too. (Not trying to brag here. I'm referring to the time I've spent on it.)
I know that this is a big ask. The best course for me is of course to research it myself. But those who know the whole history tend to have a well organized perspective of it, as well as some invaluable insights that are not recorded anywhere else. So if this describes you, please consider writing it down for others. Blog is probably the best format for this.
Together this means that basically nobody implements applications anymore. For commercial applications that market is too fragmented and it is too much effort. Open-source applications need time to grow and if all the underpinnings get changed all the time, this is too frustrating. Only a few projects survive this, and even those struggle. For example GIMP took a decade to be ported from GTK 2 to 3.
I wish websites weren't allowed to know what site a user is coming from.
My understanding is that very old statically linked Linux images still run today because paraphrasing Linus: "we don't break user space".
Good operating systems should:
1. Allow users to obtain software from anywhere.
2. Execute all programs that were written for previous versions reliably.
3. Not insert themselves as middlemen into user/developer transactions.
Judged from this perspective, Windows is a good OS. It doesn't nail all three all the time, but it gets the closest. Linux is a bad OS.
The answers to your questions are:
(1) It isn't backwards compatible for sophisticated GUI apps. Core APIs like the widget toolkits change their API all the time (GTK 1->2->3->4, Qt also does this). It's also not forwards compatible. Compiling the same program on a new release may yield binaries that don't run on an old release. Linux library authors don't consider this a problem, Microsoft/Apple/everyone else does. This is the origin of the glibc symbol versioning errors everyone experiences sometimes.
(2) Maintaining a stable API/ABI is not fun and requires a capitalist who says "keep app X working or else I'll fire you". The capitalist Fights For The User. Linux is a socialist/collectivist project with nobody playing this role. Distros like Red Hat clone the software ecosystem into a private space that's semi-capitalist again, and do offer stable ABIs, but their releases are just ecosystem forks and the wider issue remains.
(3) It hasn't change and it's still bad.
(4) Docker: "solves" the problem on servers by shipping the entire userspace with every app, and being itself developed by a for-profit company. Only works because servers don't need any shared services from the computer beyond opening sockets and reading/writing files, so the kernel is good enough and the kernel does maintain a stable ABI. Docker obviously doesn't help the moment you move outside the server space and coordination requirements are larger.
Never happens for me on Arch, which I've run as my primary desktop for 15 years.
Alternatively, RemObjects makes Elements, also a RAD programming environment in which you can code in Oxygene (their Object Pascal), C#, Swift, Java, Go, or Mercury (VB) and target all platforms: .Net, iOS and macOS, Android, WebAssemblyl, Java, Linux, Windows.
Java Build code for any of the billions of devices, PCs and servers that run JavaSE, JavaEE or the OpenJVM.
.NET Core
The cross-platform .NET Core runtime is the future of .NET and will fully replace the current classic .NET 4.x framework when .NET Core 5 ships in late 2020.
It really seems like it was last updated sometime in the last decade. Not sure I want to base a future project on it.
Wait you can make Android applications with Golang without too much sorcery??
I just wanted to convert some Golang CLI applications to GUI's for Android and I instead ended up giving up on the project and just started recommending people to use termux.
Please tell me if there is a simple method for Golang which can "just work" for basically being the Visualbasic-alike glue code to just glue CLI and GUI mostly.
We might take it for granted but React-like declarative top-down component model (as opposed to imperative UI) was a huge step forward. In particular that there's no difference between initial render or a re-render, and that updating state is enough for everything to propagate down. That's why it went beyond web, and why all modern native UI frameworks have a similar model these days.
Personally I much rather the approach taken by solidjs / svelte.
React’s approach is very inefficient - the entire view tree is rerendered when any change happens. Then they need to diff the new UI state with the old state and do reconciliation. This works well enough for tiny examples, but it’s clunky at scale. And the code to do diffing and reconciliation is insanely complicated. Hello world in react is like 200kb of javascript or something like that. (Smaller gzipped, but the browser still needs to parse it all at startup). And all of that diffing is also pure overhead. It’s simply not needed.
The solidjs / react model uses the compiler to figure out how variables changing results in changes to the rendered view tree. Those variables are wrapped up as “observed state”. As a result, you can just update those variables and exactly and only the parts of the UI that need to be changed will be redrawn. No overrendering. No diffing. No virtual Dom and no reconciliation. Hello world in solid or svelte is minuscule - 2kb or something.
Unfortunately, swiftui has copied react. And not the superior approach of newer libraries.
The rust “Leptos” library implements this same fine grained reactivity, but it’s still married to the web. I’m really hoping someone takes the same idea and ports it to desktop / native UI.
It's more the other way around, this model started on desktop (eg WPF) and then React popularized it on the web.
But if you liked that, consider that C# was in many ways a spiritual successor to Delphi, and MS still supports native GUI development with it.
The web was a big step backwards for UI design. It was a 30 year detour whose results still suck compared to pre-web UIs.
I might seriously recommend it to newbies and like there is just this love I have for windows 7 even though I really didn't use it for much but its so much more elegant in its own way than windows 10
like it can be a really fun experiment and I would be interested to see how that would pan out.
Rough approximations have been possible since the early 2000s, but they’re exactly that: rough approximations. Details matter, and when I boot up an old XP/7 box there are aspects in which they feel more polished and… I don’t know, finished? Complete? Compared to even the big popular DEs like KDE.
Building a DE explicitly as a clone of a specific fixed environment would also do wonders to prevent feature creep and encourage focus on fixing bugs and optimization instead of bells and whistles, which is something that modern software across the board could use an Everest sized helping of.
I think one of the friction could be ideological if not than anything since most linux'ers love Open source and hate windows so they might not want to build anything which even replicates the UI perhaps
Listen I hate windows just as much as the other guy but gotta give props that I feel nostalgic to windows 7, and if they provide both .exe perfect support and linux binary perfect support, things can be really good. I hope somebody does it and perhaps even adds it to loss32, would be an interesting update.
The screenshots could easily fool me into believing it actually is Windows 7 :p
There is also anduinos which I think doesn't try to replicate windows 7 but it definitely tries to look at windows 10 perhaps 11 iirc
Pro tip but if someone wants to create their own iso as well, they can probably just customize things imperatively in MxLinux even by just booting them up in your ram and then they have the magnificient option of basically snapshotting it and converting that into an iso so its definitely possible to create an iso tweaked down to your configuration without any hassle (trust me but its the best way to create iso's without too much hassle and if one wants hassle, nix or bootc seems to be the way to go)
Regarding Why it wouldn't hit. I don't know, I already build some of my own iso's and I can build one for windows (on MxLinux principle) and upload it for free on huggingface perhaps but the idea is of mass appeal
Yes I can do that but I would prefer if there was an iso which could just do that and I could share it with a new person in linux. And yes I could have the new person do the changes themselves but (why?), there really is no reason perhaps imo and this just feels like a low hanging fruit which nobody touched perhaps and so this is why I was curious too.
But also as the other comment pointed out, I feel like sure we can do this thing, but that there is definitely a genuine reason why we can probably create this thing itself as well and they give some good reasons as well and I agree with them overall too.
Like if you ask me, it would be fun to have more options especially considering this is linux where freedom is celebrated :p
It would fail, and just be another corpse in the desktop OS graveyard.
https://en.wikipedia.org/wiki/Hitachi_Flora_Prius
https://www.osnews.com/story/136392/the-only-pc-ever-shipped...
https://en.wikipedia.org/wiki/Linspire
Unless you ship your own hardware or get a vendor to ship your OS (see the above), and set up so the user can actually use it, you have to get users to install it on Windows hardware. So now your company is debugging broken consumer hardware without the help of the OEM. So that hopefully someone will install it on exactly that configuration for free.
This is not a winning business model.
Loss32 is itself a linux distro and thus there should technically be nothing stopping it from shipping everywhere
I think you were assuming that I meant create a whole kernel from scratch or something but I am just merely asking a loss32 reskin which looks like windows 7 which is definitely possible without any of the company debugging consumer hardware or even the need of company for that matter I suppose considering that I was proposing an open source desktop environment which just behaved like windows 7 by default as an example.
I don't really understand why we need a winning business model out of it, there isn't really a winning model for niri,hyprland,sway,kde,xfce,lxqt,gnome etc., they are all open source projects who are run with help of donations
There might be a misunderstanding between us but I hope this clears up any misunderstanding.
Or maybe ReactOS - the actual windows clone - gets finished. Rumours put a first release date some time after Hurd.
It's a Unix underneath, though. A strange modernised Unix written in C++ but it's definitely Unix-like.
It's a Unix-like with a Win2K GUI, which is a pretty attractive combination, TBH...
It's my strong opinion that Windows 2000 Server, SP4 was the best desktop OS ever.
Source: I reviewed Cutler's lock-free data structure changes in Vista/Longhorn to find bugs in them, failed to find any.
I haven't yet gone more than a decade in the past before, so I can't promise forever, and GPU-accelerated things probably still break, but X11 is very compatible backwards.
Meanwhile, in 2025, with 64GB RAM and solid state drives, we hear, "Windows 11 Task Manager really, really shouldn't be eating up 15% of my CPU and take multiple seconds to fire up."
I meant to agree entirely with the parent comment by showing one specific way in which Win2K SP4 is far superior to Windows 11.
In Win2K, Task Manager takes less than a second to start on a 200 MHz, single core Pentium II with 64MB of RAM and a 5400 RPM IDE HDD.
(also Microsoft has been heavily embracing Linux and open source in the last decade)
Nowadays, with the Windows team barely able to produce a functional UI, what's happening with the NT kernel? Is it all graybeards back there? When they retire, the stability of Windows going to be in trouble, which is important for the things that really pull in the money. It'll get real bad, then they'll give up and move to an open source base, just like Edge.
This is something that is very much needed to make Linux much more user friendly for new users.
What boggles my mind is why Google hasn't gotten more serious about making Android a desktop OS. Pay the money needed to get good hardware support, control the OS, and now you're a Microsoft/Apple competitor for devices. Yes there is the Chromebook, but ChromeOS is not a real desktop OS, it's a toy. Google could control both the browser market and the desktop computing market if they seriously tried. (But then again that would require listening to customers and providing support, so nevermind)
> What boggles my mind is why Google hasn't gotten more serious about making Android a desktop OS. Google is seriously working on making Android a desktop OS, Android 16 is only the first steps towards it.
> Yes there is the Chromebook, but ChromeOS is not a real desktop OS, it's a toy. ChromeOS is very much not a toy, it's pretty great if it can facilitate your work.
> But then again that would require listening to customers and providing support, so nevermind Google has consistently provided good support for all their hardware products, listening to customers is not their cup of tea though.
Google is absolutely no saint, I don't like their business model, how they're closing more and more of Android, how they keep killing services, how GCP can nuke AI nuke you, that they "own" web standards, ... But they're not all bad, they've also contributed greatly to much of the web and surrounding technologies.
ChromeOS is a better development environment than macOS in many ways. When was the last time you actually used one of these things, 2013?
What are you talking about? The majority of hardware is supported by only Linux at this point.
What are you talking about? Everything for desktops work out of the box unless you have something weird and proprietary, and even then most distros have support anyway.
This will never work, because it isn't a radical enough departure from Linux.
Linux occupies the bottom of a well in the cartesian space. Any deviation is an uphill battle. You'll die trying to reach escape velocity.
The forcing factors that pull you back down:
1. Battles-testedness. The mainstream Linux distros just have more eyeballs on them. That means your WINE-first distro (which I'll call "Lindows" in honor of the dead OS from 2003) will have bugs that make people consider abandoning the dream and going back to Gnome Fedora.
2. Cool factor. Nobody wants to open up their riced-out Linux laptop in class and have their classmate look over and go "yo this n** running windows 85!" (So, you're going to have to port XMonad to WINE. I don't make the rules!)
3. Kernel churn. People will want to run this thing on their brand-new gaming laptop. That likely means they'll need a recent kernel. And while they "never break userspace" in theory, in practice you'll need a new set of drivers and MESA and other add-ons that WILL breaks things. Especially things like 3D apps running through WINE (not to mention audio). Google can throw engineers at the problem of keeping Chromium working across graphics stacks. But can you?
If you could plant your flag in the dirt and say "we fork here" and make a radical left turn from mainline Linux, and get a cohort of kernel devs and app developers to follow you, you'd have a chance.
I might unironically use this. The Windows 2000 era desktop was light and practical.
I wonder how well it performs with modern high-resolution, high-dpi displays.
It sure was, if you were already bored by Windows 3.11/95 and were getting into Linux, it was fantastic. You were getting skills at the ground floor which could help keep you in good career for most of the rest of your life.
'unfortunate rough edges that people only tolerate because they use WINE as a last resort'
Whether those rough edges will ever be ironed out is a matter I'll leave to other people. But I love that someone is attempting this just because of the tenacity it shows. This reminds me of projects like asahi and cosmopolitan c.
Now if we're to do something to actually solve for Gnu/Linux Desktops not having a stable ABI I think one solution would be to make a compatibility layer like Wine's but using Ubuntu's ABIs. Then as long as the app runs on supported Ubuntu releases it will run on a system with this layer. I just hope it wouldn't be a buggy mess like flatpak is.
We have gone through one perceived reason after the other to try and explain why the year of the Linux desktop wasn’t this one.
Uncharitably, Linux is too busy breaking and deprecating itself to ever become more than a server OS, and that only works due to companies sponsoring most the testing and code that makes those parts work. Desktop in all its forms is an unmitigated shit show.
With linux, you’re always one kernel/systemd/$sound system/desktop upgrade away from a broken system.
Personal pains: nvidia drivers, oss->alsa, alsa->pulse audio, pulse audio->pipe wire, init.d to upstart to systemd, anything dkms ever, bash to dash, gtk2 to gtk3, kde3 to kde4 (basically a decade?), gnome 2 to gnome 3, some 10 gnome 3 releases breaking plugins I relied on.
It should be blindingly obvious; windows can shove ads everywhere from the tray bar to start menu and even the damned lock screen, on enterprise editions no less, and STILL have users. This should tell you that linux is missing something.
It’s not the install barrier (it’s never been lower, corporate IT could issue linux laptops, linux on laptops exist from several vendors).
It’s also not software, the world has never placed so many core apps in the browser (even office, these days).
It’s not gaming. Though its telling that, in the end, the solution from valve (proton) incidentally solves two issues - porting (stable) windows APIs to linux and packaging a complete mini-linux because we can’t interoperate between distros or even releases of the same distro.
I think the complete and utter disdain in linux for stability from libraries through subsystems to desktop servers, ui toolkits and the very desktops themselves is the core problem. And solving through package management and the ensuing fragmentation from distros a close second.
From there, popularity outside the organization is irrelevant, internal support and userbase is for and on some version of Linux.
As this would spread, we would eventually see global usage increase and global popularity become a non-issue.
Wine and Proton should have levelled the playing field. But they haven't. Also, if you've only just started using Linux, I recommend you wait a few years before forming an opinion.
I used to be a pretty happy Windows camper (I even got through Me without much complaint), but I'm so glad I moved to Linux and KDE for my private desktops before 11 hit.
Competition. In the first half of the 90s Windows faced a lot more of it. Then they didn't, and standards slipped. Why invest in Windows when people will buy it anyway?
Upgrades. In the first half of the 90s Windows was mostly software bought by PC users directly, rather than getting it with the hardware. So, if you could make Windows 95 run in 4mb of RAM rather than 8mb of RAM, you'd make way more sales on release day. As the industry matured, this model disappeared in favor of one where users got the OS with their hardware purchase and rarely bought upgrades, then never bought them, then never even upgraded when offered them for free. This inverted the incentive to optimize because now the customer was the OEMs, not the end user. Not optimizing as aggressively naturally came out of that because the only new sales of Windows would be on new machines with the newest specs, and OEMs wanted MS to give users reasons to buy new hardware anyway.
UI testing. In the 1990s the desktop GUI paradigm was new and Apple's competitive advantage was UI quality, so Microsoft ran lots of usability studies to figure out what worked. It wasn't a cultural problem because most UI was designed by programmers who freely admitted they didn't really know what worked. The reason the start button had "Start" written on it was because of these tests. After Windows 95 the culture of usability studies disappeared, as they might imply that the professional designers didn't know what they were doing, and those designers came to compete on looks. Also it just got a lot harder to change the basic desktop UI designs anyway.
The web. When people mostly wrote Windows apps, investing in Windows itself made sense. Once everyone migrated to web apps it made much less sense. Data is no longer stored in files locally so making Explorer more powerful doesn't help, it makes more sense to simplify it. There's no longer any concept of a Windows app so adding new APIs is low ROI outside of gaming, as the only consumer is the browser. As a consequence all the people with ambition abandoned the Windows team to work on web-related stuff like Azure, where you could have actual impact. The 90s Windows/MacOS teams were full of people thinking big thoughts about how to write better software hence stuff like DCOM, OpenDoc, QuickTime, DirectMusic and so on. The overwhelming preference of developers for making websites regardless of the preferences of the users meant developing new OS ideas was a waste of time; browsers would not expose these features, so devs wouldn't use them, so apps wouldn't require them, so users would buy new computers to get access to them.
And that's why MS threw Windows away. It simply isn't a valuable asset anymore.
The answer to maintaining a highly functional and stable OS is piles and piles of backwards compatibility misery on the devs.
You want Windows 9? Sorry, some code checks the string for Windows 9 to determine if the OS is Windows 95 or 98.
A 1:1 recreation of the Windows XP or Windows 7 user experience with the classic theme would be killer.
I say this with love, I have used KDE extensively and I still find it more janky than Windows XP. Gnome is "better" (especially since v40) in that it's consistent and has a few nicer utilities, but it also has worse UX (at least for power users) than Windows XP.
I wanted to be nice and entered a genuine Windows key still in my laptop's firmware somewhere.
As a thank you Microsoft pulled dozens of the features out of my OS, including remote desktop.
As soon as these latest FSR drivers are ported over I will swap to Linux. What a racket, lol.
Perhaps that could be mitigated if someone could come up with an awesome OSS machine code translation layer like Apple's Rosetta.
Your average user might not even know its Linux.
There were some great efforts to build these out in ReactOS a few years ago.
There is a ton of useful FOSS for Windows and maybe it is a good push to modernize abandoned projects or make Win32 projects cross-compilable.
From a quick glance at the feature lists it looks quite comparable.
Just target Windows, business as usual, and let Valve do the hard work.
But they do test their Windows games on Linux now and fix issues as needed. I read that CDProjekt does that, at least.
The idea of "fuck it, let's do Windows everywhere" was introduced by Justine Tunney as an April Fools Joke in the Cosmopolitan repository.
That's it. An april fools joke.
googles
Ah, no, that was FreeWin95. What on earth is Free95, it feels like history repeating itself…
Love this idea. Love where it is coming from.
The really cool thing about Win32 is it's also the world's stable ABI. There's lots of fields of software where the GNU/Linux and POSIX-y offerings available are quite limited and generally poor in quality, e.g. creative software and games. Win32 gives you access to a much larger slice of humanity's cultural inheritance. "
What a pile of bullshitting.
(That and Linux doesn't implement win32 and wine doesn't exclusively run on Linux.)
If you make a piece of software today and want to package it for Linux its an absolute mess. I mean, look at flatpack or docker, a common solution for this is to ship your own userspace, thats just insane.
Not talking about the cross-platform versions of .NET and VS-Code. I'm specifically talking about the Windows-specific software I mentioned above.
I don't see this happening, despite the fact that by now, these types of porting efforts were supposed to be trivial because of AI. Yeah, I'll wait.