To me the biggest issue of Wayland is that it aimed, on purpose, to imitate Windows or OS X or any GUI that is not built on the idea of a client/server protocol.
From TFA:
> I’ll also need a solution for running Emacs remotely.
If only there was something conceived from the start as a client/server display protocol...
The big thing to me is, Wayland servers have way way less responsibility than X. X had a huge Herculean task, of doing everything the video card needed. It was a big honking display server because it took up a huge chunk of the stack to run a desktop.
Wayland servers all use kernel mode setting kernel buffers, so much more. So much of the job is done. There is a huge shared code base that Wayland has that X never had, good Kernel's with actual drivers for GPUs.
If we wanted one stable platform that we could not innovate on, that was what it was and we all had to deal with it... We'd all just use Mac. punchyHamster is saying The Cathedral is the right model and The Bazaar is the bad model, of the famous Cathedral vs Bazaar.
But the model really does not enable fast iteration & broader exploration of problem spaces. The ask doesn't even make sense: there are incredibly good libraries for making Wayland servers (wlroots, smithay, more). And they're not always even huge, but do all the core protocols. Some people really want professional industrial direct software that they never have to think about that only works one way and will only evolve slowly and deliberately. I'm thankful as fuck Wayland developers aren't catering to these people, and I think that's the wrong abstraction for open source and the wrong excitement to allow timeless systems to be built grown and evolved. We should avoid critical core dependencies, so that we can send into the future, without being tied to particular code-bases. That seems obvious and proposing otherwise to consign ourselves to small limp fates.
But.. why? How many WM is feasible to have? Is it really an area where we want a "wide waist"? Should a WM be its own thing? Why not just make it an extension/plugin of a desktop environment, and have only a few of the latter? Being a library call is more efficient and easier to maintain, over maintaining an IPC API (especially that with X, X is just a dumb proxy to the compositor).
> And it shows in complexity and even in power use
You surely mean that X is more complex and has a higher power use, right?
People who are thinking of a Wayland replacement at this stage, mostly because they don't like it, will waste their time reinventing the mature parts instead of thinking about how to solve the remaining problems.
There is also a misunderstanding of the ideology the Wayland developers subscribe to. They want Wayland to be display only, but that doesn't mean they would oppose an input protocol or a window protocol. They just don't want everything to be under the Wayland umbrella like systemd.
Now, if only people deciding to replace X11 with Wayland heeded your suggestion...
so, as Douglas Adams put it: someone elses problem.
2026 and you will still run into plenty of issues with random behaviour, especially if you run anything based on wlroots. Wine apps will randomly have pointer location issues if you run multiple displays. Crashes, video sharing issues with random apps, 10 bit issues. Maybe in 2027 we'll finally make it. But I feel like these 20 years of development could have been better spent on something that doesn't end up with 4 or more implementations.
I noticed it's far far more work to build a wm for Wayland than it is for Xorg.
Because both have their own portal implementation/compositor with their own issues and service spec implementations. KDE has xdg-desktop-portal-kde, and GNOME has xdg-desktop-portal-gnome. On top of that each (still) has their own display server; KDE has KWin, and GNOME has Mutter.
> The reference compositor, Weston, is not really usable as a daily driver.
Weston is probably good for two things: Running things in Kiosk mode and showcasing how to build a compositor.
That's why you should at least use xdg-desktop-portal if you are not running KDE or GNOME. But this is a vanilla compositor (without implementations of any freedesktop desktop protocols), and as-is has no knowledge of things like screenshots or screensharing.
If you run any wlroots based compositor except Hyprland you should run xdg-desktop-portal-wlr which does implement the desktop protocols org.freedesktop.impl.portal.Screenshot and org.freedesktop.impl.portal.ScreenCast.
If you use Hyprland you should run its fork xdg-desktop-portal-hyprland instead which additionaly has things like file picking built in. Additionally you can/should run xdg-desktop-portal-gtk and/or xdg-desktop-portal-kde to respectively get GTK ("GNOME") and QT ("KDE") specific implementations for desktop protocols. And you absolutely should use xdg-desktop-portal-gtk instead of xdg-desktop-portal-gnome, because xdg-desktop-portal-gnome really doesn't like to share with others.
> With Wayland the each desktop is reinventing the wheel
Not really true, as I mentioned earlier there's still a DE specific display server running in the background (like Mutter and KWin-X11 for X11), and graphics in each compositor is driven directly by the graphics driver in the kernel (through KMS/DRM).
In fact, on paper and in theory, the architecture looks really good: https://wayland.freedesktop.org/architecture.html. However, in practice, some pretty big chunks of functionality on the protocol level is missing but the freedesktop contributors, and the GNOME and KDE teams will get there eventually.
Outside of the domain of Firefox/Chromium, screencasting is much seamless. But 90% of the screen-sharing happens in browsers.
There isn't any technical reason we couldn't have a single standardized library, at the abstraction level of wlroots.
This should ideally be solved at the kernel level - and I would argue it is solved there. Linux has the DRM abstraction for this very reason. Wayland actually builds on top of this abstraction, while Xorg was sitting in a strange position, not squarely in userspace.
You don't always have to replace something that works with something that doesn't but is "modern."
My guess is that we'll only start seeing Wayland adoption when distributions start forcing it or making it a strong default, like what happened with systemd.
OTOH though, there are a lot of reasons for projects like GNOME and KDE to want to switch to Wayland, and especially why they want to drop X11 support versus maintaining it indefinitely forever, so it is beneficial if we at least can get a hold on what issues are still holding things up, which is why efforts like the ones outlined in this blog post are so important: it's hard to fix bugs that are never reported, and I especially doubt NVIDIA has been particularly going out of their way to find such bugs, so I can only imagine the reports are pretty crucial for them.
So basically, this year the "only downsides" users need to at least move into "no downsides". The impetus for Wayland itself is mainly hinged on features that simply can be done better in a compositor-centric world, but the impetus for the great switchover is trying to reduce the maintenance burden of having to maintain both X11 and Wayland support forever everywhere. (Support for X11 apps via XWayland, though, should basically exist forever, of course.)
I don't get why X11 shouldn't work forever. It works today. As you said, there's no obvious reason for an end user to switch to Wayland if there isn't any particular problems with their current setup. "Because it's modern" and "Because it's outdated" just aren't compelling reasons for anyone besides software developers. And "because we're going to drop support so you have to switch eventually" is an attitude I'd expect out of Apple, not Linux distributions.
But, I am stuck on Xorg only because of one app that I have to use to work.
> My guess is that we'll only start seeing Wayland adoption when distributions start forcing it or making it a strong default, like what happened with systemd.
This is already happening. in my knowledge, Archlinux, Ubuntu already switched to Gnome 49, which do not support X without recompilation. So most likely, any distro using Gnome 49 upwards will not provide Xorg by default. KDE also going to do it soon.
Xorg is going away pretty soon
I believe its step to the right direction, only issue is some annoying app holding us back
It doesn't really matter if you like or dislike wayland, the major DE have decided they don't like X11 and they are making the switch to wayland. X11 code is actively being removed from these desktop environments.
If you want to use X11, you can either stay on an old unmaintained DE or switch to a smaller one that supports X11. But you should realize that with wayland being the thing major DEs are targeting, your experience with X11 will likely degrade with time.
Does XWayland help?
I don't know what to do. The outpouring of negative energies are so severe. But I think it's so incredibly un-representative, is so misleading. The silent majority problem is so real. Come to the better place.
Right now with X11, IIRC, if one application has access to your display they can read what is going on in other applications running on the same display.
If browser tabs were able to do that, all hell would break loose. So why do we accept it from applications?
Anyway, despite this, I still use X11 instead of Wayland because of all the shortcomings.
Because I don't run random untrusted apps all the time. Whereas I do visit random untrusted websites all the time.
Windows:
- Per-display scaling can be set to multiple of 25% (you can set more precise scaling ONLY if you apply it to all displays)
- Windows across monitors with different scaling look weird
- Mouse cursor movement across monitors based on screen pixels rather than scaled, which is inaccurate to physical distances when using monitors with different DPI
- Under the hood handles all monitors with one rectangular logical monitor, which renders a lot of unnecessary pixels in any multi monitor setups where the monitors don't form a perfect rectangle
- All monitors use the underlying refresh rate of the primary monitor for everything other than the mouse cursor (for example, if you have a secondary 60Hz monitor, timings on it will not be smooth unless your primary monitor refresh rate is a multiple of 60)
Wayland:
- Per-display scaling can be set to multiple of 5%
- Windows do not span across monitors. This can be a downside in case you want to span a window across multiple monitors with the same scaling, but is mostly an upside
- Mouse movements across screen boundaries based on actual scaled distance, so you can tune it perfectly to the physical screen distances
- Each monitor can have its own refresh rate, and windows in each monitor actually update at the refresh rate of that monitor
- Each monitor is logically separate, no unnecessary pixels rendered
MacOS:
- Similar to Wayland but per-display scaling is much more restrictive for external displays, sometimes there isn't any way to set a scale between 100% and 200% without blurring the screen
- Apple Silicon hardware also limits number of total monitors supported, so it is impossible to use big multi-monitor setups on all but the most expensive hardware
Well, I will be honest, I have had enough "edit this xorg.conf files to boot to a black screen" for a lifetime, so that's not the rebuttal you think it is.
If anything, the (gnu/)linux desktop has certainly matured over the years and on well-selected hardware it more often than not "just works" nowadays, which was certainly not something you could tell before.
This is a common mischaracterizarion of what happened. This API, GBM, was a proprietary API that was a part of Mesa. Nvidia couldn't add GBM to their own driver as it is a Mesa concept. So instead Nvidia tried to make a vendor neutral solution that any graphics drivers could use which is where you see EGLStreams come into the picture. Such an EGL API was also useful for other nonwayland embedded usecases. In regards to Nvidia's proprietary driver's GBM support, Nvidia themselves had to add support to the Mesa project to support dynamically loading new backends that weren't precompiled into Mesa. Then they were able to make their own backend.
For some reason when this comes up people always phrase it in terms of Nvidia not supporting something instead of the freedesktop people not offering a way for the Nvidia driver to work, which is a prerequisite of Nvidia following such guidance.
Even today if you use the API your program has to link to Mesa's libgbm.so as opposed to linking to a library provided by the graphics driver like libEGL.so.
Arguably my hardware is a lot simpler and I don't use Nvidia. But I just want to point out that, for all the flak wayland receives, it can work quite well.
I'm having more issues with games/websites/programs that didn't take high display refresh rate into account, than Wayland, at this point.
You will also note many items in the post above are papercuts that might go unnoted like input feeling a little worse or font issues.
Wayland fixes that, so that part is a huge improvement to me. Unfortunately this also limited my choice of Distros as not all of them use Wayland. I landed on Ubuntu again, despite some issues I have with it. The most annoying initially was that the Snap version of Firefox didn't use hardware acceleration, which is just barely usable.
I don't entirely love MacOS (mostly because I can't run it on my desktop, lol). But it does fractional scaling so well, I always choose the "looks like 1440p" scaling on 4K resolution, and literally every app looks perfect and consistent and I don't notice any performance impact.
On windows the same thing, except some things are blurry.
On Linux yeah I just have to bear huge UI (x2 scaling) or tiny UI (X1) or live with a noticeable performance delay that's just too painful to work with.
But I'm also running all AMD hardware, that may be a factor. Life is too short for nvidia bullshit on Linux.
I switched to get support for different scaling on different outputs and I have gone back.
So much NVidia hate, but in 23 years the only problems I've had with NVidia on Linux were when they dropped support for old GPUs. Even on proprietary hardware like iMacs and MacBooks.
But to each their own.
The better path on Linux was always AMD, and still is, to this day, since it simply works without me needing to care about driver versions, or open vs closed source, at all.
>So from my perspective, switching from this existing, flawlessly working stack (for me) to Sway only brings downsides.
Kudos to Michael for even attempting it. Personally nowadays unless my working stack stops, well, working, or there're significant benefits to be found, don't really feel even putting the effort to try the shiny new things out.
And for taking the time to thoroughly document real issues.
I had an old Chromebook which had Lubuntu on it - screen tearing was driving me crazy so I switched to Wayland and it is buttery smooth. No mean feat given the decrepit hardware.
I'm sure someone will be along to tell me that I'm wrong - but I've yet to experience any downsides, other than people telling me I'm wrong.
That's fine as long as it goes both ways. If Wayland works for you, great. Equally, for some of us it doesn't work.
I feel like the biggest issue for Wayland is the long tail of people using alternative WMs. A lot of those projects don't have manpower to do what amounts to a complete rewrite.
I honestly don't have a preference between Wayland and X, but I feel very strongly about keeping my current WM. XWayland supposedly works, but I'm not in any hurry to add an extra piece of software and extra layer of configuration for something I already have working exactly the way I want. If Wayland offered some amazing advantages over X, it might be different, but I haven't seen anything to win me over.
Looking at your github, it seems you use StumpWM. It seems they are also working on a wayland version under the name Mahogany. Development seems pretty active: https://github.com/stumpwm/mahogany
> I'll probably switch, grudgingly, to XWayland whenever X gets removed from Debian.
FWIW I think "wayback" is the project for this. It seems to be trying to use XWayland to run full X11 desktop environments on top of Wayland: https://gitlab.freedesktop.org/wayback/wayback
Don’t know what the deal is with Linux desktop experience. I have encountered various forms of perfection and had them taken away.
Once on my XPS M1330 I clicked to lift a window and then three finger swiped to switch workspace and the workspace switched and I dropped the window. It was beautiful. I didn’t even notice until after I’d done it what an intuitive thing it felt like.
Then a few years later I tried with that fond memory and it didn’t work. Where did the magic go?
Probably some accidental confluence of features broken in some change.
I have a 7,000 word blog post and demo videos coming out this Tuesday with the details but I think I uncovered a driver bug having switched to native Linux a week ago with a low GPU memory card (750 Ti).
Basically on Wayland, apps that request GPU memory will typically crash if there's no more GPU memory to allocate where as on X11 it will transparently offload those requests to system memory so you can open up as much as you want (within reason) and the system is completely usable.
In practice this means opening up a few hardware accelerated apps in Wayland like Firefox and most terminals will likely crash your compositor or at the very least crash those apps. It can crash or make your compositor unstable because if it in itself gets an error allocating GPU memory to spawn the window it can do whatever weird things it was programmed to do in that scenario.
I reported it here: https://github.com/NVIDIA/egl-wayland/issues/185
Some end users on the NVIDIA developer forums looked into it and determined it's likely a problem for everyone, it's just less noticeable if you have more GPU memory and it's especially less noticeable if you reboot daily since that clears all GPU memory leaks which is also apparent in a lot of Wayland compositors.
There's just no way to make that make sense.
It's absolutely an essential characteristic for long term survival, for long term excellence. To not be married to one specific implementation forever.
Especially in open source! What is that organizational model for this Authoritarian path, how are you going to - as Wayland successfully has - get every display server person onboard? Who would had the say on what goes into The Wayland Server? What would the rules be?
Wayland is the only thing that makes any sense at all. A group of peers, fellow implementers, each striving for better, who come together to define protocols. This is what made the internet amazing what made the web the most successful media platform, is what creates the possibility for ongoing excellence. Not being bound to fixed decisions is an option most smart companies lust but somehow when Wayland vs X comes up, everyone super wants there to be one and only one path, set forth three decades ago that no one can ever really overhaul or redo.
It's so unclear to me how people can be so negative and so short and so mean on Wayland. There's no viable alternative organization model for Authoritarian display servers. And if somehow you did get people signed up, this fantasy, there's such a load of pretense that it would have cured all ills? I don't get it.
So good or bad idea, Wayland is slowly shifting to being the default in virtue of being the most maintained up to date compositor.
What used to be maintained in one codebase by Xorg devs is now duplicated in at least three major compositors, each with their own portal implementation and who knows what else. And those are primarily maintaned by desktop environment devs who also have the whole rest of the DE to worry about.
Actually, GPU acceleration was why I initially switched. For whatever reason, this GPU (Radeon VII) crashes regularly under X11 nearly every time I open a new window, but is perfectly stable under wayland. Really frustrating! So, I had some encouragement, and I was waiting for plasma-wayland to stabilize enough to try it properly. I still have the X11 environment installed as a fallback, just in case, but I haven't needed to actually use it for months.
Minor pain points so far mostly include mouse acceleration curves being different and screen capture being slightly more annoying. Most programs do this OS-level popup and then so many follow that up with their own rectangle select tool after I already did that. I had some issues with sdl2-compat as well, but I'm not sure that was strictly wayland's fault, and it cleared up on its own after a round of updates. (I develop an SDL2 game that needs pretty low latency audio sync to run smoothly)
I use it extensively, it's easy to use, UI is compact but clear, works perfectly all the time. I honestly don't care that it is unmaintained at this point.
FWIW, I have a KDE Wayland box and OBS works for screen recording. Slightly more complex than simplescreenrecorder, but not bad.
At some point I'll get irritated enough to seek out more alternatives and give them a whirl. Such is fate :)
I know that it wasn't originally conceived to do what it does today, but I've never had any problem using it, and when I tried Wayland I didn't notice any difference whatsoever.
Is it just that it's a pain to write apps for it..?
https://www.youtube.com/watch?v=GWQh_DmDLKQ
https://people.freedesktop.org/~daniels/lca2013-wayland-x11....
It makes sand-boxing security impossible. The moment a process has access to the Xorg socket, it has access to everything. It is weird that this oftentimes misses from the discussion though.
May I suggest Xephyr, which will give you the X11 sandboxing that wayland people like to claim is impossible under X, using tech that's been around for about 20 years.
And you won't even need to replace your entire software stack with incompatible beta-quality software.
Other way around: Maintaining Xorg itself is awful.
For example there's tons of legacy cruft in there intended for working with hardware that hasn't been in use since circa 1992. Things like monochrome 3D displays with weird resolutions like 1200x240 and non-square pixels. Having that stuff in there makes supporting more modern hardware more difficult than it needs to be (and is also part of the reason behind why e.g eliminating tearing is very difficult), and it adds huge complexity to the codebase for no benefit on modern systems, which makes it much more difficult (but NOT impossible, as some love to claim) to maintain.
There's also the wayland fanboy's go-to criticism: there are also some security shortcomings in the protocol. You can find details on this shortcoming which I have never in 30 years seen exploited in the opening paragraphs of every pro-wayland article on the internet. (it is a legit shortcoming. There have been multiple suggestions on how to address it over the decades without starting over from scratch. xlibre is working on one of these)
But over the years I've slowly become more and more convinced that the biggest issue people have with X is that it's not shiny and new.
I'm expecting them to announce a rewrite in rust any day now ;)
Windows and Mac Os, for all their faults, are unquestionably ready to use in 2026. If you are a Linux on desktop advocate, read the comments and see why so many are still hesitating.
>Windows and Mac Os, for all their faults, are unquestionably ready to use in 2026.
Quite ironically there're people refusing to leave Windows 7, which has been EOS since 2020, because they find modern Windows UI unbearable. Windows 11 being considered that bad that people are actually switching OSes due to it. Have seen similar comments about OSX/macOS.
The big difference between those and Linux is that Linux users have a choice to reject forced "upgrades" and build very personalized environments. If had to live with Wayland could do it really, even if there're issues, but since my current environment is fine don't really need/care to. And it's having a personalized environment such a change is a chore. If was using a comprehensive desktop environment like GNOME (as many people do), maybe wouldn't even understand something changed underneath.
LOL
I installed a new windows 11 yesterday on a fairly powerful machine, everything lags so much on a brand new install it's unreal. Explorer takes ~2-3 seconds to be useable. Any app that opens in the blink of an eye under Linux on the same machine takes seconds to start. Start menu lags. It's just surrealistic. People who say these things work just have never used something that is actually fast.
Linux is faster in some places, maybe. But still with many issues like some applications not being drawn properly or just some applications not available (nice GUI for monitor control over ddc)
Everything actually feels significantly more solid/stable/reliable than modern Windows does. I can install updates at my own pace and without worrying that they'll add an advert for Candy Crush to my start menu.
I also run Bazzite-deck on an old AMD APU minipc as a light gaming HTPC. Again, it's a much better experience than my past attempts to run Windows on an HTPC.
As with everything, the people having issues will naturally be heard louder than the people who just use it daily without issues.
For me, Wayland seems to work OK right now, but only since the very latest Ubuntu release. I'm hoping at this point we can stop switching to exciting new audio / graphics / init systems for a while, but I might be naive.
Edit: I guess replacing coreutils is Ubuntu's latest effort to keep things spicy, but I haven't seen any issues with that yet.
Edit2: I just had the dispiriting thought that it's about twenty years since I first used Ubuntu. At that point it all seemed tantalizingly close to being "ready for primetime". You often had to edit config files to get stuff working, and there were frustrating deficits in the application space, but the "desktop" felt fine, with X11, Alsa, SysV etc. Two decades on we're on the cusp of having a reliable graphics stack.
I feel the same and find it a bit strange. I am happy with hyprland on wayland since a few months back but somehow it reminds me of running enlightenment or afterstep in the 90s. My younger self would have expected at least a decade of "this is how the UI works in Linux and it's great" by now.
Docker and node both got started after wayland and they are mature enterprise staples. What makes wayland such a tricky problem?
But then I try and focus on what each author thinks is important to them and it’s often wildly different than what’s important to me.
But a lot of internet discussion turns into very ego-centric debate including on here, where a lot of folks who are very gung-ho on the adoption of something (let’s say Linux, but could be anything) don’t adequately try and understand that people have different needs and push the idea of adoption very hard in the hopes that once you’re over the hump you might not care about what you lost.
But with Linux being mostly hobbyist-friendly a number of folks have custom setups and do not want to be forced into the standardized mold for the sake of making it super smooth to transition from Windows.
I have such a setup (using FVWM with customized key bindings and virtual layout that I like, which cannot work under Wayland), so can I donate some money to Microsoft to keep Windows users less grumpy and not bringing yet another eternal September to Linux. I like my xorg, thank you very much :).
Windows uses _you_.
This stuff has been flawless on AMD systems for a while a couple of years now, with the exception of the occasional archaic app that only runs on X11 (thus shoved in a container).
Hopefully AnyDesk and Remmina will address this issue before KDE ends it's mainline X11 support next year.
It’d be very handy if we had a performant remote desktop option for Linux. I could resume desktop sessions on my workstation from my laptop and I could pair program with remote colleagues more effectively.
In the past I’d boot into Windows and then boot my Linux system as a raw disk VM just so I could use Windows’s Remote Desktop. Combined with VMware Workstation’s support for multiple monitors, I had a surprisingly smooth remote session. But, it was a lot of ceremony.
Good news: My laptop (Lenovo P53) can now suspend / resume successfully. With Ubuntu 25.04 / Wayland it wouldn't resume successfully, which was a deal breaker.
Annoying thing: I had a script that I used to organize workspaces using wmctrl, which doesn't work anymore so I had to write a gnome-shell extension. Which (as somebody who's never written a gnome-shell extension before) was quite annoying as I had to keep logging out and in to test it. I got it working eventually but am still grumpy about it.
Overall: From my point of view as a user, the switch to Wayland has wasted a lot of my time and I see no visible benefits. But, it seems to basically work now and it seems like it's probably the way things are headed.
Edit: Actually I've seen some gnome crashes that I think happen when I have mpv running, but I can't say for sure if that's down to Wayland.
This post is a lot more relatable.
As an aside, regarding remote Emacs - I can attest that Waypipe does indeed work fantastically for this. Better than X11 ever worked over the network for me.
I, too, suffer from the pgtk is slow issue (only a 4k monitor though it's mitigable and manageable for me)
I imagine there's a "real" hyper modifier but I haven't attempted to use it. For my sake, since I use GUI Emacs I find that I have enough mappings without it (I'm also a dirty evil user). A friend makes extensive use of it because he primarily uses Emacs via a terminal and Hyper avoids all other terminal keybind conflicts he might otherwise run into. But, he uses X11, too so no PGTK Emacs even if/when he does run GUI Emacs.
I'll try to dig into this some though and see if I can (a) determine a way to map a "true" hyper to my keyboard and (b) use it in PGTK Emacs and follow up with you.
You might want to give wayshot a "shot"? https://github.com/waycrate/wayshot
Maybe in another decade or so.
After an Nvidia graphics driver release everything cleared up to be very usable (though occasionally stuff still crashed, like once or twice a week). I heavily dislike Nvidia and went with AMD just around a month ago, zero issues.
I'm curious to hear about what hardware you have.
1) Hugely enjoyable content - as usual - by Michael Stapelberg: relevant, detailed, organized, well written.
2) I am also an X11 + i3 user (and huge thanks to Michael for writing i3, I'm soooo fast with it), I also keep trying wayland on a regular basis because I don't want to get stuck using deprecated software.
I am very, very happy to read this article, if only because it proves I'm not the only one and probably not crazy.
Same experience he has: everytime I try wayland ... unending succession of weird glitches and things that plain old don't work.
Verdict: UNUSABLE.
I am going to re-iterate something I've said on HN many times: the fact that X11 has designs flaws is a well understood and acknowledged fact.
So is the fact that a new solution is needed.
BUT, because Wayland is calling themselves the new shite supposed to be that solution DOES NOT AUTOMATICALLY MEAN they actually managed to solve the problem.
As a matter of fact, in my book, after so many years, they completely and utterly failed, and they should rethink the whole thing from scratch.
And certainly not claim they're the replacement until they have reached feature and ease of use parity.
Which they haven't as Michael's article clearly points out.
So you've managed to get redhat to commit to continue to package an X server, then? I'm impressed by this achievement and send my thanks for your efforts - they've been trying to drop it for a while now, it's only wayland's continuing status as unusable vapourware that stops them.
I'm very happy with Wayland, but what a strange comparison to make if you're not. IPv6 is objectively an enormous improvement over IPv4, and the only gripe with it is that it's still not ubiquitous.
However, my comparison is end-user focused (ie. the Linux desktop experience). I should have been more clear about the scope perhaps.
Both IPv6 and Wayland have increased complexity and surface area for pain (cost) without an obvious benefit for the end-user.
Also: wrt IPv6 specifically, I don’t believe every device on a private network should be publicly addressable/routable. To me that’s a bug, not a feature, and again does not serve the consumer, only the producer.
I screen share and video call with Slack and Google Meet.
I use alacritty/zsh/tmux as my terminal. I use chromium as my browser, vscode and sublime text as code editors.
Slack, Spotify, my studio mic, my Scarlett 2i2, 10gbe networking, thunderbolt, Logitech unifying receiver…. Literally everything “just works” and has been a joy to use.
Only issues I’ve ever faced have been forcing an app to run native Wayland not xwayland (varies from app to app but usually a cli flag needed) and Bluetooth pairing with my Sony noise canceling which is unrelated to Wayland. Periodically I get into a dance where it won’t pair, but most of the time it pairs fine.
One of the obstacle that I faced is wrong resolution. On Xorg I could just add new mode and get up and running quickly. On Wayland, I have to either do some EDID changes or go through even worse.
I'd be investigating that issue instead, should have errors in systemd/journalctl or whatever you use for managing daemons. I'm using ydotool on Arch, pretty much all defaults, together with a homegrown voice dictation thing, and it's working 100% of the times.
It works nicely for me but as it uses /dev/input/eventX i don't know if it's consistent across reboots for hardcoded scripts (although so far it worked without issues)
For instance, a compositor may not support a clipboard, and the "data" related interfaces must be queried for availability (those interface are stable in core) and the client must disable such functionality if not there (for instance, wterm terminal is faulty because it forces a compositor to have such interfaces... but havoc terminal is doing it right). I don't know yet if libSDL3 wayland support "behaves" properly. wterm fix is boring but should be easy.
As wayland usage, it is probably almost everwhere (and Xwayland is there for some level of legacy compatibility).
(I am currently writting my own compositor for AMD GPUs... in risc-v assembly running on x86_64 via an interpreter)
No. Wayland still has hilariously terrible bugs and can't adequately do very basic things we have been doing for ~30 years on X, even in 2026, and because it's not compatible with anything it requires ditching all your stable, well-tested, solid software that you've been using for decades to replace it with incompatible software that doesn't work half the time.
To summarise the summary: it's in pretty much the same place as it was in 2025. And 2020. And 2016.
Turns out: absolutely no problem. The tooling around Wayland and adjacent programs is way more modern and functional. Great experience. Can recommend switching in 2026.
For me, the unintuitive truth with Linux is: once you’re on a rolling release, most problems just vanish. I still love Debian for prod, but for experimenting and development, rolling release any day.
Sounds like someone made a listener that listens on key events, but didn't bother to check the state of the event, meaning it hits releases as well. Should be easy to verify by keeping them pressed long enough to trigger the key repeat events.
> I also noticed that font rendering is different between X11 and Wayland! The difference is visible in Chrome browser tab titles and the URL bar, for example:
Tab title display is not owned by wayland unless you are running with the client side decor extension, which Gnome is not. So looking at the application or GUI framework (GTK in this case) are really the two only choices.
https://forums.tomshardware.com/threads/nvidias-name-change....
When I got to know their products, they were nVidia.
"The name of this corporation is NVIDIA Corporation." - 1995 amendment.
Is this specific to the WM he used or does HW acceleration straight up not work in browsers under Wayland? That to me seems like a complete deal breaker.
https://forum.endeavouros.com/t/linux-des-resource-usage-com...
And is it possible to get fullscreen but within a container (e.g. get rid of browser gui to see more in a small container)
The Chrome crashes when resizing a window doesn't makes any sense, apart from being a WM fault. The Xwayland scaling, again, has native scaling support on Gnome. Same for the monitor resolution problem (which he acknowledged). Same for font rendering. Idk.
> By the way, when I mentioned that GNOME successfully configures the native resolution, that doesn’t mean the monitor is usable with GNOME! While GNOME supports tiled displays, the updates of individual tiles are not synchronized, so you see heavy tearing in the middle of the screen, much worse than anything I have ever observed under X11. GNOME/mutter merge request !4822 should hopefully address this.
I mean, it works alot better than it did before, still i wouldn't recommend it for someone who isn't ready to tinker in order to make stuff work.
The point why i mention this is, while most normal desktop/coding stuff works okay with wayland, as soon i try any gaming its just a sh*show. From stuff that doesn't even start (but works when i run on x) to heavyly increased performance demands from games that work a lot smoother on x.
While i have no personal relation to any of both, and i couldn't technically care less which of them to use - if you are into gaming, at least in my experience, x is rn still the more stable solution.
Now all my computers are worse, and there's absolutely nothing I can do.
At least it's better than Windows I guess.
"Wayland is the successor to the X server "
Wayland is primarily a protocol, but most definitely not a "success" to the xorg-server. This is why it does not have - and will never have - the same feature set. So trying to sell it as "the new shiny thing" after almost 20 (!!!!!) years, is simply wrong. One should instead point out that wayland is a separate way to handle a display server / graphics. There are different trade-offs.
> but for the last 18 years (!), Wayland was never usable on my computers
I can relate to this a bit, but last year or perhaps even the year before, I used wayland via plasma on manjaro. It had various issues, but it kind of worked, even on nvidia (using the proprietary component; for some reason the open-source variant nouveau works less-well on my current system). So I think wayland was already usable even before 2025, even on problematic computer systems.
> I don’t want to be stuck on deprecated software
I don't want to be stuck on software that insinuates it is the future when it really is not.
> With nVidia graphics cards, which are the only cards that support my 8K monitor, Wayland would either not work at all or exhibit heavy graphics glitches and crashes.
I have a similar problem. Not with regards to a 8K monitor, but my ultra-widescreen monitor also has tons of issues when it comes to nvidia. I am also getting kind of tired of nvidia refusing to fix issues. They are cheap, granted, but I'd love viable alternatives. It seems we have a virtual monopoly situation here. That's not good.
> So the pressure to switch to Wayland is mounting!
What pressure? I don't feel any pressure. Distributions that would only support wayland I would not use anyway; I am not depending on that, though, as I compile everything from source using a set of ruby scripts. And that actually works, too. (Bootstrapping via existing distributions is easier and faster though. As stated, trade-offs everywhere.)
> The reason behind this behavior is that wlroots does not support the TILE property (issue #1580 from 2019).
This has also been my impression. The wayland specific things such as wlroots, but also other things, just flat out suck. There are so many things that suck with this regard - and on top of that, barely any real choice on wayland. Wayland seems to have dumbed down the whole ecosystem. After 20 years, having such a situation is shameful. That's the future? I am terrified of that future.
> During 2025, I switched all my computers to NixOS. Its declarative approach is really nice for doing such tests, because you can reliably restore your system to an earlier version.
I don't use NixOS myself, but being able to have determined system states that work and are guaranteed to work, kind of extends the reproducible builds situation. It's quite cool. I think all systems should incorporate that approach. Imagine you'd no longer need StackOverflow because people in the NixOS sphere solved all those problems already and you could just jump from guaranteed snapshot to another one that is guaranteed to also work. That's kind of a cool idea.
The thing I dislike about NixOS the most is ... nix. But I guess that is hard to change now. Every good idea to be ruined via horrible jokes of an underperforming programming language ...
> So from my perspective, switching from this existing, flawlessly working stack (for me) to Sway only brings downsides.
I had a similar impression. I guess things will improve, but right now I feel as if I lose too much for "this is now the only future". And I don't trust the wayland-promo devs anymore either - too much promo, too few results. After 20 years guys ...
There's Nickel, if it's only about the language, and Guix (Guile Scheme) which goes beyond just the language.
I don’t get the hate for Nix, honestly. (I don’t get the complaints that it’s difficult, either, but I’m guessing you’re not making one here. I do get the complaint that the standard library is a joke, but you’re not making that one either that I can see.) The derivation and flake stuff excepted, Nix is essentially the minimal way to add lazy functions to JSON, plus a couple of syntax tweaks. The only performance-related thing you could vary here is the laziness, and it’s essential to the design of Nixpkgs and especially NixOS (the only config generator I know that doesn’t suck).
I’ll grant that the application of Nix to Nixpkgs is not in any reasonable sense fast, but it looks like a large part of that is fairly inherent to the problem: you’ve got a humongous blob of code that you’re going to (lazily and in part) evaluate once. That’s not really something typical dynamic-language optimization techniques excels at, whatever the language.
There’s still probably at least an order of magnitude to be had compared to mainline Nix the implementation, like in every codebase that hasn’t undergone a concerted effort to not lose performance for stupid reasons, but there isn’t much I can find to blame Nix the language for.