What really frustrates me about Wayland is we're replacing an old system with lots of structural problems with a new system with lots of structural problems, and when it comes down to it X and Wayland are problemy for the same reason: They both farm out way too much core functionality to extensions, leaving a fragmented vendor ecosystem and disjointed developer experience. We got to do it all over and we made the same mistake! We don't even get to trade up to a DIFFERENT fundamental flaw!
In the current twilight moment the proposition with Wayland is Linux is giving up inbuilt GUI network streaming with the supposed benefit of a system which structurally works better on "unusual" display devices such as phones and VR helmets. However it seems unlikely we'll ever get to USE that benefit, because the hardware you're hoping to target there is hard locked in to Android— not Linux, Android. Unless you can appropriate Android GPU drivers or build Wayland atop SurfaceFlinger, who cares?
Like I think we absolutely need to make Wayland work, I've been yelling for a GPU-centric compositor in Linux since Rhapsody, but at the moment I am *disappointed*.
@mcc Quartz Compositor also does not directly support network transparency. I don't think a display server needs to support this, and it is most likely best left to something more suited for the purpose of network extension.
@mcc well, it's the niche, niche, but my linux phone (SailfishOs) is wayland based and, though deep down there are binary blobs from android land, it's linux wayland on a mobile. Some of us do funky things https://forum.sailfishos.org/t/fun-with-remote-wayland-waypipe/16997 so, there ARE convergent moments.
@mcc I'm hoping to live long enough to see one or two more "this is the windowing system that is going to get it right"
@mcc you can tell wayland was designed at a time where there still seemed like a hope of it running on systems that weren't desktops. somehow a 2010 design just took 15 fucking years to become The Default Linux Graphics Stack
@mcc https://mer-project.blogspot.com/2013/05/wayland-utilizing-android-gpu-drivers.html
the reason it's called hybris is it involves using two different libc's in the same address space, which is horrifying, but it does work
@mcc in many ways, it's even worse because for the last 20+ years or so server extensions (as opposed to client-to-client protocols) were largely usable even without proper support from the WM/DE, meaning they were de facto universal (if clumsy to use). With the Wayland approach, instead, everything is meaningless until/unless all compositors agree on implementing it the same way. This has effectively rolled back the ecosystem to the state it was 30+ years ago
@ariadne I think my point is that the benefits of Wayland feel, in the short term, relatively virtual.
@mcc my read is that the "desktop" is a fundamentally broken abstraction. I don't know how we will ever provide the required user control, at the same time as the necessary stability in the abstraction layer for applications to build on.
@ekg At least three different commercial OS companies have built out fully adequate API stacks for this at least five different times. The open source community is doing something wrong.
@ariadne Also it does seem to me efforts to make a good network streaming extension will be held back by the same thing that makes a good accessibility extension* difficult, that Wayland just pushes framebuffers and has no sense there are deeper semantics something might want to introspect. Right? A single abstraction could have been used for both screen reader areas and streamable rectangles, but Wayland doesn't include abstractions.
* All I know about the status of this is it's going badly
@mcc X is just framebuffers being pushed around anymore, plus a whole bunch of legacy stuff that no real world applications use anymore for decades. accessibility is handled by the toolkits directly.
that someone is trying to improve on this by integrating screen reader hints and other accessibility features into wayland itself is an improvement over X11.
@mcc also you can already forward Wayland messages using Waypipe.
> that someone is trying to improve on this by integrating screen reader hints and other accessibility features into wayland itself is an improvement over X11.
Hi. I think you're talking about my project. It's been on hold for a year now; the last status update was: https://blogs.gnome.org/a11y/2024/06/18/update-on-newton-the-wayland-native-accessibility-project/
What I like about my approach is that accessibility tree updates are serialized, unlike any existing platform accessibility API I know of... so they could efficiently be pushed over a network.
@never_released @mcc dbus does not get forwarded over X11 either…
@mcc I switched from windows specifically because the lack of control. Linux desktop attract people with weird, and unusual requirements.
@ariadne You're right that AT-SPI, the current accessibility protocol, is independent of both X and Wayland. And you _really_ wouldn't want to run that chatty interface over a network with any significant latency, though you theoretically could, since it's D-Bus-based. In certain scenarios it's already bad enough doing chatty IPC between local processes, as are other platform accessibility APIs.
@matt yes i do not personally know the specifics as the toolkits deal with AT-SPI for me. my point is that it’s not part of X11, but clearly people think it is.
@mcc I switched from windows specifically because the lack of perceived control. Linux desktop attract people with weird, and unusual requirements.
Iuno - streaming and VDI that actually get used all basically just create a virtual local abstractions and use hardware encode offload and pipe the whole thing to the client. This 'just works' and can be done on Wayland much easier than X. And is the basis of i.e sunshine. It's not going to scratch the itch of streaming primitives, but it's functional beyond what was ever easily doable otherwise.
I do this on Wayland with containers with device/client specific configs for a dozen different random retrohandheld form factors and it works very well.
@mcc one of the few advantages of niche phones. Always interesting. My Godot 3.5 and SDL2 foo have been 2D to date, so no insights there.
@mcc @ariadne
tldr; random meandering thoughts ... I dunno if I've said anyting of value, but writing it out helps me collect my thoughts 😅
I agree for an app-developer using a toolkit or end-user Wayland vs Xorg is kind of an implementation detail of the platform and not usually something that provides many tangible benefits, just annoyances because it's different. Most of the benefits are small details which had effective workarounds which people were used to, eg making the window manager/compositor/display server one piece of software reduces context switches, all pixel buffers can remain in GPU RAM built using shaders, and allows for security where no app can snoop on anothers input/output without the compositor's direct involvement, or register keybindings outside their window, where screen locks and login screens can have their own private context that apps can't interfere with. That's one thing that makes things like Flatpak sandboxes effective and useful.
None of these are *WOW* features for someone who just wants to use their computer.
I think the decision to build Wayland was not driven by user demand, it was driven by the Xorg, toolkit/desktop developers who just wanted to push pixels around with the minimum amount of fuss, because that's all toolkits like GTK and Qt have done since the mid 2000s. They didn't want to work on Xorg anymore and have to create new features/extensions in the X11 framework while carefully maintaining compatability with the X11 design and internal Xorg architecture, which toolkits didn't use anymore for most of their functionality. The biggest benefit was using the Linux kernel APIs for GPUs directly although Xorg did get the linux modesetting driver too which replaced all the various chipset-specific DDX drivers, Xorg still carries all the infrastructure for abstract GPU acceleration which is part of the maintenance burden they want to drop. This kind of thing isn't driven by project managers and focus group testing, the paid developers and volunteers who work on graphics just *decided* to rearchitect. I expect Xwayland on wayback to provide mostly the same experience as Xorg today but using the same technical underpinnings as Wayland so the whole stack is fully maintained.
@noisytoot @mcc that can be fixed easily enough…
@mcc @ariadne I must at this point the benefits for me are very much tangible. Good HDR implementation, perfect handling of multiple monitors at different scaling and different Hz, touchpad gestures support, and it generally feels lighter and snappier.
There are some regressions that are slowly been fixed, like the ability to set custom icons for programs without using .desktop files.
But network transparency was not one of them. I once tried to boot up an entire KDE session over ssh -X and it was utterly unusable. If you are serious about remote desktop you would be using some other protocols already since more than a decade. Even single applications that were not of the last Millennium worked very baldy. I know because I tried.
Overall I'm very happy with how Wayland and the Linux desktop is moving, although it took a really long time to get here for sure.
@mcc @ariadne I don't have a good technical comparison between wpra, RDP, waypipe, etc. to know if the Wayland design provides any special challenges to remote display of single apps/full desktops, but those things have implementations.
As far as accessibility, my take on it is that until Redhat announced they were shipping a Wayland-only desktop for RHEL10 that there was only one volunteer maintainer for GNOME keeping accessibility alive, which had been created by Sun when they wanted to sell GNOME on SunRay terminals in the 2000s, so there just wasn't the resources allocated to implement accessibility on Wayland and the response was always "Fall back to Xorg and the stack that has an accessibility implementation". Now that there is a clear sunset for that approach resources are being assigned to figure out and negotiate the needed protocols for the major desktop environments. A big thing is global keybindings which any random app can't just install like on X11, the compositor is the only app which can intercept and generate key/mouse events, so any accessibility tool needs a mechanism to register with the compositor and have the compositor do some things on its behalf.
I expect in the next few years that the people who have been hired by Redhat and others to figure this out will complete their work, and it'll go back into maintenance mode for another 20 years. It's not like there is some fundamental design flaw that needs to be overcome, somebody just needs to do the technical *and social* work between a bunch of wholly independent projects.
@oblomov @mcc Yeah, that's very true, and the biggest problems are social and resources not technical. 30+ years ago when CDE/Motif and ICCCM were being hammered out it was between a handful of large vendors who were serious about the high-end workstation market and were willing to throw people at the problem to make their ecosystem work, quickly. That isnt true today, while there are some vendors selling workstations it's not a huge and growing market that is attracting investment (all the money us going to "AI") so there isn't the same pressure to hammer out protocols and code across the desktops as there was 30+ years ago when X11 had the same problems. It is happening, but much more slowly and with fewer people, eg IIUC accessibility only had one or two volunteer maintainers until like a year ago when RH hired an FTE to be able to effect architectural changes. It's all XKCD 2347 all the way.
It's honestly amazing that the Linux desktops are in any way competitive with MacOS and Windows as both of those have _way_ more people than any of the different Linux desktops, so they can just *do* more.
@portaloffreedom @ariadne I am terrified by the question of what will happen when I first attempt to use HDR
@portaloffreedom @mcc @ariadne VRR support and global default-on vsync was the original reason I made the switch. Now HDR as well after I got an OLED monitor. Gaming on Wayland is so much nicer than X11 ever was and you don't need to screw around with antiquated config files and constantly restart your environment to do things like toggle VRR or vsync on and off. Wayland also handles dual GPU (NVIDIA Optimus) setups way better in my experience. For gaming Wayland is plainly superior.
@portaloffreedom @mcc @ariadne Especially so with GE-Proton 10 and PROTON_ENABLE_WAYLAND and PROTON_ENABLE_HDR. With the latest few GE Proton releases most HDR games have been working properly on my OLED Alienware monitor with KDE Plasma Wayland.