Standardization of graphical environments has always had a rocky history. Not every vendor provided one in the 20th century on their machines, and those who did generally provided an entirely proprietary solution that was custom built for the given device. Some well known consumer grade examples of the time would include the Macintosh UI, the Amiga Workbench, and who could possibly forget Microsoft Windows. The Unix sphere was in a similar boat for a long time, with window systems like SunView, MGR, NeWS, NeXTStep and many others.
Certain graphical environments meant for use across different platforms did exist, like GEM, seeing adoption on platforms like the Atari ST (TOS) and IBM PC (DOS). This provided a level of application portability at the GUI library level, but it was still far from universal due to various architectural differences, as well as being developed exclusively for use in single user sessions local to the machine.
The Unix world would see a complete shakeup in the mid to late 80s with the public release of the X11 Window System, which was specifically designed for remote, multi-session use with a client-server model that is entirely architecture independent and is driven by a message queue system. Seeing as how remote operation of different types of Unix machines with multiple users at once was a very common use case at the time, it quickly saw wide adoption by most of the major Unix vendors.
X11 has a very simple job: be a family of protocols that provide a way for applications (clients) to create and position windows, draw primitives and bitmaps in those windows, and various other functions that allow a graphical environment to function. What it does NOT provide is a full desktop environment with a suite of applications, or even a window management system. These are all meant to be built on top of X11 and run in an X session provided by an X server. Vendors and independent programmers could develop their own application bundles that would simply run in an X session and interact with one another as needed, which we would see with simple window managers like TWM, VTWM, FVWM or similar, and even full-blown desktop environments like OpenWindows, IRIX Desktop, and others (modern day equivalents would be GNOME, Xfce, KDE, etc.).
There was somewhat of a standardization dilemma across the major Unix vendors in regards to application GUI design and development, which resulted in the birth of toolkits like Motif and desktop environments like CDE, which would ensure a common look and fell across entirely different Unix flavors. Even in cases where the remote system was of a different architecture and even different operating system family, you could still run applications as long as they were made to talk with X11. Want to run a GTK application from an x86 Solaris remote host within a local POWER10 AIX system's CDE session? You can do that! Have a dedicated X server application running on a Windows machine? It'll happily display the application too!
Times change, and so do common use cases for any given thing. Running graphical Unix applications remotely is now a niche, people for the most part run local sessions with various forms of GPU acceleration in multi-monitor setups with DPI scaling of UI elements on modern Linux or BSD operating systems... and yet, X11 is somehow still around. The fundamental principles of X11 operation remained the same, but many of it's surrounding layers had to change to accomodate modern use cases. Combine this with the age of the protocol and the legacy features that have to stay around, and maintenance becomes incredibly difficult. Multiple times in the 21st century have developers tried to draw a line in the sand and start anew, shedding what came before. You probably forgot many of these attempts ever even happened due to their failure to gain traction, but one has risen above the rest: Wayland.
In some ways, Wayland tries to be what X11 isn't. It doesn't provide a level of network transparency, as it's designed to be used locally first and foremost. In other ways it takes X11's principles to their extreme, as it's protocol is even more minimal. This changes the fundamental link between applications and the environment, as now applications must run under a window compositor to be properly displayed. It is not enough to simply have a display server, because the server does not perform compositing. The burden now falls on the graphical environment itself.
This doesn't sound too bad at a surface level, but you must realize that Wayland itself is not a standalone system. You don't run a compositor on top of a Wayland host session, the compositor is the Wayland session. "What's the big deal?", you might wonder, "I just have to run a compositor like I would a window manager or desktop environment, and applications developed for Wayland should just work.". This would be the ideal reality, at which point the only real drawback would be the inability to run X11 applications, something that is tackled by the Xwayland server. The actual reality is that it's not even remotely simple.
See, there's a reason why Wayland has grown somewhat infamous for it's minimal, security first design at the protocol level: it lacks way too many features for many use cases expected by many in a graphical desktop environment. X11 also has a history of lacking important features in it's early days, but these could be added through the extension system. Once a needed extension was added to the X server, all applications could make use of it, and this would include window managers/desktop environments. Many crucial extensions would end up being added to the reference upstream X11 codebase (which as of the day of writing is X.Org's) over time, and any downstream flavors of the main X11 server would also have it. Better yet, if you're running an X server meant to run remote applications, and your current X server implementation lacks the necessary features, you can always try another one in a matter of seconds. Maybe even run another X server within your existing X session if you don't want to kill it. The Wayland ecosystem doesn't really allow for this. If you're running a given compositor that lacks certain protocols/extensions that your application needs, you're out of luck on that compositor and must switch to another one, since the Wayland "server" and the compositor are one cohesive whole.
This right here is where the major pitfalls of Wayland start to show themselves. Since the task of displaying windows and providing many fundamental protocols is the burden of the window managers/desktop environments themselves, they have to provide it one way or another. The two most popular Wayland DEs, GNOME and KDE, commonly end up implementing such features in their own way that is entirely incompatible with one another. Projects like wlroots try to standardize many such extensions, but they are not guaranteed to be implemented by downstream WMs/DEs since they are not part of the reference Wayland implementation.
This all sounds like an absolute disaster for the end user that can lead to nothing but severe fragmentation, which in many ways it has, because application developers have to really think about which of the major Wayland compositors they will write for, something that was barely a concern on X11 for a long time thanks to it's fundamental design and the existence of GUI toolkits. Users also have to pick their poison on which features they're willing to sacrifice when picking a compositor, because their Wayland implementation is never guaranteed to cover everything that others might.
In spite of this, there's been a major push by Linux distribution makers and large companies in the OSS sphere like RedHat to put X11 in the ground and go forward with Wayland. This is why I called it a trojan horse, as RedHat and similar big players can use their leverage in the Linux userland to push their preferred Wayland-powered desktop environment of choice, which is GNOME. They can use their funding and pool of engineers to add features to the GNOME compositor at a fast pace, which they are not obligated to push to Wayland's upstream in any capacity, and the rest all have to play catch-up. Over time the feature gap between GNOME, KDE, wlroots and base Wayland will grow ever larger, so anybody who wants to start developing a Wayland compositor+window manager bundle will have an ever steepening hill to climb. You might be the kind of person who only really cares to run GNOME or KDE and nothing else. You're probably going to have the smoothest transition in the X11 deprecation process that is underway on several major Linux distributions, although this still isn't 100% guaranteed. The rest of us are going to be shit out of luck.
The endgame of such a scenario should be obvious to anyone who has been in the computer sphere for long enough: consolidation. See, if you recall only a few years ago, many FOSS enthusiasts were more than eager to sell you on Linux/BSD because of the "freedom of choice" in the ecosystem, and nowhere did you see this being shown more than in the realm of window managers and desktop environments. From the most feature rich like GNOME/KDE/Xfce/Cinnamon/MATE, through to the minimal like i3/IceWM/AmiWM, to the downright ancient like CDE and GNUstep. Hell, you might not want to run a WM or DE in any capacity and just want a blank X session, it's all up to you.
You really don't have much to choose from in the modern Wayland ecosystem, and all the choices come with drawbacks that very negatively impact application usability. When push comes to shove with the choice of a Wayland environment, what are you gonna go with? How about a grassroots effort like Hyprland, that is currently developed at an incredibly rapid pace and includes the latest features almost immediately, but isn't guaranteed to continue like this long-term, and will inevitably have to play catch-up to GNOME and KDE... mmm those potential negatives don't sound very enticing, now do they? Maybe you want something that's more... "standard" in terms of it's foundations, like the wlroots based Sway? Probably a good choice if you mostly plan to use applications specifically designed to work with wlroots based compositors. If your applications are instead meant to target the two big players though... then you can potentially run into pretty bad issues. If you really hate yourself and want to suffer pain at every turn, there's always COSMIC DE!
You see where I'm going with this? With so many complications arising just from your choice of a Wayland compositor, most users who value their time will just throw their hands in the air and go with whatever has the most backing behind it, rationalizing the choice by believing that will result in the highest degree of application compatibility. These choices will naturally boil down to GNOME and maybe KDE. Who benefits the most from this consolidation? Those who actively push for Wayland of course, like the GNOME Foundation and Red Hat. Freedom of choice is the last thing either of these entities (and those close to them) want you to have in your Linux environment, and the aggressive push to kill off X11 in favor of Wayland fits perfectly into their goals of siphoning users into their graphical environments of choice.
Keep in mind, I've so far only really talked about the Linux sphere. What about other Unixes? FreeBSD has the Hikari compositor, but I'm willing to bet it has a ton of complications due to lagging behind the aforementioned options. The rest of the BSDs? Tough luck. As for Solaris and OpenIndiana? Not even a blip on the radar. A big reason for this is that Wayland is heavily bound to KMS (Kernel Mode Setting), which is a component of DRM (Direct Rendering Manager). FreeBSD has a clone of DRM, but none of the other modern Unixes do, meaning that they cannot have a working port of Wayland without severe reworks of it's foundation regarding comms with graphics devices. What they do have though is X11. WMs/DEs really only need a little bit of patching (if at all) to compile and run on these systems under X11. Hell, a few years ago trn and I patched bits of CDE to work under OpenIndiana.
"Hold on, X11 is still around. X.Org still works on it and it's the reference implementation used across the board.". Well yes and no. It's been a bit of a joke for the past several years that the Xorg display server is under maintenance mode and now purely exists to leverage Xwayland (The X11 compatibility layer for Wayland), which is currently the reality and not just a simple joke. There hasn't been a proper release in over a decade, and the "maintainers" have been very vocal about their desires to retire the Xorg display server in favor of Wayland. They're so willing to move on that they have lately been taking active efforts to pull back any efforts to revitalize the codebase upstream, as well as combating the mere existence of forks that aim to pursue this effort, like X11Libre. If this continues, X11 holdovers on Linux are gonna suffer, and non-Linux users of X11 based environments will essentially be left without a maintained display server of any kind.
Now rewind back to the start of this page. Compare the GUI situation prior to X11 and how things are progressing right now. We're looping right back to square one, wouldn't you agree?