Unless you want the compositor to execute everything in trusted shell, there needs to be a way to actually interact with the system.
One of the powers of wayland is dynamic discovery: a GUI application has to query the compositor interfaces then features, then turn on and off stuff dynamically (for instance, the clipboard).
Which "accessibility" x11 APIs you are talking about?
Under Wayland, you also can't find your mouse. [1]
No input system can find out your active window, so no remapping for applications. [2]
And no key rebindings, at all, yet. [3]
The issue I posted that started this thread was proposed by Orca. This is not some random person saying that Wayland can't do what they need, yet. Its one of the most popular accessibility software programs available on Linux.
Libinput doesn't have capability monitoring, saying instead it'll need to be implemented by the server, so it needs to be Wayland's problem, not theirs. [4]
[0] https://specifications.freedesktop.org/wm/latest/
[1] https://gitlab.freedesktop.org/wayland/wayland/-/issues/383
[2] https://gitlab.freedesktop.org/wayland/wayland/-/issues/326
[3] https://gitlab.freedesktop.org/wayland/wayland-protocols/-/m...
[4] https://gitlab.freedesktop.org/libinput/libei#capability-mon...
Then, the power of wayland has to be leveraged the right way: a set of custom and clean accessibility/instrumentation wayland protocols, queried dynamically for support or not by GUI applications which handle that complexity. A lot is now directly client side though.
You will have probably to maintain a set of forked/branched compositors with this accessibility/instrumentation code, which would be deployed as an alternative only on demand.
I just want to use my computer. That I'm blind should not be a "fork it and maintain it yourself, you're on your own" story.
You basically just told me I shouldn't exist, for want of a different kind of keyboard.
You are asking the wrong people, ask the the AT-SPI/ATK and GUI toolkit people to design the required interfaces (leveraging the dynamicity of wayland or some other custom and simple protocols perhaps) and to devel/maintain the required complexity for those interfaces to work.
Those interfaces are beyond instrusive which defeats client application isolation and compositor independance of niche complexity which is a corner stone of wayland. That's why those complex compositors (or "modules" of some huge compositors) will be on demand only (and they are highways for malware and spyware).
As I've already pointed out, multiple times, those are the people asking Wayland for the necessary protocols, so that they _can_ design the required interfaces.
KDE has pretty much given up, and kwin is a fork with a ton of extensions [0]. Because Wayland always says no.
Gnome does the same, as does wlroots and sway. Which means that all of them have incompatible protocols, meaning accessibility is sharded between desktop environments. Your apps, that you need just to press a key, are all incompatible with each other.
Accessibility is not some niche thing. It is a cornerstone of interface design, that assists everyone who interacts with it in some way.
Your view is very simple: Security trumps accessibility. That has been obvious since the first post.
My view is simpler: I am allowed to exist, and so security must make considerations for accessibility.
As things stand, both Windows and macOS have a better accessibility story than Linux, because of this dogged approach.
[0] https://invent.kde.org/libraries/plasma-wayland-protocols/
Your are not making any sense at all: it is up to the AT-SPI/ATK people to design their own set of wayland interfaces related to their definition of accessibilty and to code/maintain their related software (which could be compositors, or modules of compositors). Wayland being a set of interfaces which are fully discoverable and dynamic at runtime makes all that possible.
Wayland is the gutter sink here. Nobody else. Everyone else has done what they can, and continue to do what they can. Wayland has said no to the very interfaces you claim need to be implemented - they already have been.
I could tomorrow start to design my own set of 'wayland interfaces for spy agencies' in order to provide wayland clients a way to spy on all other applications and even on what the compositor is doing, and why not implement some super ultra giga user interactions that specific to accessibility. Then I could pick up GTK+ mutter, branch it, and start to dev the modules implementing such set of interfaces.
Since wayland is actually "x12", if starting anew, this is even a good opportunity for ATK/AT-SPI people to remove the non pertinent legacy burden, aka do some cleanup passes.
And you say those interface were already designed and implemented why are you whining about?
So... You don't know a thing about what you're talking about? AT-SPI is a _standard_ not an _implementation_. There is no legacy cruft to remove.
So how about you either go build what you've already been shown can't be built, or just... Stop talking about accessibility look you understand either it or Wayland?