Curious from people who follow its development closely.

  • What protocol are about to be finally implemented?
  • Which ones are still a struggle?
  • How many serious protocols are there missing?

https://arewewaylandyet.com/

  • makingStuffForFun@lemmy.ml
    link
    fedilink
    arrow-up
    5
    ·
    edit-2
    2 months ago

    I use an accessibility tool called Talon Voice. It is x.org only. Will the shift to Wayland kill these tools, or is it a case of the developer needing to rewrite for wayland?

    • boredsquirrel@slrpnk.net
      link
      fedilink
      arrow-up
      1
      ·
      2 months ago

      On X11 apps can scan and read what they want. This is not even very good, but developers dont need to implement accessibility really, just make all text scannable.

      If this is a screenreader you are talking about.

      Apps need to send the reader specific texts that shouls be read, like push notifications. And this needs to be implemented, because on Wayland no app can just scan everything.

      • wewbull@feddit.uk
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        2
        ·
        2 months ago

        So rather than having one single app that deals with screen reading, it’s now down to every individual application to make accessibility a priority.

        Huge retrograde step.

        We can all agree that authors should all value accessibility, but we also all know that they won’t.

        • boredsquirrel@slrpnk.net
          link
          fedilink
          arrow-up
          6
          arrow-down
          1
          ·
          2 months ago

          GUI frameworks should implement this, just like any app built on GTK, Qt, Iced or possibly others have native wayland support.

          But yes I agree this is not a good situation. There should be something like “accessibility permission” on Android, where apps can basically read anything.

        • lemmyvore@feddit.nl
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          4
          ·
          2 months ago

          That’s one of the huge problems with Wayland. The core protocol is super minimalistic so it falls to each and every individual app to (re)implement everything: accessibility, clipboard, keyboard, mouse, compositing etc. etc.

          The fact this was done in the name of security is a solution looking for a problem. Inter-window communication was never a stringent security issue on Linux.

          It’s like advising people to wear helmets in their everyday life. Sure, in theory it’s a great idea and would greatly benefit those who slip and fall, or a flower pot falls on their heads, or are in a car accident and so on. But in practice it would be a huge inconvenience 99.99% of the time.

          The largest part of all Linux apps out there will never get around to (re)implementing all this basic functionality just to deal with a 0.01% chance of security issues. Wherever convenience fights security, convenience wins. Wayland will either come around or become a bubble where 99% of Linux userland doesn’t work.

          • Zamundaaa@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            6
            ·
            2 months ago

            it falls to each and every individual app to (re)implement everything: accessibility, clipboard, keyboard, mouse, compositing etc. etc.

            I haven’t read so much nonsense packed in a single sentence in a while. No, apps don’t implement any of these things themselves. How the fuck would apps simultaneously “implement compositing themselves” and also neither have access to the “framebuffer” (which isn’t even the case on Xorg!) nor information about other windows on the screen?

            Please, don’t rant about things you clearly don’t know anything about.