In response to Wayland Breaks Your Bad Software

I say that the technical merits are irrelevant because I don’t believe that they’re a major factor any more in most people moving or not moving to Wayland.

With only a slight amount of generalization, none of these people will be moved by Wayland’s technical merits. The energetic people who could be persuaded by technical merits to go through switching desktop environments or in some cases replacing hardware (or accepting limited features) have mostly moved to Wayland already. The people who remain on X are there either because they don’t want to rebuild their desktop environment, they don’t want to do without features and performance they currently have, or their Linux distribution doesn’t think their desktop should switch to Wayland yet.

  • demesisx@infosec.pub
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 years ago

    I’m not leaving xmonad. It’s such a bummer that Waymonad didn’t really take off.

  • Sh1nyM3t4l4ss@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    2 years ago

    I switched to Wayland over two years ago and these days I don’t look back at all. I don’t care if Wayland has full feature parity with X11 as long the features I actually use are supported which they are.

    Clipboard sharing in VirtualBox doesn’t work right now (though I’m relatively sure it could be implemented by VirtualBox right now with Wayland as it is) and neither does AutoTyping in KeePassXC (not sure if there’s a mechanism for that on Wayland), though Autofill in the Browser works so it’s no big deal to me.

    In return I get 1:1 touch gestures, better multi monitor support and an overall smoother desktop on Plasma Wayland so I’ll take it.

    People often still make complaints about Wayland that have been fixed months or years ago and it’s a bit tiring.

      • Pasta Dental@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        2 years ago

        Yeah? Things like having a 60hz monitor and a 120hz monitor is basically non existent on X11, plus Wayland has this “perfect frame every time” + vsync philosophy which means no tearing and it feels much smoother to use than X11

    • marty_relaxes@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      On the topic of auto-typing, the mechanisms for variations of it exist in Wayland since I am using it in my password scripts to automatically fill login boxes. (Using tools like ydotool or wtype.)

      So I would guess that KeePass hasn’t integrated the necessary protocols/api for Wayland?

    • cobra89@beehaw.org
      link
      fedilink
      arrow-up
      0
      ·
      2 years ago

      Counterpoint, I have all AMD machines (CPU and GPU) and each time I’ve tried Wayland I’ve immediately run into bugs that make it unusable. Maybe it’s because both my setups have multiple monitors with different resolutions, but I don’t see why that use case is so hard to support. And I’m running the latest versions of Wayland and KDE so it’s not an issue of me running outdated versions that already have bug fixes supplied upstream. If Wayland can’t handle just basic desktop use with multiple resolutions why would I go through the effort to use it? Fix the basics first.

      • the_weez
        link
        fedilink
        arrow-up
        0
        ·
        2 years ago

        My experience has been the opposite. I won’t use x after using Wayland on AMD for years it just feels so much smoother. On arch with gnome Wayland has been fantastic.

    • LeFantome@programming.dev
      link
      fedilink
      arrow-up
      0
      ·
      2 years ago

      Xwayland has already been mentioned but this is an important point that not everybody may be familiar with. Xwayland is an Xserver ( actually a specialized version of Xorg itself ) that runs on top of Wayland instead of talking directly to hardware.

      If you are running Xwayland, you can run X clients ( x11 software dating back to 2003 for example ) and they will appear on your desktop.

      There can obviously be specific considerations around advanced software but moving to Wayland does not mean losing access to software written to target X. Qt and GTK support Wayland and will run native. Applications using other toolkits may still be running over X. As a normal user, you may not even know which applications are still using X and which are not.

      This is for regular applications. Moving to Wayland requires a Desktop Environment or Window Manager that supports Wayland. So, GNOME and KDE users are fine but Cinnamon or WindowMaker users would need to switch.

  • WuTang @lemmy.ninja
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    I won’t mind moving to Wayland but really , X11/xorg just works to me with all the feat. (hidpi, multi-monitor etc…) I don’t need fractional scaling, my 27" monitor is UHD but with right ppi set, everything looks good BUT I understand the interest.

    And I do understand the need to move away from X because of Elon… just kidding. Yes, we need to move to a better architecture but it must 1:1 in term of feat/stability, at least.

    • WuTang @lemmy.ninja
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      it should not. the 1st one should be that it is not opensource and 100% the cause of a X blackscreen on upgrades.

      AMD plays the game (no pun intended), so let’s go with it. If you need nvidia for CUDA for ML, standard are on the way to allow to use any GPU.

      • TechieDamien@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        2 years ago

        I already do ml on amd, and it works great. There’s usually a few extra steps that need doing as binaries aren’t always available, but that, too, will improve with time.

    • woelkchen@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 years ago

      Only reason I’m not using it is Nvidia.

      Don’t buy Nvidia GPUs. NVidia’s broken Linux support is a well-known fact since at least a decade.

      • Ineocla@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        2 years ago

        For gaming AMD is as good as NVIDIA or even better. For anything else tho it’s a dumpster fire. Amf still isn’t on par with nvenc, rocm is pure garbage and they are basically useless for any compute task

        • woelkchen@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          2 years ago

          For anything else tho it’s a dumpster fire. Amf still isn’t on par with nvenc, rocm is pure garbage and they are basically useless for any compute task

          Those specific compute tasks are not “anything else”. Pretty much every single everyday task by common people works better on GPUs with proper Mesa drivers than GeForce and there is absolutely no reason that you need to output your graphics from the NVidia GPU anyway. Do your compute tasks on dedicated Nvidia hardware if you have to. Even notebooks come with AMD and Intel iGPUs that are perfectly fine for non-gaming graphics output.

          • Ineocla@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            2 years ago

            Yep you’re right. Mesa covers almost anything. But streaming and recording, photo and video editing, 3d rendering ai training etc aren’t “specific compute tasks” they represent the vast majority of the market with billions of dollars in revenue. And no the solution isn’t to use another gpu. It’s for AMD to make their software stack actually usable

            • woelkchen@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              2 years ago

              photo and video editing

              Which photo editor for Linux even supports special NVidia features? It’s not like Linux has Photoshop or something like that – there aren’t that many photo editors under Linux. It’s one of the areas Windows people complain most loudly about Linux. Seems to me your conflated Windows with Linux when hyping Nvidia above anything.

              ai training etc aren’t “specific compute tasks”

              AI training isn’t a specific compute task? What is it then? Why do you train your AI on the regular graphics output GPU and not on dedicated hardware like sane people?

  • ThatHermanoGuy
    link
    fedilink
    arrow-up
    0
    ·
    2 years ago

    I won’t switch to Wayland until the compositor is separated so that when GNOME Shell crashes (as it does a few times a month), I can restart it without losing all my running apps.

  • Greyscale@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    I’ve not even heard of what the technical merits are. It seems to just break shit like systemd.

    Eventually I’ll be dragged across by the distro, but until then I do not care.

    • GenderNeutralBro@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      Same. I’m sure its great, but I’m not motivated to spend my time and energy on it. I remember when PulseAudio first came out, it had growing pains too. I jumped on board early because it solved problems I needed to solve. I was a younger nerd back then, and I don’t have the patience for the cutting edge anymore.

      I hear it does indeed work with Nvidia now, so I guess I’ll give it another shot next time I distro-hop.

      • russjr08@outpost.zeuslink.net
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        As someone who constantly checks in on the Nvidia + Wayland combination every time there is a Nvidia driver update, it “works” but only by the loosest definition unfortunately.

        • Pasta Dental@sh.itjust.works
          link
          fedilink
          arrow-up
          0
          ·
          2 years ago

          Yeah it does technically work as in it functions, but it’s riddled with bugs and missing features. The biggest one preventing me to switch my desktop over to Wayland is the lack of GAMMA_LUT (which enables night light). The issue for this has been open for over a year and there is still no apparent progress, Nvidia really is a pain on Linux.

          Meanwhile my AMD laptop works wonders on Wayland and it’s the best experience I’ve had using a computer by far, the touchpad gestures on GNOME +Wayland make me want to get a trackpad for my desktop when I can switch to Wayland.

          • russjr08@outpost.zeuslink.net
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 years ago

            Yep, the lack of GAMMA_LUT has been a thorn for me as well. I’ve tried getting around it every now and then by putting on some glasses that I picked up a while ago that just have the blue light tinting built in, but eh.

            Another massive problem is some applications, electron based especially, basically “rewind” frames every so often. I’m not even sure how to explain it, but for example you can be typing and a few letters will revert, then come back in… it’s very strange. Other applications also just have their UI stop rendering completely until restarted (KDE’s taskbar being an annoying occurrence when it happens).

            I have an older MacBook that I occasionally use with Fedora + KDE and Wayland works much better there, it’s only integrated graphics AFAIK so I keep my expectations tempered, but it’s definitely still smoother than Nvidia + Wayland which is just… sad.

            I really would love to test it with an AMD card some day, but I have way too many other things to worry about than picking up a new GPU for the time being.

        • GenderNeutralBro@lemmy.sdf.org
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          Oh. Womp womp.

          I need to refamiliarize myself with the state of AI libraries without CUDA. Last I checked it was still a problem. I’d love to never buy Nvidia again.

          • russjr08@outpost.zeuslink.net
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 years ago

            I’m not too well versed in the AI/ML industry, but from what I hear CUDA is still the far prevalent / preferred backend - I don’t believe its impossible so to speak, but it definitely involves having to dig a bit deeper for more alternatives.

            I hear its also somewhat common to use an AMD GPU for your actual desktop, but then also have an Nvidia GPU strictly for usage of CUDA but of course that’s a bit more expensive and also still involves keeping up with Nvidia’s hardware.

        • pelotron
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          Yea, I have a 2080 and try to run a Wayland KDE session every now and then, but so far every time the desktop has ended up frozen after a couple minutes. Reboot back into X it is…

      • zwekihoyy@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        2 years ago

        just check out a compatible desktop environment/window manager. you don’t need to do a full distro change.

        if the base is the same (ie. debian, arch, etc) there is no point in changing distros anyways.

        • dnzm@kbin.social
          link
          fedilink
          arrow-up
          0
          ·
          2 years ago

          Or, maybe, not go out of my way to fit my way of working to someone else’s notice of how I should be doing things? If A works and B doesn’t, unless I put in a lot of effort… Why exactly would I?

  • 0x0@social.rocketsfall.net
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    2 years ago

    X11 is, to put it simply, not at all fit for any modern system. Full stop. Everything to make it work on modern systems are just hacks. Don’t even try to get away with “well, it just works for me” or “but Wayland no worky”.

    I really don’t know if there could be a more obnoxious opening than this. I guess Wayland fanatics have taken a page from the Rust playbook of trying to shame people into using it when technical merits aren’t enough (“But your code is UNSAFE!”)

    • Auli@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      Ok but then how about the developers of X11 who decided it wasn’t worth fixing the issues and to start a new project called Wayland where they could start from scratch to fix the issues. Does that change your mind at all?

        • West Siberian Laika@lemm.ee
          link
          fedilink
          arrow-up
          0
          ·
          2 years ago

          I don’t want to sound rude, but how old is your setup? Are you using a desktop or a laptop computer?

          Because I’m daily driving a late 2015 Dell XPS 9350 and X11 just ain’t cutting it, even though the laptop is nearly a decade old. On X11, its trackpad would be garbage, GNOME’s animations would be stuttery, and fractional scaling would be a mess, because I have a docking station with a 75 Hz ultrawide monitor, meaning that I must utilise both 125% and 100% scaling factors, as well as 60 Hz and 75 Hz refresh rates and different resolutions. Sure, not everyone uses multi monitor setups, but those who do serious office tasks or content production work often cannot imagine their workflow without multiple monitors. Point is, X11 is to ancient to handle such tasks smoothly, reliably and efficiently.

          • 0x0@social.rocketsfall.net
            link
            fedilink
            arrow-up
            0
            ·
            2 years ago

            It’s not rude - don’t worry. My main desktop runs 4 monitors at 1080p. GPU is an RX 580. I have a number of other laptops/tablets/desktops running similar configs, including ones with mixed resolutions and refresh rates. Gaming/video production/programming.

            I think people are really discounting the amount of value experience with a certain set of software has to the end-user. Wayland isn’t a drop-in replacement. There’s a new suite of software and tooling around it that has to be learned, and this is by design. Understandably, many people focus on getting displays working properly on mixed resolutions and refresh rates, but there are concerns for usability/accessibility outside of that.

      • duncesplayed@lemmy.one
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        That would be a “technical merit”, which the article author claims is irrelevant to the discussion.

      • orangeboats@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        2 years ago

        I feel that the biggest mistake of X11’s protocol design is the idea of a “root window” that is supposed to cover the whole screen.

        Perhaps that worked greatly in the 1990s, but it’s just completely incompatible with multi-displays that we commonly see in modern setups. Hacks upon hacks were involved to make multi-displays a possibility on X11. The root window no longer corresponded to a single display. In heterogenous display setups, part of the root window is actually invisible.

        Later on we decided to stack compositing on top of the already-hacky mess, and it was so bad that many opted to disable the compositor (no Martha, compositors are more than wobbly windows!).

        And then there’s the problem of sandboxing programs… Which is completely unmappable to X11 even with hacks.

        • michaelrose@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          2 years ago

          Multiple displays work fine. The only thing that needs to be drawn in the root window is attractive backgrounds sized to your displays I’m not sure why you think that is hacky or complicated.

          • West Siberian Laika@lemm.ee
            link
            fedilink
            arrow-up
            0
            ·
            2 years ago

            Multiple displays only work as long as you have identical resolutions and refresh rates. Good luck mixing monitors with different scaling factors and refresh rates on X11.

            • michaelrose@lemmy.ml
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 years ago

              This wasn’t true in 2003 when I started using Linux in fact the feature is so old I’m not sure exactly when it was implemented. You have always been able to have different resolutions and in fact different scaling factors. It works like this

              You scale your lower DPI display or displays UP to match your highest DPI and let X scale down to the physical size. HIGHER / LOWER = SCALE FACTOR. So with 2 27" monitors where one is 4k and the other is 1080p the factor is 2, a 27" 4K with a 24" 1080p is roughly 1.75.

              Configured like so everything is sharp and UI elements are the same size on every screen. If your monitors are vertically aligned you could put a window between monitors and see the damn characters lined up correctly.

              If you use the soooo unfriendly Nvidia GPU you can actually configure this in its GUI for configuring your monitors. If not you can set with xrandr the argument is --scale shockingly enough

              Different refresh rates also of course work but you ARE limited to the lower refresh rate. This is about the only meaningful limitation.

            • Hexarei@programming.dev
              link
              fedilink
              arrow-up
              0
              ·
              2 years ago

              I run multiple refresh rates without any trouble, one 165hz monitor alongside my other 60hz ones. Is that supposed to be broken somehow?

      • woelkchen@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 years ago

        This is not an insult to the people behind X11.

        The people behind X11 agree and that’s why they founded Wayland.

    • Static_Rocket@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      No, no, they’ve got a point. The architecture of Wayland is much more sane. Because of the way refresh events are driven its also much more power and memory efficient. I’ll miss bspwm and picom but man there is a lot riding on simplifying the graphics stack under Linux. The X hacks, GLX, and all the other weird interactions X decided to take away from applications made things non-portable to begin with and a nightmare for any embedded devices that thought GLES was good enough.

    • russjr08@outpost.zeuslink.net
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 years ago

      I find that usually when people write “Full stop”, it’s best to just stop reading there in most cases.

      It comes off as “I am correct, how dare you think that for a moment I could be wrong”.

      I’d love to use Wayland, but until it works properly on Nvidia hardware like X11 is, then it’s not a viable option for me. Of course, then someone always goes “Well then use an AMD card” but money doesn’t grow on trees. The only reason I’m not still using a 970 is because a friend of mine was nice and gave me his 2080 that he was no longer using, along with some other really nice upgrades to my hardware.

      Honestly it’s one of the biggest issues I have with the Linux community. I love Linux and FOSS software but the people who go around and yell at anyone who isn’t using Linux, and the people who write articles like this who try to shame you for your choices (something that is supposed to be a landmark of using open source software) only make Linux look bad.

      There’s a difference between someone kindly telling others that X11 is not likely to receive any new major features and bug fixes (which is the right thing to do, in order to inform someone something they may not know) - and then there’s whatever the author of this quote is doing.

      • bemenaker@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 years ago

        It sounds like you need to be complaining to nvidia to do a better job with their drivers. If the drivers suck, it doesn’t matter what wayland does.

      • happyhippo@feddit.it
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        2 years ago

        It happens all the time in the magical world of closed source, too.

        Ever heard about the iOS vs Android fights? How people shame Android users for being green bubbles?

        It’s just the extension of the my camp vs theirs applied to the tech field, nothing new.

        • pelotron
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          2 years ago

          I laughed off reports about this kind of thing, thinking “omg who could possibly give a shit about what color their text bubble is in a group chat?” Later my gen Z office mate told me about how he uses an iPhone and cited this exact reason unironically. I was stunned into silence.

          • zwekihoyy@lemmy.ml
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            2 years ago

            there’s a decent amount of research into the psychology behind it and how reading white text on the light green is more difficult than on the blue bubble. it’s rather interesting.

            edit: although I would think dark mode should change that effect a little bit

        • russjr08@outpost.zeuslink.net
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          Oh absolutely, I am sadly all far too well aware of those cases (especially the “green bubbles” thing, I’ve never rolled my eyes harder at a silly situation).

          It’s not even strictly a tech thing either, its a long standing thing in human history no matter where you look, and unfortunately I don’t see it going away any time soon.

        • HouseWolf@lemmy.ml
          link
          fedilink
          arrow-up
          0
          ·
          2 years ago

          I’ve noticed it more and more over the years how people will fight tooth and nail to defend a product for no other reason than self validation.

          I’ve even had one person try to sell me on OperaGX as if they were reading off an AD, When I asked them technical questions about it they just pulled the conversation back to selling up the gimmicks. I finally straight asked them why they were advertising something for a company they don’t work for and they just got offended. Was kinda a surreal experience.

    • michaelrose@lemmy.ml
      link
      fedilink
      arrow-up
      0
      ·
      2 years ago

      This is literally the exact bad attitude of your average Wayland proponent. The thing which has worked for 20 years doesn’t work you just hallucinated it along with all the show stopper bugs you encountered when you tried to switch to Wayland.

      • orangeboats@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        2 years ago

        It’s really not “working” per se. VRR was breaking on X11, sandboxing was breaking on X11, fractional scaling and mixed DPI were breaking on X11.

        How did we achieve HiDPI on X11? By changing Xft.dpi (breaking old things) or adding random environment variables (terrible UX - do you want to worsen Linux desktop’s reputation even more?). Changing XRandR? May your battery life be long lasting.

        There’s genuinely no good way to mix different DPIs on the same X server, even with only one screen! On Windows and Mac, the old LoDPI applications are scaled up automatically by the compositor, but this just doesn’t exist on X11.

        I focus on DPI because this is a huge weakness of X11 and there is a foreseeable trend of people using HiDPI monitors more and more, there are tons of other weaknesses, but people tend to sweep them under the rug as being exotic. And please don’t call HiDPI setups exotic. For all the jokes we see on the eternal 768p screens that laptop manufacturers like to use, the mainstream laptops are moving onto 1080p. On a 13" screen, shit looks tiny if you don’t scale it up by 150%.

        You can hate on Wayland, you may work on an alternative called Delaware for all I care, but let’s admit that X11 doesn’t really work anymore and is not the future of Linux desktop.

        • michaelrose@lemmy.ml
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          Outside of your fantasies high DPI works fine. Modern QT apps seem to pick it up fairly automatically now and GTK does indeed require a variable which could trivially be set for the user.

          Your desktop relies on a wide variety of env variables to function correctly which doesn’t bother you because they are set for you. This has literally worked fine for me for years. I have no idea what you think you are talking about. Wayland doesn’t work AT ALL for me out of the box without ensuring some variables are set because my distro doesn’t do that for me this doesn’t mean Wayland is broken.

          • orangeboats@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            2 years ago

            They pick up “automatically” because of how your DE sets up the relevant envvars for you, there is nothing in the protocol that actually tells the applications “hey, this monitor needs X% DPI scaling!”.

            The side effect of this deficiency in the protocol is very obvious, you can’t mix DPIs, because the envvars or Xft.dpi are global and not per-application. Have you seen a blurry LoDPI X11 window sitting right beside a HiDPI X11 window? Or an X11 window changing its DPI dynamically as you move it across monitors with different DPIs?

            The fact that SDL2 still doesn’t support HiDPI on X11 when it already does on Macs, Windows, and Linux Wayland should tell you something.

            Don’t throw the “it works for me” excuse on me. Because I can throw it back on you too: “Wayland works on my machine”. X11 is utterly broken, just admit it. You are welcome to develop another X11 if you want.

            • michaelrose@lemmy.ml
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 years ago

              Nothing is set automatically I run a window manager and it starts what I tell it to start. I observed that at present fewer env variables are now required to obtain proper scaling. I did not personally dig into the reasoning for same because frankly its an implementation detail. I just noted that qt apps like dolphin and calibre are scaled without benefit of configuration while GTK apps like Firefox don’t work without GDK_SCALE set.

              X actually exposes both the resolution and physical size of displays. This gives you the DPI if you happen to have mastered basic math. I’ve no idea if this is in fact used but your statement NOTHING provides that is trivially disprovable by runing xrandr --verbose. It is entirely possible that its picking up on the globally set DPI instead which in this instance would yield the exact same result because and wait for it.

              You don’t in fact actually even need apps to be aware of different DPI or dynamically adjust you may scale everything up to the exact same DPI and let X scale it down to the physical resolution. This doesn’t result in a blurry screen. The 1080p screen while not as pretty as the higher res screens looks neither better nor worse than it looks without scaling.

              Why would I need to develop another X11 I believe I shall go on using this one which already supported high and mixed DPI just fine when Wayland was a steaming pile of shit nobody in their right mind would use. It probably actually supported it when you yourself were in elementary school.

              • orangeboats@lemmy.world
                link
                fedilink
                arrow-up
                0
                ·
                edit-2
                2 years ago

                Nothing is set automatically I run a window manager and it starts what I tell it to start. I observed that at present fewer env variables are now required to obtain proper scaling.

                Fun fact: zero envvars are needed for HiDPI support on Wayland.

                You do possibly need envvars to enable Wayland support though, but the latest releases of Qt6, GTK4, SDL3 etc. are enabling Wayland by default these days so in the future everything will work out of the box. By default.

                X actually exposes both the resolution and physical size of displays. This gives you the DPI if you happen to have mastered basic math. I’ve no idea if this is in fact used but your statement NOTHING provides that is trivially disprovable by runing xrandr --verbose.

                Did I say XRandR and mixed DPI in my previous comments? Yeah, I think I did. What the Qt applications currently do is choosing the max DPI and sticking with it. There are some nasty side effects, as I will explain below.

                You don’t in fact actually even need apps to be aware of different DPI or dynamically adjust you may scale everything up to the exact same DPI and let X scale it down to the physical resolution.

                Don’t forget the side effect: GPU demands and/or CPU demands (depending on the renderer) increase… a lot, nearly 2x in some cases. This might not be acceptable in applications like laptops - have you used projectors in college?

                Anecdotally speaking, I gained 1 to 2 hours of battery life just by ditching X11, it’s impressive considering my battery life was like 4 to 5 hours back then. Now it’s actually competitive with Windows which usually gets 6 to 7 hours of battery life.

                Furthermore, scaling up and down in multiple passes, instead of letting the clients doing it in “one go” and have the compositor scan it directly onto your screen, leads to problems in font rendering because of some antialiasing shenanigans in addition to the power consumption increase. It’s the very reason why Wayland added a fractional_scaling protocol.

                Why would I need to develop another X11 I believe I shall go on using this one which already supported high and mixed DPI just fine when Wayland was a steaming pile of shit nobody in their right mind would use.

                Apparently the “nobody” includes GTK, Qt, SDL, and all the mainstream DEs (Xfce and Cinnamon included - even they are preparing to add Wayland support). 90% of the programs I use actually support Wayland pretty well. Good job lad, you managed to invalidate your own argument.

                Besides that, you still haven’t properly answered the question of mixed DPI: have you seen a properly-scaled-up LoDPI X11 application? It’s a big problem for XWayland developers. See it here. And yes… those developers are (were?) X11 developers. I think they know how unworkable X11 is, more than you do.

                • michaelrose@lemmy.ml
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  2 years ago

                  It doesn’t require a meaningful or measurable difference in CPU/GPU to scale my third monitor. That is to say in practical effect actual usage of real apps so dwarfs any overhead that it is immeasurable statistical noise. In all cases nearly all of the CPU power is going to the multitude of applications not drawing more pixels.

                  The concern about battery life is also probably equally pointless. People are normally worrying about scaling multiple monitors in places where they have another exciting innovation available… the power cord. If you are kicking it with portable monitors at the coffee shop you are infinitely more worried about powering the actual display more so than GPU power required to scale it. Also some of us have actual desktops.

                  Furthermore, scaling up and down in multiple passes, instead of letting the clients doing it in “one go” and have the compositor scan it directly onto your screen, leads to problems in font rendering

                  There are some nasty side effects

                  There just aren’t. It’s not blurry. There aren’t artifacts. It doesn’t take a meaningful amount of resources. I set literally one env variable and it works without issue. In order for you to feel you are justified you absolutely NEED this to be a hacky broken configuration with disadvantages. It’s not its a perfectly trivial configuration and Wayland basically offers nothing over it save for running in place to get back to the same spot. You complain about the need to set an env var but to switch to wayland would be a substantial amount of effort and you can’t articulate one actual benefit just fictional deficits I can refute by turning my head slightly.

                  Your responses make me think you aren’t actually listening for instance

                  11 is utterly broken, just admit it. You are welcome to develop another X11 if you want.

                  Why would I need to develop another X11 I believe I shall go on using this one which already supported high and mixed DPI just fine when Wayland was a steaming pile of shit nobody in their right mind would use. Apparently the “nobody” includes GTK, Qt, SDL…

                  Please attend more carefully. Scaling and High DPI was a thing on X back when Wayland didn’t work at all. xrandr supported --scale back in 2001 and high DPI support was a thing in 2012. Wayland development started in 2008 and in 2018 was still a unusable buggy pile of shit. Those of us who aren’t in junior high school needed things like High DPI and scaling back when Wayland wasn’t remotely usable and now that it is starting to get semi usable I for one see nothing but hassle.

                  I don’t have a bunch of screen tearing, I don’t have bad battery life, I have working high DPI, I have mixed DPI I don’t have a blurry mess. These aren’t actual disadvantages this is just you failing to attend to features that already exist.

                  Imagine if at the advent of automatic transmissions you had 500 assholes on car forums claiming that manual transmission cars can’t drive over 50MPH/80KPH and break down constantly instead of touting actual advantages. It’s obnoxious to those of us who discovered Linux 20 years ago rather than last week.

            • michaelrose@lemmy.ml
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 years ago

              Why on earth would I develop “another X11” instead of using the one that still works perfectly fine?

        • WuTang @lemmy.ninja
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 years ago

          How did we achieve HiDPI on X11? By changing Xft.dpi (breaking old things) or adding random environment variables (terrible UX - do you want to worsen Linux desktop’s reputation even more?).

          You seems to have dealt with windows recentely.

          Regarding linux on desktop… as long as you don’t involve smelly gamers, it’s perfectly fine.

          • orangeboats@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            2 years ago

            I have been daily-driving Linux for years, but I do boot into Windows from time to time. Even then, I recognize that the out-of-the-box experience of Linux desktop isn’t as good as it can be, although it’s been rapidly improving.

    • Sh1nyM3t4l4ss@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 years ago

      There are several remarks in that article that bothered me. I agree with their message overall and am a strong proponent of Wayland but…

      Unless your workflow (and hardware) comes from 20+ years ago, you have almost no reason to stick with Xorg

      There definitely are valid use cases that aren’t 20 years old that will keep you on X11 for a little while longer. And hardware too: NVIDIA dropped driver support for Kepler GPUs and older before they added GBM support which is effectively a requirement for Wayland, so you can’t use these older cards on Wayland with the proprietary drivers

      Of course, NVIDIA likes to do their own thing, as always. Just use Nouveau if you want to do anything with Xwayland, and you don’t have several GPUs.

      Uh, no. Nouveau is not a serious option for anyone who likes using their GPU for useful things. And on those older cards it will likely never work well.

      The author of that article seems extremely ignorant of other people’s needs.

      • woelkchen@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 years ago

        NVIDIA dropped driver support for Kepler GPUs and older before they added GBM support which is effectively a requirement for Wayland, so you can’t use these older cards on Wayland with the proprietary drivers

        That’s definitively the fault of people to buy NVidia hardware which only works fine on Windows. It’s not the fault of Wayland developers that NVidia is a shit company that does not care to make their hardware properly run on Linux.

        • Sh1nyM3t4l4ss@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          2 years ago

          Can we stop shaming people who buy NVIDIA?

          For one, people want to keep using what they have and not buy something new just because it may work better on Linux, abd they may not even be able to afford an upgrade. They probably didn’t even know about Linux compatibility when they got it.

          And additionally, some people have to use NVIDIA because e. g. they rely on CUDA or something (which is unfortunate but not their fault).

          And honestly, NVIDIA is fine on Linux nowadays. It sucks that support for older cards will likely stay crappy forever but hopefully with the open kernel drivers and NVK newer cards won’t have to suffer that fate.

          • woelkchen@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            2 years ago

            Can we stop shaming people who buy NVIDIA?

            Can people who buy NVidia hardware contrary to widespread wisdom just start to own up to their decisions and not complain about Wayland every time it is mentioned?

      • michaelrose@lemmy.ml
        link
        fedilink
        arrow-up
        0
        ·
        2 years ago

        The author is a Wayland fanboy which almost by definition makes them a moron. We are talking about folks who were singing the same song like 7 years ago when the crack they were promoting was outrageously broken for most use cases.

  • bookmeat@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    2 years ago

    I still don’t know why people are willing to give up remoting so willingly. With X it was always easy to send your accelerated video securely over the network. Didn’t Wayland drop this? How are people remoting securely into Wayland desktops now?

  • AItoothbrush@lemmy.zipBanned from community
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 years ago

    I switched to wayland because of screen tearing and it fixed it. Idk if x is still glitchy on my new laptop but i dont really care. Also hyprland is really cool so im happy with wayland.

  • calzone_gigante@lemmy.eco.br
    link
    fedilink
    arrow-up
    0
    ·
    2 years ago

    Replacing good legacy will always be a struggle. X11 works pretty well and has been stable for decades. Most of the things that suck about it already have workarounds.

    The advantages of Wayland are not directly visible for the end user. The security part will be great once it’s completely integrated on the distributions to give granular permissions to software. The simpler apis and greater performance will help libraries creators, but most developers don’t touch X directly and won’t touch Wayland either.

    Being stable for a couple of months is not good enough. People will use it once distros trust it enough to make it default, and this will probably only happen once Wayland or its compatibility tools work with most software and major applications work significantly better on it.

  • FluffyPotato@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    2 years ago

    Yea, I’m currently using Wayland because Manjaro comes with it but like 90% of the programs I use launch with xwayland anyway. I’m not a developer but can’t they just give it proper support for all programs? Like run those programs like they do on X11. Seems pointless if nothing works on it.

    Not to mention my laptop with Nvidia graphics, that is just so broken on Wayland I ended up switching it to X11 and I’m very lazy.

    • orangeboats@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 years ago

      Most programs you use (provided they are FOSS) probably already support Wayland, they just don’t do it by default. The following list of environment variables can go a long way in making your system largely Wayland-native.

      GDK_BACKEND=wayland
      QT_QPA_PLATFORM=wayland
      MOZ_ENABLE_WAYLAND=1
      #SDL_VIDEODRIVER=wayland # this one is dangerous if you play games
      
      • FluffyPotato@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        2 years ago

        My desktop is basically just used for gaming so like nothing there is FOSS. My laptop is used for work which requires a ton of remoting so again Wayland would not work.

        • orangeboats@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          2 years ago

          On the other hand… if you are primarily gaming on your PC, then the moment Wine supports Wayland 90% of your programs will be Wayland-native.

          • FluffyPotato@lemm.ee
            link
            fedilink
            arrow-up
            0
            ·
            2 years ago

            And that would be great if/when it happens but currently Wayland offers very little over X11 to the average user. Better multi monitor support is like the only thing.

            • orangeboats@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              2 years ago

              Theoretically an average user should feel nothing. The benefits of Wayland are usually more visible when more modern technologies are involved, for example VRR, high DPI, HDR, etc.

        • woelkchen@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          2 years ago

          My laptop is used for work which requires a ton of remoting so again Wayland would not work.

          Funny how a 10 second “wayland remote desktop” web search disproves that claim.

          • FluffyPotato@lemm.ee
            link
            fedilink
            arrow-up
            0
            ·
            2 years ago

            Last time I tried that it just launched through xwayland. And currently my laptop is using X11 because Nvidia GPU so I can’t re-test.

            • woelkchen@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              2 years ago

              And currently my laptop is using X11 because Nvidia GPU

              All notebook released in the last several years with an NVidia dGPU also have an AMD or Intel iGPU.

              • FluffyPotato@lemm.ee
                link
                fedilink
                arrow-up
                0
                ·
                2 years ago

                Sure and X11 actually works with it’s GPU while Wayland causes more issues than its worth. I’m not gonna use software that locks off my GPU especially if that software has no advantages for me.

                • woelkchen@lemmy.world
                  link
                  fedilink
                  arrow-up
                  0
                  ·
                  2 years ago

                  Wayland causes more issues than its worth. I’m not gonna use software that locks off my GPU

                  Sure, shift the blame of NVidia’s shitty Linux and Wayland support to Wayland. How on earth could a multi-billion dollar company ever manage to make proper drivers only 11 years after Wayland 1.0 has been released…? THAT’S IMPOSSIBLE! WAYLAND BAD!

    • Rescuer6394@feddit.nl
      link
      fedilink
      arrow-up
      0
      ·
      2 years ago

      manjaro switched to Wayland??!

      i installed it a couple of years ago… should i search on how to switch it to Wayland?

      • FluffyPotato@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        2 years ago

        I would not bother, Wayland doesn’t have many advantages, at least not from my point of view. I would honestly switch my install to X11 if I wasn’t lazy.