• 0 Posts
  • 50 Comments
Joined 1 year ago
cake
Cake day: June 17th, 2023

help-circle






  • I use IPv6 exclusively for my homelab. The pros:

    • No more holepunching kludge with solutions like ZeroTier or Tailscale, just open a port and you are pretty much good to go.

    • The CGNAT gateway of my ISP tends to be overloaded during the holiday seasons, so using IPv6 eliminates an unstability factor for my lab.

    • You have a metric sh*t ton of addressing space. I have assigned my SSH server its own IPv6 address, my web server another, my Plex server yet another, … You get the idea. The nice thing here is that even if someone knows about the address to my SSH server, they can’t discover my other servers through port scanning, as was typical in IPv4 days.

    • Also, because of the sheer size of the addressing space, people simply can’t scan your network.








  • There are times when the original standard has zero forwards compatibility in it, such that any improvement made to it necessarily creates a new standard.

    And then there’s also times when old greybeards simply disregard the improved standard because they are too used to the classic way.






  • You ably demonstrate your own inability to listen.

    Or was it you?

    I’m not sure how you hallucinated that Wayland got 4 years of design and 8 years of implementation.

    2012-2021, or to clarify “Late 2012 to early-mid 2021” seems to be 8-point-something years to me. I dunno, did mathematics change recently or something?

    With graphics programming relatively in its infancy X11 didn’t require 15 years to become usable

    I hope you do understand that graphics weren’t as complicated back then. Compositing of windows was not an idea (at least, not a widely spread one) in the 90s. Nor was sandboxing an idea back then. Or multidisplay (we hacked it onto X11 later through XRandR). Or HDR nowadays. Or HiDPI. Or touch input and gestures. We software rendered everything too, so DRI and friends weren’t thought of.

    In a way… you are actually insulting the kernel developers.


  • That is to say in practical effect actual usage of real apps so dwarfs any overhead that it is immeasurable statistical noise

    The concern about battery life is also probably equally pointless.

    some of us have actual desktops.

    There just aren’t. It’s not blurry.

    I don’t have a bunch of screen tearing

    Let me summarize this with your own statement, because you certainly just went out and disregarded all things I said:

    Your responses make me think you aren’t actually listening for instance

    Yeah, you are now just outright ignoring people’s opinion. 2 hours of battery life - statistical noise, pointless. Laptops - who neeeeeeeeds that, we have desktops!! Lack of fractional scaling which people literally listed as a “disadvantage” of Wayland before it got the protocol - yeah, I guess X11 is magic and somehow things are not blurry on X11 which has the same problem when XRandR is used.

    Do I need to quote more?

    Also, regarding this:

    Wayland development started in 2008 and in 2018 was still a unusable buggy pile of shit.

    Maybe you should take note of when Wayland development had actually started picking up. 2008 was when the idea came up. 2012 was when the concrete foundation started being laid.

    Not to mention that it was 2021 when Fedora and Ubuntu made it default. Your experience in 2018 is not representative of the Wayland ecosystem in 2021 at all, never mind that it’s now 2023. The 3 years between 2018-2021 saw various applications either implementing their first support, or maturing their support of Wayland. Maybe you should try again before asserting a bunch of opinions which are outdated.

    Wayland was effectively rebuilding the Linux graphics stack from the ground up. (No, it’s not rebuilding the stack for the sake of it. The rebuilding actually started in X.org, but people were severely burned out in the end. Hence Wayland. X.org still contains an atomic KMS implementation, it’s just disabled by default.)

    4 years of designing and 8 years of implementation across the entire ecosystem is impressive, not obnoxious.

    It’s obnoxious to those of us who discovered Linux 20 years ago rather than last week.

    Something makes me think that you aren’t actually using it 20 years ago.

    Maybe it’s just my memory of the modelines failing me. Hmmm… did I just hallucinate the XFree86 server taking down my system?

    Oh noes, I am getting old. Damn.