Not just Linux… 99% of the time you see something weird in the computing world, the reason is going to be “because history.”
Not just Linux… 99% of the time you see something weird in the computing world, the reason is going to be “because history.”
The C developers are the ones with the ageist mindset.
The Rust developers certainly are not the ones raising the point “C has always worked, so why should we use another language?” which ignores the objective advantages of Rust and is solely leaning on C being the older language.
They very rarely have memory and threading issues
It’s always the “rarely” that gets you. A program that doesn’t crash is awesome, a program that crashes consistently is easy to debug (and most likely would be caught during development anyway), but a program that crashes only once a week? Wooo boy.
People vastly underestimate the value Rust brings by ensuring the same class of bugs will never happen.
It really depends.
If I know I will never open the file in the terminal or batch process it in someways, I will name it using Common Case: “Cool Filename.odt”.
Anything besides that, snake case. Preferably prefixed with current date: “20240901_cool_filename”
People back then just grossly underestimated how big computing was going to be.
The human brain is not built to predict exponential growths!
I use IPv6 exclusively for my homelab. The pros:
No more holepunching kludge with solutions like ZeroTier or Tailscale, just open a port and you are pretty much good to go.
The CGNAT gateway of my ISP tends to be overloaded during the holiday seasons, so using IPv6 eliminates an unstability factor for my lab.
You have a metric sh*t ton of addressing space. I have assigned my SSH server its own IPv6 address, my web server another, my Plex server yet another, … You get the idea. The nice thing here is that even if someone knows about the address to my SSH server, they can’t discover my other servers through port scanning, as was typical in IPv4 days.
Also, because of the sheer size of the addressing space, people simply can’t scan your network.
If I remember right, the syncing issue was particularly egregious when you run windowed X11 programs on Wayland. So it could be that you got lucky.
It’s the explicit sync protocol.
The TL;DR is basically: everyone else has supported implicit sync for ages, but Nvidia doesn’t. So now everyone is designing an explicit sync Wayland protocol to accommodate for this issue.
You need to enable DRM KMS on Nvidia.
Mine is simply default KDE. The only visible thing I’ve changed is the wallpaper – changes to my desktop mostly concentrate on the “invisible” ones like shortcut keys or setting changes or scripting.
Desktop? I settled on Arch and Fedora.
Server? Debian. Although technically I never distrohopped on servers, been using Debian since the beginning of time.
Ackshually you still can blame Windows for not supporting live updates.
There are times when the original standard has zero forwards compatibility in it, such that any improvement made to it necessarily creates a new standard.
And then there’s also times when old greybeards simply disregard the improved standard because they are too used to the classic way.
To be fair: A notebook with a bunch of strong passwords is probably more secure than a human brain memorising a bunch of weak passwords.
Can’t replicate your results here. I play on Wayland, and deliberately force some games to run natively on Wayland (SDL_VIDEODRIVER=wayland
) and so far I haven’t noticed any framerate changes except statistical noise.
Still really good but I wanted it to melt my face like the end of Indiana Jones.
Truly c/BrandNewSentence material.
Seriously though, never challenge Indian and Thai cooks on spiciness. They can be ruthless!
It’s not a fork of wlroots. wlroots is a library to assist developers in creating Wayland compositors.
You ably demonstrate your own inability to listen.
Or was it you?
I’m not sure how you hallucinated that Wayland got 4 years of design and 8 years of implementation.
2012-2021, or to clarify “Late 2012 to early-mid 2021” seems to be 8-point-something years to me. I dunno, did mathematics change recently or something?
With graphics programming relatively in its infancy X11 didn’t require 15 years to become usable
I hope you do understand that graphics weren’t as complicated back then. Compositing of windows was not an idea (at least, not a widely spread one) in the 90s. Nor was sandboxing an idea back then. Or multidisplay (we hacked it onto X11 later through XRandR). Or HDR nowadays. Or HiDPI. Or touch input and gestures. We software rendered everything too, so DRI and friends weren’t thought of.
In a way… you are actually insulting the kernel developers.
That is to say in practical effect actual usage of real apps so dwarfs any overhead that it is immeasurable statistical noise
The concern about battery life is also probably equally pointless.
some of us have actual desktops.
There just aren’t. It’s not blurry.
I don’t have a bunch of screen tearing
Let me summarize this with your own statement, because you certainly just went out and disregarded all things I said:
Your responses make me think you aren’t actually listening for instance
Yeah, you are now just outright ignoring people’s opinion. 2 hours of battery life - statistical noise, pointless. Laptops - who neeeeeeeeds that, we have desktops!! Lack of fractional scaling which people literally listed as a “disadvantage” of Wayland before it got the protocol - yeah, I guess X11 is magic and somehow things are not blurry on X11 which has the same problem when XRandR is used.
Do I need to quote more?
Also, regarding this:
Wayland development started in 2008 and in 2018 was still a unusable buggy pile of shit.
Maybe you should take note of when Wayland development had actually started picking up. 2008 was when the idea came up. 2012 was when the concrete foundation started being laid.
Not to mention that it was 2021 when Fedora and Ubuntu made it default. Your experience in 2018 is not representative of the Wayland ecosystem in 2021 at all, never mind that it’s now 2023. The 3 years between 2018-2021 saw various applications either implementing their first support, or maturing their support of Wayland. Maybe you should try again before asserting a bunch of opinions which are outdated.
Wayland was effectively rebuilding the Linux graphics stack from the ground up. (No, it’s not rebuilding the stack for the sake of it. The rebuilding actually started in X.org, but people were severely burned out in the end. Hence Wayland. X.org still contains an atomic KMS implementation, it’s just disabled by default.)
4 years of designing and 8 years of implementation across the entire ecosystem is impressive, not obnoxious.
It’s obnoxious to those of us who discovered Linux 20 years ago rather than last week.
Something makes me think that you aren’t actually using it 20 years ago.
Maybe it’s just my memory of the modelines failing me. Hmmm… did I just hallucinate the XFree86 server taking down my system?
Oh noes, I am getting old. Damn.
For many systems out there, /bin and /lib are no longer a thing. Instead, they are just a link to /usr/bin and /usr/lib. And for some systems even /sbin has been merged with /bin (in turn linked to /usr/bin).