So I’ve been trying to install the proprietary Nvidia drivers on my homelab so I can get my fine ass art generated using Automatic1111 & Stable diffusion. I installed the Nvidia 510 server drivers, everything seems fine, then when I reboot, nothing. WTF Nvidia, why you gotta break X? Why is x even needed on a server driver. What’s your problem Nvidia!

  • fx_@feddit.de
    link
    fedilink
    English
    arrow-up
    100
    arrow-down
    5
    ·
    1 year ago

    Nvidia doesn’t hate linux, it just don’t care and the linux community hates nvidia

    • Vilian@lemmy.ca
      link
      fedilink
      arrow-up
      45
      ·
      1 year ago

      amd didn’t care a few years ago, but their drivers are open, so the community can fix it even if the company don’t care(now amd care a lot more, so it’s better) nvidia is a closed source crap, and it don’t give a fuck too

    • HurlingDurling@lemm.ee
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      2
      ·
      1 year ago

      And they can’t get all those sweet sweet tracking data they get from Windows users

  • GenderNeutralBro@lemmy.sdf.org
    link
    fedilink
    arrow-up
    73
    arrow-down
    1
    ·
    edit-2
    1 year ago

    Linux is their bread and butter when it comes to servers and machine learning, but that’s a specialized environment and they don’t really care about general desktop use on arbitrary distros. They care about big businesses with big support contracts. Nobody’s running Wayland on their supercomputer clusters.

    I cannot wait until architecture-agnostic ML libraries are dominant and I can kiss CUDA goodbye for good. I swear, 90% of my tech problems over the past 5 years have boiled down to “Nvidia sucks”. I’ve changed distros three times hoping it would make things easier, and it never really does; it just creates exciting new problems to play whack-a-mole with. I currently have Ubuntu LTS working, and I’m hoping I never need to breathe on it again.

    That said, there’s honestly some grass-is-greener syndrome going on here, because you know what sucks almost as much as using Nvidia on Linux? Using Nvidia on Windows.

    • lightstream@lemmy.ml
      link
      fedilink
      arrow-up
      11
      ·
      1 year ago

      I cannot wait until architecture-agnostic ML libraries are dominant and I can kiss CUDA goodbye for good

      I really hope this happens. After being on Nvidia for over a decade (960 for 5 years and similar midrange cards before that), I finally went AMD at the end of last year. Then of course AI burst onto the scene this year, and I’ve not yet managed to get stable diffusion running to the point it’s made me wonder if I might have made a bad choice.

      • Ádám@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        6
        ·
        1 year ago

        It’s possible to run stable diffusion on amd cards, it’s just a bit more tedious and a lot slower. I managed to get it working on my rx 6700 under arch linux just fine. Now that I’m on fedora, it doesn’t really want to work for some reason, but I’m sure that it can be fixed as well, I just didn’t spend enough time on it.

    • ProtonBadger@kbin.social
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Yeah they don’t hate Linux, they just have their own priorities. That said I’m running Nvidia+Wayland happily, for desktop they have worked a lot more on Wayland this year, the upcoming driver fixes a bunch of things, and my distrib handled driver installation and updates, I never have to think about it.

    • Sparking@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      It just makes no sense to me though, how is it sustainable for nvidia to not have great Linux kernel support? Like, let the kernel maintainers do their job and reap the benefits. I’m guessing that nvidia sees enterprise support contracts as an essential revenue stream, but eventually even enterprises are going to go with hardware that Linus isn’t giving the finger to right? Am I crazy?

    • SkySyrup@sh.itjust.works
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      i totally agree with that whack-a-mole analogy. what, you’re on debian or any debian-based distro? well, my friend, good luck ever waking up from suspend! no error logs, no crash reports, nothing. So i’m on arch right now. Annoying, because i prefer getting stuff done instead of yelling at my bluetooth driver, but at least the nvidia drivers work most of the time :)

      sorry, rant over

  • xrun_detected@programming.dev
    link
    fedilink
    arrow-up
    53
    ·
    1 year ago

    nvidia has always been hostile to open source, as far back as i can remember.

    back when nvidia bought 3dfx they took down the source code for the open 3dfx drivers within days, if not on the same day. i remember because i had just gotten myself a sweet voodoo 5 some weeks before that, and the great linux support was the reason i chose it… of course the driver code survived elsewhere, but it told me all i needed to know about that company.

    also: linus’ rant wasn’t just a fun stunt, it was necessary to get nvidia to properly cooperate with the open source community if they want to keep making money running linux on their hardware.

  • sealneaward@lemmy.ml
    link
    fedilink
    arrow-up
    40
    arrow-down
    1
    ·
    1 year ago

    Takes about 8 hrs to setup properly. But once you do set your Nvidia card with Linux, you just never update your OS and cry to sleep every night.

    • BaconIsAVeg@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      It took a little tweaking but I have Iray and Dforce working for Daz3D under Wine, and FFXIV runs great with it as well. Also got Stable Diffusion running last night without any issues.

      All the issues I’ve had have been requiring extra packages or installing some random github nvlibs, and kernel parameters. So as a user, the fact that my nvidia card didn’t work painlessly out of the box without additional configs doesn’t seem like an Nvidia problem…

      • Rhabuko@feddit.de
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Yikes… I hoped that Daz3D would just work fine with Iray for my next nvidia card. Add to this that I was planning to switch to Vanilla OS because I don’t want to hack my system or accidentally bork it 😑.

  • ghariksforge@lemmy.world
    link
    fedilink
    arrow-up
    39
    arrow-down
    1
    ·
    1 year ago

    Companies love to use open source software to reduce their development costs. They hate to contribute back.

    • Cethin@lemmy.zip
      link
      fedilink
      arrow-up
      15
      arrow-down
      20
      ·
      1 year ago

      That’s not true. Some companies contribute. AMD does a great job fostering open source software. This is an Nvidia issue. They are a plague and I hope they one day lose market share for it.

  • scorpiosrevenge@lemmy.ml
    link
    fedilink
    arrow-up
    22
    arrow-down
    1
    ·
    1 year ago

    Switched to high powered AMD GPUs years ago… No regrets. Awesome graphics, better support, and a better price point usually.

    • Lemminary@lemmy.ml
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      I did have many regrets. Mainly overheating and the card eventually failing on me. Funny how these large companies ship their shit to “third world countries” so that people have a lower chance of returning their POS

  • WasPentalive@beehaw.org
    link
    fedilink
    arrow-up
    22
    arrow-down
    1
    ·
    1 year ago

    Nvidia does not ‘hate’ Linux, Nvidia simply never thinks about Linux. They need to keep secrets so people can’t buy the cheap card and with a little programming turn it into the expensive card.

      • WasPentalive@beehaw.org
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        Of course you do. Nvidia wants you to buy the expensive card instead. Since they are almost the same card in some instances the only difference is knowing that you can change values in certain registers to make cheapcard act like expensivecard. I personally use Intel graphics and won’t have nvidea.

    • michel@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      This. I bet the experience is better if you use it on an enterprise distro they have precompiled drivers for.

      With the boom in AI their focus is increasingly on the data center market, so it’s a small miracle (thanks Red Hat and others prodding them) they even have an open driver right now for newer cards (tellingly it’s in a better state for computational use than for rendering pixels on the screen)

  • sonymegadrive@feddit.uk
    link
    fedilink
    arrow-up
    23
    arrow-down
    2
    ·
    1 year ago

    I’m gonna be that person… I rarely, if ever have issues with nvidia on Linux. Used several 30xx series cards for gaming over the last couple of years and it’s been a great experience.

    Is it my distro (Void)?. is it because I’m happy staying on X11? Is it just luck? Interested to hear people’s gripes

    • ForbiddenRoot@lemmy.ml
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      1 year ago

      I’m gonna be that person…

      Well, you are not alone. While I too would prefer not to use proprietary drivers, I have had no problems on any of my Nvidia machines as well. Ironically, despite the open source drivers, getting a 7900XTX card up and running was an issue for me for months till distros caught up (with newer kernels and mesa libs), while my 4090 installation was a breeze even on the day it was released.

      A lot of problems people have with Nvidia GPUs seem to be installation related. I think that is because the installation tends to be distro-specific and people do not necessarily follow the correct procedure for their distro or try installing the drivers directly from the Nvidia site as they would on Windows. For example, Fedora requires you to add RPMFusion, Debian needs non-free to be added to sources, Linux Mint lets you install the proprietary drivers but only after the first boot, and so on. Pop OS! probably makes the process the easiest with their Nvidia-specific ISO.

    • sLLiK@lemmy.ml
      link
      fedilink
      arrow-up
      5
      ·
      1 year ago

      Minimal issues here. Set up Arch, install nVidia, add build hooks before next kernel update, carry on.

    • polygon@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      I have a 3080 and it runs fine with openSUSE Tumbleweed. On first boot you do need to add the nvidia repo and then install it which I guess could be problematic for new linux users, but it’s literally pasting 1 line into terminal and then clicking the driver in yast. Echoing what others have said, I’d prefer if nvidia was a little less hostile to open source but frankly the driver just works, and works well. The only thing I’ve used besides openSUSE lately is Pop_OS and I believe the nvidia driver was installed automatically. If someone is having trouble getting the driver installed that seems to be a failure of the distro, not the user. You should be able to depend on your distros packaging to take care of this stuff.

    • cybersandwich@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      There is definitely some substance behind the complaints, but I think they are overblown or just the typical linux-user-parroting something they heard other people say.

      On PopOS my 3070ti was always stable. I ran into occasional stuttering in the DE, but the biggest thing was I had to manual compile shaders using some guys github repo to play Apex Legends without it being a stuttery mess. But like you said, Pop is on X11 so maybe that makes a difference?

      I bought into the “if you are going to use linux, especially for gaming, you need an amd gpu.” So I bought a 6900xt. I’ve had as many issues with my 6900xt; they are just different types of issues. Nothing insurmountable but its not like its some panacea.

    • lightstream@lemmy.ml
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Same. I had an Nvidia 960 for about 5 years on arch with very few problems. Maybe twice over that time I had to rollback to an older version temporarily due to some incompatibility with wine or such like.

      Towards the end of last year I finally decided to upgrade (mostly to play RDR2) and I went with AMD. I love the feel of using a pure open source gfx stack, but there is no real functional advantage to it.

    • russjr08@outpost.zeuslink.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      is it because I’m happy staying on X11?

      I think this is a big part of it. I have no issues with Nvidia + X11, however if I try to use Wayland with my 2080 I get numerous issues that has me running back to Xorg very quickly.

    • Hairyblue@kbin.social
      link
      fedilink
      arrow-up
      2
      arrow-down
      8
      ·
      1 year ago

      I use Ubuntu and Nvidia 3080 and the only issue I have had was when Steam updated their Big Picture Mode. I was using Wayland and it broke with the new Big Picture Mode. I had to switch back to x11 and it works well with that. I do hope Nvidia and Steam fix the Wayland issue. I’d rather use Wayland.

      I have been using my Linux Gaming PC for for a couple of years now. I jumped ship with the ad riddled Windows 11. And I have been very happy with Steam/Proton gaming.

  • Sparking@lemm.ee
    link
    fedilink
    English
    arrow-up
    18
    ·
    1 year ago

    What i don’t get is how nvidia stock is exploding when using their hardware for AI is a nightmare on Linux. How are companies doing this? Are they just offering enterprise support to ibsiders or something?

    • Crayphish@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      27
      ·
      1 year ago

      For what it’s worth, NVIDIA’s failings on Linux tend to be mostly in the desktop experience. As a compute device driven by cuda and not responsible for the display buffer, they work plenty good. Enterprise will not be running hardware GUI or DEs on the machines that do the AI work, if at all.

      • Aasikki@lemmy.ml
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        Even the old 1060 in my truenas scale server, has worked absolutely flawlessly with my jellyfin server.

      • Diplomjodler@feddit.de
        link
        fedilink
        arrow-up
        3
        arrow-down
        2
        ·
        1 year ago

        They don’t give a fuck about consumers these days and Linux being just a tiny fraction of the userbase, they give even less of a fuck.

      • Klara@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        I’ve had a bunch of issues with my GTX 1080 before I switched to an AMD RX 5700 XT. I love it, but I recently put the 1080 back in use for a headless game streaming server for my brother. It’s been working really well, handling both rendering and encoding at 1080p without issue, so I guess I’ve arrived at the same conclusion. They don’t really care about desktop usage, but once you’re not directly interacting with a display server on an Nvidia GPU, it’s fine.

    • cybersandwich@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      1 year ago

      Nvidia is a breeze on linux vs amd. cuda is the only thing meaningfully supported across Windows and Linux. I fought with my 6900xt for so long trying to get ROCm working that I eventually bought a used 1080ti just to do the AI/ML stuff I wanted to do. I threw that into a server and had everything up and running in literally 10 minutes (and 5 minutes was making proxmox pass the gpu through to the VM).

      People want to bitch about nvidia, but their entire ecosystem is better than AMD. The documentation is better and the tooling is better. On paper AMD is competitive but in practice Nvidia has so much more going for it–especially if you are doing any sort of AI/ML.

      There are some benefits to to amd on linux; its the reason I replaced my 3070ti for a 6900xt. But that experience taught me: 1. AMD isn’t as good on linux as people give it credit for 2. nvidia isn’t as bad on linux as people blame it for. You trade different issues. Eg. Lose nvenc and cant use amf unless you use the amdpro driver not the open source one. if you use the pro driver you immediately lose half the benefits of the open source driver which is probably why you switch to amd on linux to begin with. So if you game, you can’t stream with a decent encoder–so you have to play with settings and throw cpu horsepower at it.

      But hey, my DE doesn’t stutter and I dont have to do kludgy workarounds to get some games to play.

  • mub@lemmy.ml
    link
    fedilink
    arrow-up
    14
    ·
    1 year ago

    I’m on the cusp off jumping to Arch. Before I do I’m replacing my rtx 3080 with an RX 6800 XT. They are close enough in performance and identical pricing on eBay.

    I’ve done a bunch of testing and found great support for all my hardware except my Razer Ripsaw HDMI capture device, which I can replace with something supported. It is just the Nvidia bullshit holding me back.

    • /home/pineapplelover@lemm.ee
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      When I built my pc, I made sure to get AMD because of the nvidia outcry from the linux community. Thank goodness I got a 6800xt. I haven’t had any problems with it. It worked straight out of the box.

    • brakenium@lemm.ee
      link
      fedilink
      arrow-up
      2
      ·
      1 year ago

      While I’m using AMD, I have had no issues with Nvidia on Arch using X before I switched earlier this year. One just installs the nvidia or nvidia-dkms package. My main reasons to switch were I had a 1060 6GB and it was getting old, AMD had a better price and if I’m keeping this one as long as my last I wanted to be certain wayland support was good even though I don’t use it right now

      • Delta_44@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        1 year ago

        Using X, but X is shitty since ages and Wayland is gaining neat features that I personally don’t give a damn to (see: tearing protocol)

  • planish@sh.itjust.works
    link
    fedilink
    arrow-up
    14
    ·
    1 year ago

    They love to publish drivers that worked with like 1 release of X 5 years ago when the card came out and never update them.

    Except when they update them and it breaks X.

  • danielton@outpost.zeuslink.net
    link
    fedilink
    arrow-up
    10
    ·
    edit-2
    1 year ago

    I call them “novideo” because the nvidia GPU in a PC someone gave me was the bane of my existence on Linux. I ended up buying a Radeon for it because I got so tired of having no video after security updates. Nvidia seems to hate everybody except Windows for some reason. Even Apple ditched them long before they ditched Intel.

    But yet, it seems like the majority of Linux users have nvidia anyway.

    • Rassilonian Legate@mstdn.social
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      @danielton
      @Mr_Esoteric
      >But yet, it seems like the majority of Linux users have nvidia anyway.

      Probably becouse it’s more popular among windows users, so when most people switch to linux from Windows, they use the hardware they already had, which more often than not includes an nvidia GPU

    • 1984@lemmy.today
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      Nvidia seems to hate everybody except Windows for some reason.

      It’s called money. Microsoft and all these big tech companies have lots of agreements with eachother to support certain choices and ignore others. This is also why Lenovo has very limited choice of amd processors, and if they put that in, it’s in a model with other serious flaws.

      • danielton@outpost.zeuslink.net
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        1 year ago

        But it’s still stupid, especially if it’s about money, since Nvidia wants to sell a lot of chips to the Android market. And with Linux users being dumb enough to keep buying Nvidia products and using their mediocre proprietary drivers, nothing will ever change.