So I’ve been trying to install the proprietary Nvidia drivers on my homelab so I can get my fine ass art generated using Automatic1111 & Stable diffusion. I installed the Nvidia 510 server drivers, everything seems fine, then when I reboot, nothing. WTF Nvidia, why you gotta break X? Why is x even needed on a server driver. What’s your problem Nvidia!

  • Sparking@lemm.ee
    link
    fedilink
    English
    arrow-up
    18
    ·
    1 year ago

    What i don’t get is how nvidia stock is exploding when using their hardware for AI is a nightmare on Linux. How are companies doing this? Are they just offering enterprise support to ibsiders or something?

    • Crayphish@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      27
      ·
      1 year ago

      For what it’s worth, NVIDIA’s failings on Linux tend to be mostly in the desktop experience. As a compute device driven by cuda and not responsible for the display buffer, they work plenty good. Enterprise will not be running hardware GUI or DEs on the machines that do the AI work, if at all.

      • Aasikki@lemmy.ml
        link
        fedilink
        arrow-up
        5
        ·
        1 year ago

        Even the old 1060 in my truenas scale server, has worked absolutely flawlessly with my jellyfin server.

      • Diplomjodler@feddit.de
        link
        fedilink
        arrow-up
        3
        arrow-down
        2
        ·
        1 year ago

        They don’t give a fuck about consumers these days and Linux being just a tiny fraction of the userbase, they give even less of a fuck.

      • Boo@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        I’ve had a bunch of issues with my GTX 1080 before I switched to an AMD RX 5700 XT. I love it, but I recently put the 1080 back in use for a headless game streaming server for my brother. It’s been working really well, handling both rendering and encoding at 1080p without issue, so I guess I’ve arrived at the same conclusion. They don’t really care about desktop usage, but once you’re not directly interacting with a display server on an Nvidia GPU, it’s fine.

    • cybersandwich@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      1 year ago

      Nvidia is a breeze on linux vs amd. cuda is the only thing meaningfully supported across Windows and Linux. I fought with my 6900xt for so long trying to get ROCm working that I eventually bought a used 1080ti just to do the AI/ML stuff I wanted to do. I threw that into a server and had everything up and running in literally 10 minutes (and 5 minutes was making proxmox pass the gpu through to the VM).

      People want to bitch about nvidia, but their entire ecosystem is better than AMD. The documentation is better and the tooling is better. On paper AMD is competitive but in practice Nvidia has so much more going for it–especially if you are doing any sort of AI/ML.

      There are some benefits to to amd on linux; its the reason I replaced my 3070ti for a 6900xt. But that experience taught me: 1. AMD isn’t as good on linux as people give it credit for 2. nvidia isn’t as bad on linux as people blame it for. You trade different issues. Eg. Lose nvenc and cant use amf unless you use the amdpro driver not the open source one. if you use the pro driver you immediately lose half the benefits of the open source driver which is probably why you switch to amd on linux to begin with. So if you game, you can’t stream with a decent encoder–so you have to play with settings and throw cpu horsepower at it.

      But hey, my DE doesn’t stutter and I dont have to do kludgy workarounds to get some games to play.