• Scrubbles@poptalk.scrubbles.tech
    link
    fedilink
    English
    arrow-up
    38
    ·
    15 days ago

    I know it’s a small set, but for gaming and is honestly king. Unless you want the absolute “I’m willing to pay double the cost for 5% more performance” top of the line, amd is just great.

    For AI and compute… They’re far behind. CUDA just wins. I hope a joint standard will be coming up soon, but until then Nvidia wins

    • aard@kyu.de
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      15 days ago

      For AI and compute… They’re far behind. CUDA just wins. I hope a joint standard will be coming up soon, but until then Nvidia wins

      I got a W6800 recently. I know a nvidia model of the same generation would be faster for AI - but that thing is fast enough to run stable diffusion variants with high resolution pictures locally without getting too annoyed.

      • crispyflagstones@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        15 days ago

        The completely different software stack is a killer. It’s not that you can’t find versions of a model to run, but almost everything that hits the GPU for compute is going to be targeting CUDA, not RocM. From a compatibility standpoint alone this killed AMD for me. I just do not want to spend my time fighting the stack to get these models running.

        • aard@kyu.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          15 days ago

          Admittedly I’m just toying around for entertainment purposes - but I didn’t really have any problems of getting anything I wanted to try out with rocm support. Bigger annoyance was different projects targetting specific distributions or specific software versions (mostly ancient python), but as I’m doing everything in containers anyway that also was manageable.

    • DarkThoughts@fedia.io
      link
      fedilink
      arrow-up
      2
      ·
      14 days ago

      Or the rise of dedicated NPUs, but that will likely take even more time (speaking of regular consumers here).

    • MonkderDritte@feddit.de
      link
      fedilink
      English
      arrow-up
      1
      ·
      15 days ago

      I know it’s a small set, but for gaming and is honestly king.

      I feel like the usecases for GPU in industry are more than AI.

    • vividspecter@lemm.ee
      link
      fedilink
      English
      arrow-up
      15
      ·
      15 days ago

      That and there just hasn’t been much gains in performance in recent years, so it makes sense to not upgrade for a while. And a lot of people upgraded all at once during the pandemic, so there are less people on the market for a new GPU.

      • priapus@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        15 days ago

        GPU’s aren’t in a shortage like they were. The majority of new GPUs are just regular people selling them. I wouldn’t personally call it scalping if it’s below MSRP.

  • Wahots@pawb.social
    link
    fedilink
    English
    arrow-up
    5
    ·
    15 days ago

    Hopefully, they remain competitive, I wanna try them next time I need a GPU. Would love a Sapphire card.

  • frezik@midwest.social
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    15 days ago

    If you have something from the Nvidia rtx20xx generation or newer, I’m not sure how much advantage there is to upgrading at all.