• bruhduh@lemmy.world
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    1
    ·
    edit-2
    2 months ago

    Search Nvidia p40 24gb on eBay, 200$ each and surprisingly good for selfhosted llm, if you plan to build array of gpus then search for p100 16gb, same price but unlike p40, p100 supports nvlink, and these 16gb is hbm2 memory with 4096bit bandwidth so it’s still competitive in llm field while p40 24gb is gddr5 so it’s good point is amount of memory for money it cost but it’s rather slow compared to p100 and compared to p100 it doesn’t support nvlink

    • RegalPotoo@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 months ago

      Thanks for the tips! I’m looking for something multi-purpose for LLM/stable diffusion messing about + transcoder for jellyfin - I’m guessing that there isn’t really a sweet spot for those 3. I don’t really have room or power budget for 2 cards, so I guess a P40 is probably the best bet?

      • bruhduh@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        2 months ago

        Try ryzen 8700g integrated gpu for transcoding since it supports av1 and these p series gpus for llm/stable diffusion, would be a good mix i think, or if you don’t have budget for new build, then buy intel a380 gpu for transcoding, you can attach it as mining gpu through pcie riser, linus tech tips tested this gpu for transcoding as i remember

        • RegalPotoo@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          8700g

          Hah, I’ve pretty recently picked up an Epyc 7452, so not really looking for a new platform right now.

          The Arc cards are interesting, will keep those in mind

      • Justin@lemmy.jlh.name
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        Intel a310 is the best $/perf transcoding card, but if P40 supports nvenc, it might work for both transcode and stable diffusion.

    • Scipitie@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      2 months ago

      Lowest price on Ebay for me is 290 Euro :/ The p100 are 200 each though.

      Do you happen to know if I could mix a 3700 with a p100?

      And thanks for the tips!

      • utopiah@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        Interesting, I did try a bit of remote rendering on Blender (just to learn how to use via CLI) so that makes me wonder who is indeed scrapping the bottom of the barrel of “old” hardware and what they are using for. Maybe somebody is renting old GPUs for render farms, maybe other tasks, any pointer of such a trend?