In my server I currently have an Intel i7 9th gen CPU with integrated Intel video.

I don’t use or need A.I. or LLM stuff, but we use jellyfin extensively in the family.

So far jellyfin worked always perfectly fine, but I could add (for free) an NVIDIA 2060 or a 1060. Would it be worth it?

And as power consumption, will the increase be noticeable? Should I do it or pass?

  • Eskuero@lemmy.fromshado.ws
    link
    fedilink
    English
    arrow-up
    7
    ·
    2 months ago

    For an old nvidia it might be too much energy drain.

    I was also using the integrated intel for video re-encodes and I got an Arc310 for 80 bucks which is the cheapest you will get a new card with AV1 support.

      • Eskuero@lemmy.fromshado.ws
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        Is x266 actually taking off? With all the members of AOmedia that control graphics hardware (AMD, Intel, Nvidia) together it feels like mpeg will need to gain a big partner to stay relevant.

        • InverseParallax@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 months ago

          Google is pushing av1 because of patents, but 266 is just plain better tech, even if it’s harder to encode.

          This same shit happened with 265 and vp9, and before that, and before that with vorbis/opus/aac.

          They’ll come back because it’s a standard, and has higher quality.

          Maybe this is the one time somehow av1 wins out on patents, but I’m encoding av1 and I’m really not impressed, it’s literally just dressed up hevc, maybe a 10% improvement max.

          I’ve seen vvc and it’s really flexible, it shifts gears on a dime between high motion and deep detail, which is basically what your brain sees most, while av1 is actually kind of worse than hevc at that to me, it’s sluggish at the shifts, even if it is better overall.