Is HEVC (8-bit)/AAC a good, modern CODEC combination for rebuilding & reducing my library size without compromising quality? Helpful feedback would be appreciated.

  • Nine@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    2
    ·
    1 year ago

    Yes it’s good, but with AV1 hanging about then you’re WAAAY better off using that over x265.

    I re-encode all my stuff with AV1. It will take a 40GB x264 rip to 3-4GB. Where as with x265 It will be around 10-15GB.

    It’s a significant difference in storage size and (as far as I can tell) no obvious difference in quality.

    • LienNoir@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 year ago

      if u re getting 3 to 4Gb out of a 40G x264, u re definitely losing a lot of data… With proper settings, It should be around 16 - 18G. AV1 can’t do miracles.

      • Nine@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        For sure!

        What I’m talking about is perception of quality. If you know what you’re looking for then you’ll notice some of the artifacts. Especially in the darkest areas and when going from HDR to SDR.

        It still looks better than steaming the same thing off of Netflix, Hulu, etc. So that’s all I need/want.

        I fully realize there’s compromise there and if I want to view it in all its original glory I can bust out the bluray.

    • mordack550@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Sadly some clients (nvidia shield tv) does not support AV1 :( right now I’m encoding some AV1 content I have back to HEVC just because of that.

      • noim@feddit.de
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        1
        ·
        1 year ago

        And most older gpu models also don’t support av1. So transcoding for these clients happens on the cpu. This is why I will continue to use x265 for now.

        • Nine@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Yes, you’re completely correct. There’s something to consider though.

          CPU encoding gives the best results possible, in terms of quality and size. Decoding, unless you have a very weak CPU, isn’t necessarily the bottleneck it most transcoding applications eg plex, jellyfin, etc.

          So you can do things to make the media as streamable as possible for instance encoding your media in AV1 using the mp4 container rather than mkv. If you make it web optimized aka ATOM upfront it makes playing the file much easier and less resource intensive. Now when a client that can’t use AV1 requests it your transcode can do SW decode and HW encode. Not as efficient as pure HW but IMHO it’s a worthwhile trade off for the storage space you get in return.

          You can make things more efficient by disabling subtitles and/or burnin on the media server side. If you have people like myself who need subs in everything then you can burn them in while you’re encoding the media to AV1 or only using formats like UTF8 so you can pass through them as m4v/mp4 doesn’t support subs like mkv does.

          That’s essentially what the optimized versions do on Plex. Only it sticks with x264 rather than AV1.

          If your media is only 720p then none of this would really make a difference for you. If you’re using 1080p+ rips then this will make a SIGNIFICANT difference. It’s made such a difference that I’ve started redoing my rips in 4K.

          Unless that is you got a SAN in your closet and free electricity that is…

        • vildis@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          1 year ago

          Only NVIDIA 3000-series cards and up support hardware AV1 decoding and only 4000-series cards support both hardware AV1 encoding and decoding, so only just about 3 year old cards! source

          All Intel ARC cards support both decode and encode, (released October 2022) source

          AMD 6000-series cards (November 2020) support AV1 hardware decoding and only 7000-series cards (December 2022) support both hardware AV1 encoding and decoding. source