I use a 1080p monitor and what I’ve noticed is that once creators start uploading 4k content the 1080p version that I watch on fullscreen has more artifacting than when they only uploaded in 1080p.

Did you notice that as well?

Watching in 1440p on a 1080p monitor results in a much better image, to the detriment of theoretically less sharper image and a lot higher CPU usage.

  • @kevincox@lemmy.ml
    link
    fedilink
    116 days ago

    I’m pretty sure that YouTube has been compressing videos harder in general. This loosely correlates with their release of the “1080p Enhanced Bitrate” option. But even 4k videos seem to have gotten worse to my eyes.

    Watching a higher resolution is definitely a valid strategy. Optimal video compression is very complicated and while compressing at the native resolution is more efficient you can only go so far with less bits. Since the higher resolution versions have higher bitrates they just fundamentally have more data available and will give an overall better picture. If you are worried about possible fuzziness you can try using 4k rather than 1440p as it is a clean doubling of 1080p so you won’t lose any crisp edges.

  • @chunkystyles@sopuli.xyz
    link
    fedilink
    English
    116 days ago

    YouTube compresses the shit out of 1080p content. Any video that has a lot of movement will look like trash at 1080p. Even if you’re on a lower resolution monitor, the higher bit rate of higher resolution videos will look better. It’s all very stupid on our end, but I assume it saves them a ton on bandwidth.

  • Ace! _SL/S
    link
    fedilink
    8
    edit-2
    6 days ago

    That’s because for example Youtube uses a bitrate of 4-7mbps for 1080p. 1440p gets arround 13mbps and 4k something like 46mbps iirc

    Other media providers are similiarly bad with their bitrates

    • @Peter1986C
      link
      26 days ago

      For AV1 that could still be okay, lol. It would be kind of meh for e.g. H264 but YT does not even use that anymore AFAIK.

      • Ace! _SL/S
        link
        fedilink
        36 days ago

        Youtube uses VP9 for all resolutions most of the time. 1080p and below offer AVC as fallback encoding

  • @DdCno1@beehaw.org
    link
    fedilink
    76 days ago

    There’s something else that hasn’t been mentioned yet: Video games in particular have been so detailed since the eight generation (XB1/PS4) that 1080p with its significant compression artifacts on YouTube swallows too many of those fine moving details, like foliage, sharp textures, lots of moving elements (like particles) and full-screen effects that modify nearly every pixel of every frame.

    And no, you will not get a less sharp image by downsampling 1440p or even 4K to 1080p, on the contrary. I would recommend you take a few comparison screenshots and see for yourself. I have a 1440p monitor and prefer 4K content - it definitely looks sharper, even down to fine-grain detail and I did the same when I had a 1200p screen, preferring 1440p content then (at least as soon as it was available - the early years were rough).

    If you are noticing high CPU usage at higher video resolutions, it’s possible that your GPU is outdated and can’t handle the latest codecs anymore - or that your operating system (since you’re on Linux based on your comment history) doesn’t have the right drivers to take advantage of the GPU’s decoding ability and/or is struggling with certain codecs. Under normal circumstances, there should be absolutely no increased CPU usage at higher video resolutions.

    • @kevincox@lemmy.ml
      link
      fedilink
      46 days ago

      It may be worth right-clicking the video and choosing “Stats for Nerds” this will show you the video codec being used. For me 1080p is typically VP9 while 4k is usually AV1. Since AV1 is a newer codec it is quite likely that you don’t have hardware decoding support.

  • @stealth_cookies@lemmy.ca
    link
    fedilink
    56 days ago

    The one I’ve noticed is that for videos with the 1080p “Enhanced Bitrate” option, the free 1080p video looks like a blurry mess compared to normal 1080p content.

    • @kevincox@lemmy.ml
      link
      fedilink
      26 days ago

      From my experience it doesn’t matter if there is an “Enhanced Bitrate” option or not. My assumption is that around the time that they added this option they dropped the regular 1080p bitrate for all videos. However they likely didn’t eagerly re-encode old videos. So old videos still look OK for “1080p” but newer videos look trash whether or not the “1080p Enhanced Bitrate” option is available.

  • @Maxy@lemmy.blahaj.zone
    link
    fedilink
    56 days ago

    About the “much higher CPU usage”: I’d recommend checking that hardware decoding is working correctly on your device, as that should ensure that even 4K content barely hits your CPU.

    About the “less sharper image”: this depends on your downscaler, but a proper downscaler shouldn’t make higher-resolution content any more blurry than the lower-resolution version. I do believe integer scaling (eg. 4K -> 1080p) is a lot less dependant on having a proper downscaler, so consider bumping the resolution up even further if the video, your internet, and your client allow it.

    • @Peter1986C
      link
      26 days ago

      Youtube pushes the AV1 “format” heavily these days which is hard to decode using hardware acceleration, given that a lot of devices still out there do not support that.

      • @kevincox@lemmy.ml
        link
        fedilink
        36 days ago

        which is hard to decode using hardware acceleration

        This is a little misleading. There is nothing fundamental about AV1 that makes it hard to decode, support is just not widespread yet (mostly because it is a relatively new codec).

        • @Peter1986C
          link
          16 days ago

          I mean, given that many devices do not support accelerating it, it is in practice “hard to accelerate” unless you add a new gfx card or buy a new device.

          I may not have worded it optimally (2L speaker), but I am sure it was fairly clear what I meant. 🙂

          • @kevincox@lemmy.ml
            link
            fedilink
            36 days ago

            I wouldn’t call a nail hard to use because I don’t have a hammer. Yes, you need the right hardware, but there is no difference in the difficulty. But I understand what you are trying to say, just wanted to clarify that it wasn’t hard, just not widespread yet.

      • @Maxy@lemmy.blahaj.zone
        link
        fedilink
        16 days ago

        Good point, though I believe you have to explicitly enable AV1 in Firefox for it to advertise AV1 support. YouTube on Firefox should fall back to VP9 by default (which is supported by a lot more accelerators), so not being able to decode AV1 shouldn’t be a problem for most Firefox-users (and by extension most lemmy users, I assume).

        • @Peter1986C
          link
          14 days ago

          I am running mostly Firefox or Librewolf on Linux these days, but I do not remember having to enable it. Not all of my systems support accelerating AV1 in their hardware, but they do play at 1080p (but with framedrops once above 30fps on the unaccelerated computer). But yeah, I do hope YT keeps VP9 around because of the acceleration.

  • I haven’t noticed anything. Would you do me a disservice and explain what I’m missing in my blissful ignorance. Make me see something that can never be unseen.

    • @sexy_peach@feddit.orgOP
      link
      fedilink
      English
      25 days ago

      I sit quite close to a large 1080p monitor. That’s why I notice when the bitrate is low and the video I am seeing lacks true 1080*720 pixels. Basically it’s compressed so much, that the image is noticeably worse than an image my monitor could display. That’s why when I use a higher pixel count compression, like 1440p, the compression problems don’t show as bad on the screen that will only show 1080p pixels anyways. That’s what I am talking about. On a phone or a laptop screen it will probably be less noticeable. I guess that’s why Youtube does it, it probably saves them a huge amount of bandwidth and people who want really good quality video might already have 4k displays which then get a way higher bitrate video feed anyways.

      I guess the 1080p monitor size starts to be a niche. More and more people using it are on smartphones I guess so it really makes sense to have a very low bitrate.

      • Turns out, I have an old dumb FullHD TV that should be ideal for this experiment. So, if I watch a YT video on 1080p, I should be able to see compression artefacts that are invisible when using a higher resolution. How is that supposed to work anyway, given that the browser knows the output resolution? Will it just download a higher resolution video, drop every other pixel, and display the rest?

        • @sexy_peach@feddit.orgOP
          link
          fedilink
          English
          24 days ago

          Will it just download a higher resolution video, drop every other pixel, and display the rest?

          Yes, just like it can show a 1080p video not in fullscreen :)

    • @Peter1986C
      link
      36 days ago

      I can only imagine that they (OP) set quality settings on [auto]. That way they might have YT constantly lowering bitrates/resolution. I do not have any issues either, but I use fixed quality settings.

      • @DdCno1@beehaw.org
        link
        fedilink
        36 days ago

        No, that’s not what they are talking about. Even if you set the video to 1080p and make sure that YouTube isn’t lowering it to a lower resolution, it still won’t look very good.

        Whether you notice or not depends on how perceptive you are, the quality of your eyesight and also the size and quality of your display. It’s hard to notice on a low-grade laptop screen (or smaller), as well as a cheap TN panel monitor, but go beyond around 20" and use a decent enough IPS panel and those blocky compression artifacts are hard to miss.