cross-posted from: https://lemmy.world/post/11840660

TAA is a crucial tool for developers - but is the impact to image quality too great?

For good or bad, temporal anti-aliasing - or TAA - has become a defining element of image quality in today’s games, but is it a blessing, a curse, or both? Whichever way you slice it, it’s here to stay, so what is it, why do so many games use it and what’s with all the blur? At one point, TAA did not exist at all, so what methods of anti-aliasing were used and why aren’t they used any more?

  • falsem
    link
    fedilink
    69 months ago

    We used AA on our CRTs back in the day. Of course we were all running like 1024x768 as the resolution so it was a lot more needed. The higher your resolution the less you need it.

    • @RightHandOfIkaros@lemmy.world
      link
      fedilink
      English
      29 months ago

      Yes, thats true. AA was helpful at certain resolutions that were what I call “medium resolutions”, the range between 480 and 768 pixels. But CRTs still had a softer image simply as a byproduct of the way the technology worked, and worked better at lower resolutions like 240p (AFAIK, any signal less than 480 vertical pixel resolution was automatically progressive scan). This was abused and exploited by game developers of the time, famously utilizing dithering for transparency effects for platforms that didn’t fully support it such as the SEGA Saturn (it only supported transparent 2D sprites, but not textured polygons like the PSX did). The softer image led to the dithered effects smoothing out, giving the appearance of a bigger available color palette and special effects. Flickering sprites every other field was also a common technique due to CRTs high image persistence. This is why games like Streets of Rage look awful on modern displays, but display correctly on CRTs.

      But regardless, AA will probably be phased out eventually, its just a tool to mitigate growing pains of new display technology.