Had this reflection that 144hz screens where the only type of screen I knew, that was not a multiple of 60. 60 Hz - 120hz - 240hz - 360hz

And in the middle 144hz Is there a reason why all follow this 60 rule, and if so, why is 144hz here

  • JustEnoughDucks@feddit.nl
    link
    fedilink
    arrow-up
    25
    arrow-down
    1
    ·
    edit-2
    1 year ago

    72 Hz was used as a refresh rate for CRT monitors back in the day. Specifically because it was the average threshold that no users reported discomfort from CRT flicker. And 72 * 2.

    It is likely a holdover from that era. I think from there, it is a multiple of 24 HZ so movie content scaled smoothly without tearing before vsync? Last part is a guess.

    • ZephrC@lemm.ee
      link
      fedilink
      arrow-up
      10
      ·
      1 year ago

      Old reel projectors actually flashed their light at 72Hz. They had to turn off the light to move the reel to the next slide so you couln’t see the pictures moving up off the screen, and human eyes are better at spotting quickly flashing lights than they are at spotting microstuttery motion, so flashing the bulb once per frame at 24Hz in a dark room was headache inducing. The solution they came up with was just to flash the bulb 3 times per frame, which is 72Hz.

    • astraeus@programming.dev
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      144Hz is not a holdover in the case of computer monitors. It’s the maximum bandwidth you can push through DVI-D Dual-link at 1080p, which was the only standard that could support that refresh rate when they began producing LCD monitors built to run 144Hz.