The first thing we need to establish is “What exactly is refresh rate?” Fortunately it isn’t very complex. Refresh rate is simply the number of times a display refreshes the image it shows per second. You can understand this by comparing it to frame rate in films or games. If a film is shot at 24 frames per second (as is the cinema standard), then the source content only shows 24 different images per second. Similarly, a display with a display rate of 60Hz shows 60 “frames” per second. It’s not really frames, because the display will refresh 60 times each second even if not a single pixel changes, and the display only shows the source fed to it. However, the analogy is still an easy way to understand the core concept behind refresh rate. A higher refresh rate therefore means the ability to handle a higher frame rate. Just remember, that the display only shows the source fed to it, and therefore, a higher refresh rate may not improve your experience if your refresh rate is already higher than the frame rate of your source.
When you connect your monitor to a GPU (Graphics Processing Unit/Graphics Card) the monitor will display whatever the GPU sends to it, at whatever frame rate it sends it, at or below the maximum frame rate of the monitor. Faster frame rates allow any motion to be rendered on screen more smoothly (Fig 1), with reduced motion blur. This is very important when watching fast video or games.
Post time: Dec-16-2021