As an Amazon Associate I earn from qualifying purchases from amazon.com

Frame Rate (FPS) vs Refresh Rate (Hz)

[ad_1]

When comparing the performance of two gaming computers, we often look at the frame rates each computer is capable of producing in a certain game at the same resolution and graphics quality. Frame rates are measured in FPS or Frames per Second. Most people know that higher FPS is better, but let’s clear up some common misconceptions with FPS and refresh rates.

First, what is a frame and what determines the frame rate? A frame is a single still image, which is then combined in a rapid slideshow with other still images, each one slightly different, to achieve the illusion of natural motion. The frame rate is how many of these images are displayed in one second. To produce, or render, a new frame your CPU and GPU work together to determine the actions of the AI, the physics, positions, and textures of the objects in the scene to produce an image. Then your GPU cuts this image into pixels at the resolution you set and sends this information to the display. The more powerful your CPU and GPU, the more frames they are able to generate per second.

Your monitor or display is where refresh rates come in. Refresh rate is measured in frequency (Hz) which is the number of times per second your monitor can redraw the screen. A refresh rate of 85Hz means that your display can redraw the entire screen 85 times in one second.

Does that mean that your frame rate is limited by your screen’s refresh rate? No; they are two separate things. Remember that FPS is how many frames your gaming computer is producing or drawing, while the refresh rate is how many times the monitor is refreshing the image on the screen. The refresh rate (Hz) of your monitor does not affect the frame rate (FPS) your GPU will be outputting. However, if your FPS is higher than your refresh rate, your display will not be able to display all of the frames your computer is producing, so although the refresh rate doesn’t technically limit the frame rate, it does effectively set a cap.

It’s also important to remember that even if your gaming PC is capable of generating 90 FPS in your favorite game at your preferred settings, and even if your monitor supports 90Hz, 120Hz or more, you could still be capped by the lower refresh rate capabilities of the ports on your graphics card and display. Read our blog post on DisplayPort vs HDMI vs DVI vs VGA to learn about the pros, cons and limitations of the different types of connections. For example, some gaming monitors feature 120Hz refresh rates, but have HDMI 1.4 and DisplayPort 1.4. This means that you can only take advantage of the 120Hz refresh rate if your use DisplayPort; you’ll be stuck at 60Hz if you use HDMI.

Frame rate is typically used as a gaming benchmark for measuring the performance of hardware, drivers, games and APIs like Vulkan and DirectX. In this case the monitor’s refresh rate doesn’t matter because the frame rate is just used as a number to measure the gaming performance. A higher frame rate is better. However, when you’re actually playing a game, the display’s refresh rate does effectively limit the frame rate – if you have an 80hz display and your computer is capable of outputting 120 FPS, your screen will still only be able to show 80 different images per second.

If the frame rate your computer is producing is different (either higher or lower) that the refresh rate of your monitor, you may experience a glitch known as screen tearing, where information from two or more frames is shown in a single screen draw. It is important to note that screen tearing does no damage to a display or graphics card.

Hz vs FPS - Screen tearing can be caused by frame rate out of sync with refresh rate
What screen tearing looks like

To prevent screen tearing, you can enable a feature called Vertical Synchronization, or VSync. This tells your GPU to synchronize its actions with the display, which forces it to render and send a new frame when the monitor is ready to redraw the screen. This does limit your framerate exactly to the refresh rate. For example, if your refresh rate is 60Hz, VSync will cap your framerate to 60 FPS. If your GPU is capable of producing higher frame rates than the VSync cap you can take advantage of its leftover capacity to increase the resolution, draw distance, or other graphics quality settings. If your graphics card can’t outpace the refresh rate of your display then enabling VSync won’t help much. You may be able to lock your GPU into a lower frame rate, like 30 FPS, that would match up with your monitor. Common display refresh rates include 120hz, 60hz and 30hz, and there are all divisible by 30 so you won’t get screen tearing, but you may get stutter, as each frame will be on the screen for a couple of cycles.

If your monitor and graphics card both support NVIDIA G-SYNC, you’re in luck. With this technology, a special chip in the display communicates with the graphics card. This lets the monitor vary the refresh rate to match the frame rate of the NVIDIA GTX graphics card, up to the maximum refresh rate of the display. This means that the frames are displayed as soon as they are rendered by the GPU, eliminating screen tearing and reducing stutter for when the frame rate is both higher and lower than the refresh rate of the display. This makes it perfect for situations where the frame rate varies, which happens a lot when gaming. Today you can even find G-SYNC technology in gaming laptops!

AMD has a similar solution called FreeSync. However, this doesn’t require a proprietary chip in the monitor. Instead, FreeSync relies on the DisplayPort’s Adaptive-Sync specification, which is a royalty-free industry standard. The difference between them is that in G-SYNC, the proprietary module in the monitor handles the work of communication between the devices. In FreeSync, the AMD Radeon driver, and the display firmware handle the communication. AMD has demonstrated that FreeSync can work over HDMI, but it requires custom drivers from AMD and the monitor’s manufacturer. Currently G-SYNC only works with DisplayPort, but that may change. Generally, FreeSync monitors are less expensive than their G-SYNC counterparts, but gamers generally prefer G-SYNC over FreeSync as the latter may cause ghosting, where old images leave behind artifacts. However, this may change as both technologies are relatively new.

[ad_2]

We will be happy to hear your thoughts

Leave a reply

Cartizzle
Logo
Enable registration in settings - general
Compare items
  • Total (0)
Compare
0
Shopping cart