What is G-Sync? A technology from Nvidia allows you to synchronize your graphics card’s refresh rate with that of your screen. G-Sync Ultimate or G-Sync compatible, what does this mean? You have certainly heard of Nvidia G-Sync technology, but usually, views on the benefits are mixed.
Whether it’s Nvidia or AMD’s FreeSync technology, these solutions’ goal is to synchronize the screen or television refresh rate with that of the graphics card.
This is to reduce or eliminate stuttering, shaking, and tearing of the image. Following Nvidia’s announcement at CES 2019, G-Sync can be enabled with FreeSync displays.
This should give you an idea of the monitors currently available on the market. It will also be an opportunity to see the price differences between AMD’s technology and Nvidia.
In this article
What is Nvidia G-Sync?
All standard monitors operate at a fixed refresh rate. Which means they run at the same speed (frequency) all the time. For example, many screens are at 60 Hz, that is to say, 60 images per second (or FPS).
While graphics cards (GPUs) do not work with a fixed frame rate. 3D scenes are more or less complicated to calculate, and the number of images calculated per second is variable.
This is why the frame rate, the number of frames per second, varies at the output of your graphics card.
So a graphics processor can often produce images faster or slower than the monitor can display those same images. In fact, this difference between the images rendered and the images displayed between your GPU and your monitor creates an image drop.
As shown in the image below, the misalignment of buildings is “Screen Tearing.”When your GPU creates frames (frames per second or FPS) at a rate lower than your monitor’s refresh rate, your monitor displays the next frame before it is fully rendered.
When the number of frames per second of your GPU is greater than your monitor’s refresh rate, your monitor will start displaying the next image before the previous image finishes displaying.
In both of these cases, you, as the user, see what is known as Screen Tearing. Because you have the impression that your image is split.
VSync is a feature that fixes image splitting. In fact, it forces the graphics card to output images at the same rate as the monitor. This by limiting the number of images in the buffer memory.
Which effectively eliminates screen breaks. However, this creates other common problems, such as jerkiness and latency (input lag).
Even though the graphics card never exceeds the monitor’s refresh rate, graphics cards do not output images at a fixed rate.
But they can eventually fall below the fixed monitor frequency. And stuttering happens when your graphics card fails to keep up with the monitor’s refresh rate.
The latter must then wait for the rendering of the next image. It then creates what looks like a “stutter,” a jerk, perceptible on the user side.
G-Sync solves the problem of screen splitting, stuttering, and latency (Input Lag). As you may have already selected, rather than limiting the graphics card’s frame rate so that it does not exceed the monitor’s refresh rate, NVIDIA’s G-Sync technology allows the monitor to operate frequency variable to the number of images output from the graphics card.
So, with a G-Sync monitor, if your graphics card is producing 75 frames per second (FPS), your monitor will operate at a refresh rate of 75Hz.
If you encounter a more demanding situation in a game that requires a decrease in your graphics card’s display speed, then G-Sync will lower your screen’s refresh rate.
This so that it matches the new display frequency used at the output of the graphics processor. This eliminates screen tearing and stuttering. Also, it gives a feeling of fluidity and sharpness in-play games and working graphic design & video editing.
Disadvantages of the Nvidia G-Sync
Although NVIDIA’s G-Sync technology works well and overall, it serves the purpose. It is not without drawbacks. Currently, users face two main issues with G-Sync: cost and compatibility.
Unlike other variable frame rate technologies (such as AMD’s Freesync ), NVIDIA’s G-Sync was initially a hardware-only solution.
This means that rather than using software to force the monitor to use a variable refresh rate, G-Sync monitors actually have an Nvidia circuit installed. The latter allows them to operate at a variable refresh rate depending on the GPU.
Getting display manufacturers to implement G-Sync chips in their displays increases the cost of these displays. When you compare two monitors with identical functionality, one with G-Sync and the other with FreeSync,
The G-Sync monitor costs a lot more. But since then, Nvidia has understood its mistake. To be more competitive and to offer its technology on good displays, the company introduced the G-Sync compatible model.
This allows you to activate the G-Sync of an Nvidia graphics card with a FreeSync monitor. However, there are a few limitations, which we will cover later in this article.
Type G-Sync Model
But as announced by Nvidia at CES 2019, a new G-Sync Compatible model is added. Which makes three versions for Nvidia’s technology:
- G-Sync Ultimate (formerly G-Sync HDR)
- G-Sync: This means the monitor contains the Nvidia chip
- G-Sync Compatible: The monitor does not contain the Nvidia chip. But G-Sync is enabled by default. This label means that the monitor has FreeSync or Adaptive-Sync technology. But also that the tests carried out by Nvidia were conclusive.
In fact, we can even add a fourth version. Since it is possible to activate it manually: start G-Sync with a FreeSync monitor. This on FreeSync monitors that have not passed the test or those that have not yet been tested.
For a range of the great budget monitor, check out our guide to the best vertical monitor for coding and gaming
What is G-Sync Compatible?
This label indicates that the FreeSync or Adaptiv Sync monitor has successfully passed Nvidia’s tests. G-Sync will then be automatically activated by the graphics card.
This means that the monitor does not display blinking, flickering, ghosting, or other objects with VRR (Variable Refresh Rate). They also confirm that the monitor has a wide range of VRR activation.
Specifically, the VRR range should be at least 2.4: 1 (e.g., 60Hz-144Hz). Finally, the last criterion is that it must offer the player a transparent experience by activating the VRR by default.
G-Sync with an AMD GPU?
If you already have an AMD graphics card or are planning to get one, you won’t be able to use NVIDIA’s G-Sync technology. This Nvidia technology can only be used by graphics cards of the same brand.
In fact, FreeSync is an open-source software solution based on the Adaptiv Sync model. FreeSync compatible monitors are therefore much cheaper than monitors with Nvidia’s chip.
What is Nvidia G-Sync Ultimate?
It is the continuation of the G-Sync model, a move upmarket. Old known as G-Sync HDR. On the one hand, AMD released a FreeSync 2. The latter greatly improves the FreeSync model with several features, including HDR management.
The lack of rigor of the FreeSync 1 model is partly corrected with this new version. One of the drawbacks of FreeSync is that you have to be careful when choosing the monitor at the frequency range of FreeSync.
On the other hand, Nvidia has also released a new version of G-Sync. And, of course, it adds the management of HDR.
Nvidia G-Sync, is it worth it?
The answer to the question is that it depends on the user. Obviously, you need to have a substantial budget to be able to fully benefit from G-Sync. And you must also have an NVIDIA GPU.
However, with the “G-Sync Compatible” model, you can cut the budget and purchase a Freesync monitor. But we still have a little prospect on this use. If you’re building a new computer, getting an NVIDIA graphics processor won’t be a problem.
However, if you already have a computer with an AMD graphics card, there are still a few solutions. Or you switch to an NVIDIA graphics processor (which will increase your expenses even more). Either you choose a Freesync monitor.
G-Sync monitors are ideal for people who already have high-end systems with NVIDIA graphics cards. Otherwise, those who have a sufficient budget to build a new high-end gaming computer (with an NVIDIA graphics processor).
Unfortunately, many gamers will not be able to use G-Sync due to the high costs. However, if you have the budget to accommodate a monitor with Nvidia’s chip and the hardware to run it, it’s well worth it.
Compatible Monitor and prices
In fact, you won’t find a cheap G-Sync monitor. Unfortunately, that’s the price you pay for this technology. But as we said before, to keep the price down, you can choose a FreeSync monitor listed as compatible.
However, if you already have an Nvidia GPU or are a fan of Nvidia, this technology will bring you a lot in visual quality and fluidity. So to get an idea, here is a list of several G-Sync monitors with the associated price.
If you want to change GPUs, the guide to choosing the right graphics card between Nvidia or AMD can be useful to you.