Variable refresh rates and adaptive sync have been around for some time now. Both AMD and Nvidia have their own proprietary offerings that aid in the reduction of annoying visual artifacts like screen tear.
Screen tear is found when your monitor’s refresh rate and your GPU’s frame rate output aren’t synchronized. The GPU ends up sending more frames per second to your monitor than it can handle, resulting in two frames being displayed at the same time.
In today’s article, we’ll be focusing on Nvidia’s G-sync, answering some of the big questions that surround it. What is G-sync? How does it affect gaming performance? And what are the pros and cons that come with G-sync? We’ll be answering all the above and more in our comprehensive guide to G-sync technology, so let’s waste no further time and dive straight into it!
V-Sync was the first synchronization technology to hit the monitor marketplace, aiding with screen tearing by capping the frame rate of the monitor to match your GPU’s output. Whilst this was great for gamers that could produce FPS that eclipsed your monitor’s refresh rate, it had the opposite effect when frame rates dropped well under that figure – resulting in stuttering and huge input lag times.
G-Sync was originally designed to be used with V-Sync, however, Nvidia soon allowed people to turn this option off. The G-Sync module allows for a dynamic refresh rate that matches the output from your GPU, much like V-Sync only without the stuttering dips and high input lag times.
G-Sync updates the screen exactly when the frame is finished and wants to be output by the GPU. The refresh rate is the maximum frame rate used by the G-Sync module and with G-Sync you shouldn’t get any lag or noticeable tearing. This works in the same way as V-sync, synchronizing the monitor’s refresh rate to your GPUs frame rate output – resulting in a tear-free gaming experience that is much more immersive.
So, the bottom line is, if your GPU is creating frames at a lower rate than your monitor’s refresh rate you should experience some stuttering and if it’s operating faster then it can show the next frame too quickly resulting in a tear.
Like all variable refresh rate technology, G-sync comes with its own set of unique pros and cons. Below are what we consider the most noteworthy of each:
Unlike V-Sync, a technology that caps the frame rate to match the monitor’s refresh rate, G-Sync allows the monitor to work at a variable refresh rate matching the GPU and ultimately eliminating the chance of tearing and lag – accounting for dips and peaks in performance.
Let’s put this into a real-life scenario. So, if you are playing a demanding game with G-Sync enabled and achieving 100 fps, your monitor’s refresh rate in real-time is matching that frame rate. Let’s say you come to a part in the game that is even more demanding on your GPU and you witness a much lower FPS, then it shouldn’t be an issue as your frame rate is matched yet again by the module.
Being a proprietary technology, the G-Sync module can be considered an expensive luxury as the G-Sync scaler replaces the standard one in a monitor. Other sync technologies – such as Freesync – are hardware-software solutions too and are normally a cheaper option due to the scaler being manufactured by multiple different companies.
The addition of G-Sync can sometimes add hundreds of dollars to your bill but, as of last year, Nvidia began to release drivers that would enable their GPUs to work with certain adaptive sync and Freesync monitors. This now makes G-Sync a more affordable option and is a brilliant move from Nvidia even if it took them a little while to get here.
Another downside to G-Sync is it will not work with AMD graphics cards so if you have AMD or plan on going down this route, do not buy a G-Sync monitor. G-Sync only works with Nvidia cards but the monitor would still work on your AMD setup meaning you would have paid a premium for no extra features.
AMD graphics cards use a different adaptive sync technology developed by themselves called Freesync. Freesync, unlike G-Sync, is open source so monitors don’t require a proprietary module which creates a competitive market and reduces the cost. Freesync is often used in more budget monitors which drive the price down and pass on the savings to the consumer (you).
It’s worth noting while G-Sync locks the frame rate to the upper limit of the monitor Freesync can bypass this and give you higher frames. This would result in some tearing as your frame rate and refresh rate wouldn’t match but would keep input lag very low. Some Freesync monitors are now compatible with Nvidia cards giving yet another option to gamers.
A quick fix to the tearing issue could be to turn on V-Sync in your settings. V-Sync is otherwise known as vertical sync and caps your graphics card’s frame rate to the refresh rate of the monitor. This may deal with the tearing issue in the same way G-Sync and Freesync do. That said, V-Sync doesn’t deal with the dips in performance and you can still get a ‘stuttering’ effect.
What is G-Sync and is it worth it? Deciding whether or not G-Sync is worth your money depends on your preferences and budget. If you have money to spare and want the best ‘future proof’ setup you can get, then buying into G-Sync is a smart investment – with the technology becoming more widely available over time.
On the other hand, you may not want to put up with the input lag V-Sync can cause and would prefer a cheaper alternative to G-Sync, then Freesync would be your best bet. If going down the Freesync route, make sure to have an AMD graphics card or a monitor with Freesync that is compatible with an Nvidia card.
This sort of technology is great for those that hate tearing and want to run demanding games smoothly but if you’re just into hardcore FPS or low spec games, then this increase in your costs is probably not needed.