Learn the Basics of Nvidia's 'G-Sync' and AMD's 'FreeSync' Monitor Technologies

Nvidia, the dominant purveyor of PC graphics cards and all things related to making games look good, recently announced plans to open up its proprietary display rendering technology, “G-Sync,” for a handful of monitors that support the other variable refresh rate technology, AMD’s FreeSync.

It’s OK if none of that makes any sense. The nuances of display technologies aren’t a thing that most gamers and geeks likely memorize to impress friends and loved ones. However, if you’re looking to take your computer gaming to the next level, you’d be unwise to ignore G-Sync and FreeSync entirely. The gains—if you can get them—can really enrich your gaming experience by giving you smoother gameplay and reduced input lag.

If you’re still scratching your head, don’t worry. To save you from going down a rabbit hole of research, here’s a quick look at the basics behind adaptive display technologies so you better understand G-Sync and FreeSync—and why you might want them when you’re shopping for your next big gaming monitor.

What can G-Sync and FreeSync do for my games?

G-Sync and FreeSync are their common names, but the technology underlying Nvidia and AMD’s implementations is typically known as “adaptive synchronization,” or “adaptive sync” for short. When enabled, G-Sync and FreeSync match your graphics card’s output to your display’s output, to ensure that every single frame the former generates is presented on your monitor—no more, no less. This minimizes lag and prevents annoying image glitching that can occur when your graphics card is sending more (or fewer) frames than your monitor’s native refresh rate.

Let’s unpack this a bit more. While running a video game, your PC calculates and redraws everything you need to know—the status of every potential moving part in the game including the player, enemies and the environment, for example—dozens of times per second. Just like how cartoons are made of many similar drawings with small changes, your PC sends each of these “frames” to your monitor, which creates the movement and animation you perceive. The number of times a game sends that information over is called its “framerate.”

On the other hand, every monitor has a limit as to how many frames of animation it can show—its refresh rate, or the number of times it can update the picture each second. When your computer sends more frames of animation than your monitor can handle, this leads to “screen tearing,” where to two frames of animation are showing at the same time. Conversely, A PC that doesn’t output enough frames of animation creates a delay between your inputs and what’s displayed on your screen—annoying lag.

This video shows does a good job demonstrating screen-tearing and how FreeSync helps prevent it.

Most games combat this issue by offering a feature known as vertical sync, or “V-Sync,” which is a software solution that prevents your PC from sending more frames than your monitor can handle. This solves the screen tearing issue in some cases, but it’s not an ideal solution.

Games generally don’t run at a consistent framerate: Depending on the game and your computer’s prowess, your gameplay can jump by tens of frames-per-second at any moment. If you’re having trouble outputting a frame rate that matches or exceeds your monitor’s refresh rate, for lack of a better way to phrase it, you might get stuck with even worse frame rates. That’s just the technology.

Adaptive sync prevents problems in all but the most extreme cases by giving you more flexibility: locking the frame rate so it doesn’t exceed what your monitor can do, but keeping you running at the highest frame rate possible when your frames-per-second can’t reach your monitor’s refresh rate.

What’s the difference between G-Sync and FreeSync?

The simplest answer is that G-Sync is a proprietary dynamic scaler exclusive to Nvidia and its monitor-making partners. FreeSync is open-source standard created by AMD that any monitor manufacturer can support.

There is more to it than that, though. Until CES 2019, G-Sync was technically a hardware standard: Monitors that support G-Sync have a processor chip in them that communicates directly with an Nvidia graphics card to adjust framerate. Unless you have an Nvidia graphics card, no G-Sync for you.

FreeSync monitors—which don’t require any special monitor scalers to work—should also theoretically work with any graphics card, but until now, you’ve only been able to take advantage of FreeSync when using a compatible AMD graphics card.

As we mentioned earlier, Nvidia announced this week that a software-only version of G-Sync, coming January 15th, will enable the adaptive synchronization technology for a small number of pre-approved “G-Sync Compatible” monitors that do not have built-in Nvidia scalers. Those owning other FreeSync monitors (and Nvidia graphics cards) should eventually be able to turn on adaptive synchronization as well, but details on that are still unclear. Additionally, we can’t say at this point how the new software-based G-Sync will stack up against FreeSync or the original, chip-based version.

How do I know if a monitor supports G-Sync or FreeSync?

If you’re buying a new monitor, most companies make it very obvious that the display supports G-Sync, FreeSync, or (now) both. It will be a bullet point in the item description at whatever online store where you shop, and there should be a logo on the box. (Most of the time it’s in the damn name).

If you want to check on a monitor you already own or simply want to be completely sure, here are links to every G-Sync monitor and FreeSync monitor. Nvidia also made a little crib sheet with the monitors that will become G-Sync Compatible next week

How do I use them?

G-Sync and FreeSync are switched on by default if you’re using a compatible GPU and monitor. You can check to make sure your tool is turned on by going to the Nvidia Control Panel app for G-Sync or the AMD Catalyst Control Panel app for FreeSync.

While most people will be fine to leave G-Sync or FreeSync on and forget about it, particular gamers might find that certain games run better without G-Sync or FreeSync enabled—those looking for the least input lag possible for competitive first-person shooters, for example. For a longer, more detailed rundown on how to optimize G-Sync, including how to turn it off for individual games, check out this guide. AMD has a similar guide for how to do this on a FreeSync monitor.

Which adaptive synchronization technology is best?

Unless you’re buying a new graphics card and monitor at the same time, choosing between G-Sync and FreeSync probably comes down picking the best gear to match what you already have. There’s no point in buying a G-Sync monitor if you have an AMD graphics card, for example, nor would I recommend you go out and buy a FreeSync monitor to pair with your Nvidia GPU—at least, not until we see how well the Nvidia-blessed FreeSync displays perform with G-Sync.

There are plenty of other considerations you’ll want to think about when buying a new monitor: its panel type (TN? IPS?); its maximum refresh rate, as well as the refresh rates where G-Sync and FreeSync are supported; its resolution, and whether your graphics card can output high quality games at whatever that is; and how adjustable the display is, to name a few.

The simple fact that a display supports G-Sync or FreeSync is the very first question you should ask yourself when shopping for a new gaming monitor. And even then, you might want to hold off on buying anything new for a bit of time. Newer “FreeSync 2" and “G-Sync Ultimate” displays are still in their infancy, but they have the potential to look even better than today’s best monitors—for a hefty price now, but one that should hopefully cool off after some time.

linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram