Skip to main content

What is G-Sync?

When shopping for a gaming monitor, you’ll undoubtedly come across a few displays advertising Nvidia’s G-Sync technology. In addition to a hefty price hike, these monitors usually come with gaming-focused features like a fast response time and high refresh rate. To help you know where your money is going, we put together a guide to answer the question: What is G-Sync?

In short, G-Sync is a hardware-based adaptive refresh technology that helps prevent screen tearing and stuttering. With a G-Sync monitor, you’ll notice smoother motion while gaming, even at high refresh rates.

Recommended Videos

What is G-Sync?

Nvidia

G-Sync is Nvidia’s hardware-based monitor syncing technology. G-Sync solves screen tearing mainly, synchronizing your monitor’s refresh rate with the frames your GPU is pushing out each second.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Your GPU renders a number of frames each second, and put together, those frames give the impression of smooth motion. Similarly, your monitor refreshes a certain number of times each second, clearing the previous image for the new frames your GPU is rendering. To keep things moving smoothly, your GPU stores upcoming frames in a buffer (something you can tweak to increase FPS on your PC). The problem is that the buffer and your monitor’s refresh rate may get out of sync, causing a nasty line of two frames stitched together.

V-Sync emerged as a solution. This software-based feature essentially forces your GPU to hold frames in its buffer until your monitor is ready to refresh. That solves the screen tearing problem, but it introduces another: Input lag. V-Sync forces your GPU to hold frames it has already rendered, which causes a slight delay between what’s happening in the game and what you see on screen.

Nvidia’s first alternative to V-Sync was Adaptive VSync. Like the older technology, Nvidia’s driver-based solution locked the frame rate to the display’s refresh rate to prevent screen tearing. However, when the GPU struggled, Adaptive VSync unlocked the frame rate until the GPU’s performance improved. Once stable, Adaptive VSync locked the frame rate until the GPU’s performance dropped again.

Nvidia introduced a hardware-based solution in 2013 called G-Sync. It’s based on VESA’s Adaptive-Sync technology, which enables variable refresh rates on the display side. Instead of forcing your GPU to hold frames, G-Sync forces your monitor to adapt its refresh rate depending on the frames your GPU is rendering. That deals with input lag and screen tearing.

However, Nvidia uses a proprietary board that replaces the typical scaler board, which controls everything within the display like decoding image input, controlling the backlight, and so on. A G-Sync board contains 768MB of DDR3 memory to store the previous frame so that it can be compared to the next incoming frame. It does this to decrease input lag.

On the PC end, Nvidia’s driver can fully control the display’s proprietary board. It manipulates the vertical blanking interval, or VBI, which represents the interval between the time when a monitor finishes drawing the current frame and the beginning of the next frame.

With G-Sync active, the monitor becomes a slave to your PC. As the GPU rotates the rendered frame into the primary buffer, the display clears the old image and gets ready to receive the next frame. As the frame rate speeds up and slows down, the display renders each frame accordingly as instructed by your PC. Since the G-Sync board supports variable refresh rates, images are often redrawn at widely varying intervals.

G-Sync system requirements

27'' UltraGear 4K UHD Nano IPS 1ms 144Hz G-Sync Compatible Gaming Monitor
Image used with permission by copyright holder

For years, there’s always been one big caveat with G-Sync monitors: You need an Nvidia graphics card. Although you still need an Nvidia GPU to fully take advantage of G-Sync — like the recent RTX 3080 — more recent G-Sync displays support HDMI variable refresh rate under the “G-Sync Compatible” banner (more on that in the next section). That means you can use variable refresh rate with an AMD card, though not Nvidia’s full G-Sync module. Outside of a display with a G-Sync banner, here’s what you need:

Desktops

  • GPU – GeForce GTX 650 Ti BOOST or newer
  • Driver – R340.52 or higher

Laptops connected to G-Sync monitors

  • GPU – GeForce GTX 980M, GTX 970M, or GTX 965M GPU or newer
  • Driver – R340.52 or higher

Laptops with built-in G-Sync displays

  • GPU – GeForce GTX 980M, GTX 970M, or GTX 965M or newer
  • Driver – R352.06 or higher

G-Sync vs. G-Sync Compatible vs. G-Sync Ultimate

Because G-Sync is a hardware solution, certified G-Sync monitors must include Nvidia’s proprietary board. Fortunately, most major monitor manufacturers like Asus, Philips, BenQ, AOC, Samsung, and LG offer G-Sync displays.

Nvidia currently lists three monitor classes: G-Sync Ultimate, G-Sync, and G-Sync Compatible. Here’s a breakdown of each:

G-Sync Compatible

  • 24 to 88 inches
  • Validated no artifacts

G-Sync

  • 24 to 38 inches
  • Validated no artifacts
  • Certified +300 tests

G-Sync Ultimate

  • 27 to 65 inchesbest graphics cards
  • Validated no artifacts
  • Certified +300 tests
  • Best quality HDR
  • 1000 nits brightness

For G-Sync Ultimate displays, you’ll need one of the best graphics cards to handle HDR visuals at 4K. They’re certainly not cheap, but they provide the best experience.

As for G-Sync Compatible, it’s a newer category. These displays do not include Nvidia’s proprietary G-Sync board, but they do support variable refresh rates. These panels typically fall under AMD’s FreeSync umbrella, which is a competing technology for Radeon-branded GPUs that doesn’t rely on a proprietary scaler board. Nvidia tests these displays to guarantee “no artifacts” when connected to GeForce-branded GPUs. Consider these displays as affordable alternatives to G-Sync and G-Sync Ultimate displays.

Overall, resolutions range from Full HD to 4K while refresh rates range from 60Hz max to 240Hz max. Nvidia provides a full list of compatible monitors on its website. Prices range from about $100 to well over $1,000, like the Asus’ ROG Swift PG279Q 27-inch monitor selling for $698.

G-Sync TVs

Image used with permission by copyright holder

Since G-Sync launched in 2013, it has always been specifically for monitors. However, Nvidia is expanding. Last year, Nvidia partnered with LG to certify recent LG OLED TVs as G-Sync Compatible. You’ll need some drivers and firmware to get started, which Nvidia outlines on its site. Here are the currently available TVs that support G-Sync:

  • LG BX 2020 (50-, 65-, and 77-inch)
  • LG CX 2020 (50-, 65-, and 77-inch)
  • LG GX 2020 (50-, 65-, and 77-inch)
  • LG B9 2019 (50- and 65-inch)
  • LG C9 2019 (50-, 65-, and 77-inch)
  • LG E9 2019 (50- and 65-inch)

FreeSync: the G-Sync alternative

best ultra-wide monitors
Bill Roberson/Digital Trends

As we pointed out earlier, AMD’s FreeSync derives from VESA’s Adaptive-Sync technology. One of the main differences is that it doesn’t use proprietary hardware. Rather, FreeSync-certified displays use off-the-shelf scaler boards, which lessens the cost. The only AMD hardware you need for FreeSync is a Radeon-branded GPU. AMD introduced AdaptiveSync support in 2015.

FreeSync has more freedom in supported monitor options, and you don’t need extra hardware. So, FreeSync is a budget-friendly alternative to G-Synch compatible hardware. Asus’ MG279Q is around $100 less than the aforementioned ROG Swift monitor.

No matter which you choose, each technology has advantages. There are also numerous graphics cards and monitors to up your gaming experience. FreeSync covers graphical glitches caused by monitor and GPU synchronization issues.

Some downsides

One downside is the price. Whether you’re looking at a laptop or desktop, G-Sync requires both a capable monitor and graphics card. While there are many G-Sync compatible graphics cards, giving you plenty of budgetary options, G-Sync monitors are almost always more expensive than their AMD Freesync counterparts. Compatible laptops may be even more expensive.

In addition, users point to a lack of compatibility with Nvidia’s Optimus technology. Optimus, implemented in many laptops, adjusts graphics performance on the fly to provide the necessary power to graphics-intensive programs and optimize battery life. Because the technology relies on an integrated graphics system, frames move to the screen at a set interval, not as they are created as seen with G-Sync. One can purchase an Optimus-capable device or a G-Sync-capable device, but no laptop exists that can do both.

Want to know more? It’s worth exploring the differences between FreeSync and G-Sync.

Kevin Parrish
Former Digital Trends Contributor
Kevin started taking PCs apart in the 90s when Quake was on the way and his PC lacked the required components. Since then…
The RTX 5070 Ti may continue Nvidia’s disappointing streak
Nvidia CEO Jensen Huang holding an RTX 50 GPU and a laptop.

The disappointing "paper launch" continues. Nvidia's RTX 5070 Ti is just a couple of days away from launch, but whether it'll actually be readily available is another thing entirely. Although it could rival some of the best graphics cards, the GPU is said to be hard to come by, much like the RTX 5090 and the RTX 5080.

It appears that my worries might be about to come true -- the RTX 5070 Ti will only be available on paper and not in reality, at least if this new leak is to be believed. Channel Gate shared an update on the predicted pricing and stock levels for the RTX 5070 Ti, and it's grim news all around.

Read more
RTX 5080 vs 9070 XT — battle of the stock
Logo on the Nvidia GeForce RTX 5080.

The matchup between new-generation AMD and Nvidia graphics cards is heating up in early-2025, with Nvidia's RTX 50 storming out the gates, and AMD's RDNA4 GPUs waiting excitedly in the wings. With stock levels all over the place, prices skewing wildly too, questions abound about how these new cards will fair against one another in a straight head to head.

Until we have hands-on time with these cards ourselves (and the embargos lift) we can't tell you for sure, but we can make some educated estimations of how the battle might turn out.

Read more
The competition between AMD and Nvidia is finally heating up
Two RTX 4070 Ti Super graphics cards sitting next to each other.

Nvidia opened this year with two of the best graphics cards, but AMD largely stayed silent. Now, for the first time in 2025, the competition will start heating up. Yesterday, Nvidia announced the release date for its next GPU, the RTX 5070 Ti. AMD immediately struck back with an important announcement about the RX 9070 XT. Here's what we know.
Nvidia's RTX 5070 Ti is almost here
Nvidia's RTX 5070 Ti is right around the corner, and it marks yet another win for leakers who predicted the release date correctly. Set to arrive on February 20, the GPU will start at $749, but realistically, finding one at MSRP (recommended list price) might be tough. Prices aside, the RTX 5070 Ti will feature 8,960 CUDA cores, a boost clock of up to 2.45GHz, and 16GB of GDDR7 VRAM across a 256-bit memory bus alongside a 300-watt TGP.

https://x.com/NVIDIAGeForce/status/1890038221314077048

Read more