Skip to main content

Nvidia says it’s better than AMD for low-lag gaming, and has the data to prove it

Nvidia RTX 2080 Super impressions
Riley Young/Digital Trends

Nvidia has been showing off some new technological innovations at this year’s Gamescom show, most of which require one of its RTX graphics cards to experience at their full potential. One that doesn’t is the new, ultra-low latency option that’s now included with its latest Game Ready Driver (436.02), which debuted during the show. It, like AMD’s anti-lag, helps reduce latency between user inputs and their action in games. According to Nvidia’s own benchmarks, its solution is better.

After two graphics card generations from both Nvidia and AMD that haven’t pushed the performance envelope significantly, features are a greater selling point than they’ve been in the past. For Nvidia, the two biggest benefits of its RTX generation have been RT and Tensor core-powered ray tracing and deep learning super sampling. For AMD, it’s been image sharpening and input lag reduction. While Nvidia previously claimed it had its own anti-lag system for years, it’s now released a new, improved version that’s more in line with what AMD offers. Only better, according to its own benchmarks.

Recommended Videos

Although the time saved is in terms of milliseconds, high-speed gamers should notice a slight improvement from when no anti-lag features enabled. Nvidia claims, via TechRadar, that it can reduce input lag in Apex Legends from 30ms down to just 19ms. The Division 2’s input lag could be reduced from 49ms down to just 22ms with ultra-low-latency enabled.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

AMD has made similarly impressive claims about its anti-lag feature’s capabilities, suggesting that input lag could be cut in half. While that would likely compete favorably with Nvidia’s low-latency, according to Nvidia’s testing and results, Nvidia is the clear winner. First-party benchmarks need to always be taken with a healthy dose of skepticism, but this is still good news for consumers. AMD’s anti-lag feature was enough of a concern for Nvidia that it aped it and created a feature for Nvidia gamers that is as good, if not better in some cases, which just means more gamers end up with lower input lag in their games. That’s a good thing.

AMD introduced image sharpening in a recent release and now Nvidia has done so as well. Also introduced with the new driver release, is its new Freestyle Sharpening feature that will purportedly work better than its previous “detail” filter, providing better-quality images while not exacting as much of a performance hit.

Elsewhere in this driver release, Nvidia also introduced a more expansive list of G-Sync-compatible monitors and beta support for its GPU integer scaling, which should make older games and those with a pixel art style look far better on higher-resolution monitors where they can otherwise look a little blurry.

You can download Nvidia’s latest drivers from the official website.

Interested in reducing your input lag even further? A high refresh rate monitor can help.

Jon Martindale
Jon Martindale is a freelance evergreen writer and occasional section coordinator, covering how to guides, best-of lists, and…
Nvidia CEO in 1997: ‘We need to kill Intel’
NVIDIA CEO Jensen Huang at GTC

Those headline above includes strong words from the maker of the best graphics cards you can buy, and they have extra significance considering where Nvidia sits today in relation to Intel. But in 1997, things were a bit different. The quote comes from the upcoming book The Nvidia Way, written by columnist Tae Kim, and was shared as part of an excerpt ahead of the book's release next month.

The words from Nvidia CEO Jensen Huang came as part of an all-hands meeting at the company in 1997 following the launch of the RIVA 128. This was prior to the release of the GeForce 256, when Nvidia finally coined the term "GPU," and it was a precarious time for the new company. Shortly following the release of the RIVA 128, Intel launched its own i740, which came with an 8MB frame buffer. The RIVA 128 came with only a 4MB frame buffer.

Read more
Nvidia’s next-gen GPU plans could be good news for Intel and AMD
Two RTX 4070 Ti Super graphics cards sitting next to each other.

According to a new leak from Benchlife, Nvidia may launch the vast majority of the RTX 50-series in the first quarter of 2025 -- but one GPU is notably missing from the early lineup. That could be very good news for AMD and Intel. While Nvidia will rule the high-end market, the other two brands may get to swoop in with some of the best graphics cards for gamers on a budget and get some breathing room before Nvidia strikes back.

Benchlife reveals that we'll see many of the RTX 50-series staples arrive in the first quarter of the year. The flagship RTX 5090 and the RTX 5080 arriving in January feel like a sure thing at this point, but many leakers also suggest that we'll see other GPUs make their debut during CES 2025.

Read more
Nvidia may keep producing one RTX 40 GPU, and it’s not the one we want
The Alienware m16 R2 on a white desk.

The last few weeks brought us a slew of rumors about Nvidia potentially sunsetting most of the RTX 40-series graphics cards. However, a new update reveals that one GPU might remain in production long after other GPUs are no longer being produced. Unfortunately, it's a GPU that would struggle to rank among Nvidia's best graphics cards. I'm talking about the RTX 4050 -- a card that only appears in laptops.

The scoop comes from a leaker on Weibo and was first spotted by Wccftech. The leaker states that the RTX 4050 is "the only 40-series laptop GPU that Nvidia will continue to supply" after the highly anticipated launch of the RTX 50-series. Unsurprisingly, the tipster also reveals that the fact that both the RTX 4050 and the RTX 5050 will be readily available at the same time will also impact the pricing of the next-gen card.

Read more