Skip to main content

Stop worrying so much about benchmarks when buying a new GPU

Jacob Roach in a promotional image for ReSpec
This story is part of Jacob Roach's ReSpec series, covering the world of PC gaming and hardware.

GPU prices are finally normal, and you might have found yourself in recent weeks browsing graphics cards reviews to see which ones top the charts. After all, the best graphics cards live and die based on their performance in gaming benchmarks, right?

But those benchmarks are far from a definitive answer, and in most cases, they skew the conversation away from the games you actually play and the experiences they offer.

Recommended Videos

I’m not saying we need to throw the baby out with the bathwater. GPU benchmarks offer a lot of value, and I don’t think anything needs to change about how we (or others) conduct GPU reviews. But now that it’s actually possible to upgrade your graphics card, it’s important to take all of the performance numbers in context.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Games, not benchmarks

A character swinging a sword in Lost Ark.
The most popular Steam game of 2022 so far? Lost Ark, which only calls for a GTX 1050. Image used with permission by copyright holder

DT’s computing evergreen coordinator Jon Martindale made a joke concerning GPU prices the other day: “I need a new GPU so I can get 9,000 frames in Vampire Survivors.” Silly, but there’s a salient point there. When looking at performance, it’s important to recognize the fact that there are around four times as many people playing Terraria or Stardew Valley as there are playing Forza Horizon 5 or Cyberpunk 2077 at any given time.

The best games to benchmark your PC are not the most popular games that people play. In the top 25 most popular Steam games, only two of them are regularly used in benchmarks: Grand Theft Auto V and Rainbow Six Siege. Virtually no “live” games are included in benchmark suites due to network variation, despite the fact that these games largely top the charts in player count, and recent, GPU-limited games are usually overrepresented.

The games that we and others have chosen as benchmarks aren’t the problem — they offer a way to push a GPU to its extreme in order to compare it to the competition and previous generations. The problem is that benchmark suites frame performance around the clearest margins. And those margins can imply performance that doesn’t hold up outside of a graphics card review.

Benchmarks are often misleading

A hand grabbing the RTX 3090 Ti graphics card.
Jacob Roach / Digital Trends

Especially when it comes to the most recent graphics cards, benchmarks can be downright misleading. Every benchmark needs at least an average frame rate, which is a problematic number in and of itself. Brief spikes in frame rate are over-represented in an average of 1% lows and 0.1% lows — which average the lowest 1% and 0.1% of frames, respectively. But those numbers still don’t say much about how often those frame rate dips occur — only how severe they are.

A frame time chart can show how often frame rate dips happen, but even that only represents the section of the game the benchmark focused on. I hope you see the trend here: The buck has to stop somewhere, even as more data points try to paint a picture of real-world performance. Benchmarks show relative performance, but they don’t say much about the experience of playing a game.

The RTX 3090 Ti is 8.5% faster than the RTX 3090 in Red Dead Redemption 2, for example. That’s true, and it’s important to keep in mind. But the difference between the cards when playing is all of seven frames. I’d be hard-pressed to tell a difference in gameplay between 77 fps and 84 fps without a frame rate counter, so while the RTX 3090 Ti is technically faster, it doesn’t impact the experience of playing Red Dead Redemption 2 in any meaningful way.

Performance benchmarks for the RTX 3090 and RTX 3090 Ti in Red Dead Redemption 2.
Image used with permission by copyright holder

The recent F1 2022 is another example. The game shows huge disparities in performance between resolutions with all of the settings cranked up (as you’d usually find them in a GPU review). But bump down a few GPU-intensive graphics options, and the game is so CPU limited that it offers almost identical performance between 1080p and 4K. No need for a GPU upgrade there.

No one is lying or intentionally misleading with benchmarks, but the strict GPU hierarchy they establish is an abstraction of using your graphics card for what you bought it for in the first place. Benchmarks are important for showing differences, but they don’t say if those differences actually matter.

How to make an informed GPU upgrade

Installing a graphics card in a motherboard.
Image used with permission by copyright holder

You should absolutely look at benchmarks before upgrading your GPU, as many as you can. But don’t put your money down until you answer these questions:

  • What games do I want to play?
  • What resolution do I want to play at?
  • Are there other components that I need to upgrade?
  • What’s my budget?

Relative performance is extremely important for understanding what you’re getting for your money, but better isn’t strictly better in the world of PC components. Depending on the games you’re playing, the resolution you’re playing at, and potential bottlenecks in your system, you could buy a more expensive GPU and get the exact same performance as a cheaper one.

That doesn’t mean you shouldn’t splurge. There’s a lot to be said about buying something nice just because it’s nice, even if it doesn’t offer a huge advantage. If you have the means, there’s novelty in owning something super powerful like an RTX 3090 — even if you just use it to play Vampire Survivors. Just don’t expect to notice a difference when you’re actually playing.

This article is part of ReSpec – an ongoing biweekly column that includes discussions, advice, and in-depth reporting on the tech behind PC gaming.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
The state of GPUs is about to drastically change
Several GPUs sitting next to each other.

Get ready -- the list of the best graphics cards is going to look a lot different in the next couple of months. For the first time, Nvidia, AMD, and Intel are set to launch new generations within weeks of each other. Whatever you know about the three major players is about to change. Not only are we getting new generations but there are also shifts in strategy between Nvidia and AMD, tariffs to contend with, and next-gen AI features like FSR 4 in the pipeline.

Over the next few months, everything we currently know about the current slate of GPUs will change -- that much I can say for sure. I'm here to not only catch you up to speed on the past 12 months of leaks, rumors, reports, and confirmations, but also distill all of that information to get a better grasp on the GPU market of 2025. A lot is changing -- some good and some bad -- but one thing is undeniable: We're standing on the edge of an exciting new era for PC gaming.
The easy one: Nvidia

Read more
Nvidia CEO in 1997: ‘We need to kill Intel’
NVIDIA CEO Jensen Huang at GTC

Those headline above includes strong words from the maker of the best graphics cards you can buy, and they have extra significance considering where Nvidia sits today in relation to Intel. But in 1997, things were a bit different. The quote comes from the upcoming book The Nvidia Way, written by columnist Tae Kim, and was shared as part of an excerpt ahead of the book's release next month.

The words from Nvidia CEO Jensen Huang came as part of an all-hands meeting at the company in 1997 following the launch of the RIVA 128. This was prior to the release of the GeForce 256, when Nvidia finally coined the term "GPU," and it was a precarious time for the new company. Shortly following the release of the RIVA 128, Intel launched its own i740, which came with an 8MB frame buffer. The RIVA 128 came with only a 4MB frame buffer.

Read more
Can it run Doom? My journey through hell to discover why the answer is always ‘yes’
Various devices running Doom.

With its decades-long legacy, the "Can it run Doom" meme is one of the internet's oldest and and most beloved gags. But it's grown into far more than that. It's a rite of passage for aspiring developers.

Take a stroll through what has become my favorite subreddit recently, r/itrunsdoom, and you'll see the torch being valiantly carried forward to this very day. Since the It Runs Doom! Tumblr page stopped posting updates about a year ago, the Reddit community has grown to over 100,000 members that post and react to Doom being ported to just about anything. Calculators, credit card terminals -- sure. But also the Nintendo Alarmo, an RS Media Robot from 2006, and a Pioneer DJ multiplayer. And that's just from the past couple of months.

Read more