Skip to main content

The surprising reason your powerful PC still can’t handle the latest games

We’re off to a rocky start with PC releases in 2023. Hogwarts Legacy, Resident Evil 4 Remake, Forspoken, and most recently and notably The Last of Us Part One have all launched in dire states, with crashes, hitches, and lower performance despite a minor increase in visual quality. And a big reason why is that the graphics cards of the last few years aren’t equipped to handle the demands of games today.

The GPUs themselves are powerful enough; games haven’t suddenly gotten more demanding for no reason. The problem is video memory or VRAM. Many of the most powerful GPUs from the previous generation weren’t set up to handle the VRAM demands of modern games, which may explain why your relatively powerful PC can’t handle the latest and most exciting new games.

Recommended Videos

What does your VRAM do anyway?

Nvidia GPU core.
Niels Broekhuijsen / Digital Trends

Think of your graphics card like a self-contained computer. In your PC, your processor and RAM work together to do the brunt of the processing work. Your processor does the actual calculations, while your RAM holds the data it needs to do that processing closely. If your CPU had to go out to your hard drive every time it wanted to do a calculation, your computer would be too slow to be useful.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

Your graphics card is the same way. The GPU handles the actual processing, while the VRAM holds the data necessary for that processing. This most notably comes up in texture resolution, as higher resolutions are much larger in size compared to lower ones. But other data flows in and out of VRAM, too: shadow maps, geometry, and critically, shaders.

Shaders, especially in titles with ray tracing, are complex and require a lot of space in VRAM. Along with rising texture resolution, the demands of modern AAA games often go beyond the standard 8GB of VRAM you’ve needed in the past, especially if you’re playing at higher resolutions. Unfortunately, this isn’t a problem that a lot of last-gen GPUs accounted for.

The RTX 3070 Ti problem

Nvidia's RTX 3070 Ti graphics card.
Image used with permission by copyright holder

When ExtremeTech published a round-up for RTX 3070 Ti reviews, it didn’t mince words. The card had a “long-term problem” with its low VRAM, and we’re starting to see that problem take shape.

Resident Evil 4 Remake can hog up to 8GB of VRAM simply on textures, though you have the option to go much lower. The Last of Us Part One can consume nearly 7GB at its lowest graphics preset and upwards of 14GB at its highest. And Hogwarts Legacy sucked up nearly 13GB of VRAM with ray tracing on, and close to 8GB with it off.

The effects of this are already clear. In preliminary testing of The Last of Us Part One, Hardware Unboxed found massive stuttering with 8GB of VRAM compared to 16GB, even with two graphics cards that should perform around the same level. Keep in mind that the recommended system requirements for this game only call for 4GB of VRAM, as well.

Even powerful graphics cards from the last couple of years are running out of VRAM. Stuttering is one issue, but running out of VRAM can also cause crashes and force you to turn down settings that your GPU is otherwise capable of handling.

I’m calling this the RTX 3070 Ti problem, but it’s not exclusive to the RTX 3070 Ti. It just serves as good touchstone for a wide swath of GPUs that are stuck at or under 8GB of VRAM, despite sporting excellent GPU power otherwise. Even the 10GB RTX 3080 isn’t immune, especially with the highest graphics settings at 4K.

Focused in one direction

Two intel Arc graphics cards on a pink background.
Jacob Roach / Digital Trends

It’s upsetting that graphics cards that should be plenty powerful to run modern games are simply running out of VRAM, causing stuttering and crashes that shouldn’t be happening. Most of this problem is focused in one direction, though: Nvidia.

Nvidia makes the best graphics cards you can buy today, but AMD and Intel have focused some effort on boosting VRAM, even on lower-end models. For example, Intel’s Arc A770 includes 16GB of VRAM under $350. Even the $900 RTX 4070 Ti only includes 12GB. Similarly, AMD opted for 12GB of memory for its midrange RX 6700 XT, while Nvidia stuck with 8GB. That can make a difference in games like Hogwarts Legacy, where Intel’s GPU performs much better than its price would suggest.

Some of that is being rectified with newer cards. Rumors suggest Nvidia’s RTX 4070 could carry 12GB of VRAM, but it still stings that high-end GPUs capable of running the most demanding games are coming up against issues simply due to VRAM limitations. Unfortunately, there’s not a lot you can do if you’re running out of video memory outside of upgrading your graphics card.

You can reduce some stuttering issues, though. If you’re limited by VRAM, turning down your texture resolution can help a lot. You can also reset your shader cache through AMD Software and try increasing your shader cache size in the Nvidia Control Panel. The ultimate fix, though, is more VRAM on graphics cards, especially in lower-end models, which is going to come as a major letdown for those that recently upgraded.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
Nvidia CEO in 1997: ‘We need to kill Intel’
NVIDIA CEO Jensen Huang at GTC

Those headline above includes strong words from the maker of the best graphics cards you can buy, and they have extra significance considering where Nvidia sits today in relation to Intel. But in 1997, things were a bit different. The quote comes from the upcoming book The Nvidia Way, written by columnist Tae Kim, and was shared as part of an excerpt ahead of the book's release next month.

The words from Nvidia CEO Jensen Huang came as part of an all-hands meeting at the company in 1997 following the launch of the RIVA 128. This was prior to the release of the GeForce 256, when Nvidia finally coined the term "GPU," and it was a precarious time for the new company. Shortly following the release of the RIVA 128, Intel launched its own i740, which came with an 8MB frame buffer. The RIVA 128 came with only a 4MB frame buffer.

Read more
Rest in pieces: Nvidia is finally ditching GeForce Experience for good
The Nvidia app on the Windows desktop.

We've had the Nvidia app for a while, but now, it's available officially. About a year ago, Nvidia launched the Nvidia app into beta as a one-stop-shop for managing some of its best graphics cards, including grabbing new drivers, messing around with different features, and optimizing your game settings. Now, it's out of beta, officially replacing the legacy GeForce Experience and Nvidia Control Panel apps, and with some new features in tow.

One of the biggest draws of the Nvidia app initially was driver downloads. It may seem mundane, but you'd previously need to download GeForce Experience and create an Nvidia account for GPU driver updates. If you didn't, you'd have to search and install your drivers manually. The Nvidia app gives you access to new drivers, and notifies you when they're ready, all without an Nvidia login. Now, signing in is optional for "bundles and rewards" offered by Nvidia.

Read more
Your next gaming PC could be fully built by Nvidia
Nvidia's A100 data center GPU.

Nvidia might be at the heart of your next gaming PC, not just through a graphics card, but also through your processor. Team Green is working on an Arm-based PC platform that's built around a CPU and GPU designed by Nvidia and that is reportedly set to launch in September 2025, according to DigiTimes.

According to the report, Nvidia is planning on launching a high-end computing platform based on Arm instructions in September, with a commercial launch following in March 2026. This is the first we're hearing about a timeline for Arm-based chips designed by Nvidia, but it's not the first time we're hearing about it. About a year ago, Reuters reported that Nvidia began looking into Arm-based CPUs as "part of Microsoft's effort to help chip companies build Arm-based processors for Windows PCs."

Read more