Skip to main content

How much does the RTX 4090 cost? RTX 40-series buying guide

Nvidia has finally announced the RTX 40 series, and three new RTX 40 cards will be available later this year: the flagship RTX 4090, the high-end RTX 4080 16GB, and the RTX 4080 12GB. Along with apparently massive performance improvements over last-generation RTX 30 series cards, these new GPUs come with high price tags.

How much does the RTX 4090 cost?

Nvidia

The flagship RTX 4090 is launching with an MSRP of $1,599, which is $100 higher than the $1,499 MSRP of the RTX 3090 and $400 lower than the $1,999 MSRP of the RTX 3090 Ti. $100 extra for Nvidia’s new flagship isn’t that much when the RTX 3090 was already so expensive, so not much has changed here.

Recommended Videos
RTX 4090 RTX 3090
Process TSMC 5nm Samsung 8nm
Architecture Ada Lovelace Ampere
CUDA cores 16,384 10,496
Memory 24GB GDDR6X 24GB GDDR6X
Boost clock speed 2520MHz 1695MHz
Bus width 384-bit 384-bit
Power 450W 350W
Get your weekly teardown of the tech behind PC gaming
Check your inbox!

It’s actually surprising that the RTX 4090 doesn’t cost more because it has way more CUDA cores than the RTX 3090 and the RTX 3090 Ti. It still has more or less the same memory size and bandwidth, but that shouldn’t really be a cause for concern; Nvidia should know how much VRAM its GPUs need.

How much does the RTX 4080 cost?

New Nvidia GeForce RTX 4080 GPU over a black and green background.
Nvidia

Things are a bit more complicated with the RTX 4080, which has two different models: the 4080 16GB at $1,199 and the 4080 12GB at $899. That’s much more expensive than the RTX 3080 10GB, which launched at $699, but it’s cheaper than the RTX 3080 12GB, which launched at $1,249. That being said, the 3080 12GB has seldom been in good supply, and the price has been falling ever since the end of the GPU shortage. Compared to the standard RTX 3080 10GB, both RTX 4080 models are much more expensive.

RTX 4080 16GB RTX 4080 12GB RTX 3080
Process TSMC 5nm TSMC 5nm Samsung 8nm
Architecture Ada Lovelace Ada Lovelace Ampere
CUDA cores 9,728 7,680 8960 / 8704
Memory 16GB GDDR6X 12GB GDDR6X 12GB / 10GB GDDR6X
Boost clock speed 2505MHz 2610MHz 1710MHz
Bus width 256-bit 192-bit 384-bit / 320-bit
Power 320W 285W 350W / 320W

At first glance, this different amount of memory business might sound like the difference between the RTX 3080 10GB and the RTX 3080 12GB, which have very similar performance but a large difference in price. However, these two different 4080s differ greatly not just in memory size and price but also in other specifications.

The RTX 4080 16GB has 9,728 CUDA cores, while the RTX 4080 12GB has just 7,680. The memory bandwidth on the 12GB model is also much lower since it has a 192-bit bus compared to the 256-bit bus on the 16GB version. The 12GB card does have a slightly higher clock speed, but that’s more than offset by the lower amount of cores and memory bandwidth. The 16GB and 12GB are effectively very different GPUs and not just merely different versions of the same card, hence the $300 price difference.

Which RTX 40-series GPU should you buy?

The top of the Nvidia RTX 4080 cooler.
Image used with permission by copyright holder

Until the reviews are in, it’s hard to recommend any of the RTX 40 series cards Nvidia has revealed so far. These are some of the most expensive GPUs ever released (which hasn’t been received well by most users), and even if RTX 40 is as fast as Nvidia says it is, these high price tags are definitely going to negatively impact the value proposition of these cards.

That being said, Nvidia’s new GPUs do seem priced sensibly relative to each other. The RTX 4080 16GB offers over 2,000 more CUDA cores, 4GB more VRAM, and more memory bandwidth than the RTX 4080 12GB for $300 more. For another $400, you could get the RTX 4090, which comes with 6,000 more CUDA cores, 8GB more VRAM, and even more memory bandwidth. The 4090 is actually in a class above the 4080 16GB, unlike how the RTX 3090 was just an RTX 3080 with a few more cores and higher TDP.

If you’re going to spend hundreds of dollars on a cutting-edge GPU, it might just be worth it to go all out and get the RTX 4090. At least then you won’t be wanting for more, even if it is one of the most expensive gaming GPUs ever made. On the other hand, you still get faster ray tracing performance and DLSS 3 with the much cheaper RTX 4080 16GB and 12GB.

Topics
Matthew Connatser
Former Digital Trends Contributor
Matthew Connatser is a freelance writer who works on writing and updating PC guides at Digital Trends. He first got into PCs…
This GPU just beat the RTX 4090 — and Nvidia didn’t make it
The board of the RTX 4090 Super graphics card.

Modders are doing what Nvidia won't. The team at Teclab put together a Frankenstein graphics card, which it calls the RTX 4090 Super, that was able to beat the RTX 4090 by 13% in benchmarks.

You can't buy this graphics card, of course, but it's an interesting look into how splicing together the best components can deliver big performance gains. The heart of the RTX 4090 Super is the RTX 4090 GPU, which is Nvidia's AD102 die. Teclab changed everything else about the graphics card, though.

Read more
The RTX 4090 is more popular on Steam than any AMD GPU
Nvidia GeForce RTX 4090 GPU.

Despite being easily the fastest graphics card you can buy right now, the RTX 4090 is a niche product. At $1,600, it's out of the conversation for the vast majority of gamers. Still, that hasn't stopped the GPU from reaching a high ranking in Steam's hardware survey. According to the latest survey, the RTX 4090 is in 0.96% of gaming PCs running Steam -- more than any individual AMD GPU.

Although it's no surprise that Nvidia tops the charts in the Steam hardware survey -- the most recent report says Nvidia is represented in 76.59% of PCs compared to AMD's 15.79% -- it's shocking to see such an expensive GPU rank so highly. Compared to last month, the RTX 4090 even gained 0.11%, despite only being available above list price.

Read more
Nvidia just made GeForce Now so much better
Playing games with GeForce Now on a laptop.

Nvidia has just added adaptive refresh rates to GeForce Now, its cloud gaming service. The new tech, dubbed Cloud G-Sync, works on PCs with Nvidia GPUs first and foremost , but also on Macs. These include Macs with Apple Silicon, as well as older models with Intel CPUs and AMD GPUs. On the Windows PC side more broadly, Intel and AMD GPUs will not be supported right now. Nvidia has also made one more change to GeForce Now that makes it a lot easier to try out -- it introduced day passes.

Cloud G-Sync's variable refresh rate (VRR) feature will sync your monitor's refresh rate to match the frame rates you're hitting while gaming with GeForce Now. Nvidia's new cloud solution also uses Reflex to lower latency regardless of frame rates. Enabling VRR in GeForce Now should provide a major boost by reducing screen tearing and stuttering, improving the overall gaming experience on PCs and laptops that normally can't keep up with some titles. To pull this off, Nvidia uses its proprietary RTX 4080 SuperPODs.

Read more