Skip to main content

AMD could swipe some of the best features of Nvidia GPUs

AMD logo on the RX 7800 XT graphics card.
Jacob Roach / Digital Trends

Nvidia overwhelmingly dominates the list of the best graphics cards, and that largely comes down to its feature set that’s been enabled through DLSS. AMD isn’t sitting idly by, however. The company is researching new ways to leverage neural networks to enable real-time path tracing on AMD graphics cards — something that, up to this point, has only really been possible on Nvidia GPUs.

AMD addressed the research in a blog post on GPUOpen, saying that the goal is “moving towards real-time path tracing on RDNA GPUs.” Nvidia already uses AI accelerators on RTX graphics cards to upscale an image via DLSS, but AMD is focused on a slightly different angle of performance gains — denoising.

Neural denoising in a complex virtual scene.
AMD

When enabling path tracing in a game like Alan Wake 2 or Cyberpunk 2077, you’re only getting a small fraction of the rays cast into the scene. In a real-time context, only a handful of samples per pixel are cast into the scene, and they bounce around, but rarely go back to a light source within the scene. That leads to a noisy image — see the top left of the image above — that needs to be cleaned up with denoising. AMD is applying a neural network to the denoising process.

Recommended Videos

Nvidia has this technique covered already with Ray Reconstruction, which is a DLSS feature that’s sorely underrated. It makes a massive difference in image quality, preserving details in path tracing that would normally take minutes or hours to render for a single frame offline. AMD is looking at something similar: taking a small number of samples per pixel and reconstructing the fine details of path tracing using a neural network.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

The technique AMD is researching, however, combines upscaling and denoising into a single neural network. “We research a Neural Supersampling and Denoising technique which generates high-quality denoised and supersampled images at higher display resolution than render resolution for real-time path tracing with a single neural network,” the blog post reads. “Our technique can replace multiple denoisers used for different lighting effects in rendering engine by denoising all noise in a single pass, as well as at low resolution.”

This looks like some foundational research for the next version of AMD’s FSR, which could finally match Nvidia on performance and image quality. The lingering question is if these techniques require any bespoke hardware. Nvidia claims that dedicated accelerators on its RTX graphics cards are necessary for AI-assisted upscaling and denoising with DLSS, so AMD may need dedicated hardware on its GPUs, too.

However, there is a world where AMD could open up FSR 4 — or whatever the next version is called — to all graphics cards while still leveraging a neural network. RTX GPUs already have the hardware, and we’ve seen with features like Intel’s XeSS that it’s possible to run AI models on GPUs through separate instructions, though usually with a hit to image quality and performance.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
It’s finally time to stop ignoring Intel GPUs
Two intel Arc graphics cards on a pink background.

Intel is taking another swing at making it among the best graphics cards with the Arc B580, which is set to launch in a matter of days. It's the first time we're seeing discrete graphics on desktop packing Intel's Battlemage architecture, and it's arriving just weeks before AMD and Nvidia are set to launch new generations.

I'm sure you've heard about Intel's first attempt with discrete GPUs, and all of the problems that ensued. Things have changed quite a bit over the past few years, though. I need to wait until the Arc B580 is here to fully put it through its paces, but based on what Intel has shared so far, it's a card you should definitely keep an eye on.
Fulfilling AMD's role

Read more
Indiana Jones and the Great Circle proves Nvidia wrong about 8GB GPUs
Indiana jones buried in the sand.

Nvidia was wrong, and Indiana Jones and the Great Circle is proof of that. Despite being a game that's sponsored by Nvidia due to its use of full ray tracing -- which is said to arrive on December 9 -- multiple of Nvidia's best graphics cards struggle to maintain a playable frame rate in the game, and that largely comes down to VRAM.

Computer Base tested a swath of GPUs in the game across resolutions with the highest graphics preset, and one consistent trend emerged. Any GPUs packing less than 12GB of VRAM couldn't even maintain 30 frames per second (fps) in the game at its highest graphics settings. That led to some wild comparisons as you can see in the chart below. The Intel Arc A770, for example, which is a budget-focused 1080p graphics card, beats the RTX 3080, which was the 4K champion when it launched. Why? The A770 has 16GB of VRAM, while the RTX 3080 has 10GB.

Read more
A PC ‘recession’ could make hardware way more expensive, says researcher
The RTX 4080 in a running test bench.

Get ready to spend big if you plan on scoring one of the best graphics cards or best processors. According to Jon Peddie Research, the PC market could be headed for a "recession" due to proposed tariffs on several countries, which are said to go into effect shortly after Donald Trump becomes president on January 20.

The quote comes from JPR's third-quarter GPU market study. Market share has shifted a bit, CPU shipments are up by 12%, but there really isn't much to write home about -- short of the tariffs. "AMD and Intel released new CPUs, and there was some pent-up demand for them. However, looking forward, we think that if the proposed tariffs are imposed, the PC market will suffer a recession due to increased prices and unmatched increases in income," wrote Dr. Jon Peddie.

Read more