Skip to main content

With exclusive PC partnerships, everyone loses

A pilot inside of a ship in Starfield.
Image used with permission by copyright holder
Jacob Roach in a promotional image for ReSpec
This story is part of Jacob Roach's ReSpec series, covering the world of PC gaming and hardware.

I’m worried. This past week, AMD announced it was the “exclusive PC partner” for the upcoming Starfield, working directly with Bethesda Game Studios to optimize the game for AMD hardware and getting AMD’s FidelityFX Super Resolution (FSR) up and running in the game (the second, much better version, to be exact).

That’s not a problem. It’s a good thing. Nvidia and AMD routinely enter co-marketing arrangements with upcoming games. The company promotes the game or bundles a code with a new hardware purchase, and it usually dedicates some engineers to assist with getting features working in the game. Nvidia, for example, recently offered a copy of Diablo IV with the purchase of select RTX 40-series GPUs, and the game supports Nvidia’s latest version of Deep Learning Super Sampling (DLSS).

Recommended Videos

What’s the problem? Well, it’s the word “exclusive” that AMD used, and its lack of clarity on what that means. This isn’t throwaway phrasing. On the announcement page, AMD uses “exclusive” three times in four paragraphs of text. And the accompanying YouTube video, which you can see has twice as many dislikes as it does likes through a browser extension, uses “exclusive” in the title.

AMD is Starfield’s Exclusive PC Partner

I’m not trying to nitpick, but clearly, AMD is putting weight behind being the “exclusive” PC partner of Starfield, so it shouldn’t come as a surprise that gamers are putting weight behind the word, too. And the results are clear: Speculation since the announcement has run rampant, with many assuming the game won’t support Nvidia’s DLSS or Intel’s XeSS.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

We don’t know if the game will exclusively use FSR 2. I’ve repeatedly reached out to AMD, but I haven’t gotten clarity on what exactly “exclusive” means. The most I’ve heard is the following: “I do not have a statement to share at this time.” It should go without saying, though, that having multiple PC options is important for a game as monumental as Starfield, marking the first time Bethesda Game Studios has released a single-player RPG in over eight years. It’s a big deal — and for all of us, it creates a problem.

More than a word

Cal picking up a stim upgrade with BD.
Image used with permission by copyright holder

Shortly before AMD’s announcement, Wcctech pointed out that several major AAA releases have implemented FSR but not DLSS. These games, including Star Wars Jedi: Survivor, The Callisto Protocol, and Dead Island 2, have been featured as part of AMD co-marketing agreements. In the same time frame, games that have entered co-marketing agreements with Nvidia have implemented FSR at or shortly after launch.

Take your tin foil hats off for just a moment, though. There are several other reasons why some games only feature FSR (or DLSS, for that matter). Resident Evil games, for example, have only supported FSR since Resident Evil 7, so there’s definitely a world where the developers have found something that worked and ran with it. Development time is very limited in many cases, so a game lacking DLSS support doesn’t inherently mean that AMD blocked it.

And for Nvidia’s part, there are games that only support DLSS, as well. A Plague Tale: Requiem, the Crysis trilogy, and Control are prime examples. AMD’s argument seems to be that FSR works on just about anything, while DLSS is exclusive to Nvidia RTX graphics cards.  “… we believe an open approach that is broadly supported on multiple hardware platforms is the best approach that benefits developers and gamers,” the company said in a statement to Wccftech.

We have a relatively trivial wrapper around DLSS, FSR2, and XeSS. All three APIs are so similar nowadays, there's really no excuse.

— Nico (@mempodev) June 27, 2023

That brings us back to the word “exclusive,” though. As some developers have already pointed out, the backbone of DLSS, FSR, and XeSS is largely the same, saying there’s “really no excuse” for not using all three. And some developers, such as PureDark, have begun building careers out of developing mods that implement the missing features (they’ve already promised a mod for Starfield).

AMD’s commitment to providing an open technology has no future in a world of exclusive partnerships. At the moment, we don’t know if Starfield will only support FSR, and any claims that AMD is entering agreements to block DLSS and XeSS are just speculation. But when you stop and ask yourself what else “exclusive partnership” could mean — it certainly doesn’t sound good. It’s a troubling sign of the times, and if feature exclusivity becomes the norm, it’ll be yet another example of fragmentation that only hurts PC gaming as a whole.

A swift counter

Nvidia GeForce RTX 4090 GPU.
Jacob Roach / Digital Trends

Let’s look at the counterargument. The internet is, unsurprisingly, outraged by the thought of Starfield exclusively using FSR, but isn’t that what Nvidia does? After all, you can’t use DLSS on any graphics card. The two most popular GPUs on Steam, Nvidia’s own GTX 1060 and 1650, don’t support DLSS. But they support FSR. It can seem like a double standard when Nvidia has its own exclusive technology.

DLSS only works on RTX GPUs. That’s due to the fact that DLSS needs specific hardware, but that doesn’t matter. Nvidia developed a feature to sell new graphics cards, plain and simple. Comparing that to games exclusively using one image reconstruction feature over another isn’t an equal battle. They aren’t the same thing.

AMD could, for example, decide to make its upcoming FSR 3 exclusive to its GPUs. In that world, we have two GPU brands competing with each other on features, and it’s up to buyers and reviewers to decide which is best. In the case of Starfield, if it will indeed exclusively use FSR, there’s no competition. It’s just one company putting a roadblock in front of the other.

An announcement slide showing FSR 3.
AMD

It’s important to reiterate once again that we don’t know, with certainty, that Starfield will exclusively use FSR. The problem is AMD’s silence on the situation, allowing speculation to run wild. Nvidia, for its part, hasn’t minced words.

“Nvidia does not and will not block, restrict, discourage, or hinder developers from implementing competitor technologies in any way,” said Nvidia’s Keita Iida in a statement to Digital Trends and other outlets.

No excuse for exclusivity

RX 7900 XTX and RX 7900 XT on a pink background.
Jacob Roach / Digital Trends

In the early days of DLSS and FSR, we saw a lot of games that supported only one of the two features. Since then, however, we (and developers) have learned just how similar the two are behind the scenes, and most major AAA releases come with at least FSR and DLSS support, and oftentimes XeSS support, as well.

Dying Light 2 and Call of Duty Modern Warfare II support all three. Atomic Heart, which Nvidia has been promoting for nearly six years, supports DLSS and FSR (as does Dead Space). Diablo IV supports all three, as well, and with a really unique implementation where you’re restricted based on your hardware (DLSS for RTX, XeSS for Arc, and FSR for everything else). For a game as large as Starfield, there’s simply no excuse for only supporting one of the three.

A character in Starfield.
Image used with permission by copyright holder

This, on its own, isn’t a big deal. Modders will get DLSS up and running the game, regardless of if it’s officially supported or not, and I don’t think Nvidia users will have a subpar experience playing it (they can use FSR, too, after all). But it sets a bad precedent.

If AMD or Nvidia start to engage in blocking competitor tech, everyone loses. We’re now not fighting on a level playing field of competing technologies to see which is best, but instead engaging in whoever can dedicate the most time (and money) to blocking out the other. It’s not dissimilar from the shenanigans we’ve seen in the recent FTC vs. Microsoft trial, where console exclusivity deals have blocked games simply based on hardware preference and who has a larger exclusivity budget to throw around.

I hope PC isn’t headed to a similar future. I understand the need for Nvidia and AMD to enter co-marketing agreements to promote games and their products, but in a world where vendors are locked out of implementing features and providing optimized drivers for the biggest game releases, it’s a race to the bottom. And ultimately, gamers have to pay the price for that.

That, thankfully, isn’t the world we live in today. I just hope it’s not where PC gaming is headed.

This article is part of ReSpec – an ongoing biweekly column that includes discussions, advice, and in-depth reporting on the tech behind PC gaming.

Jacob Roach
Lead Reporter, PC Hardware
Jacob Roach is the lead reporter for PC hardware at Digital Trends. In addition to covering the latest PC components, from…
This $20 accessory should be in the stocking of every PC gamer
8bitdo wireless usb adapter stocking stuffer adpater 1

I hate Bluetooth in Windows. Add it to the list of gripes I have, but no matter how many times I go through the process of pairing and re-pairing different controllers on my PC, I always seem to run into issues with dropped connections and pairing failures when I go to play games with a controller on my PC. This $20 accessory solves that issue.

I'm talking about the . It's a little $20 dongle that I've used for years at this point, and it's become such a staple of my PC setup that I rarely even think about it. If you're looking to pad the stocking of a PC gamer in your life, it's one of the most inexpensive accessories that can benefit nearly any PC gamer.
The Bluetooth battle
At a high level, the Wireless USB Adapter 2 is just a Bluetooth adapter. You plug it into your PC, hold down a tiny button until the light starts blinking rapidly, and pair your controller of choice, be it from Xbox, PlayStation, Nintendo, or even 8BitDo itself. I've gone through the pairing process literally hundreds of times with the adapter, and I've never once run into an issue. I can't say the same with Bluetooth in Windows.

Read more
It’s finally time to stop ignoring Intel GPUs
Two intel Arc graphics cards on a pink background.

Intel is taking another swing at making it among the best graphics cards with the Arc B580, which is set to launch in a matter of days. It's the first time we're seeing discrete graphics on desktop packing Intel's Battlemage architecture, and it's arriving just weeks before AMD and Nvidia are set to launch new generations.

I'm sure you've heard about Intel's first attempt with discrete GPUs, and all of the problems that ensued. Things have changed quite a bit over the past few years, though. I need to wait until the Arc B580 is here to fully put it through its paces, but based on what Intel has shared so far, it's a card you should definitely keep an eye on.
Fulfilling AMD's role

Read more
Indiana Jones and the Great Circle proves Nvidia wrong about 8GB GPUs
Indiana jones buried in the sand.

Nvidia was wrong, and Indiana Jones and the Great Circle is proof of that. Despite being a game that's sponsored by Nvidia due to its use of full ray tracing -- which is said to arrive on December 9 -- multiple of Nvidia's best graphics cards struggle to maintain a playable frame rate in the game, and that largely comes down to VRAM.

Computer Base tested a swath of GPUs in the game across resolutions with the highest graphics preset, and one consistent trend emerged. Any GPUs packing less than 12GB of VRAM couldn't even maintain 30 frames per second (fps) in the game at its highest graphics settings. That led to some wild comparisons as you can see in the chart below. The Intel Arc A770, for example, which is a budget-focused 1080p graphics card, beats the RTX 3080, which was the 4K champion when it launched. Why? The A770 has 16GB of VRAM, while the RTX 3080 has 10GB.

Read more