Skip to main content

Intel may already be conceding its fight against Nvidia

Two intel Arc graphics cards on a pink background.
Jacob Roach / Digital Trends

Nvidia continues to own the top-of-the-line GPU space, and the competition just hasn’t been able to, well, compete. The announcement of the impressive-sounding RTX 40 Super cards cements the lead even further.

As a result, AMD is said to be giving up on the high-end graphics card market with its next-gen GPUs. And now, a new rumor tells us that Intel might be doing the same with Arc Battlemage, its anticipated upcoming graphics cards that are supposed to launch later this year. While this is bad news, it’s not surprising at all.

Recommended Videos

Arc Battlemage leaks

First, let’s talk about what’s new. Intel kept quiet about Arc Battlemage during CES 2024, but Tom Petersen, Intel fellow, later revealed in an interview that it’s alive and well. The cards might even be coming out this year, although given Intel’s track record for not meeting GPU deadlines, 2025 seems like a safer bet. But what kind of performance can we expect out of these new graphics cards? This is where YouTuber RedGamingTech weighs in.

Get your weekly teardown of the tech behind PC gaming
Check your inbox!

RedGamingTech posted a big update to Intel Arc Battlemage specs in his latest video, and it doesn’t sound particularly good for high-end gaming enthusiasts. According to the YouTuber, the specifications of the flagship chip may be significantly different compared to his previous predictions. What’s worse, it might never even be released.

Initially, RedGamingTech suggested that the top Battlemage GPU would feature 56 Xe cores and a frequency of up to 3GHz. That’s still the case, but rumor has it that there’s been a big shake-up in memory bus and cache configuration. Instead of the 256-bit bus and the 116MB of L2 cache, the YouTuber now says that we can expect a 192-bit bus, 8MB of L2 cache, and a whopping 512MB of Adamantine cache.

Adamantine cache is still pretty unknown to us at this stage, although an Intel patent that PCGamer shared details on tells us more about it. It’s essentially Level 4 cache that’s comparable to AMD’s Infinity Cache and appears to work in a similar way.

That sounds pretty good, right? With 56 Xe cores, the card would be a huge upgrade over the Arc A770 that comes with 32 cores. However, even despite this massive L4 cache, those specs already hint at a less-than-high-end flagship for Intel. With a 192-bit bus, Intel would probably stop at around 12GB of VRAM, unless it ends up feeling adventurous like AMD with the RX 7600 XT or Nvidia with the RTX 4060 Ti. (Let’s hope that it won’t.)

Regardless of whether this GPU is even real, RedGamingTech suspects that Intel may choose not to release it at all due to unsatisfactory profit margins. Instead, Intel might focus on a GPU with 40 Xe cores, a 192-bit memory bus, 18MB of L2 cache, and zero “Adamantine” cache.

Is it time for Nvidia to celebrate?

Nvidia GeForce RTX 4090 GPU.
Jacob Roach / Digital Trends

AMD is reportedly bowing out of the high-end GPU race in this next generation. Now, Intel is said to be doing the same. Where does that leave Nvidia? Right at the very top, with complete control of the enthusiast GPU market and nothing to worry about in that regard.

It’s a dream for Nvidia, but it’s not so great for us, the end users. Giving Nvidia the ability to drive up the prices as much as it wishes brought us the RTX 40-series, where the prices and the performance often just don’t add up. With zero competition at the high end, the RTX 5090 might turn out to be a terrifying monstrosity with an eye-watering price tag. After all, why wouldn’t it be? It’s not like AMD or Intel are doing anything to keep Nvidia from doing otherwise.

On the other hand, even if Intel chooses to focus on the mainstream segment, things won’t change too much. AMD is Nvidia’s main competitor, and even now, when it has a couple of horses in this race, it still can’t match Nvidia’s flagship RTX 4090, or even the surprisingly impressive new RTX 40 Super cards. Intel, now one generation behind (and soon to be two), wouldn’t have been able to beat Nvidia’s future flagship either.

For the mainstream market, meaning the vast majority of GPUs that are sold, it’s actually good if AMD and Intel will be there and give Nvidia some heat. Those prices might end up less inflated as a result. Meanwhile, high-end gaming will be pricier than ever, but unfortunately, Intel wouldn’t have been able to stop Nvidia there anyway, regardless of the card it might never release.

Monica J. White
Monica is a computing writer at Digital Trends, focusing on PC hardware. Since joining the team in 2021, Monica has written…
Nvidia’s RTX 5090 might be up to 70% faster than its predecessor
The RTX 4090 graphics card sitting on a table with a dark green background.

We're nearing the announcement of Nvidia's upcoming RTX 50-series, which will most likely be revealed during CES 2025 in January. Despite the fact that it's less than a month away, we haven't seen any leaked benchmarks of the cards, so their performance remains an enigma. However, a leaker with a lengthy track record now sheds some light on what we can expect from each GPU, and that includes an up to 70% performance boost for Nvidia's best graphics card.

The leaker in question is OneRaichu on X (Twitter), who hasn't shared many new leaks recently, but has had some good insights in the past. As always with any type of leak, treat the following with caution -- it won't be long before we know with certainty what to expect from these upcoming GPUs.

Read more
It’s finally time to stop ignoring Intel GPUs
Two intel Arc graphics cards on a pink background.

Intel is taking another swing at making it among the best graphics cards with the Arc B580, which is set to launch in a matter of days. It's the first time we're seeing discrete graphics on desktop packing Intel's Battlemage architecture, and it's arriving just weeks before AMD and Nvidia are set to launch new generations.

I'm sure you've heard about Intel's first attempt with discrete GPUs, and all of the problems that ensued. Things have changed quite a bit over the past few years, though. I need to wait until the Arc B580 is here to fully put it through its paces, but based on what Intel has shared so far, it's a card you should definitely keep an eye on.
Fulfilling AMD's role

Read more
Intel’s new $249 GPU brings 1440p gaming to the masses
An exploded view of Intel's Arc A580 GPU.

Intel is trying to redefine what a "budget GPU" really means in 2024, and it's doing so with the new Arc B580 GPU. In what Intel itself described as its "worst kept secret," the B580 is the debut graphics card in Intel's new Battlemage range of discrete GPUs, and it's arriving at just $249. That's a price point that's been relegated to 1080p for decades, but Intel says the B580 will change that dynamic.

It's a 1440p GPU, at least by Intel's definition. That's despite the fact that Intel is comparing the card to GPUs like the RTX 4060 and RX 7600, both of which are more expensive than the B580 and squarely target 1080p. Intel says it can deliver higher performance than these two GPUs while undercutting the price, all in an attempt to capitalize on 1440p gamers. "1440p is becoming 1080p," as Intel's Tom Petersen put it in a pre-briefing with the press.

Read more