Skip to main content

The surprising reason your powerful PC still can’t handle the latest games

We’re off to a rocky start with PC releases in 2023. Hogwarts Legacy, Resident Evil 4 Remake, Forspoken, and most recently and notably The Last of Us Part One have all launched in dire states, with crashes, hitches, and lower performance despite a minor increase in visual quality. And a big reason why is that the graphics cards of the last few years aren’t equipped to handle the demands of games today.

The GPUs themselves are powerful enough; games haven’t suddenly gotten more demanding for no reason. The problem is video memory or VRAM. Many of the most powerful GPUs from the previous generation weren’t set up to handle the VRAM demands of modern games, which may explain why your relatively powerful PC can’t handle the latest and most exciting new games.

What does your VRAM do anyway?

Nvidia GPU core.
Niels Broekhuijsen/Digital Trends

Think of your graphics card like a self-contained computer. In your PC, your processor and RAM work together to do the brunt of the processing work. Your processor does the actual calculations, while your RAM holds the data it needs to do that processing closely. If your CPU had to go out to your hard drive every time it wanted to do a calculation, your computer would be too slow to be useful.

Your graphics card is the same way. The GPU handles the actual processing, while the VRAM holds the data necessary for that processing. This most notably comes up in texture resolution, as higher resolutions are much larger in size compared to lower ones. But other data flows in and out of VRAM, too: shadow maps, geometry, and critically, shaders.

Shaders, especially in titles with ray tracing, are complex and require a lot of space in VRAM. Along with rising texture resolution, the demands of modern AAA games often go beyond the standard 8GB of VRAM you’ve needed in the past, especially if you’re playing at higher resolutions. Unfortunately, this isn’t a problem that a lot of last-gen GPUs accounted for.

The RTX 3070 Ti problem

Nvidia's RTX 3070 Ti graphics card.
Image used with permission by copyright holder

When ExtremeTech published a round-up for RTX 3070 Ti reviews, it didn’t mince words. The card had a “long-term problem” with its low VRAM, and we’re starting to see that problem take shape.

Resident Evil 4 Remake can hog up to 8GB of VRAM simply on textures, though you have the option to go much lower. The Last of Us Part One can consume nearly 7GB at its lowest graphics preset and upwards of 14GB at its highest. And Hogwarts Legacy sucked up nearly 13GB of VRAM with ray tracing on, and close to 8GB with it off.

The effects of this are already clear. In preliminary testing of The Last of Us Part One, Hardware Unboxed found massive stuttering with 8GB of VRAM compared to 16GB, even with two graphics cards that should perform around the same level. Keep in mind that the recommended system requirements for this game only call for 4GB of VRAM, as well.

Even powerful graphics cards from the last couple of years are running out of VRAM. Stuttering is one issue, but running out of VRAM can also cause crashes and force you to turn down settings that your GPU is otherwise capable of handling.

I’m calling this the RTX 3070 Ti problem, but it’s not exclusive to the RTX 3070 Ti. It just serves as good touchstone for a wide swath of GPUs that are stuck at or under 8GB of VRAM, despite sporting excellent GPU power otherwise. Even the 10GB RTX 3080 isn’t immune, especially with the highest graphics settings at 4K.

Focused in one direction

Two intel Arc graphics cards on a pink background.
Jacob Roach / Digital Trends

It’s upsetting that graphics cards that should be plenty powerful to run modern games are simply running out of VRAM, causing stuttering and crashes that shouldn’t be happening. Most of this problem is focused in one direction, though: Nvidia.

Nvidia makes the best graphics cards you can buy today, but AMD and Intel have focused some effort on boosting VRAM, even on lower-end models. For example, Intel’s Arc A770 includes 16GB of VRAM under $350. Even the $900 RTX 4070 Ti only includes 12GB. Similarly, AMD opted for 12GB of memory for its midrange RX 6700 XT, while Nvidia stuck with 8GB. That can make a difference in games like Hogwarts Legacy, where Intel’s GPU performs much better than its price would suggest.

Some of that is being rectified with newer cards. Rumors suggest Nvidia’s RTX 4070 could carry 12GB of VRAM, but it still stings that high-end GPUs capable of running the most demanding games are coming up against issues simply due to VRAM limitations. Unfortunately, there’s not a lot you can do if you’re running out of video memory outside of upgrading your graphics card.

You can reduce some stuttering issues, though. If you’re limited by VRAM, turning down your texture resolution can help a lot. You can also reset your shader cache through AMD Software and try increasing your shader cache size in the Nvidia Control Panel. The ultimate fix, though, is more VRAM on graphics cards, especially in lower-end models, which is going to come as a major letdown for those that recently upgraded.

Editors' Recommendations

Jacob Roach
Senior Staff Writer, Computing
Jacob Roach is a writer covering computing and gaming at Digital Trends. After realizing Crysis wouldn't run on a laptop, he…
This absurd PC with 7 RTX 4090s costs an eye-watering $31,600
A close-up of the Mifcom Big Boss PC showing its seven RTX 4090 GPUs.

Given its sky-high price, it’s safe to say most people do not own an Nvidia RTX 4090 GPU, despite it being one of the best graphics cards you can buy. At $1,499 a pop, it costs more than most PCs, yet one company has just launched a monstrous computer kitted out with no less than seven RTX 4090s. The asking price? A mere 28,999 euros, or roughly $31,600.

Dreamed up by German PC builder Mifcom, the appropriately named Big Boss not only stacks in seven of Nvidia’s most powerful graphics cards but fits each of them with a watercooler too. Those GPUs bring a combined 168GB of video memory to the table for an absolutely absurd amount of power.

Read more
Why I couldn’t live without an ultrawide gaming monitor
Spider-Man running on the LG UltraGear OLED 45.

The debate between a dual-monitor setup and ultrawide is one for the ages (or, at least, the last decade), but I firmly fall into the ultrawide camp. That's why it surprised me to see my colleague Jon Martindale sing the praises of a dual-monitor setup over an ultrawide. I have to push back.

Martindale makes some valid points, from the cost of ultrawide monitors to the pixel density of such a large display. But, for the way I use my monitor, I couldn't imagine using anything else. Here's why.
Perfectly immersive

Read more
Why I leave Nvidia’s game-changing tech off in most games
Ratchet and Clank Rift Apart running on the Samsung Odyssey OLED G8.

Nvidia's most recent graphics cards have increasingly relied on Deep Learning Super Sampling (DLSS) 3 to find their value. GPUs like the RTX 4070 Ti and RTX 4060 Ti aren't impressive on their own, but factor DLSS 3 into the buying decision, and they start to become attractive. If you look at Nvidia's overall strategy for this generation of chips, it looks like the company has started selling DLSS, not graphics cards.

It's not hard to see why DLSS 3 is so important. It makes the impossible possible, like path tracing in Cyberpunk 2077, and it helps multiply frame rates far beyond what should be possible in games like Portal RTX. But now that we finally have DLSS 3 in more games and the party trick status has faded away, I've left Frame Generation off in most games. Here's why.
How DLSS 3 works

Read more