Starfield is one of the most demanding games on PC that we’ve seen recently, with even the RTX 4090 paired with AMD’s latest Ryzen 7800X3D just about hitting 60fps on average at 4K with all the settings maxed out. As reviewers and testers scramble to figure out why Starfield is so heavy, the experts over at Digital Foundry have discovered some obvious differences between AMD and Intel / Nvidia systems.
“If you’re on Intel and Nvidia you’re getting a bizarrely worse experience here in comparison to AMD GPUs in a way that’s completely out of the norm,” explains Alexander Battaglia in a detailed 32-minute tech analysis of Starfield on PC.
AMD is Starfield’s “exclusive PC partner,” with Bethesda and AMD engineers working to optimize the game for multithreaded code on both the Xbox and PC versions of the game across Ryzen 7000 processors and Radeon 7000 series graphics cards. As a result, it appears that Starfield is more optimized on AMD GPUs and CPUs than Intel CPUs and Nvidia GPUs.
Digital Foundry found that AMD’s previous-generation Radeon RX 6800 XT paired with Intel’s Core i9-12900K is around 46 percent faster than Nvidia’s previous-generation RTX 3080 on the same system. In my testing, I’ve found the RX 6800 XT can beat the RTX 3080 in a variety of games, but 46 percent is a far bigger margin than normal.
While average frame rates are lower with the RTX 3080 on this particular system, frame times — the time it takes for a frame to render — also take a big hit with regular spiking. “Frame times in this game are poorer on ultra settings on Nvidia GPUs, and it gets worse the slower the GPU is,” says Battaglia. Ultra shadow quality might be the culprit here, so if you’re on an older Nvidia GPU, try changing that setting in Starfield to see if it impacts performance for you.
But in general, Digital Foundry found that “AMD GPUs are really destroying Nvidia ones in this game in a way that’s not seen normally in rasterized titles, really far out of the norm.” It’s clear Nvidia and Intel didn’t have the same level of access as AMD, particularly because AMD paid for its PC partnership here that saw engineers from AMD and Bethesda working directly together.
Over on the CPU side, there are some strange things happening with Intel performance in this game, too. Digital Foundry found that enabling hyperthreading on Intel CPUs results in worse average frame rates than if it’s turned off. Turning off SMT, AMD’s equivalent, doesn’t have the same impact on frame rates, but it does cause frame times to be spikier.
a:hover]:text-gray-63 [&>a:hover]:shadow-underline-black dark:[&>a:hover]:text-gray-bd dark:[&>a:hover]:shadow-underline-gray [&>a]:shadow-underline-gray-63 dark:[&>a]:text-gray-bd dark:[&>a]:shadow-underline-gray”>Image: Digital Foundry
If Starfield were fully optimized for Intel’s hyperthreading, then we’d expect to see performance scale with the benefits of adding more CPU cores and hyperthreading. This could be something that Bethesda may address in subsequent updates to the game.
Overall, Digital Foundry concludes that Starfield “seems optimized for AMD systems, but not so much so for Intel and Nvidia ones,” says Battaglia. “I would say Bethesda needs to do some work in optimizing better for those platforms, and Intel and Nvidia also need to put out some new drivers over time.”
Starfield director Todd Howard was asked why Bethesda hadn’t optimized the game for PCs during a Bloomberg interview last week. “We did, it’s running great,” responded Howard. “It is a next-gen PC game, we really do push the technologies. So you may need to upgrade your PC for this game.”
That answer hasn’t satisfied the many who are wondering why Starfield doesn’t play as well on their Nvidia and Intel systems, which account for the vast majority of PC gamers in Steam’s hardware survey. Perhaps a few patches and some updated drivers might help out soon, though.