Bethesda games are genuinely badly optimized. Most gamers will say that without knowing why and i know it's frustrating, but every graphics study of a Bethesda game will make you shriek when you spot the amount of the draw calls the average scene pumps out.
Big scenes like Corvega in F4 pumped out 14k, and on average, the game was working with 6k draw calls. To this day, even while newer APIs are overwhelmingly better when it comes to high draw counts, 6K draw calls is still a significant amount.
Going back a little less than two decade ago: while Bethesda at the time couldn't be truly blamed as they were really pushing the enveloppe, Oblivion being such a disaster when it came to performance was one of the deciding factor for AAA studios to follow Crytek in implementing Deferred rendering if they were working on Open-worlds.
In Bethesda fashion, they quickly jerry-rigged their engine without actually taking the time and, unsurprisingly, it worked like shit.
For example, their methods to have soft-shadows nearly doubles the amount of draw calls because they re-draw every single object that's in the fulstrum. It goes against the very purpose of using deferred rendering !
There's other dumb stuff, too, like static mesh instancing still being a broken mystery, AMD CPUs performing abnormally slower than intel, under usage of HLOD/Proxy meshes with no explicable reason, modular meshes with unique atlases all over the place...
Other things like physics and NPCs are actually extremely efficient. But they just can't nail their lighting without shitting the bed.
Also want to mention that GI also has very little actual impact on CPU performance, it's a new problem with custom engines reserving threads that wait for the GPU when calculating ray traces.
So far, there's really no reason to believe Starfield is doing something so special that it can't hit reasonable CPU performance; and yet, even the trailer is a stuttery mess.
Odds are, Bethesda is just badly doing more of the same, again.