• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Starfield Tech Breakdown - 30FPS, Visuals, Rendering Tech + Game Impressions

PaintTinJr

Member
DF sets the record straight. And people still trying to criticize 30fps here lol.
Just to be clear in advance, that this comment is not to be down on the game, or DF, but they didn't actually set the record straight, they hypothesized the game is CPU bound, while showing Star Citizen at 17fps, claiming it was CPU bound by their monitoring numbers, while making no allowance for an equally likelihood that both games are CPU Data to GPU VRAM bound, either by memory bandwidth or even by PCIe bandwidth.

Their hypothesis is a little flaky given that similar game data mechanics worked on the 360 and PS3 games by Bethesda, and those consoles had much smaller memory bandwidths, and 32x less unified RAM, and much weaker CPUs in IPC, lower general purpose core and thread counts, lack of SMT cores, etc, etc.

Going by the information about the forge engine's components had been added to the game's engine, it is more likely that the underlying problem is that they have choose to continue with a deferred graphics rendering solution to provide the GI lighting they mentioned, which typically results in 30fps games, because despite deferred rendering being easier to utilise more of the rendering potential of the GPU, it adds latency for every state change of deferred buffer being rendered, and further latency from gathering and compositing, to actually render a front-buffer frame to display for the user. Triple buffering is typically considered for deferred rendering, also to help with frame-pacing, but in-turn adds input latency.
 
Last edited:

clarky

Gold Member
Yes but it's not doing every single thing at once. When you're in space the game isn't running hundreds of NPC routines. The moment to moment gameplay they've shown so far could run at 60fps on a Series X. No joke, they should ask for deeper help from id Software beyond post-processing effects. Those guys managed to put Doom Eternal on the Switch.
I think you need to be hired by Bethesda, your ability come to conclusions on what frame rates can be achieved based on a 45 minute video is second to none.

Personally i think you and a few others are just arguing in bad faith. I doubt you'll get anywhere near a constant 60 fps on a equivalent PC during heavy scenes. Guess we'll find out soon enough.
 
Last edited:

PaintTinJr

Member
Looking forward to those shader and traversal stutters.
I was reading about the forge engine's shader stuff, and they have their own superset of HLSL, that can precompile all the FSL shaders in advance. Whether Bethesda has taken that "lego block" of the Forge's engine isn't clear, but if they have, shader compilation stutter shouldn't be an issue AFAIK.
 

avin

Member
Just to be clear in advance, that this comment is not to be down on the game, or DF, but they didn't actually set the record straight, they hypothesized the game is CPU bound, while showing Star Citizen at 17fps, claiming it was CPU bound by their monitoring numbers, while making no allowance for an equally likelihood that both games are CPU Data to GPU VRAM bound, either by memory bandwidth or even by PCIe bandwidth.

Can you clarify? "CPU Data to GPU VRAM bound", limited by memory bandwidth? I'd be grateful if you could explain what this means.

avin
 

Roxkis_ii

Member
Just look at the game lmao...you can't see it?

It's literally a full space flight sim where you can instantly board ships, physics out the ass, look how the body's fly around etc. Different atmospheres, full global illumination lighting from planets. Moons you can jump to and explore...I don't know how you can't see this?
No man sky running at 60fps :
omni-man-invincible.gif
 
Last edited:

FireFly

Member
Why? does that happen on other titles with this engine? I thought it was predominantly a UE thing.
It seems to be especially bad with UE4 titles but also happens on other DirectX 12 titles such as Elden Ring and the Dead Space remake.

I believe Starfield will be Bethesda's first DirectX 12 game.
 

PaintTinJr

Member
Can you clarify? "CPU Data to GPU VRAM bound", limited by memory bandwidth? I'd be grateful if you could explain what this means.

avin
In Star Citzen they showed some train/monorail IIRC with the frame-rate dropping down to 17fps on their high-end PC - assuming it was Alex's LGA2066 socketed system - and because the game is making the data in VRAM, and possibly RAM redundant very quickly, I was suggesting that it could be that either the GPU is stuck awaiting data to transfer across into VRAM before it can finish rendering, yielding low GPU utilisation from all the wait time.

The exact reason for waiting, could be the CPU can't unpack into RAM fast enough, resulting in delays issuing the streamed buffers to the VRAM, then you might have the situation that the RAM data is all ready to go, but the rate at which the user's game position is changing is making data streaming and rendering commands redundant in less than 1 second, resulting in the PCie bus getting thrashed by the data transfers that are redundant by the time they transfer, further eating bandwidth and delay the valid transfers, resulting in GPU waiting to do work.

You could also have the same heavy data transfers, but the VRAM bandwidth is being thrashed by data/textures getting unpacked by a shader into VRAM, with a second or less before that data gets ejected and the next set of data is then being unpacked, making the whole bottleneck a VRAM latency/write bandwidth issue, while half of the shader units sit idle
 
Last edited:

clarky

Gold Member
It seems to be especially bad with UE4 titles but also happens on other DirectX 12 titles such as Elden Ring and the Dead Space remake.

I believe Starfield will be Bethesda's first DirectX 12 game.
Well fingers crossed then. Although i fully expect it to run like shit at launch. I'm holding off on upgrading until I can see if its worth it.
 

Fredrik

Member
Oh no. An NPC disappeared. Anyway. Is storing every little thing worth the 30fps tradeoff? Maybe to some. If that’s actually the reason (which it’s probably not)

But 99% of the players won’t care that you can stack sandwiches
If the 30fps trade-off is worth it or not will be different for different people. Some thought it was worth it in Driveclub even though it’s a racing game.
It is what it is.

As for their reason. Apparently the engine keeps track of where every NPC is, in the whole world, at all time, and they give everyone a plan, sort of AI, for everybody, and every object you pick up is a real 3D model you can place anywhere in the world with physics and a world position. They do balance it by reducing the update rate the further away from the player things are but I don’t doubt for a second that they’re using up system resources.




There is lots of cool info there about their way to make open world games. Boiling it down to stacking sandwiches is absurd tbh. I don’t know any game that does all the things Skyrim does. And I fully expect Starfield to do as much or more. As Todd says, their way to do things makes it more like a simulation when all AI, NPC, quest, object, physics, weather, world systems are always running. In most other games the systems are closed down outside of a small masterfully directed bubble. Different approaches.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Just to be clear in advance, that this comment is not to be down on the game, or DF, but they didn't actually set the record straight, they hypothesized the game is CPU bound, while showing Star Citizen at 17fps, claiming it was CPU bound by their monitoring numbers, while making no allowance for an equally likelihood that both games are CPU Data to GPU VRAM bound, either by memory bandwidth or even by PCIe bandwidth.

Their hypothesis is a little flaky given that similar game data mechanics worked on the 360 and PS3 games by Bethesda, and those consoles had much smaller memory bandwidths, and 32x less unified RAM, and much weaker CPUs in IPC, lower general purpose core and thread counts, lack of SMT cores, etc, etc.

Going by the information about the forge engine's components had been added to the game's engine, it is more likely that the underlying problem is that they have choose to continue with a deferred graphics rendering solution to provide the GI lighting they mentioned, which typically results in 30fps games, because despite deferred rendering being easier to utilise more of the rendering potential of the GPU, it adds latency for every state change of deferred buffer being rendered, and further latency from gathering and compositing, to actually render a front-buffer frame to display for the user. Triple buffering is typically considered for deferred rendering, also to help with frame-pacing, but in-turn adds input latency.
Yeah, its odd that they went on and on about the game being CPU bound while their own internal counts suggested resolution dropping to 1296p from native 4k. clearly, its GPU bound too.

A better explanation would be that they have finally hit the big leagues and have an amazing rendering engine that makes the game look stunning and on par with the best games out there.
 

OCASM

Banned
@ O OCASM If walking down the heart of the large cities with all the NPCs doing their thing with overhead for combat etc. is enough to cause the cap, that's that. It doesn't matter what is happening anywhere else.
If that's the case and not a poorly architected engine, as has been the case with previous Bethesda games.

I think you need to be hired by Bethesda, your ability come to conclusions on what frame rates can be achieved based on a 45 minute video is second to none.

Personally i think you and a few others are just arguing in bad faith. I doubt you'll get anywhere near a constant 60 fps on a equivalent PC during heavy scenes. Guess we'll find out soon enough.
Oh yes, drawing opinions on what the developers have shown and its long track record is arguing in bad faith...
 

clarky

Gold Member
If that's the case and not a poorly architected engine, as has been the case with previous Bethesda games.


Oh yes, drawing opinions on what the developers have shown and its long track record is arguing in bad faith...
We've seem a small amount of gameplay and an overview of some systems & locations, so not much. Yet your calling the developers (of some of the GOAT's no less) incompetent. Like i said you clearly know better than them, get yourself into game dev Im sure its dead simple.
 

adamsapple

Or is it just one of Phil's balls in my throat?
if modders manage to get the game to 60fps, then Bethesda fucked up. and until the game is out all of what DF said is pure speculation

Todd's comments on this are pretty clear, I think. He said the game was above 30 FPS most of the time and even hit 60 FPS at times.

But if that's the case, a wildly fluctuating frame rate between 30 and 60 would be an absolute terrible way to play and a 30fps lock is the vastly better option.

At best, if Starfield has the same kind of mod support Fallout 4 and Skyrim had on consoles, we'll get a V-Sync unlock mod like those and we'll be able to see ourselves how inconsistent it is.
 
Last edited:

clarky

Gold Member
Todd's comments on this are pretty clear, I think. He said the game was above 30 FPS most of the time and even hit 60 FPS at times.

But if that's the case, a wildly fluctuating frame rate between 30 and 60 would be an absolute terrible way to play and a 30fps lock is the vastly better option.

At best, if Starfield has the same kind of mod support Fallout 4 and Skyrim had on consoles, we'll get a V-Sync unlock mod like those and we'll be able to see ourselves how inconsistent it is.
But I'm sure xbox owners with VRR TV's would welcome the option to unlock the frame rate (myself included). They will probably add it at some point after launch no doubt.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
But I'm sure xbox owners with VRR TV's would welcome the option to unlock the frame rate (myself included). They will probably add it at some point after launch no doubt.

Well, it would still need to be able to reach a certain performance for VRR to be viable.
 

MarkMe2525

Member
Yes but it's not doing every single thing at once. When you're in space the game isn't running hundreds of NPC routines. The moment to moment gameplay they've shown so far could run at 60fps on a Series X. No joke, they should ask for deeper help from id Software beyond post-processing effects. Those guys managed to put Doom Eternal on the Switch.
I think the point is you are going to find yourself in situations where you may have a bunch of dynamic effects or interactions happening at once. If the game is tanking in frame rate down to 40 frames per second every time this is the case, it's not a good experience.

Like some have already said, I don't think the cap is there because they are pushing these incredible visuals (because they are not). It seems like it must be there for CPU headroom. The global illumination looks heavy as well (I could be wrong here as I'm no expert).

For as awesome as it is, ToTK suffers from situations like this in many situations. Of course it gets a pass from most (including me) because it's running on a 2017 mobile chip, and let's be honest, because it's Nintendo. I believe the gaming public would not be so lenient towards Microsoft and they know that as well.
 

adamsapple

Or is it just one of Phil's balls in my throat?
I believe in 120hz mode its 20hz to 120hz. So like i said no reason not to include a performance option. It will come im sure.


That's true, I forgot Xbox has system level LFC support and if you have a 120hz display, it works universally for every game.
 

clarky

Gold Member
That's true, I forgot Xbox has system level LFC support and if you have a 120hz display, it works universally for every game.
I'd hazard a guess that they don't want people s first impression of the game to be a stuttery mess if you don't have a VRR capable display. Which lets face it that's most of the none nerds.
 

01011001

Banned
But if that's the case, a wildly fluctuating frame rate between 30 and 60 would be an absolute terrible way to play and a 30fps lock is the vastly better option.

that might have been the case on PS4 and older systems. but is simply not the case on curent gen with VRR

and even then, options are always better than no options.
all Bioshock games on 360 offered a setting to unlock the framerate, so people who wanted could lower the input lag dramatically if they could deal with the unstable performance and tearing. and of course when they became backwards compatible that opened then door for a build in 60fps mode.

not offering an unlocked mode either means they are just full of themselves or that their engine is still pure dogshit and it would fuck with the game logic.
 
Last edited:

01011001

Banned
I'd hazard a guess that they don't want people s first impression of the game to be a stuttery mess if you don't have a VRR capable display. Which lets face it that's most of the none nerds.

you can make it so the game only shows the performance mode option if you're set to 120hz, so this makes no sense
 
[
I love NMS and I realize you weren't addressing me specifically, but NMS on foot combat is crap, their enemy variety is crap, there mission structure and objectives are crap, copy and paste assets galore. I put about 150 hours into the game just because I really enjoyed the discovery aspect it, but as a whole it left me wishing there was more to it.
No I wasn't addressing you specifically but I enjoy polite conversation =)
I agree with all of your points it doesn't change the fact that at a base level, they are doing the same things. I am still in day 1 but praying it exceeds my expectations.
 

OCASM

Banned
We've seem a small amount of gameplay and an overview of some systems & locations, so not much. Yet your calling the developers (of some of the GOAT's no less) incompetent. Like i said you clearly know better than them, get yourself into game dev Im sure its dead simple.
They're incompetent at getting the most out of consoles. They've shown this with every game they've released for at least the past decade. It's a well known fact.

Fair point. I'll go with the engine being dogshit then.
See?

I think the point is you are going to find yourself in situations where you may have a bunch of dynamic effects or interactions happening at once. If the game is tanking in frame rate down to 40 frames per second every time this is the case, it's not a good experience.

Like some have already said, I don't think the cap is there because they are pushing these incredible visuals (because they are not). It seems like it must be there for CPU headroom. The global illumination looks heavy as well (I could be wrong here as I'm no expert).

For as awesome as it is, ToTK suffers from situations like this in many situations. Of course it gets a pass from most (including me) because it's running on a 2017 mobile chip, and let's be honest, because it's Nintendo. I believe the gaming public would not be so lenient towards Microsoft and they know that as well.
I'm sure performance will vary in-game. That happens with most systemic games. My point is that Bethesda isn't the best at optimization and other devs would do a better job at getting it to run at a better framerate. Is this really a controversial opinion?
 

Apocryphon

Member
Everybody should have expected this. There’s zero evidence to suggest it was going to be anything other than the usual unoptimized, wonky, Bethesda jank.

The game looks good enough graphically and excellent artistically. 30fps lock is disappointing but hopefully it feels responsive enough. The game itself will undoubtedly be fun.

Some of the facial animations are giving me Andromeda flashbacks though 😂
 

Bernardougf

Gold Member
I know its only been 21 years and 4 console generations, but you may have not noticed that Bethesda Game Studios have been the only developers that can even create the games they do.

So does From software with their souls games.. and they also are a shit optimization studio and a brilliant gameplay/idea/ar work etc one .. just see bluepoint demons souls 60 fps stellar presentation for its time and what From releases in terms of performance since forever

So saying that another studio could get starfield and make it run at 60 fps is not some absurd out of this world thing...

People need to stop beeing so defensive.. is a game.. is not your mama
 
Last edited:
So does From software with their souls games.. and they also are a shit optimization studio and a brilliant gameplay/idea/ar work etc one .. just see bluepoint demons souls 60 fps stellar presentation for its time and what From releases in terms of performance since forever

So saying that another studio could get starfield and make it run at 60 fps is not some absurd out of this world thing...

Did you just bring up Bluepoint remaking a PS3 game to explain why another studio could've created Starfield and ran it at 60fps? Yes you did.

And with that, I'm out of this shit thread.
 

Bojji

Member
That's true, I forgot Xbox has system level LFC support and if you have a 120hz display, it works universally for every game.

Game running between 30 and 60FPS would look quite good on series X with 120Hz output. Frame tripling and doubling of LFC would do good work here.

They should give players OPTIONS! Same goes for 40FPS mode.
 

Bernardougf

Gold Member
Did you just bring up Bluepoint remaking a PS3 game to explain why another studio could've created Starfield and ran it at 60fps? Yes you did.

And with that, I'm out of this shit thread.

Yes I did.. and what you said did nothing to counter my point.. blue point REMADE from scratch a souls game for nextgen only and made a better job in perfomance with great graphics that all the games From have created (perfomance/graphics speaking only) ....so good riddance to you my friend with your shit non counter argument
 
Last edited:
One of the things that Starfield will probably bring is that people will be more likely to accept delays or even seeing reveals in not-completely ready state, because they now know how the game can change within 1 year. Actually, the reveal last year was essentially the traditional Bethesda launch state to be honest - at least it felt that way. But after one year, the game looks much more polished than previous Bethesda launches.
 

Fredrik

Member
Is there any chance that they’ll show it on PC before launch? I don’t like the Xbox focus tbh. Makes me nervous.
 
Yes I did.. and what you said did nothing to counter my point.. blue point REMADE from scratch a souls game for nextgen only and made a better job in perfomance with great graphics that all the games From have created (perfomance/graphics speaking only) ....so good riddance to you my friend with your shit non counter argument

His point was no one has made the type of games BGS makes. Your counter was that a developer was able to remake a game another developer already made. In other words it’s nonsense and doesn’t address what he actually said.

Not to mention they remade a PS3 game onto the PS5. I’d hope it would look and run better than it did on PS3. But either way, kudos for unknowingly proving his point.
 

Buggy Loop

Member
Seriously? Are you that fucking dishonest? Initially, I could literally turn around in Cyberpunk and NPCs would disappear and appear out of thin air (I think their range was increased in a patch) and you're telling me this doesn't impact immersion? Yes, I would take 30fps over NPCs vanishing almost in front of my face. Assuming complex gameplay systems and everything else holds back 60fps, then I'd 100% take that over it. Thankfully, I game on PC so I don't have to worry about that but when we tell you something as egregious as NPCs disappearing happens and you just go "so what?", it tells us exactly where you stand.


Certain peoples with clear history of being fan of a certain platform become more dense than a fucking black hole when it comes to Starfield since now it’s pulled from that said platform..
 

OCASM

Banned
His point was no one has made the type of games BGS makes. Your counter was that a developer was able to remake a game another developer already made. In other words it’s nonsense and doesn’t address what he actually said.

Not to mention they remade a PS3 game onto the PS5. I’d hope it would look and run better than it did on PS3. But either way, kudos for unknowingly proving his point.
And how does that refute the fact that their engines have sub-par performance compared to those of other studios?
 

DaGwaphics

Member
It’s common sense. They always demo these games on a PC.

Three years into a gen at an Xbox showcase I'd assume it was running on XSX. Especially with the aggressive upscale.

If they were going to fake it on a PC they would use a 4080/4090 and would not need the upscale at 30fps. They apparently also gave hands-on access to journalists on the XSX, an odd thing to do if you were going to bull-shot the presentation, as that would be noticed.
 

Bernardougf

Gold Member
And how does that refute the fact that their engines have sub-par performance compared to those of other studios?

People are deliberately obtuse or just plain stupid.... they really think that my point was that bluepoint remade a ps3 game better ... the stupidity clouds their brain to the fact that im saying that bluepoint nextgen engine made a better perfomance and graphical souls game than ALL souls games From ever created .. and that From engine is ass as proven in elden ring .. . Therefore one studio can make a better perfomance than the original studio who creates the same type of game ... and worst they are defending bethesda historical bad perfomance engine and doubling down that NO ONE could make starfield have better PERFOMANCE ... no studio in the world could take the same work from bethesda and remake the same game but with better perfomance ... bethesda is that good ... we just didnt know it because they been hiding all their optimizing developer skills until now

Just to reinforce my point because we have to keep up with some of the slower brain fanboys

If given the chance TODAY bluepoint could make Elden Ring in their engine and would have a much better perfomance and graphical assets then ER. Because they already prove it they can do it, and other studios also could take ER and make it a better perfomance game.. From excels in game play, concept , and other things.. they SUCK in optimization.

So! Other more capable studios in engine/optimization PROBABLY could also take starfield and all its content and remake in a better engine to give better all around perfomance ... given bethesda engine poor history of all around optimization.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
I'm still running a 1070. Most consumers except the hard-core pc enthusiasts don't upgrade their pc that often.
Sure, but 2070S-2080S is a performance tier from 5 years ago. A lot more than 10% of people PC gamers have rigs equal or faster than the consoles.
 
Top Bottom