• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Devs forgot how to do proper 30fps.

Not at all, the intended creative vision of the game director is the intention, so adding in a soap opera effect with more frames can't automatically be argued as better for all gamers,, can it?

Do you prefer the original six SW 24fps films look - as :Lucas intended - or Disney's 60fps TV soap opera Mandalorian look? As it is SW, I prefer what Lucas intended and find all the space scenes in the Mandalorian to look like garbage, despite the higher fps.
Exactly.
 

Pimpbaa

Member
It matters to me too on a game by game basis. Spiderman, Ratchet, and Forbidden West look A LOT BETTER at 30 fps, and are all extremely responsive in fidelity mode. I spend the majority of time playing those at 30.

It's nice to have the choice on console. The brain does adapt to 30 despite 60 being smoother. Forza Horizon 5 at 30 fps looks like a true next gen game while at 60 it just looks average. It's nice to at least be able to play it looking the way it did in the initial trailers even though I have to sacrifice some responsiveness. Not everyone can afford a 3080 rtx

Those games (ps5 titles) look more detailed in motion at 60fps. They only look better at 30fps if you are standing still, once you start moving the perceived detail goes way down and lose all that you would gain with 4K. That is unless you are using black frame insertion or a CRT tv (they preserve motion detail better than higher framerates).
 

Javthusiast

Banned
Still have zero problems with 30fps games if it is stable. If a powerhouse like ND decides to go for bonkers graphics and animation at the cost of it being 30 fps, so be it. I will never whine about it. I grew up with most games being that.
 

TonyK

Member
Wait, so forbidden west 40fps mode is not exactly the same as fidelity30 graphics wise?
What are they doing
No, 40fps mode it's exactly the 60fps version but at higher resolution. I was waiting to the 40fps mode because I thought it would be the same Fidelity mode but at 40fps. I don't know exactly what adds Fidelity mode, the most evident is a more advanced AO, but it looks better, specially during conversations. In Performance and Balanced modes it looks like a game, you understand me, like PS4, but in Fidelity it looks like CG: characters are better integrated in scenario, characters AO is clearly improved, etc
 

rofif

Can’t Git Gud
No, 40fps mode it's exactly the 60fps version but at higher resolution. I was waiting to the 40fps mode because I thought it would be the same Fidelity mode but at 40fps. I don't know exactly what adds Fidelity mode, the most evident is a more advanced AO, but it looks better, specially during conversations. In Performance and Balanced modes it looks like a game, you understand me, like PS4, but in Fidelity it looks like CG: characters are better integrated in scenario, characters AO is clearly improved, etc
What a mess. They should label their stuff more clearly
 

Fafalada

Fafracer forever
BB wins since it got minimal framepacing for exchange of very fast input lag
I might agree if BB was x-platform like the other games. But for a PS4* exclusive, not dealing with this is just poor form.

*this is one of those examples where leveraging console specific hw access(not all consoles anymore, sadly) actually allows superior results. If you're prepared to put in the work of course.
But instead they recycled their cross platform approach.
 

Pimpbaa

Member
No, 40fps mode it's exactly the 60fps version but at higher resolution. I was waiting to the 40fps mode because I thought it would be the same Fidelity mode but at 40fps. I don't know exactly what adds Fidelity mode, the most evident is a more advanced AO, but it looks better, specially during conversations. In Performance and Balanced modes it looks like a game, you understand me, like PS4, but in Fidelity it looks like CG: characters are better integrated in scenario, characters AO is clearly improved, etc

The difference you are suggesting is not nearly that much between performance mode and fidelity. Better AO and resolution doesn’t magically change the game into something more CG looking, nor does the resolution loss in performance mode make it look anywhere near a PS4 game. I played half the game in fidelity mode (due to performance mode foliage looking like ass until recently) and had to stop due to 30fps being so fucking terrible after playing nearly every other PS5 game I got in 60fps. Once they fixed performance mode, I had vastly renewed interest in the game not only because it plays better, but the visual loss is so minimal particularly during actual gameplay. I mean yeah I did notice a few effects here and there that looked a bit worse, but it’s better than all the detail in the game going to shit soon as you move due to the blur you get at 30fps on any modern sample and hold display.
 

skneogaf

Member
I am constantly surprised I can play bloodborne whereas I can't manage any other games at 30fps, at least not any that come to mind.
 
These new consoles should have locked 60 regardless of genre. 30fs is headache inducing coming from PC gaming at north of 100+ fps.

1440P / 60FPS across the board. No if's but's or maybe's

JUST FECKING DO IT
 
Last edited:

PaintTinJr

Member
It's because consoles don't have the power to run games with max settings and decent frame rates.
What is the excuse for PC not being able to do deferred rendering for VR then, and having to fallback to FR(Forward Renders) or the newer FR plus?

Performance of PCs and consoles is measured by the audience size you can sell to when a game launches, and is finite in that moment. Targeting 4K60 on the UE5 demos wasn't even possible on the niche highest PC consumer hardware when it was first available in public beta, if you remember, so there is always some arbitrarily chosen line in the sand where pushing visuals rules out a 30fps displayed at 60fps option.

Unlocked frame-rate on most PC games is clearly a business choice for forward compatibility to maximise the audience they can sell to over the long term without having to patch, not a desired artistic choice. And IIRC wasn't there an issue with one of the Dark Souls PC ports being artistically locked to 30fps originally on PC?
 
Last edited:

winjer

Gold Member
What is the excuse for PC not being able to do deferred rendering for VR then, and having to fallback to FR(Forward Renders) or the newer FR plus?

Performance of PCs and consoles is measured by the audience size you can sell to when a game launches, and is finite in that moment. Targeting 4K60 on the UE5 demos wasn't even possible on the niche highest PC consumer hardware when it was first available in public beta, if you remember, so there is always some arbitrarily chosen line in the sand where pushing visuals rules out a 30fps displayed at 60fps option.

Unlocked frame-rate on most PC games is clearly a business choice for forward compatibility to maximise the audience they can sell to over the long term without having to patch, not a desired artistic choice. And IIRC wasn't there an issue with one of the Dark Souls PC ports being artistically locked to 30fps originally on PC?

MSAA is the biggest reason. But you can also tune individual materials to use less complex reflection environments, and get some gains there.
Temporal AA also relies on a velocity buffer. With MSAA you don't need all the draw calls related to that, which saves on some CPU in areas that are currently bottlenecked to a single thread in older graphics APIs.
In 2D the velocity buffer is nice to have for motion blur so it isn't a big burden, but in VR motion blur doesn't really work without some kind of super low latency eye tracking, so no one really uses it other than for effects when teleport dashing (can be done well with depth alone and doesn't need velocity buffer).
Many 2D games are moving to forward too. GPUs used to be really bad at branching but now are pretty efficient at it. By doing everything at once instead of in two passes you save on memory bandwidth and have more cache coherency. With an early z pass you don't get extra overdraw from it.

On PC games have unlocked frame rates, because it has the power to use it. And PC gamers do appreciate the extra smoothness and responsiveness of higher frame rates.
 

PaintTinJr

Member
MSAA is the biggest reason. But you can also tune individual materials to use less complex reflection environments, and get some gains there.
Temporal AA also relies on a velocity buffer. With MSAA you don't need all the draw calls related to that, which saves on some CPU in areas that are currently bottlenecked to a single thread in older graphics APIs.
In 2D the velocity buffer is nice to have for motion blur so it isn't a big burden, but in VR motion blur doesn't really work without some kind of super low latency eye tracking, so no one really uses it other than for effects when teleport dashing (can be done well with depth alone and doesn't need velocity buffer).
Many 2D games are moving to forward too. GPUs used to be really bad at branching but now are pretty efficient at it. By doing everything at once instead of in two passes you save on memory bandwidth and have more cache coherency. With an early z pass you don't get extra overdraw from it.

On PC games have unlocked frame rates, because it has the power to use it. And PC gamers do appreciate the extra smoothness and responsiveness of higher frame rates.
That is wrong. They do it because caches aren't fast enough to complete a deferred workload for +75fps irrespective of the hardware used with modern day rendering, and that's with people like Carmack working on the problem.

Forward rendering is less complex and so is faster, but far less impressive, it is a trade off, and the caches on a highest end PC are barely any faster than the new consoles, just bigger, so both can do 240fps, etc, but just with different cutbacks.
 
Last edited:

PaintTinJr

Member
I might agree if BB was x-platform like the other games. But for a PS4* exclusive, not dealing with this is just poor form.

*this is one of those examples where leveraging console specific hw access(not all consoles anymore, sadly) actually allows superior results. If you're prepared to put in the work of course.
But instead they recycled their cross platform approach.
Even though it is exclusive, what would be the best strategy for them in this situation in your opinion?

I personally think the generic triple buffer option that adds a frame of latency but makes it easier to maintain GPU rendering efficiency puts the least amount of restrictions on artists, etc while they are focused on making a game, say compared to how strict a game like arcade cabinet development would have been, where everything is designed from an engineering angle, accounting for every polygon texture lookup or fragment shader write before the artist has got to work, but then in a open world game - or wide linear - to hit those frame timings with minimal latency you are going to need to air on the side of caution heavily and scale back, or end up with a BB situation IMO.
 

winjer

Gold Member
That is wrong. They do it because caches aren't fast enough to complete a deferred workload for +75fps irrespective of the hardware used with modern day rendering, and that's with people like Carmack working on the problem.

Forward rendering is less complex and so is faster, but far less impressive, it is a trade off, and the caches on a highest end PC are barely any faster than the new consoles, just bigger, so both can do 240fps, etc, but just with different cutbacks.

You do realize that there are GPUs on the PC that have L3 caches as big as 128MB. Much more than consoles.
And all modern GPUs on the PC are tile based renderers.
 
Last edited:

PaintTinJr

Member
You do realize that there are GPUs on the PC that have L3 caches as big as 128MB. Much more than consoles.
And all modern GPUs on the PC are tile based renderers.
Size isn't the limiting issue. Recursive access - without enough time for latency hiding is. There's an infographic somewhere nicely showing the way improvements in memory performance have compared to processing performance over the decades, and the memory graph is a linear shallow line, showing very little gains.

You can still brute force older games, but you can't do contemporary spec deferred rendering in VR because even high end PC lacks the performance to provide low latency high frame-rate.
 

winjer

Gold Member
Size isn't the limiting issue. Recursive access - without enough time for latency hiding is. There's an infographic somewhere nicely showing the way improvements in memory performance have compared to processing performance over the decades, and the memory graph is a linear shallow line, showing very little gains.

You can still brute force older games, but you can't do contemporary spec deferred rendering in VR because even high end PC lacks the performance to provide low latency high frame-rate.

But what you want is a tile based deferred rendering GPU architecture. Similar to what Apple and PowerVR have.

But if PCs can´t do it, then consoles also can´t. So what is the point of this conversation?
 

yamaci17

Member
But what you want is a tile based deferred rendering GPU architecture. Similar to what Apple and PowerVR have.

But if PCs can´t do it, then consoles also can´t. So what is the point of this conversation?
as far as I know, devs choose forward rendering for VR in general to render everything at fully maximum resolution. deferred rendering is there to save performance, and render stuff at sub resolutions and mash them up together with TAA

naturally you want a clear, clean and stable image quality on a VR headset. TAA would most likely induce headache for most people (it already does for me without a VR set. TAA is a plague upon AAA gaming and deferred rendering is one of the biggest reasons why TAA became so prevalent)

just wated to add my own thoughts
 

Shut0wen

Member
Interesting video that I think many people missed last month from DF.
It shows what I felt since getting a ps5. Some 30fps games are TERRIBLY LAGGY. More than 30fps really should be and it gives even worse impression about 30fps.
I remember comparing Demons Souls 30fps mode to Bloodborne and bloodborne felt so much much more responsive in comparison.
And I was right. DeS 30fps mode is 75ms slower than bloodborne. DF talks about this near the end of the video.



Seems that devs got lazy because they can now include 17 different modes in their games, that they forgot all about good 30fps lock.... and good 60fps lock for that matter. It is more laggy than it should too.
They are most likely falling on slow system level vsynced cap or something similar.

Sure, bloodborne was a little bit stuttery because of all this framepacing but this never bothered me and I now think it was a very fair trade of for very fast input response.
Not every game from last gen does 30fps well though. Final Fantasy XV fidelity mode has so much extreme frame pacing, it is unplayable.... but 1080p30 and 1080p60 modes are perfect, so not sure why 1800p30 is broken like hell.

And no. I will not "just play at performance mode man". Those who say that are ignorant fools.
All the modes are just BS. Make a game and make it properly. And only after that include downgraded performance mode or only fps cap off for vrr users.
When you play a game with good 30fps like uncharted 4 remaster, the trade off is not that bad.
I like best possible graphics and I play on 4k oled on my desk. 1440p can look like crap

Great post abd glad it isnt me, if anyone has an xbox play mass effect 1 then play legendary on 30fps, legendary edition just drops like crazy but tbh op using fromsoftware is pointless, ninja blade and dark souls 2 socolar edition have literally only been the 2 games where they have given a shit about frames, maker of demon souls and elden ring is a total hack in that area, not surprised since he plays dynasty warriors, only developer atm who locks 30fps is capcom
 

Shut0wen

Member
Seems like a FormSoftware specific thread. they have always been bad at framerates.
Not fromsoftware games but Miyazaki games, the guys a hack and doesnt give a fuck about frames, 2 games thqt were recently made fromsoftware ninja blade and demon souls 2 both have stable 30fps, both games he was never involved in, he also killed armoured core
 

rofif

Can’t Git Gud
Not fromsoftware games but Miyazaki games, the guys a hack and doesnt give a fuck about frames, 2 games thqt were recently made fromsoftware ninja blade and demon souls 2 both have stable 30fps, both games he was never involved in, he also killed armoured core
Watch the video... Souls games are example of good 30fps imo. Very low input lag
 

PaintTinJr

Member
But what you want is a tile based deferred rendering GPU architecture. Similar to what Apple and PowerVR have.

...
And how exactly does that fix the cache latency bottleneck of deferred rendering for VR on PC at +75fps? Do you have a deferred VR example at +75fps on Mac? Or is this a moving of goal posts because you can also see that PC and Consoles both lack the performance at times.
 

Roxkis_ii

Member
Yeah, I knew all things 30 fps werent the same when I compared the 30 fps mode in HFW, which is really smooth, to cyberpunck 2077 that is basicly unplayable at the same framerate.
 
Last edited:

winjer

Gold Member
And how exactly does that fix the cache latency bottleneck of deferred rendering for VR on PC at +75fps? Do you have a deferred VR example at +75fps on Mac? Or is this a moving of goal posts because you can also see that PC and Consoles both lack the performance at times.

Honestly, I don't care about VR. So I never researched anything about it. And know almost nothing about it.
What I know is about current gen GPU architectures and none does what you want, considering that all you want is lower cache latencies.
But the thing is that cache latencies are bound to increase, as cache sizes increase. So you might never get that ultra deferred thing you want, in any system.

No this is not moving goal posts. Considering that no system on the planet does what you want.
Despite that, the PC still has bigger and faster caches than consoles. So even for that VR thing you want, the PC space will provide better performance than any console.
 
Last edited:
n64 showing how to do 20fps right

FelineGleefulDassierat-max-1mb.gif
 

fart town usa

Gold Member
Agree. 30fps modes on PS5 feel downright awful. Like things are in slow motion.

It's weird, doesn't feel that way on older consoles, to me at least.
 

PaintTinJr

Member
Honestly, I don't care about VR. So I never researched anything about it. And know almost nothing about it.
What I know is about current gen GPU architectures and none does what you want, considering that all you want is lower cache latencies.
But the thing is that cache latencies are bound to increase, as cache sizes increase. So you might never get that ultra deferred thing you want, in any system.
It seems you've lost the thread of what each of us have said.

I never advocated for VR to use it, but merely used it as an example to expose the obvious flaw in you saying:

"It's because consoles don't have the power to run games with max settings and decent frame rates."

Which is a consoles are weak but PC can have it all stance. When in reality is complete rubbish.

Better graphics typically need deferred passes (divide and conquer) - whether that be deferred rendering or just pre-z deferred render targets in forward rendering.

Rendering quality and complexity increase together, complexity leads to more deferred targets needed per frame, and every render target increases memory latency. Rendering higher fps shrinks the time slice to hide memory latency per frame, so clearly my point was to show that even your fictional highest PC has to fall back to lesser fidelity by less complex rendering, or target a lower frame-rate at a design level. ,

Despite the highest end PCs having the best of everything, other than unified memory - which they have to workaround with PCie bandwidths. The difference between the memory pyramid in a weak gaming PC isn't massively(eg x8) inferior in latency to the memory pyramid latencies in a high-end PC, and the high-end PC compared to a console with unified memory with the ability to dereference between CPU and GPU access might actual be slightly less when factoring in fixed target hardware. So the so-called weakness of consoles unable to do 60fps, low latency input and complex deferred rendering is untrue, as the bottlenecks more than match PC, and it is completely down to the quality of software.

No this is not moving goal posts. Considering that no system on the planet does what you want.
Despite that, the PC still has bigger and faster caches than consoles. So even for that VR thing you want, the PC space will provide better performance than any console.
Well as I already quoted you:
"It's because consoles don't have the power to run games with max settings and decent frame rates."

There is no mention of Macs in your comment (just consoles against the implicit PC), and my response was:
"What is the excuse for PC not being able to do deferred rendering for VR then, and having to fallback to FR(Forward Renders) or the newer FR plus?"

Which as I said, I'm not advocating for it use, just demonstrating the parity between "weak" consoles and high-end PC to be equally deficient at the desired task - and obviously AA techniques aren't part of the discussion for me, they aren't a fundamental requirement of deferred rendering.
 
Last edited:

winjer

Gold Member
It seems you've lost the thread of what each of us have said.

I never advocated for VR to use it, but merely used it as an example to expose the obvious flaw in you saying:

"It's because consoles don't have the power to run games with max settings and decent frame rates."

Which is a consoles are weak but PC can have it all stance. When in reality is complete rubbish.

Better graphics typically need deferred passes (divide and conquer) - whether that be deferred rendering or just pre-z deferred render targets in forward rendering.

Rendering quality and complexity increase together, complexity leads to more deferred targets needed per frame, and every render target increases memory latency. Rendering higher fps shrinks the time slice to hide memory latency per frame, so clearly my point was to show that even your fictional highest PC has to fall back to lesser fidelity by less complex rendering, or target a lower frame-rate at a design level. ,

Despite the highest end PCs having the best of everything, other than unified memory - which they have to workaround with PCie bandwidths. The difference between the memory pyramid in a weak gaming PC isn't massively(eg x8) inferior in latency to the memory pyramid latencies in a high-end PC, and the high-end PC compared to a console with unified memory with the ability to dereference between CPU and GPU access might actual be slightly less when factoring in fixed target hardware. So the so-called weakness of consoles unable to do 60fps, low latency input and complex deferred rendering is untrue, as the bottlenecks more than match PC, and it is completely down to the quality of software.


Well as I already quoted you:
"It's because consoles don't have the power to run games with max settings and decent frame rates."

There is no mention of Macs in your comment (just consoles against the implicit PC), and my response was:
"What is the excuse for PC not being able to do deferred rendering for VR then, and having to fallback to FR(Forward Renders) or the newer FR plus?"

Which as I said, I'm not advocating for it use, just demonstrating the parity between "weak" consoles and high-end PC to be equally deficient at the desired task - and obviously AA techniques aren't part of the discussion for me, they aren't a fundamental requirement of deferred rendering.

We all know that there are weak and powerful PCs.
But today there are a lot of GPUs in the PC space that do much more than what a console can do. And that includes any deferred engine.
If you want a strong GPU for deferred rendering, the most powerful are Apple´s. But what then you lose in geometry throughput.

Consoles are significantly weaker than most modern GPUs on the PC space. Especially with modern effects like ray-tracing.
Only when you go as low as an RTX 3050 or AMD GPUs under the 6600, can you say that PC GPUs are weaker in RT.

Having a unified pool of memory has advantages and disadvantages.
Let's not forget that a DDR controller has much lower latencies than GDDR.
A Zen2 CPU on PC has a memory latency of 60-70 ns. But the Zen2 CPU on the PS5 has a latency of over 140ns.
Every time there is a cache miss, and there are a good amount because the PS5 has a very small L3 cache, it has to waste a lot of time going to system memory.
Then there is the matter of sharing memory between the CPU and GPU. And the issue of memory contention that further limits the overall memory bandwidth of the system.
Then we also have to remember that NVidia has the most advanced data compression pipeline in any GPU. So the effective memory bandwidth of a modern GTX/RTX card is even greater than what the theoretical numbers.

PC does have to go through the PCIe bus to send data between the CPU and GPU.
But in tests with modern high end GPUs, the difference between Gen3 and Gen4 is negligible. In most games is nothing.
Even in a game like the Spider-Man remaster, which is probably the game that stresses the bus the most, the difference is under 10%.
 
Last edited:

rofif

Can’t Git Gud
by the way.
Anyone somehow checked lag on 30fps games with vrr on vs off ?
Like gta 5 or demons souls 30 fps modes. Both are locked 30 and report as 60hz vrr on the tv with vrr forced to on in ps5 settings.
I know there is no lfc running, so on vs off should really be identical
 
Last edited:

01011001

Banned
by the way.
Anyone somehow checked lag on 30fps games with vrr on vs off ?
Like gta 5 or demons souls 30 fps modes. Both are locked 30 and report as 60hz vrr on the tv with vrr forced to on in ps5 settings.
I know there is no lfc running, so on vs off should really be identical

as long as Vsync is on nothing changes to the latency.
it's the vsync that's the issue.

so these 30fps games would need Special modes without vsync to have lower latency with VRR
 

Alexios

Cores, shaders and BIOS oh my!
Toukiden 1 on PC is locked 30 yet somehow feels very nice and responsive, and I don't even like the game as it's a poor man's Monster Hunter, just saying, it feels solid. Of course it'd still be better in 60fps, that's the minimum devs should be striving for (real, not Nvidia's new frame generation crap).
 
Last edited:
Top Bottom