• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why Ratchet & Clank: Rift Apart’s 40fps Fidelity Mode Is A Big Deal For Consoles

Kagey K

Banned
Proper use of VRR is much better than locked 40, wish more devs would take advantage of it. It should just Auto Unlock the frame rates if the console detects a VRR TV.
 

Hendrick's

If only my penis was as big as my GamerScore!
Pour One Out Malt Liquor GIF
 

SlimySnake

Flashless at the Golden Globes
Here is something I dont understand. They are able to do a locked 40 fps at native 4k with some drops to 1800p. That's roughly 8.2 million pixels with the low end being 5.6 million pixels.

But then for the 60 fps mode, they not only drop the resolution to 1440p (3.6 million pixels) with the low end being 1080p or 2.1 million pixels, but they also have to downgrade the visuals like reducing the crowd and the skybox detail as well as the lighting and other vfx. Why? Typically, just reducing by half frees up enough of the GPU to get the same visual quality at 2x the framerate. For some reason, reducing the resolution by half or even 2/3rds is not good enough when it comes to maintaining the same visual quality.

The GPU can handle it at 40 fps and native 4k so clearly thats not the bottleneck. Is it the memory bandwidth? Or the CPU? I can attribute the downgrade in crowd and flying ships to the CPU but other lighting downgrades? Insomniac just needs to release a 1440p fidelity mode just so we can see where the bottleneck is.
 

clintar

Member
So based on those frametime graphs wouldn't this even be able to hit 40-45fps on a VRR display if Sony ever implements that on PS5?
 
Awesome, the 3 people who have a 120hz display are going to use it.

If you really want 120hz PC is better. The point of high refresh rates is being able to be faster, so KB&M is natural for it.
 
Thinking the same thing. 24fps would mean ~42ms lag, but would allow for more graphical bells and whistles. It would be interesting to see a dev make a game with that framerate and see how the input lag would affect the experience.
Here is something I dont understand. They are able to do a locked 40 fps at native 4k with some drops to 1800p. That's roughly 8.2 million pixels with the low end being 5.6 million pixels.

But then for the 60 fps mode, they not only drop the resolution to 1440p (3.6 million pixels) with the low end being 1080p or 2.1 million pixels, but they also have to downgrade the visuals like reducing the crowd and the skybox detail as well as the lighting and other vfx. Why? Typically, just reducing by half frees up enough of the GPU to get the same visual quality at 2x the framerate. For some reason, reducing the resolution by half or even 2/3rds is not good enough when it comes to maintaining the same visual quality.

The GPU can handle it at 40 fps and native 4k so clearly thats not the bottleneck. Is it the memory bandwidth? Or the CPU? I can attribute the downgrade in crowd and flying ships to the CPU but other lighting downgrades? Insomniac just needs to release a 1440p fidelity mode just so we can see where the bottleneck is.
It's a number of things. But first of all ps5's bottleneck is its bandwidth so it definitely could be a case where the system has more breathing room at 30fps. Second Insomniac's engine has been focused on 30fps for a long time, and that's to the point where the 60fps mode on ratchet wasn't even on the disc unless you forced it in ps5's menu. Ratchet ps5 Was a 30fps game first.

We take for granted perfect framerate in multiple modes these days, but it's a ton of work. Think back to the fps we got in 5th gen.
 

SlimySnake

Flashless at the Golden Globes
It's a number of things. But first of all ps5's bottleneck is its bandwidth so it definitely could be a case where the system has more breathing room at 30fps. Second Insomniac's engine has been focused on 30fps for a long time, and that's to the point where the 60fps mode on ratchet wasn't even on the disc unless you forced it in ps5's menu. Ratchet ps5 Was a 30fps game first.

We take for granted perfect framerate in multiple modes these days, but it's a ton of work. Think back to the fps we got in 5th gen.
You would think 448 GBps would be enough though. The rtx 2080 cards are easily able to do 1440p 60 fps in games like Metro that run at native 4k 30 fps. No need to change settings to medium or low. Everything just scales down.

maybe the CPU is using up a lot of the memory bandwidth?

Either way, I think devs need to start thinking ahead and leave unlocked modes in their games. That way when the PS5 Pro does come out they wont need to go back and patch in 60 fps fidelity modes.
 

SlimySnake

Flashless at the Golden Globes
Awesome, the 3 people who have a 120hz display are going to use it.

If you really want 120hz PC is better. The point of high refresh rates is being able to be faster, so KB&M is natural for it.
Not just any 120 hz TVs. Most 120 hz tvs only have 1080p 120 hz modes. Only the new HDMI 2.1 compatible tvs like the LG CX and Sony's own x900H tvs have 120 hz at 4k. I believe a couple of the new LG and Sony OLEDS have HDMI 2.1 capabilities but they cost thousands of dollars. Still, as an LG CX owner, I am happy they added it.

As for the point of the high refresh rates, if you watch this video, the point is actually the input lag. The input lag for the 60 fps mode on a 60 hz display is 75 ms in this game. The input lag for the same 60 fps mode on a 120 hz tv is only 60 ms.

The same applies to this new 40 fps mode. The input lag is only 80ms compared to the 117 ms input lag of the 30 fps mode. It's pretty much on par with the 75 ms input lag of the 60 fps mode which is crazy.
 
Last edited:
You would think 448 GBps would be enough though. The rtx 2080 cards are easily able to do 1440p 60 fps in games like Metro that run at native 4k 30 fps. No need to change settings to medium or low. Everything just scales down.

maybe the CPU is using up a lot of the memory bandwidth?

Either way, I think devs need to start thinking ahead and leave unlocked modes in their games. That way when the PS5 Pro does come out they wont need to go back and patch in 60 fps fidelity modes.
It's not enough to let the gpu perform at 100% at all times in all scenarios. Ditto ps4, but the limitation is even bigger on ps5 , though not as bad as ps4 pro's bottleneck.
 

Tschumi

Member
"welcome to a very different video" i love their "different" videos lol, retro etc, the dry stuff is hella dry, dryer than a witch's teat, i believe the saying goes
 

cireza

Banned
Midway point between 30 and 60fps is actually 45fps, not 40.

And yes I now that 45 is not a multiple of 120, still doesn't mean that you can blatantly lie.
 
Midway point between 30 and 60fps is actually 45fps, not 40.

And yes I now that 45 is not a multiple of 120, still doesn't mean that you can blatantly lie.
They said in terms of frametime it's a midpoint, and yeah 40fps is the closest we can get to a midpoint given display refresh rates.
 

deriks

4-Time GIF/Meme God
No, you can play at 40 FPS on a 1080p device at 120hz, but you need HDMI 2.1 with 4K 120Hz to get the full fat native 4K 40FPS.

You need 120hz to play at 40 fps. You don't need it to be 4K 120Hz.
Tomatos, tomatos
Watch the video so you can see why that is. the reason is not because they couldnt do it, it's because without a tv with that support, it would look and feel wrong because of the unven frames for a normal 60hz tv
It's not a matter of "why is like this", it's just a matter of I don't really care for this fidelity. 60fps is fine
 

Kydd BlaZe

Member
The game is absolutely STUNNING in fidelity mode. I normally try to stick to performance mode in a lot of games, but the visuals have me sticking with the 40 fps mode. It looks amazing and feels great. Wish more games had this option, because I’d sacrifice 60 fps if it meant 40 fps but with all the bells and whistles and better response time than 30 fps.
 
Blow Your Mind Wow GIF by Product Hunt

Someone explain like I'm a retard. How is 40 the midway point of 30 and 60?
I mean?

When a game is running at 60fps, a frame is delivered every 16.6ms.
When a game is running at 30fps, a frame is delivered every 33.3ms.

So @ 30fps, a frame is delivered 16.6ms later than it is @ 60fps[. If we split it in half, you get 25ms. (16.6 + (16.6 / 2))
1000/25 = 40fps!
 
Last edited:

Riky

$MSFT
VRR makes this 40fps mode completely redundant, I would think we will only see a handful of games, if that, use this option.
I tried it but I'd still rather play the Performance RT mode myself.
 
Top Bottom