• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Modded GeForce RTX 3070 with 16GB memory gets major 1% low FPS boost

lukilladog

Member
Considering Nvidia had RTX3080'20G in the wings and RTX 3070Ti'16G all but ready im not surprised more people are doing this mod now.

Two years ago we saw it was possible:


And Nvidia didnt allow AIBs to sell these Protos to the public........so they sold them to Miners.
RTX-3070TI-16GB-4-768x1045.jpg


RTX-3080-20GB-4.jpg


To selected miners, even miners were pissed off because nvidia´s scumbaggery.
 
Last edited:

Wonko_C

Member
1% lows are almost more important than the average in terms of how the game is going to feel while you are playing it. It is normal for there to be a drop from the average to the 1% lows but it shouldn't be that large, if you are falling from 60 or 70 fps down to 8 or 9fps, that's a garbage tier experience for the player.

its actually the most important number. As someone said before its the worst percentile of framerate. How many fps do you have there? If you have 600fps average but 50fps 1% low that means that the fps goes up to 600 then something happens that makes the fps plummet. Big fps diffrences are noticable. While u can say 144fps are very smooth. A Game that runs 400fps with 1% lows of 150fps has higher fps all the time but you still feel stutter because of the diffrences in frametime. So you gotta look at average fps and low % fps. The narrower those number the smoother the expirience.

Stutters when it runs out of vram. Play TLOU on PC and see just how often it stutters randomly.

1%low is what the lowest framerate is. So, on a bell curve of performance, the 1% low is what it hits when things are at their lowest. It is a pretty important metric, because it is basically the number of framerate slowdowns. Its much harder to notice a 60 to 70 fps change than a 60-15 fps change - for example, when you walk into a big city in an open world game or some asset and effect heavy area.
Thanks for the clarification.
So basically it means how much can a game stutter during a certain period of time? I thought stutters were all due to shader compilation or unoptimized loading/streaming
 
Last edited:

DaGwaphics

Member
So basically it means how much can a game stutter during a certain period of time? I thought stutters were all due to shader compilation or unoptimized loading/streaming

If the 1% is really low it is more like the game is completely freezing up. The shader compilation stutters won't drop the average that significantly in most cases.
 

YCoCg

Member
8gb is probably good until ps6, maaaybe even ps7 is you're really good at avoiding bad ports.
What a ridiculous statement, developers have been asking Nvidia and AMD for more VRAM over the past couple of years, as developers get to grips with consoles and squeeze out more space, most of that extra space in the RAM pool is going to graphics. If you want things like ray tracing and all the ultra graphics then you're going to need more VRAM sooner than later.
 

SlimySnake

Flashless at the Golden Globes
Thanks for the clarification.
So basically it means how much can a game stutter during a certain period of time? I thought stutters were all due to shader compilation or unoptimized loading/streaming
Shader compilation stutters are actually a very recent thing. Ive been having stutters for over a decade now. It has to do with offloading/streaming textures into vram. If the game isnt optimized well then yes, you will have those stutters as soon as the game fails to load textures into vram in time.

Some games like forspoken dont stutter but they just refuse to load textures so instead of a stutter you get basic objects with no textures until the vram is filled... i.e., pop-in and if they go beyond the vram limit, they never load.
 
Top Bottom