• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Devs forgot how to do proper 30fps.

yep. You clearly have not experienced it.
It matters to some like me.
You should try to experience 60fps. It matters a lot more to pretty much everyone except for you.

Maybe you should get a TV that can handle images that were not rendered at native 4K better. You clearly have not experienced it. :messenger_tears_of_joy:
 
Last edited:

NeonGhost

uses 'M$' - What year is it? Not 2002.
It's just an effect of longer cross gen.
You act like 60fps was discovered now. I play hfr for almost 30 years now. I still have 180hz crt man.... framerate is not the only thing that matters but these new pc gamers who act like they just discovered hfr are just annoying
Where in my statement do I act like it was just discovered? But with more tvs offering 120fps these developers know they can’t get away with 30fps only modes. starting to thing you made this topic just so you can argue with people knowing damn well that 30fps should be left in the past
 
Last edited:

rofif

Member
Absolut non-sense.
30 fps has much more lag than 60 fps.
In 2020, there is no excuse to sacrifice gameplay for raw pixel count.
It's 33 vs 16ms.
And if You watched the video, you would know that bad 30fps adds 75ms lag....
The same lag is most likely added to bad 60fps.

The input lag diffeernce between 30 and 60fps can only be 16-17ms.... it does not have to be 75-85ms...
 

rofif

Member
You should try to experience 60fps. It matters a lot more to pretty much everyone except for you.
Are you talking to me?
I am a pc gamer for at least 25 years now.
I had first gsync montitors, 240hz, 144hz, everything.
I play 4k 120fps oled on rtx 3080 daily.
I play exclusively 120fps on pc... what is your case again?

The thread is not about me.
It's about how badly devs represent 30fps now. It can be 75ms faster ffs
 
Last edited:
Are you talking to me?
I am a pc gamer for at least 25 years now.
I had first gsync montitors, 240hz, 144hz, everything.
I play 4k 120fps oled on rtx 3080 daily.
I play exclusively 120fps on pc... what is your case again?
You still playing at 4K on PC in 2022? I moved to 8K years ago.

8K, 15fps. It's the sweet spot if you appreciate image quality. You just turn on motion smoothing and it's way better than 60 fps, it really help with all the stutter as well.
 
Last edited:

winjer

Member
It's 33 vs 16ms.
And if You watched the video, you would know that bad 30fps adds 75ms lag....
The same lag is most likely added to bad 60fps.

The input lag diffeernce between 30 and 60fps can only be 16-17ms.... it does not have to be 75-85ms...

That's double the input.
60 fps is much better for gameplay.
The only place for 30 fps is in the history books for old consoles.
 

NeonGhost

uses 'M$' - What year is it? Not 2002.
Are you talking to me?
I am a pc gamer for at least 25 years now.
I had first gsync montitors, 240hz, 144hz, everything.
I play 4k 120fps oled on rtx 3080 daily.
I play exclusively 120fps on pc... what is your case again?

The thread is not about me.
It's about how badly devs represent 30fps now. It can be 75ms faster ffs
This about devs shouldn’t represent 30fps anymore. Have you made any good topics here ? Every time I see a post or topic my eyes roll so hard I’m afraid they’re gonna fly out my skull
 

rofif

Member
That's double the input.
60 fps is much better for gameplay.
The only place for 30 fps is in the history books for old consoles.
30 vs 60 is double the input but it's still only 16ms.
While Vsync on/off system level is 75ms more like in DF video. Much bigger difference.
 

rofif

Member
This about devs shouldn’t represent 30fps anymore. Have you made any good topics here ? Every time I see a post or topic my eyes roll so hard I’m afraid they’re gonna fly out my skull
That's your problem.
What's wrong in this topic?
30fps is done badly an can be done better. I suspected that and DF confirms it.
There is absolutely nothing wrong here.

You think only about yourself. Just because you dont care for 30fps with better graphics on console, does not mean everyone shares the same opinion.
And I m saying that because 30fps is misrepresented recently, it's not as bad as people are led to believe with some big games this gen
 

winjer

Member
30 vs 60 is double the input but it's still only 16ms.
While Vsync on/off system level is 75ms more like in DF video. Much bigger difference.

Have you ever played a 60 fps game? It's is much smoother to play than 30 fps.
I play at 144hz. I can't even conceive the notion of paying at 30 fps in 2022.
Sacrificing gameplay to get better graphics or raw pixel count is compete non-sense.
 

Topher

Gold Member
That's double the input.
60 fps is much better for gameplay.
The only place for 30 fps is in the history books for old consoles.

I tend to agree. 30fps is great for stopping a game and zooming in and staring at the extra details, but why even bother doing that? 60fps looks superior in motion in every way. At least that is the case for my eyes.
 

NeonGhost

uses 'M$' - What year is it? Not 2002.
That's your problem.
What's wrong in this topic?
30fps is done badly an can be done better. I suspected that and DF confirms it.
There is absolutely nothing wrong here.

You think only about yourself. Just because you dont care for 30fps with better graphics on console, does not mean everyone shares the same opinion.
And I m saying that because 30fps is misrepresented recently, it's not as bad as people are led to believe with some big games this gen
Make a poll and see what the majority want
 

rofif

Member
Have you ever played a 60 fps game? It's is much smoother to play than 30 fps.
I play at 144hz. I can't even conceive the notion of paying at 30 fps in 2022.
Sacrificing gameplay to get better graphics or raw pixel count is compete non-sense.
Read up a bit. I play 120fps every day. And I had 240hz monitor for some time too.
I have an ability switching back and forth 30 - 120. It just requires few minutes to adapt (assuming that 30fps is not broken af)
 

rofif

Member
Make a poll and see what the majority want
I don't care what majority want.
I just posted a video with DF analysis to show that 30fps can be much better and it differs from game to game.
People tend to generalize and go with popular bandwagon of hate.

My goal with this topic was not to make people play at 30fps. I don't care. I just want 30fps to be done better because I would like to choose fidelity modes more often
 

winjer

Member
Read up a bit. I play 120fps every day. And I had 240hz monitor for some time too.
I have an ability switching back and forth 30 - 120. It just requires few minutes to adapt (assuming that 30fps is not broken af)

Do you know about Scott Watson? He is the analyst that started the whole frame time analysis.
I have been reading about it, even before DF found out what a nano second was.
 
You both clearly do not own 4k monitor or 4k tv.
4k can be really nice and immersive. I mean... look. This is how 4k looks on oled up close:
The pixelation is not showing in reality. It's just a moire effect from the camera.
So 4k modes are just for advertising but then when it comes to reality, we are expected to play in low res mode ?!



moved the iphone really close up to the screen. Game paused with photomode still on the same frame as above
Sure 4K at 30fps is nice...for static cutscenes. For gameplay it's shit (high input lag + blurry mess in motion).
 

yamaci17

Member
I don't care what majority want.
I just posted a video with DF analysis to show that 30fps can be much better and it differs from game to game.
People tend to generalize and go with popular bandwagon of hate.

My goal with this topic was not to make people play at 30fps. I don't care. I just want 30fps to be done better because I would like to choose fidelity modes more often
making proper 30 fps lock without enabling full buffered vsync requires developer to move tearing line to above or below the screen. this needs extra processing power overall and a hassle to setup overall. they also need to adjust frame buffer parameters so that vsync buffer lag is not pressurized with latest frames

enabling VRR globally and disabling vsync globally should eliminate all that hassle on consoles however I don't see them doing this. this is why on PC you can get minimal latency by relying on VRR alone, by completely eliminating vsync

to get minimal input lag at 30 fps with vsync is just a challenge. on PC you can force a frame limiter that is just a tad bit below 30 (29.97) if you're doing 1/2 vsync. this method creates intermittent stutters (what happens with bloodborne). other methods is moving the tearing line

no one wants intermittet stutters just for minimal input lag. at 60 fps you don't need to rely on weird hacks to minimize input lag. you just enable vsync and you're ready to go
 
Last edited:

rofif

Member
guys guys - RESET. CUT OFF. STOP

I don't want to be an ass here. Let's just discuss on topic please.

The topic is "30fps can be better than it is in some games".
And that's it. Play 60fps modes. But I just wanted people to know that laggy 30fps mode in DeS is not normal.
And yes. I played that game in 60fps mode of course. I am not that crazy.
 
Last edited:

rofif

Member
making proper 30 fps lock without enabling full buffered vsync requires developer to move tearing line to above or below the screen. this needs extra processing power overall and a hassle to setup overall. they also need to adjust frame buffer parameters so that vsync buffer lag is not pressurized with latest frames

enabling VRR globally and disabling vsync globally should eliminate all that hassle on consoles however I don't see them doing this. this is why on PC you can get minimal latency by relying on VRR alone, by completely eliminating vsync

to get minimal input lag at 30 fps with vsync is just a challenge. on PC you can force a frame limiter that is just a tad biy below 30 (29.97) if you're doing 1/2 vsync. this method creates intermittent stutters (what happens with bloodborne). other methods is moving the tearing line

no one wants intermittet stutters just for minimal input lag. at 60 fps you don't need to rely on weird hacks to minimize input lag. you just enable vsync and you're ready to go
Yeah it's a bad problem to have.
I wonder how vrr vsync works on console.
On nvidia cards, You need vsync enabled globally with gsync enabled. Otherwise there can still be some tearing.
And I think they finally fixed the vsync gsync ceiling lag and now you don't have to cap fps to 117, to avoid hitting laggy vsync ceiling.
At least on my tv, I can see fps stopping at 118-119. It never hits 120
 

TonyK

Member
Playing Horizon 2 at 4k 30fps just right now and it plays perfectly. After waiting for the 40fps patch I prefer 30fps because 40fps mode is same image quality than 60 but at higher resolution. Only mode that it looks barely next gen is 4k30fps, but as said, at least it plays smoothly at 30fps.
 

rofif

Member
If there's a 60 option, I don't care how good or bad the 30 mode is. I'm not going to be seeing it anyways.
Fair enough!
Even if the fidelity mode is much better looking?
Like, horizon foribdden west performance mode was very blurry on release date. Now it's perfect.

That is another thing - Ratched, Horizon FW are both games that got patched with vrr, 40fps and all these other great modes after I finished the game.
Wtf devs! Prioritize this shit. I would for sure play fidelity 40 or vrr mode on release
 

rofif

Member
Playing Horizon 2 at 4k 30fps just right now and it plays perfectly. After waiting for the 40fps patch I prefer 30fps because 40fps mode is same image quality than 60 but at higher resolution. Only mode that it looks barely next gen is 4k30fps, but as said, at least it plays smoothly at 30fps.
Wait, so forbidden west 40fps mode is not exactly the same as fidelity30 graphics wise?
What are they doing
 

The_Mike

Member
Are you talking to me?
I am a pc gamer for at least 25 years now.
I had first gsync montitors, 240hz, 144hz, everything.
I play 4k 120fps oled on rtx 3080 daily.
I play exclusively 120fps on pc... what is your case again?

The thread is not about me.
It's about how badly devs represent 30fps now. It can be 75ms faster ffs
Lying Season 4 GIF by Curb Your Enthusiasm
 

yamaci17

Member
Yeah it's a bad problem to have.
I wonder how vrr vsync works on console.
On nvidia cards, You need vsync enabled globally with gsync enabled. Otherwise there can still be some tearing.
And I think they finally fixed the vsync gsync ceiling lag and now you don't have to cap fps to 117, to avoid hitting laggy vsync ceiling.
At least on my tv, I can see fps stopping at 118-119. It never hits 120
I never use vsync in conjuction with VRR. I just make sure my framerates are within my refresh rate. its an hyperbolic issue. Since i have a 144 hz screen, I just cap my games to 120 fps with no vsync. that way I never ever get tears, it is impossible. even at 30 fps, screen refreshes at 90 hz, which is still below maximum 144 hz

on 120 hz modes, this can be problematic. indeed at 30 fps screen can refresh at 120 hz. however nvidia's driver decides to refresh the screen at 90 hz instead of 120 hz usually. at 60 fps however it refreshes at actual 60 hz so its not even an issue if the global refresh rate cap is still at 120 hz

vsync + gsync thing has always beena hyperboled thing for me. I don't care about that combo. if you still get tears, it means that you're close to maximum refresh rate container. for 120 hz screen on PC I'd just cap it to 108 FPS and that way you will never get tears

naturally I have no idea how that stuff works on consoles

you still have to cap by the way, nothing is fixed or anything. if framerate gets arbitrarily above maximum container refresh rate, you will get tears. your stats may not even pick up that your framerate exceeded that. that is why I suggest aggresive caps, not only below -3 but below -12.

think of it like his, if you constantly geat tears with a 117 cap without vsync and suddenly those tearlines go extinct with vsync, that means you're incurring vsync bound input lag everytime those tears would occur. this is why I'd prefer getting rid of tears without vsync and with a more aggresive framelimiter instead.

running 30 fps games by tripling them at 90 hz is the perfect solution for not employing vsync at all on a 120 hz container. this is what nvidia's driver does, which I actually admire and wonder if its intentional or not.
 

rofif

Member
I never use vsync in conjuction with VRR. I just make sure my framerates are within my refresh rate. its an hyperbolic issue. Since i have a 144 hz screen, I just cap my games to 120 fps with no vsync. that way I never ever get tears, it is impossible. even at 30 fps, screen refreshes at 90 hz, which is still below maximum 144 hz

on 120 hz modes, this can be problematic. indeed at 30 fps screen can refresh at 120 hz. however nvidia's driver decides to refresh the screen at 90 hz instead of 120 hz usually. at 60 fps however it refreshes at actual 60 hz so its not even an issue if the global refresh rate cap is still at 120 hz

vsync + gsync thing has always beena hyperboled thing for me. I don't care about that combo. if you still get tears, it means that you're close to maximum refresh rate container. for 120 hz screen on PC I'd just cap it to 108 FPS and that way you will never get tears

naturally I have no idea how that stuff works on consoles

you still have to cap by the way, nothing is fixed or anything. if framerate gets arbitrarily above maximum container refresh rate, you will get tears. your stats may not even pick up that your framerate exceeded that. that is why I suggest aggresive caps, not only below -3 but below -12.

think of it like his, if you constantly geat tears with a 117 cap without vsync and suddenly those tearlines go extinct with vsync, that means you're incurring vsync bound input lag everytime those tears would occur. this is why I'd prefer getting rid of tears without vsync and with a more aggresive framelimiter instead.

running 30 fps games by tripling them at 90 hz is the perfect solution for not employing vsync at all on a 120 hz container. this is what nvidia's driver does, which I actually admire and wonder if its intentional or not.
so must just be how my lg reports hz then. makes sense.
I cap to 120 recently. It's still faster than not capping at all. I was capping -3 for years but after going 4k120, I kinda changed my mind. Maybe because most games hover around 100fps anyway
About the tearing with gsync - I noticed more of it when I had 240hz and played older games that reach 240fps.
So it could be reaching high hz limit.
 

The_Mike

Member
Last time I had to prove I have a gaming pc, donjuan got banned.
You want to go through that again?
You don't have to prove anything.

And apparently you are good friends with mods since you can get away with calling people brain dead, and get people banned for pointing your lies out. Even your threat makes if plausible, but I am sure that's not the case. Hopefully at least.

I think Don got more frustrated by you that he felt for your bait and got banned
 
Last edited:

rofif

Member
You don't want to prove anything.

And apparently you are good friends with mods since you can get away with calling people brain dead, and get people banned for pointing you lies out. Even your threat makes if plausible, but I am sure that's not the case. Hopefully at least.

I think Don got more frustrated by you that he felt for your bait and got banned
Lies?
Are you still saying I am lying about owning a pc now and in the past?
Are we really doing this again?
I've not lied about a single thing. Don't go off-topic. Watch the video and discuss that.
 
Last edited:

Belthazar

Member
You both clearly do not own 4k monitor or 4k tv.
4k can be really nice and immersive. I mean... look. This is how 4k looks on oled up close:
The pixelation is not showing in reality. It's just a moire effect from the camera.
So 4k modes are just for advertising but then when it comes to reality, we are expected to play in low res mode ?!



moved the iphone really close up to the screen. Game paused with photomode still on the same frame as above

I do own a 4k display, but imo the tradeoff just isn't worth it (especially with the upscaling solutions offered today). I'd rather have a good 1080p60fps option available with raytraycing, better shadows, better draw distance, etc.
 
Last edited:
Normally the higher the fps the better.

But consider the limitation of hardware, cost, and the consideration of game type, its not a black and white scenario.

Normally for PC gaming, 1440p with high refresh rate is the most economic way of experience video game with our current technology. In this resolution not only we enjoy higher frame rate, but with also the benefit of Ray tracing (with DLSS or FSR) and high frame rate.

But there are certain type of game that I'd rather prefer 30 fps even if I have the option to set it higher, for example Detroit Become Human is the perfect example here. The game was designed specifically for 30 fps cinematic experience (which the game defaulted to 30 fps, you have to do extra tweaking to be able to unlock the fps limit, on PC of course). When I unlock the fps, the game looked like Soap Opera, I felt much better with its default setting. I guess there is still a case to be argued here that 30 fps is more like an artistic choice for cinematic video game, just like how pixel art is now an art form instead of hardware limitation (If you time travel back to the 90s and claim pixel art is an art form, people will laugh at you and claim its an excuse for outdated hardware). So if the game was to set at 30 fps, with the extra power why not play it at better resolution let say 4k?

That said, there also exist games that required to have higher frame rate for example multiplayer competitive games. For these type of game I would rather play at 1080p but with 240hz if possible, visual quality and effect is not important in this case, input lag is a game changer here.

And there are also games that are in the middle ground, for example Souls games, although they are not multiplayer game, but the game design itself are challenging for single player experience, for these game I would make a balance between frame rate and visual quality. For this case I'll use Demon Souls for PS5 as an example, even the most die hard 30 fps supporter would prefer to play at 60 fps in this game, which I would say 60 fps mode is mandatory.

My point here is: What are the best resolution and frame rate combination depend on the game type.
It would be silly to play Detroit Become Human with 1080p 240hz Low, or COD with 4K 30hz Ultra High
 
Last edited:

Filben

Member
When I've knew Elden Ring sold 16 million in such short amount of time, I doubt they will have any incentives to improve their technological side of things, such as better visual quality (again not art direction), frame pacing etc.
From a very practical business point of view this is true. Calculate what it costs you and what it gets you and the math is making you decide. Maybe the individual responsive devs would fix it, but deadlines and job queues and budget won't let them because management calculate exactly that, also focusing on what's making your product good, trying to make the rest acceptable.
 
Read the thread or watch the video.
FROM Soft DONT have this issue. If anything, bluepoint does
Urm isn't the video about the notoriously stuttery 30fps modes in FS games (DS, Sekiro, BB) and how hacked PS4s can help to fix said issue? Ie those games are NOT examples of good 30fps locks? Maybe comparatively speaking one could argue the DeS Remake has a worse 30fps mode, but on the other hand the performance mode is really good and I tend to believe that's the mode the developers prioritised.
 

rofif

Member
Urm isn't the video about the notoriously stuttery 30fps modes in FS games (DS, Sekiro, BB) and how hacked PS4s can help to fix said issue? Ie those games are NOT examples of good 30fps locks? Maybe comparatively speaking one could argue the DeS Remake has a worse 30fps mode, but on the other hand the performance mode is really good and I tend to believe that's the mode the developers prioritised.
I take some mild stutter compared to super laggy 30fps. BB was not very bad when it comes to frame pacing.
That's why I also posted about ff15 which is an example of terrible constat framepacing.
In bloodborne the spikes are not mild and not happening all the time.
I played Bb and DES 4k30 on ps5 one after another to compare and bb plays so much better 30 to 30 (60 aside)
 

PaintTinJr

Member
30 fps needs to die.
60 fps is the bare minimum.
I'm not going to tell you that you are wrong, but I would recommend you stop reading about that person you quoted you claim discovered frame-timing, and go read any edition of Computer Graphics: principles in practice, and then get any copy of the Opengl redbook or a direct3D tutorial and work through a few examples and then go play Mario sunshine on the Gamecube - it is 30fps - and see if you still think the same.

Some genres like racing simulator games very much do need 60fps or higher, but most games that are designed around consoles really just need a good double buffered vsync implementation of a locked 30fps IMHO.
 
Last edited:

Lasha

Member
yep. You clearly have not experienced it.
It matters to some like me.


Nobody is disagreeing with you. It's just odd to care about 4k native while also expecting only 30fps. Enthusiasts usually prioritize framerate first for gameplay reasons. 1440p at 60fps or more is a better experience than 4k at 30fps.
 

winjer

Member
I'm not going to tell you that you are wrong, but I would recommend you stop reading about that person you quoted you claim discovered frame-timing, and go read any edition of 3D graphics principles in practice, and then get any copy of the Opengl redbook or a direct3D tutorial and work through a few examples and then go play Mario sunshine on the Gamecube - it is 30fps - and see if you still think the same.

Some genres like racing simulator games very much do need 60fps or higher, but most games that are designed around consoles really just need a good double buffered vsync implementation of a locked 30fps IMHO.

Scott Watson revolutionized how reviewers look at hardware. It's because of his first articles that today almost all reviewers use frame times.
His work was so important that AMD and NVidia pushed a greater focus on frame pacing.
And today, he is working at RTG.

Probably, you were not into the PC tech analysis a decade ago, so this revolution passed you by. So here is a quote fom wikipedia so you can start to understand.

On September 8, 2011 Scott Wasson posted an article titled "Inside the second: A new look at game benchmarking". This showed gamers that frames per second(FPS) are not the only thing that matters in "smooth" gameplay, but frame latency has a big part.[19] This innovative benchmarking method was later mentioned and acknowledged by other publications such as Anandtech, which described this method as "a revolution in the 3D game benchmarking scene"[4][20] and Overclockers.[21]
 
Last edited:
I take some mild stutter compared to super laggy 30fps. BB was not very bad when it comes to frame pacing.
That's why I also posted about ff15 which is an example of terrible constat framepacing.
In bloodborne the spikes are not mild and not happening all the time.
I played Bb and DES 4k30 on ps5 one after another to compare and bb plays so much better 30 to 30 (60 aside)
That's just the thing, it's extremely personal. I myself can't stand stutter and it would throw me off faster than higher input lag. So I'm sure there are loads of people who prefer the non-stuttery but more laggy experience in DeS remake to the stuttery but more responsive one in the older titles.

Ultimately though it is pretty clear DeS remake is designed to be played in its performance mode and the fidelity mode is more of an afterthought. And in the future if developers design games around 30fps locks (I hope not lol) I'm sure they will put a much better effort to minimise input lag.
 

StreetsofBeige

Gold Member
Can anyone here explain to me why From games have frame pacing issues? That whole repeat 1-2 frames per second?

So From purposely repeats a few frames per second to save tiny amounts of processing power?
 

Kataploom

Member
So you will not play ND new game if it's only 30fps? You will miss on the best of the best because of artificial limitation?
I personally remember how good the game looked after I finish it. Not how it ran really.
I'm sorry but I actually remember how games run and feel... I got 30fps reticence because of some 30fps games I played after playing 60fps for years... Some games I have no choice then I'll decide if I want to wait for a proper version or if they're worth it even at it... The framerate has a feel, it also looks visually pleasant but what I feel while interacting with the game is what I remember the most... Can't think of Metroid Dread feeling equally good if it was 30fps but higher effects quality
 
Last edited:

PaintTinJr

Member
Scott Watson revolutionized how reviewers look at hardware. It's because of his first articles that today almost all reviewers use frame times.
His work was so important that AMD and NVidia pushed a greater focus on frame pacing.
And today, he is working at RTG.

Probably, you were not into the PC tech analysis a decade ago, so this revolution passed you by. So here is a quote fom wikipedia so you can start to understand.
I was responding to your "30 fps needs to die" comment, which has nothing to do with reviewers looking at frame-timing.

Frame timing is a valid topic for you to read about (IMO) but not a priority while you hold that view of 30 fps IMO without having a understanding of the founding principles of computer graphics - a book that's first edition was from 1982 (commissioned by IBM) before ATI or Nvidia were even formed, although I didn't view a copy until about 1999.

Also, even if we were talking about frame-timing(also) why would you put reviewers at the same importance as creatives ?
 

ProLogY

Member
Perhaps some of it is psychological? We have seen the light and going back to 30 feels horrible in comparison?

I always thought 30fps was miserable and unplayable. As someone who prefers console gaming, this drove me to jump to pc just to experience playable frame rates in games. I’m thrilled to see a generation that is finally treating 60fps as a standard.
 
Last edited:
Top Bottom