• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

People seem to think I’m ruining gaming because I like 30fps…

KAL2006

Banned
I'm not a frame rate obsessive, never have been, have always been like "if I'm ok playing it I don't give a fuck what frame rate it is". Until recently, when after having (unconsciously) played a load of 60fps games I tried to replay Bloodborne, I barely made it into the introductory area before I stopped because it just felt shit, my movement didn't feel smooth or responsive. I also had a bit of this when playing Control and choosing between the quality mode and the higher frame rate mode, once I'd tried the higher frame rate playing the lower option became unthinkable to me and even trying gave me the beginnings of motion sickness. Once you've snorted that sweet high framerate chisel it's difficult to go back imo, better to never try it at all if all you can get is 30FPS pub grub.

This is pretty much me. I played God of War, Ratchet and Clank, Spiderman Remastered in 60fps back to back. Then I put on Control and heard about how good RT is in that game. Played in Quality Mode and felt like I can see each individual frame and the game gave me motion sickness. So I switched to performance mode and finally the game felt normal. I kept switching between the 2 modes and came I to the realisation The game looks great in quality mode when you are standing still and not moving. But 99 percent of the time you are moving so it makes quality mode obsolete. A other thing that doesn't help with 30fps is that Im playing on a 65inch sitting 6 feet away from a LED screen. Back in the day we all played Ocarina of Time on N64 but that was on a tiny CRT TV that had way better motion clarity in comparison to modern TVs.
 

Hoppa

Member
I'm just stating the truth. When a game is going for that animated movie style, I want 30fps and every time there was an article about the game, I'd ask if there would be a 30fps mode.
Interesting. I've never thought about preferring 30fps because it looks cinematic. Fair enough. I might be overexaggerating but when I play games in 30fps it feels choppy. I even felt that in Ratchet and Miles in fidelity mode.
 
Interesting. I've never thought about preferring 30fps because it looks cinematic. Fair enough. I might be overexaggerating but when I play games in 30fps it feels choppy. I even felt that in Ratchet and Miles in fidelity mode.
That’s fine. Thanks for understanding and not being a jerk about it.
 
When I have the option, I choose 60fps nowadays - just feels so much smoother. It's not a rule though, there have been games where either the 60fps wasn't that stable or the graphical compromises were too noticeable to make the jump worth it. Cyberpunk 2077 comes to mind, where they straight-up disabled the floor's reflection and made assets worse, and I honestly could not take that option. Similarly, half of Watch_Dogs: Legion's greatness comes from the stunning futuristic London they created, with raytracing even on console, beautiful particles and whatnot. The 60fps trade takes too much away, so I stuck to 30. But those are the exceptions for me.

On the other hand, I'd never shame people for playing at 30fps when given the option of 60fps. I know someone in real life who prefers the "slower" 30fps and I get why that happens. It's a fair take, and people should be allowed to play as they please.
 

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
Last gen people loved having no settings because they didn't want to fiddle with graphic settings, and now people want it? 🤷‍♂️

When it comes to gaming 60FPS is always better but it super hyperbolic BS that people say 30FPS is "unplayable", I'm so glad i don't have oversensetive eyes like them.
You also forgot oversensitive reaction time
 
Last gen people loved having no settings because they didn't want to fiddle with graphic settings, and now people want it? 🤷‍♂️


You also forgot oversensitive reaction time
1 - people having 2 or 3 options of presets is good, and it's not like opening all graphics options like in PC.

2 - that's not the same thing as input lag, a game can overreact to your input even with a 500ms delay
 

DavidGzz

Member
I was fine with 15 fps Blighttown back in the day. I got a PS5 and a XSX. Now I can't replay Bloodborne until we get a proper remaster. 30 fps sucks.
 

BreakOut

Member
I think the best is target 60 Sony needs to add VRR. Let it sort itself.
Actually insomniacs is still better that 40 at 120 Hz is a good idea.
 

McCheese

Member
Higher framerates are always better, I made the mistake of getting a 144hz monitor and after playing some 120/144fps games, now 60fps games feel like shit :(

Stay in your 30fps ignorant bliss for as long as you can
 
Last edited:

6502

Member
I don't see what the issue with OP stance is. I am sure he doesn't mind people enjoying ps4 graphics on their new £500 consoles at 60fps...
 
Last edited:

hussar16

Member
30fps is the better framerate ,we dont need 60fps its just tht the screens we have now introduce motion blur and we try to fix that by adding more frames, instead of fixing the problem outright which is more hrtz crt like motion
 

lmimmfn

Member
@OP, yes, folk who are happy with 30FPS generally just ruin it for those who like higher framerates.
However consoles have been 30FPS in most games since forever, PC is where its at for 60 or 120FPS
 

Dr. Claus

Vincit qui se vincit
Sure, Jan. You also must love the almost 500ms of input lag of RDR2.
Never noticed or cared about it. It does not affect the playability of the game or the fun factor. And given the millions of people who bought it and loved it, most didn't give a flying shit either.

They might not know it but there's a reason Call of Duty has been so popular on console for long time (it was always 60 fps vs other going for 30 in ps3/x360 era).

Yes, because they are easy to digest and had been continuously pushing forward the MP during its heyday. Being 60FPS had nothing to do with it.
 

Scotty W

Member
We can only guess. No good saying 1% when you have no idea how many. Their not making 60fps games for nothing.
I agree, actually. Fps takes a long time to become perceptible to most people. It wasn’t until several generations later that I realized how bad the fps on 8 and 16 bit games are.

Fps is not like graphical quality, in that everyone recognizes an improvement as soon as they see it. It is more like, it has to be badly broken to reach the threshold of consciousness for most people.

But that ‘badly broken’ is a standard not reached, and it takes a long time to establish that standard in the mind of the general public.
 

ArtHands

Thinks buying more servers can fix a bad patch
Is it really that difficult to make every game have 3 options:
4K 30FPS Quality mode,
Dynamic 4K 60 FPS using VRS balanced mode
Dynamic 1440P 120 FPS VRS performance mode

we don’t want console players to start spending hours fiddling with setting options, don’t we
 

01011001

Banned
30fps is the better framerate ,we dont need 60fps its just tht the screens we have now introduce motion blur and we try to fix that by adding more frames, instead of fixing the problem outright which is more hrtz crt like motion

lol, we found the dumbest opinion yet... there are TVs and PC monitors that can basically eliminate pixel response blur, 30fps still sucks on those...
also low framerates will still produce blur on high refresh monitors.
if you have a high refresh (hertz) monitor maybe look up the UFO test on the BlurBusters website. the lower the framerate the less information the panel gets and the blurrier your image will be.

and of course, lower Framerate = worse input latency, and there is nothing worse for gaming than high input lag. no display tech can fix that.
 
Last edited:

Dr. Suchong

Member
Fun fact, 99% of all gamers don’t know, care or notice fps.
I'd say this isn't far off the mark.
A lot of performance and graphical terms would be met with blank stares if mentioned to a hefty proportion of gamers.
I'd imagine they're too busy thinking about frivolous things like *Gasp!* having fun.
Not to say that I personally don't recognise performance such as fps being conducive to the enjoyment.
I'm somewhere in the middle. As long as it isn't a technical shit storm I'm ok.
Hell, I still think rotoscoping in Prince of Persia is cool lol
 

ahtlas7

Member
You like 30 FPS.

season 3 GIF by Workaholics
 
It’s not that I specifically NEED 30fps like it’s an addiction or something. A lot of games that “target” 60fps have drops which makes for a lot of stutter and inconsistency, but drop it down to 30 and you get a constant smooth stream of frames.

Also, I appreciate quality over quantity. I can definitely see the picture is blurrier with 60fps mode activated, and I don’t see to be bothered by the lower frame rate.
 

Genx3

Member
It’s not that I specifically NEED 30fps like it’s an addiction or something. A lot of games that “target” 60fps have drops which makes for a lot of stutter and inconsistency, but drop it down to 30 and you get a constant smooth stream of frames.

Also, I appreciate quality over quantity. I can definitely see the picture is blurrier with 60fps mode activated, and I don’t see to be bothered by the lower frame rate.
That's why VRR exists.
With VRR anything above 30 fps is better than 30 fps.
 

Rickyiez

Member
They might not know it but there's a reason Call of Duty has been so popular on console for long time (it was always 60 fps vs other going for 30 in ps3/x360 era).
True .

Same reason why DMC felt so good to play as they are always 60FPS and the backlashes on DmC initial launch because it's 30FPS .
Or Bayonetta Xbox 360 60FPS vs PS3 30FPS
Or TLOU2 PS5 60FPS vs TLOU2 PS4 30FPS
Or GOW PS5 60FPS vs GOW PS4 30FPS
Or HZD PS5 60FPS vs HZD PS4 30FPS

You don't eyeball the difference , you play them to feel it . If you couldn't tell 60FPS plays or feels better and how 30FPS is more sluggish and felt like a slog , then I really have no words .

There's nothing worse than buying a game that you thought would be fun, but there's loads stuttering or screen tearing
Fun fact , stuttering and screen tearing has nothing to do with 30fps vs 60fps . On PC , 60fps locked with borderless and triple buffering eliminates 99% of screen tearing and stuttering without even the need of any adaptive sync tech .
 
Last edited:

Genx3

Member
Its still fiddling settings. Console players just want to hit play and jump into the game.

They don't have to select the mode, if you press start the game automatically starts in Quality mode.

If prefer performance or balanced mode the game starts in the last mode you used.

Problem solved.
 
I don’t prefer 30th fps but if a game is good enough, I’ll respect it. The gamers in these forums are fairly entitled and jump on trends quickly, don’t mind them.
 

01011001

Banned
Fun fact, 99% of all gamers don’t know, care or notice fps.

explain how almost all of the top games in terms of popularity run (or try to run) at 60fps even on console?
Fortnite, Call of Duty, Battlefield, Apex Legends, FIFA, Madden, [(games that basically run at 60+ on every toaster) LoL, Valorant, CS:GO], Mario kart 8, Super Smash Bros, Minecraft...
what a weird coinkidink, isn't it?
 
Last edited:
Fun fact , stuttering and screen tearing has nothing to do with 30fps vs 60fps . On PC , 60fps locked with borderless and triple buffering eliminates 99% of screen tearing and stuttering without even the need of any adaptive sync tech .
Yes I know 30fps has nothing to do with screen tearing and stuttering. I guess what I meant was a solid 30fps with vsync is better than almost 60fps with drops to the 40s and 50s without VRR.

I had a PC with FreeSync and it was amazing, but unfortunately I don't think Sony is going to support FreeSync VRR and they'll most likely require HDMI 2.1's VRR in order to function, so that's going to require me to buy a new monitor, which may be slightly difficult because there aren't a lot of HDMI 2.1 monitors to choose from yet. I want something with 4K, HDR, 120Hz, VRR and local dimming but in monitor size. Guess I'll be waiting a while.
 
So, this next gen game I bought for my PS5 was supposed to be 60fps, but it randomly drops to 30fps during races. They still haven't fixed it.







SO NEXT GEN!

If you don't see any issues with these videos then you can't tell the difference between 30fps and 60fps. If you're sensitive, you will see the problem pretty easily.

To get around this issue, I just decided to play and platinum the PS4 version on my PS5 in 4K/30.
 
Last edited:

Dunky

Member
I agree, actually. Fps takes a long time to become perceptible to most people. It wasn’t until several generations later that I realized how bad the fps on 8 and 16 bit games are.

Fps is not like graphical quality, in that everyone recognizes an improvement as soon as they see it. It is more like, it has to be badly broken to reach the threshold of consciousness for most people.

But that ‘badly broken’ is a standard not reached, and it takes a long time to establish that standard in the mind of the general public.
What do you mean 8-bit and 16-bit games had FPS? Most games in that generation round at 60 frames per second.

It was the PlayStation, Sega Saturn generation with 3D graphics that introduced low frame rates. The machines could barely power 3D graphics.

Casual players don't really notice things like ray-tracing or a higher shadow mapping. But I can guarantee I could show people the difference between 60 and 30 and it instantly noticeable. I'm not saying this for the fun of it the different Is massive.
 

StreetsofBeige

Gold Member
What do you mean 8-bit and 16-bit games had FPS? Most games in that generation round at 60 frames per second.

It was the PlayStation, Sega Saturn generation with 3D graphics that introduced low frame rates. The machines could barely power 3D graphics.

Casual players don't really notice things like ray-tracing or a higher shadow mapping. But I can guarantee I could show people the difference between 60 and 30 and it instantly noticeable. I'm not saying this for the fun of it the different Is massive.
The vast majority of people can tell the difference between 30 and 60 if you showed them.

The problem is some people out there think the average person cant tell the difference because they dont know the terminology. So if they don't know what its called it means they cant tell the difference. You get some people like that.

If I showed my parents (who are in their 80s) a 30 ad 60 frame video, they would be able to tell the difference. They'd have no clue it's called frame rate though.
 
Top Bottom