• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

4k, 60fps, or Quality w/ray tracing. Console players what is your choice?

4k, 60fps, or Quality mode on consoles?

  • 60FPS (often sacrificing resolution and deatils)

    Votes: 188 66.7%
  • Resolution (native 4k often not having all effects on)

    Votes: 5 1.8%
  • Quality mode (all effects on but at a lesser resolution than native 4k)

    Votes: 67 23.8%
  • 1080p here!

    Votes: 22 7.8%

  • Total voters
    282

rofif

Can’t Git Gud
I really have to look hard to see the differences in resolution over 1080p and most times I just can't. But I know 30fps when I see it right away.



These modes are just presets. Most games are on PC and we all know of the multitude of options that are there. I don't see how creating two or three static modes for the console version of the same game makes anything "half-ass".
You should be able to do way more on console than just lock pc like presets.
You have a hardware with direct access to it's features. Should be able to do way more.

Back to the resolution... I was downscaling from 4k 10 years ago because I always hated bad aliasing. Even 1440p is not good enough really. At least not raw 1440p.
Just look at pixel counts.
" 1080p display (1920 by 1080) has 2,073,600 pixels, a 1440p display (2560 by 1440) has 3,686,400 pixels and a 4K display (3840 by 2160) has 8,294,400 pixels"
1440p is not middle option. It's not even 2x more than 1080p. But 4k is more than 2 times more than 1440p.
To be honest, even raw 4k is jaggy. YOu always need some good aa solution. DLSS is so good. shame we dont have it on consoles. WE wouldn't have this discussion if consoles supported DLSS
 

Cryio

Member
Given consoles have so much flexibility nowadays (and arguably since the Xbox One X launched), I'll tell you my perspective from a PC player.

I'm still at 1080p and will probably continue to be at 1080p for a long time. Maybe 1440p in the future, but I'm not interested in 4K.

My preference is visuals / framerate / resolution.

6th and 7th gen games I play maxed out, at 4K (downsampled to 1080p), 120 fps. For example: FEAR, Half-Life 2, Bioshock, CoD.

If I can't reach high resolutions, I'll still prefer high framerate. So 1080p120, such as Metro Redux or Tomb Raider titles.

If a game is intensive, I'm fine at max visuals and 1080p60 (Red Dead Redemption 2).

If theoretically I can't do 1080p60 maxed out and a game allows to tweak settings where the difference is small, I'll drop to 1080p60 High instead of Ultra.

Nowadays with TAA, FSR, Magpie/Lossless Scaling (external apps that allow FSR injection), I'd be fine with lower than 1080p too. Not a perfect example based on everything I said so far, but I'm playing Halo Infinite at Medium and dynamic 1080p just to reach a locked 120 fps. The resolution adjustment doesn't affect me at all. Also recently finished Hellblade maxed out at 1080p120 with FSR Ultra Quality (which is basically just slightly under 900p)

When I had a 2011 era rig, I played Witcher 2 at High (instead of Max) visuals at 900p 45 fps and I was fine. Finished Witcher 3 with the same right at Medium, 900p and 35 fps.

If I'd had a console, I'd play at max visuals at all times OR 60 fps if the downgrade wasn't that big of a deal. Ratchet Rift Apart, 4K30 with RT or 1080p60 with RT that's slightly of a less quality than the RT at 4K? Meh.
 
Last edited:

ethomaz

Banned
Yeah it baffles me. It's either they just don't try it or they literally experience motion differently in some way, perhaps it's a perceptual thing. Cos there's just no way anyone's seeing what I'm seeing and going 'Oh yeah, I'll take the slideshow effect please, looks fucking GREAT! If only real life could look this stuttery!'
From my experience with different framerate games it takes few minutes to the eye adapt.
After that you won't even tell there is difference.

BTW that eye adaptation is more for when you shift between games... if you for example play now a 60fps game and after some hours plays a 30fps you won't have that adaptation.

But that doesn't happen only with 30 vs 60 fps games alone... solida 30fps vs fluctuating 30fps generate the same eye adaption... I played Bloodborne and Destiny at same time (with people asking me to fireteam to a Raid when I was playing Bloodborne) and when you shifted from Bloodborne to Destiny the experience in Destiny looked like fast foward for a few minutes until it become normal... the same for Destiny to Bloodborne the experience in Bloodborne looked like slow motion in few minutes.

After that eye adaptation time both looked and played fine... well you could see the framerate flutuaction in Bloodborne time to time but that is another issue (it should be locked 30fps to beguin with).

Destiny 30fps plays better than CoD, Apex, Titanfall, etc 60fps for me... Destiny could even be better in 60fps? Of course, but it being 30fps doesn't harm in any way the experience and enjoyment of the game.

The trade off for 60fps is way more concerning than play 30fps games... most games looks awful in 60fps.
 
Last edited:

rofif

Can’t Git Gud
Given consoles have so much flexibility nowadays (and arguably since the Xbox One X launched), I'll tell you my perspective from a PC player.

I'm still at 1080p and will probably continue to be at 1080p for a long time. Maybe 1440p in the future, but I'm not interested in 4K.

My preference is visuals / framerate / resolution.

6th and 7th gen games I play maxed out, at 4K (downsampled to 1080p), 120 fps. For example: FEAR, Half-Life 2, Bioshock, CoD.

If I can't reach high resolutions, I'll still prefer high framerate. So 1080p120, such as Metro Redux or Tomb Raider titles.

If a game is intensive, I'm fine at max visuals and 1080p60 (Red Dead Redemption 2).

If theoretically I can't do 1080p60 maxed out and a game allows to tweak settings where the difference is small, I'll drop to 1080p60 High instead of Ultra.

Nowadays with TAA, FSR, Magpie/Lossless Scaling (external apps that allow FSR injection), I'd be fine with lower than 1080p too. Not a perfect example based on everything I said so far, but I'm playing Halo Infinite at Medium and dynamic 1080p just to reach a locked 120 fps. The resolution adjustment doesn't affect me at all. Also recently finished Hellblade maxed out at 1080p120 with FSR Ultra Quality (which is basically just slightly under 900p)

When I had a 2011 era rig, I played Witcher 2 at High (instead of Max) visuals at 900p 45 fps and I was fine. Finished Witcher 3 with the same right at Medium, 900p and 35 fps.

If I'd had a console, I'd play at max visuals at all times OR 60 fps if the downgrade wasn't that big of a deal. Ratchet Rift Apart, 4K30 with RT or 1080p60 with RT that's slightly of a less quality than the RT at 4K? Meh.
You are not aware what you loose by still hanging at 1080p.
I was thinking the same way but once you play some at 4k, then 1080p really does look like sdr
 

Topher

Gold Member
That is where the big mistakes from devs starts.

Console graphic modes should not be only presets from PC... they should be mode made specifically optimized to the capabilities of the consoles.

Those settings are common to both and they are there in either case. Consoles and PC are not that different.

From my experience with different framerate games it takes few minutes to the eye adapt.
After that you won't even tell there is difference.

But that's you. For me, I don't adapt at all and 30fps is just annoying. This is why options are good.
 

anthony2690

Banned
Yeah it baffles me. It's either they just don't try it or they literally experience motion differently in some way, perhaps it's a perceptual thing. Cos there's just no way anyone's seeing what I'm seeing and going 'Oh yeah, I'll take the slideshow effect please, looks fucking GREAT! If only real life could look this stuttery!'
My missus played through watch dogs legion + ray tracing at launch on the series x, loved the game, thought it looked incredible.

Played some other 60fps games afterwards that she enjoyed/thought looked great.

Went back to legion when it got it's expansion and said it hurt her eyes/looks bad whenever she moved/turned the camera.

My very casual gamer partner could tell the difference straight away. :)
 

Hunnybun

Member
And the "60fps" crowd is just the worst one.
They are so stubborn and loud about their "always performance mode". It is so annoying and obnoxious... get the fucking pc already and shut up.
I can and will play 30fps if I want. You can too if you wanted.
I like good graphics, I play games to relax and look at them and I like good motion blur which is essential at 30 or 60 too.

Why should it be the 60fps crowd that's forced to spend thousands on a PC rather than the 4k crowd? A PC solves for both situations, so I'm not sure what the difference is?

Maybe it's just because that's how it's always been? But then maybe it's our turn to get to enjoy consoles? I mean the "graphics" crowd have only had it their way for what, the last FOUR console generations!
 

ethomaz

Banned
But that's you. For me, I don't adapt at all and 30fps is just annoying. This is why options are good.
Just play a bit more... our eyes is really good in doing that.

I can see the case of gamers that keep shifting between games in a game session... I'm most a gamer that player until the end one game before move to another... I can tell the experience with change of framerate due the few times that I played some multiplayer games in parallel with a single-player game so people requested to play MP and I jumped on.
 
Last edited:

Topher

Gold Member
Yeah it baffles me. It's either they just don't try it or they literally experience motion differently in some way, perhaps it's a perceptual thing. Cos there's just no way anyone's seeing what I'm seeing and going 'Oh yeah, I'll take the slideshow effect please, looks fucking GREAT! If only real life could look this stuttery!'

I definitely think there are difference perceptions here from one person to the next.

Just play a bit more... our eyes is really good in doing that.

dont tell me what to do comedy central GIF by Workaholics
 

Tarnpanzer

Member
60fps, if it is at least 1440p. If not, then I won´t buy it, like GotG. (maybe Gamepass)

40fps like in R&C is fine too. (although 60fps is noticeably smoother there)
 
Last edited:

Spaceman292

Banned
I don't want to see what I will loose.
I want to play the game, thinking this is the best there is. (outside of pc).
The modes make these consoles look weaker.

And yes - 60fps crowd is a nightmare. I am not saying there shouldn't be 60fps modes. But if there is make it like Returnal. They focused and made a fantastic 60fps game
You sound like a crazy person
 

Fbh

Member
To me 60fps >> Visuals (textures, shadows, draw distance, etc) >>> Resolutions above 1080p >>>>>>>>>>>>>>>>>>>>>>>> Ray tracing.
I'd rather play a game at 1080p and 60fps with a "high" graphics preset than the same game at 4K 60fps with a "low" preset.

The sweet spot for me without spending a ton on hardware is 60fps, 1440P, high graphics
 
Last edited:

yamaci17

Member
native 4k+60 fps is my first go to. i'd gladly sacrifice unnecessary "high-ultra" bait settings and ray tracing to hit this target
if the sacrifice is not enough, i'd settle for 4k 30 fps. in fact i played gears 5 with low med settings to hit 4k 60 fps. i couldn't notice any downgrades going from high to medium/lows, but i could clearly see how much pristine the game looked going from 1440p to 4k. 1440p is simply not enough

practically, any game below 1800p look horrible, blurry and muddy with modern TAA implementations. any graphical improvement, settings, ray tracing or whatnot is wasted on anything below 1800p, including 1440p. neither 1440p nor 1200-1300p-ish resolutions are enough to make the game look clear/pristine. i value pristine image quality over minor improvements that can only be observed with %400 zooms
 
Last edited:

ethomaz

Banned
native 4k+60 fps is my first go to. i'd gladly sacrifice unnecessary "high-ultra" bait settings and ray tracing to hit this target
if the sacrifice is not enough, i'd settle for 4k 30 fps

practically, any game below 1800p look horrible, blurry and muddy with modern TAA implementations. any graphical improvement, settings, ray tracing or whatnot is wasted on anything below 1800p, including 1440p. neither 1440p nor 1200-1300p-ish resolutions are enough to make the game look clear/pristine. i value pristine image quality over minor improvements that can only be observed with %400 zooms
I won't say below 1800p because some good 1440p or 1600p implementation looks good.
But 1080p in a 4k TV looks awful no matter which TAA implementation used... neither AI scaling can save that case.

I rather play a 1080p game in a 1080p TV than 4k TV.

Returnal looks better on my 1080p PLASMA than my 4k OLED CX… in the OLED you can see the pixels due the lack of resolution… it is awful.

4k TV doesn’t work well with lower resolution content… watching the YouTube GT7 streaming in 1080p yesterday was bad too…. Watching the 4k version released today in the same TB is another world.

And SD or sub-1080p in a 4k TV… man…
 
Last edited:

Hunnybun

Member
From my experience with different framerate games it takes few minutes to the eye adapt.
After that you won't even tell there is difference.

BTW that eye adaptation is more for when you shift between games... if you for example play now a 60fps game and after some hours plays a 30fps you won't have that adaptation.

But that doesn't happen only with 30 vs 60 fps games alone... solida 30fps vs fluctuating 30fps generate the same eye adaption... I played Bloodborne and Destiny at same time (with people asking me to fireteam to a Raid when I was playing Bloodborne) and when you shifted from Bloodborne to Destiny the experience in Destiny looked like fast foward for a few minutes until it become normal... the same for Destiny to Bloodborne the experience in Bloodborne looked like slow motion in few minutes.

After that eye adaptation time both looked and played fine... well you could see the framerate flutuaction in Bloodborne time to time but that is another issue (it should be locked 30fps to beguin with).

Destiny 30fps plays better than CoD, Apex, Titanfall, etc 60fps for me... Destiny could even be better in 60fps? Of course, but it being 30fps doesn't harm in any way the experience and enjoyment of the game.

The trade off for 60fps is way more concerning than play 30fps games... most games looks awful in 60fps.

It's true that you can get used to 30fps and enjoy those games perfectly fine. What I don't agree with is that it's a total adjustment and you don't even notice the deficiencies.

For ME - might be different for you, but for me - I was ALWAYS aware of the problems of 30fps, even before I really knew that it was a frame rate issue. As in, I was always aware that games looked much worse in motion than standing still, and that in some strange way it was kind of hard to really 'see' what was going on when the action started. I just didn't know why and I didn't really know any better tbh.

I think the first I really understood the difference was when the Uncharted games came out on the PS4. It was a real "ohhhhh I get it now' moment. Everything looked clearer and somehow more real, more solid and three dimensional. I realised what a big difference it really is.

People get used to anything, and take the coolest things for granted. But that same argument can be applied to 1080p or lower settings as much as it can to 30fps. In my opinion, of course.
 

ethomaz

Banned
It's true that you can get used to 30fps and enjoy those games perfectly fine. What I don't agree with is that it's a total adjustment and you don't even notice the deficiencies.

For ME - might be different for you, but for me - I was ALWAYS aware of the problems of 30fps, even before I really knew that it was a frame rate issue. As in, I was always aware that games looked much worse in motion than standing still, and that in some strange way it was kind of hard to really 'see' what was going on when the action started. I just didn't know why and I didn't really know any better tbh.

I think the first I really understood the difference was when the Uncharted games came out on the PS4. It was a real "ohhhhh I get it now' moment. Everything looked clearer and somehow more real, more solid and three dimensional. I realised what a big difference it really is.

People get used to anything, and take the coolest things for granted. But that same argument can be applied to 1080p or lower settings as much as it can to 30fps. In my opinion, of course.
Do you have motion issues with PSVR? I heard some friends talking about that but I don’t have either.

I don’t know if it is related but at high school my friends could not read the books inside the bus going to school because it give them nauseas (it was a travel of 30km in a highway with 100km/h speed limitation)… I could read everything without any issue.

My wife says she can’t read too much on cellphone when we are traveling on car (I’m driving), she says she feels she will vomit. While when she is driving I can read for hours without any issue too… when she was pregnant she was the main driver because she wanted to vomit everything she was not driving but that is a special case 😇
 
Last edited:

Topher

Gold Member
It's true that you can get used to 30fps and enjoy those games perfectly fine. What I don't agree with is that it's a total adjustment and you don't even notice the deficiencies.

For ME - might be different for you, but for me - I was ALWAYS aware of the problems of 30fps, even before I really knew that it was a frame rate issue. As in, I was always aware that games looked much worse in motion than standing still, and that in some strange way it was kind of hard to really 'see' what was going on when the action started. I just didn't know why and I didn't really know any better tbh.

I think the first I really understood the difference was when the Uncharted games came out on the PS4. It was a real "ohhhhh I get it now' moment. Everything looked clearer and somehow more real, more solid and three dimensional. I realised what a big difference it really is.

People get used to anything, and take the coolest things for granted. But that same argument can be applied to 1080p or lower settings as much as it can to 30fps. In my opinion, of course.

That's a good point. "just get used to it" works both ways then, doesn't it? If people are telling me I can get used to 30fps then no reason why they can't "get used to" lower resolution.

I've been waiting on the 60fps patch for AC: Origins, but now I'm curious to see if I truly can get used to 30fps. I need a game to play before HFW comes out anyway so I'm downloading Origins and will try to "get used to" 30fps. We will see.
 

ethomaz

Banned
That's a good point. "just get used to it" works both ways then, doesn't it? If people are telling me I can get used to 30fps then no reason why they can't "get used to" lower resolution.

I've been waiting on the 60fps patch for AC: Origins, but now I'm curious to see if I truly can get used to 30fps. I need a game to play before HFW comes out anyway so I'm downloading Origins and will try to "get used to" 30fps. We will see.
The issue with lower resolution is that you can see the pixels… you look to the TV and see a image formed by big squares.

I know pixeled is a game design choice too but I don’t like it and when you see in a game that is suppose to have a clear imagine it bother me.

Somebody said I have to play far away from the TV but I don’t have more space and my TV is not that big… it is 55”… the same game in a 1080p set at same distance looks fine.

I wonder if migrate to 4k was a good choice after all… maybe I should stay in 1080p until next generation.
 
Last edited:

Hunnybun

Member
Do you have motion issues with PSVR? I heard some friends talking about that but I don’t have either.

I don’t know if it is related but at high school my friends could not read the books inside the bus going to school because it give them nauseas (it was a travel of 30km in a highway with 100km/h speed limitation)… I could read everything without any issue.

Lol no that's a reasonable assumption but funnily enough I'm the complete opposite, like you. Never get travel sick, no problems with VR of any kind.

I don't know what it is. But yeah I'm pretty sure different people are perceiving this stuff very differently. It's the only explanation.

The I feel the same as you about other people's complaints. I don't know wtf this 120fps thing is about, for example. I literally cannot tell the difference. I just have to take their word for it. And, if I'm honest, I can't REALLY tell inconsistent frame rates, either. Dropped frames here or there? Ok, if you say so. No idea, myself.

I was perfectly happy playing God of War in that weird 40 to 50fps mode on the Pro, for example. Looked fine to me lol.
 

Hunnybun

Member
The issue with lower resolution is that you can see the pixels… you look to the TV and see a image formed by big squares.

I know pixeled is a game design choice too but I don’t like it and when you see in a game that is suppose to have a clear imagine it bother me.

Somebody said I have to play far away from the TV but I don’t have more space and my TV is not that big… it is 55”… the same game in a 1080p set at same distance looks fine.

I wonder if migrate to 4k was a good choice after all… maybe I should stay in 1080p until next generation.

Yeah I'm not sure what's going on there but you really shouldn't be able to see individual pixels, that sounds weird.

I stopped being able to see actual pixels at 720p! I remember when I first got an HD tv and going right up to the screen to test the resolution, and I was like, nope, still looks good.
 

Topher

Gold Member
So I downloaded AC:Origins on PS5. Beyond the visual aspect of frame rate is the simple fact that 30fps has a negative impact on gameplay. Maybe this game isn't a good example.
 
These are the type of statements that irritate the fuck out of me. Like what?

Every game looks better at 30. That’s the point.

Miles literally looks significantly better at 30 fps. The ray tracing effects and extra detail make the world of difference. Especially indoors and at night.
Exactly. He's confusing fluidity with graphics . It's why it sucks having to choose because these fidelity modes actually look like next gen games but if we want fluidity we have to play the scaled down version. He's also wrong about 30 fos feeling less fluid in Miles Morales than Spiderman Remastered. Both feel identical at 30 and they're one of the smoother game fos games you cam play.
 
I don't like upscaling artifacts so I'm sticking with my old 1080p plasma for now. It gives me the best of both worlds on PS5 and means that my Series S games don't look too bad.
You realize that your game is still being upscaled on a 1080p display too right?

For example: a game can be 1440p checkerboarded to 4K and then downsampled to fit your 1080p screen.

Just because your screen is 1080p doesn’t mean the game is now running in native 1080p.
 
If IQ doesn't take a heavy hit 60 fps is nice, but if the difference is too much, I'd take the bells and whistles. I hate having to choose now because I'm always thinking the other option could be better. Fucking people.
 

Hunnybun

Member
Exactly. He's confusing fluidity with graphics . It's why it sucks having to choose because these fidelity modes actually look like next gen games but if we want fluidity we have to play the scaled down version. He's also wrong about 30 fos feeling less fluid in Miles Morales than Spiderman Remastered. Both feel identical at 30 and they're one of the smoother game fos games you cam play.

I'd say you're arbitrarily separating fluidity and graphics from the more important global value of how something actually looks. Fluidity is an important component of what a game looks like.
For example, no game can "look" good running at 15fps. I'd say it looks shit. You'd say it has fantastic graphics but terrible fluidity. It's a meaningless distinction in a reality where fidelity and fluidity are both unavoidable elements of a game's visual presentation.

Also, I didn't actually say that about the Spider-Man games. I said the Remastered looks worse in the Performance RT mode than MM. I haven't compared the 2 games in fidelity mode because they both look so choppy and awful that it's a non-starter for me.
 
You should be able to do way more on console than just lock pc like presets.
You have a hardware with direct access to it's features. Should be able to do way more.

Back to the resolution... I was downscaling from 4k 10 years ago because I always hated bad aliasing. Even 1440p is not good enough really. At least not raw 1440p.
Just look at pixel counts.
" 1080p display (1920 by 1080) has 2,073,600 pixels, a 1440p display (2560 by 1440) has 3,686,400 pixels and a 4K display (3840 by 2160) has 8,294,400 pixels"
1440p is not middle option. It's not even 2x more than 1080p. But 4k is more than 2 times more than 1440p.
To be honest, even raw 4k is jaggy. YOu always need some good aa solution. DLSS is so good. shame we dont have it on consoles. WE wouldn't have this discussion if consoles supported DLSS
Sony and MS fucked up by going with Nvidis ot having their own form of dlss. These consoles can't quite hack it in their current state. The mid gen refreshes are already needed. We can blame these 1080p/60 games on lazy devs, and they are lazy if they can't manage 1440p/60 on a cross gen game, but if the new systems had more to brute force than we wouldn't have to ever play at shitty 1080p.

Halo infinitevis even 1440p/60 on xb1x but dying light 2 on next gen with infinitely better cpu can't do it. There's not enough power (and no dlss) to make these lazy devs jobs easier so we're forever stuck with highly anticipated games at garbage 1080p.
 

NewYork214

Member
I never understood the love of higher framerate until I played uncharted collection at 120fps. Tried switching to 4k 30fps during a cut scene and instantly had to switch back.
 

wvnative

Member
The games I play typically are 4k30 in the quality mode. That's what I normally go for, but there are some exceptions. Some games I feel have a wonky feeling 30fps mode in which case it's performance for me.
 

ethomaz

Banned
Yeah I'm not sure what's going on there but you really shouldn't be able to see individual pixels, that sounds weird.

I stopped being able to see actual pixels at 720p! I remember when I first got an HD tv and going right up to the screen to test the resolution, and I was like, nope, still looks good.
1080p content on a 4k TV you can see the pixels... I will take a picture to you understand if I have time (yeap it shows in the pics).
4k content on 4k TV looks fine... well even a lower resolution content it just can't be too low near 1080p.

I guess when you show a 1080p content in a 4k TV each pixel of the content become 4 pixels in the TV and that is why you can see these 4 equal pixels.

But like I said... maybe it is the distance... in any case I don't like it.
 
Last edited:

rofif

Can’t Git Gud
Why should it be the 60fps crowd that's forced to spend thousands on a PC rather than the 4k crowd? A PC solves for both situations, so I'm not sure what the difference is?

Maybe it's just because that's how it's always been? But then maybe it's our turn to get to enjoy consoles? I mean the "graphics" crowd have only had it their way for what, the last FOUR console generations!
Consoles only recently catched up to the fidelity. 360 gen always was low fps and low resolution. PS4 gen started to somewhat get in line with pc. Meaning we get about the same res on console and pc.

Your argument is fair. I just enjoy graphics more and new graphics are something exciting whole another 60 fps game is just that
 

Hunnybun

Member
Isn’t it reasonable to want a good hassle free experience? You know…. Like it always was on consoles?
Saying to me that I am crazy because you like 60fps so much makes you sound crazy

Lol I'm gonna have to check this out for myself.
 

rofif

Can’t Git Gud
I can make a similar statement about lower frame rate.
of course and that's true.
I just found that I get used to good 30fps with good motion blur easier than bad image quality.
Hours into the game, I will notice jaggies and pixelation but I will most likely get used to 30fps.

That said - it is getting harder as more and more games are 60fps and the 30fps mode is not always worth it.
In case of Demons Souls, the 30fps mode is sharper but just barely, so 60 is a no brainer even for me
 
Top Bottom