sncvsrtoip
Member
it realy depands of resolution diff, if we talking 4k30 vs 1080p60 I would chose 4k30, but if its dynamic 4k 60 with lowest bound at 1440p vs 4k native I would chose 60fps mode
Last edited:
You should be able to do way more on console than just lock pc like presets.I really have to look hard to see the differences in resolution over 1080p and most times I just can't. But I know 30fps when I see it right away.
These modes are just presets. Most games are on PC and we all know of the multitude of options that are there. I don't see how creating two or three static modes for the console version of the same game makes anything "half-ass".
From my experience with different framerate games it takes few minutes to the eye adapt.Yeah it baffles me. It's either they just don't try it or they literally experience motion differently in some way, perhaps it's a perceptual thing. Cos there's just no way anyone's seeing what I'm seeing and going 'Oh yeah, I'll take the slideshow effect please, looks fucking GREAT! If only real life could look this stuttery!'
You are not aware what you loose by still hanging at 1080p.Given consoles have so much flexibility nowadays (and arguably since the Xbox One X launched), I'll tell you my perspective from a PC player.
I'm still at 1080p and will probably continue to be at 1080p for a long time. Maybe 1440p in the future, but I'm not interested in 4K.
My preference is visuals / framerate / resolution.
6th and 7th gen games I play maxed out, at 4K (downsampled to 1080p), 120 fps. For example: FEAR, Half-Life 2, Bioshock, CoD.
If I can't reach high resolutions, I'll still prefer high framerate. So 1080p120, such as Metro Redux or Tomb Raider titles.
If a game is intensive, I'm fine at max visuals and 1080p60 (Red Dead Redemption 2).
If theoretically I can't do 1080p60 maxed out and a game allows to tweak settings where the difference is small, I'll drop to 1080p60 High instead of Ultra.
Nowadays with TAA, FSR, Magpie/Lossless Scaling (external apps that allow FSR injection), I'd be fine with lower than 1080p too. Not a perfect example based on everything I said so far, but I'm playing Halo Infinite at Medium and dynamic 1080p just to reach a locked 120 fps. The resolution adjustment doesn't affect me at all. Also recently finished Hellblade maxed out at 1080p120 with FSR Ultra Quality (which is basically just slightly under 900p)
When I had a 2011 era rig, I played Witcher 2 at High (instead of Max) visuals at 900p 45 fps and I was fine. Finished Witcher 3 with the same right at Medium, 900p and 35 fps.
If I'd had a console, I'd play at max visuals at all times OR 60 fps if the downgrade wasn't that big of a deal. Ratchet Rift Apart, 4K30 with RT or 1080p60 with RT that's slightly of a less quality than the RT at 4K? Meh.
That is where the big mistakes from devs starts.
Console graphic modes should not be only presets from PC... they should be mode made specifically optimized to the capabilities of the consoles.
From my experience with different framerate games it takes few minutes to the eye adapt.
After that you won't even tell there is difference.
My missus played through watch dogs legion + ray tracing at launch on the series x, loved the game, thought it looked incredible.Yeah it baffles me. It's either they just don't try it or they literally experience motion differently in some way, perhaps it's a perceptual thing. Cos there's just no way anyone's seeing what I'm seeing and going 'Oh yeah, I'll take the slideshow effect please, looks fucking GREAT! If only real life could look this stuttery!'
And the "60fps" crowd is just the worst one.
They are so stubborn and loud about their "always performance mode". It is so annoying and obnoxious... get the fucking pc already and shut up.
I can and will play 30fps if I want. You can too if you wanted.
I like good graphics, I play games to relax and look at them and I like good motion blur which is essential at 30 or 60 too.
Just play a bit more... our eyes is really good in doing that.But that's you. For me, I don't adapt at all and 30fps is just annoying. This is why options are good.
Yeah it baffles me. It's either they just don't try it or they literally experience motion differently in some way, perhaps it's a perceptual thing. Cos there's just no way anyone's seeing what I'm seeing and going 'Oh yeah, I'll take the slideshow effect please, looks fucking GREAT! If only real life could look this stuttery!'
Just play a bit more... our eyes is really good in doing that.
You sound like a crazy personI don't want to see what I will loose.
I want to play the game, thinking this is the best there is. (outside of pc).
The modes make these consoles look weaker.
And yes - 60fps crowd is a nightmare. I am not saying there shouldn't be 60fps modes. But if there is make it like Returnal. They focused and made a fantastic 60fps game
Okay, go cry about it or sometbingConstantly amazed by the bad takes on this site
I won't say below 1800p because some good 1440p or 1600p implementation looks good.native 4k+60 fps is my first go to. i'd gladly sacrifice unnecessary "high-ultra" bait settings and ray tracing to hit this target
if the sacrifice is not enough, i'd settle for 4k 30 fps
practically, any game below 1800p look horrible, blurry and muddy with modern TAA implementations. any graphical improvement, settings, ray tracing or whatnot is wasted on anything below 1800p, including 1440p. neither 1440p nor 1200-1300p-ish resolutions are enough to make the game look clear/pristine. i value pristine image quality over minor improvements that can only be observed with %400 zooms
From my experience with different framerate games it takes few minutes to the eye adapt.
After that you won't even tell there is difference.
BTW that eye adaptation is more for when you shift between games... if you for example play now a 60fps game and after some hours plays a 30fps you won't have that adaptation.
But that doesn't happen only with 30 vs 60 fps games alone... solida 30fps vs fluctuating 30fps generate the same eye adaption... I played Bloodborne and Destiny at same time (with people asking me to fireteam to a Raid when I was playing Bloodborne) and when you shifted from Bloodborne to Destiny the experience in Destiny looked like fast foward for a few minutes until it become normal... the same for Destiny to Bloodborne the experience in Bloodborne looked like slow motion in few minutes.
After that eye adaptation time both looked and played fine... well you could see the framerate flutuaction in Bloodborne time to time but that is another issue (it should be locked 30fps to beguin with).
Destiny 30fps plays better than CoD, Apex, Titanfall, etc 60fps for me... Destiny could even be better in 60fps? Of course, but it being 30fps doesn't harm in any way the experience and enjoyment of the game.
The trade off for 60fps is way more concerning than play 30fps games... most games looks awful in 60fps.
Do you have motion issues with PSVR? I heard some friends talking about that but I don’t have either.It's true that you can get used to 30fps and enjoy those games perfectly fine. What I don't agree with is that it's a total adjustment and you don't even notice the deficiencies.
For ME - might be different for you, but for me - I was ALWAYS aware of the problems of 30fps, even before I really knew that it was a frame rate issue. As in, I was always aware that games looked much worse in motion than standing still, and that in some strange way it was kind of hard to really 'see' what was going on when the action started. I just didn't know why and I didn't really know any better tbh.
I think the first I really understood the difference was when the Uncharted games came out on the PS4. It was a real "ohhhhh I get it now' moment. Everything looked clearer and somehow more real, more solid and three dimensional. I realised what a big difference it really is.
People get used to anything, and take the coolest things for granted. But that same argument can be applied to 1080p or lower settings as much as it can to 30fps. In my opinion, of course.
It's true that you can get used to 30fps and enjoy those games perfectly fine. What I don't agree with is that it's a total adjustment and you don't even notice the deficiencies.
For ME - might be different for you, but for me - I was ALWAYS aware of the problems of 30fps, even before I really knew that it was a frame rate issue. As in, I was always aware that games looked much worse in motion than standing still, and that in some strange way it was kind of hard to really 'see' what was going on when the action started. I just didn't know why and I didn't really know any better tbh.
I think the first I really understood the difference was when the Uncharted games came out on the PS4. It was a real "ohhhhh I get it now' moment. Everything looked clearer and somehow more real, more solid and three dimensional. I realised what a big difference it really is.
People get used to anything, and take the coolest things for granted. But that same argument can be applied to 1080p or lower settings as much as it can to 30fps. In my opinion, of course.
The issue with lower resolution is that you can see the pixels… you look to the TV and see a image formed by big squares.That's a good point. "just get used to it" works both ways then, doesn't it? If people are telling me I can get used to 30fps then no reason why they can't "get used to" lower resolution.
I've been waiting on the 60fps patch for AC: Origins, but now I'm curious to see if I truly can get used to 30fps. I need a game to play before HFW comes out anyway so I'm downloading Origins and will try to "get used to" 30fps. We will see.
Do you have motion issues with PSVR? I heard some friends talking about that but I don’t have either.
I don’t know if it is related but at high school my friends could not read the books inside the bus going to school because it give them nauseas (it was a travel of 30km in a highway with 100km/h speed limitation)… I could read everything without any issue.
The issue with lower resolution is that you can see the pixels… you look to the TV and see a image formed by big squares.
The issue with lower resolution is that you can see the pixels… you look to the TV and see a image formed by big squares.
I know pixeled is a game design choice too but I don’t like it and when you see in a game that is suppose to have a clear imagine it bother me.
Somebody said I have to play far away from the TV but I don’t have more space and my TV is not that big… it is 55”… the same game in a 1080p set at same distance looks fine.
I wonder if migrate to 4k was a good choice after all… maybe I should stay in 1080p until next generation.
Exactly. He's confusing fluidity with graphics . It's why it sucks having to choose because these fidelity modes actually look like next gen games but if we want fluidity we have to play the scaled down version. He's also wrong about 30 fos feeling less fluid in Miles Morales than Spiderman Remastered. Both feel identical at 30 and they're one of the smoother game fos games you cam play.These are the type of statements that irritate the fuck out of me. Like what?
Every game looks better at 30. That’s the point.
Miles literally looks significantly better at 30 fps. The ray tracing effects and extra detail make the world of difference. Especially indoors and at night.
You should absolutely keep it. Dying Light 2 will look better on that than a 4k tv that's for sureI haven't jumped into next gen yet but I am so reluctant to trade up my 1080p Panasonic Plasma,lol......I might just keep it and play games on 1080p/60fps modes!
You realize that your game is still being upscaled on a 1080p display too right?I don't like upscaling artifacts so I'm sticking with my old 1080p plasma for now. It gives me the best of both worlds on PS5 and means that my Series S games don't look too bad.
Exactly. He's confusing fluidity with graphics . It's why it sucks having to choose because these fidelity modes actually look like next gen games but if we want fluidity we have to play the scaled down version. He's also wrong about 30 fos feeling less fluid in Miles Morales than Spiderman Remastered. Both feel identical at 30 and they're one of the smoother game fos games you cam play.
Sony and MS fucked up by going with Nvidis ot having their own form of dlss. These consoles can't quite hack it in their current state. The mid gen refreshes are already needed. We can blame these 1080p/60 games on lazy devs, and they are lazy if they can't manage 1440p/60 on a cross gen game, but if the new systems had more to brute force than we wouldn't have to ever play at shitty 1080p.You should be able to do way more on console than just lock pc like presets.
You have a hardware with direct access to it's features. Should be able to do way more.
Back to the resolution... I was downscaling from 4k 10 years ago because I always hated bad aliasing. Even 1440p is not good enough really. At least not raw 1440p.
Just look at pixel counts.
" 1080p display (1920 by 1080) has 2,073,600 pixels, a 1440p display (2560 by 1440) has 3,686,400 pixels and a 4K display (3840 by 2160) has 8,294,400 pixels"
1440p is not middle option. It's not even 2x more than 1080p. But 4k is more than 2 times more than 1440p.
To be honest, even raw 4k is jaggy. YOu always need some good aa solution. DLSS is so good. shame we dont have it on consoles. WE wouldn't have this discussion if consoles supported DLSS
No he doesn't. Actually YOU do if you can't see the point he's makingYou sound like a crazy person
The imagine is the same whatever the framerate.I can make a similar statement about lower frame rate.
Opinions can't be factually false. This is called personal preference dummy.That is factually false.
30fps is fine on OLED.
Isn’t it reasonable to want a good hassle free experience? You know…. Like it always was on consoles?You sound like a crazy person
1080p content on a 4k TV you can see the pixels... I will take a picture to you understand if I have time (yeap it shows in the pics).Yeah I'm not sure what's going on there but you really shouldn't be able to see individual pixels, that sounds weird.
I stopped being able to see actual pixels at 720p! I remember when I first got an HD tv and going right up to the screen to test the resolution, and I was like, nope, still looks good.
Consoles only recently catched up to the fidelity. 360 gen always was low fps and low resolution. PS4 gen started to somewhat get in line with pc. Meaning we get about the same res on console and pc.Why should it be the 60fps crowd that's forced to spend thousands on a PC rather than the 4k crowd? A PC solves for both situations, so I'm not sure what the difference is?
Maybe it's just because that's how it's always been? But then maybe it's our turn to get to enjoy consoles? I mean the "graphics" crowd have only had it their way for what, the last FOUR console generations!
Isn’t it reasonable to want a good hassle free experience? You know…. Like it always was on consoles?
Saying to me that I am crazy because you like 60fps so much makes you sound crazy
of course and that's true.I can make a similar statement about lower frame rate.