Arcadialane
Member
Agreed & 60 FPS focus is robbing us of the next gen experience, how much graphical fidelity has been lost so they can make a game 60?
For Call of Duty, sports, racing and fast twitchy games like Returnal, yeah 60fps is desired, but if you're playing Uncharted, Tomb Raider, The Last of Us, The Order 1886, Ratchet and Clank.... etc, the studio is actually going for that filmic look. Why do you think they put chromatic aberration, motion blur, FILM GRAIN, water splashing onto the lens and other camera effects into their games? It's because they don't want it to LOOK like a video game and I shouldn't feel like I'm at the arcade.
Agreed & 60 FPS focus is robbing us of the next gen experience, how much graphical fidelity has been lost so they can make a game 60?
Yes, the fps autism and insecurity kicking in. They can’t avoid it. It’s stronger than them.
Ultimately, you do you. I prefer 60fps whenever possible, but 30 is perfectly playable and enjoyable. It’s fun seeing how many people are personally offended by your post.
You could say the same for Control UE, which I really wanted to play with RT because I have a PS5, so you want to use it's features, right?So yeah, to summarise: I think Spidey on PS5 just feels better at 30fps, greater sense of momentum, greater emotional response, more aesthetically pleasing image and more artistically in keeping with the content. And I personally prefer to trade-off some responsiveness and visual comfort in favour of that.
it doesnt matter.
all the pussies were throwing up when they watched spiderman into the spiderverse. They prolly also throw up and cry playing vr games.
How many did you start with?Lost a couple brain cells there
I disagree with this, depends on the implementation. High quality per object motion blur is excellent even in 60fps games. I just think most people still have nightmares from the PS3/360 where everything was a blurry smear at 20fps.Motion Blur = Bad
It's not necessarily bad (depending on the game and how it's handled). It's just worse than 60.30fps = Bad
Preferring visuals over framerate is wrong think. Got it.No matter what information is provided to people like you, how much we try to convince you with data on why your post is the most retarded shit today on this board, you still believe you made a great post.
How is it worse than 60 though? Usually there's a reason why it's 30fps right? Usually there's some extra effects, extra resolution, better lighting, shadows etc.... so there's typically a reason why someone wants 30fps.It's not necessarily bad (depending on the game and how it's handled). It's just worse than 60.
TLOU2 was choppy? Huh? TLOU2 has some of the best motion blur ever and it's a rock solid 30fps without pacing issues.I also played Spider-Man and MM at 30fps so I can have all the bells and whistles. It doesn't have the same choppiness that other games have at 30fps like DS or TLOU2. And as long as that's the case I will always go with quality mode if I'm not dependent on quick reaction like in a MP game or soulslike.
Sure you have vrr set correctly? 40 to 60 vrr is good for that purpose. I used 4k 40 to 60 vrr screen for years and vrr saved that thingWhen I'm playing The Witcher 3, my GPU is able to display between 40 and 60 fps, depending on the complexity and details on screen.
I can not stand big frame rate leaps, so I capped it at 45 fps. (I have an adaptive sync monitor which can thus handle that.)
The result is utterly satisfying, like the perfect mix between fluid animation and cinematic feel. And I don't detect occasional drops even if I know there are some.
Or maybe he truly just prefers 30fps.30 fps never looks better than 60 fps.
People need to stop trying to delude themselves into believing otherwise.
Yes, it works fine but big framerate leaps are still noticeable, which I can't say the same of little variations.Sure you have vrr set correctly? 40 to 60 vrr is good for that purpose. I used 4k 40 to 60 vrr screen for years and vrr saved that thing