• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PC/Console gamers are focusing on the wrong things, - 1440p/HFR trumps everything (right now).

Poppyseed

Member
I know there are a lot of you out there that understand this already - but for those that don't, this 4K obsession is something that needs to go away, - at least for the time being.

You see, I just built a new PC around an i7-10700K and an RTX 3080. I have a 4K/60 display, and a 1440p/144hz display. Which would I rather game on? The 1440p display, by a mile. In fact, anyone I've shown these two displays to running Doom, Forza etc - they ALL prefer the 1440p display, and in movement they're convinced the 1440p display is the 4K display. Why? Because a 1440p display running at over 100fps is so much sharper/clearer than a 4K display running at 60fps. You can see people more easily as you're panning back and forth - in Doom/PUBG etc, - the corners of the track in Dirt Rally 2.0 are more easily discernible due to reduction in blur, and reduced input latency is unbelievable at 144fps - just 6.97ms, whereas at 60fps it's 16.667. The lag reduction is real, and you can absolutely feel it. Would I rather 4K at 144fps? Absolutely! But we're not there on the latest games quite yet, and at 4K even on a 3080, some titles see frame rates drop way too much.

Much has been made of the 3080 being the first "true" 4K/60 card out there. Why is that a major achievement? The best thing about the 3080 is that it's the first true HFR graphics card for 1440p with max graphics options turned on, and you just have to experience it to understand it.

In addition, it seems this upcoming console generation is going to be focusing on 4K over higher frame rates, but I really hope every game has the option of 60+ FPS at 1440p, because there's no doubt in my mind this is the sweet spot. 1440p/60 trumps 4K/30 every day of the week.

Thanks for reading.
 
Last edited:

PhoenixTank

Member
I tend to go for pretty high fps with my 1080Ti @1440p. 4K is better able to stress a 3080 than 1440p, even without CPU limits. More for your money in a sense, which probably goes a fair way towards the "true 4K60" reputation.
 

Neo_game

Member
Best gfx this gen are probably going to be around 1440P 30fps or 1080P 60fps. Personally I do not see any problem with that. 4K definitely has advantages but IMO priority should not be given to it as the compromise is not worth it. I think most games are going to come with performance and quality mode so people can choose.
 

Armorian

Banned
Yeah this is great and all but when you have new games that drop to ~45FPS on 9900k (so exactly like 10700) you can only dream about it



With Ryzens as a base for next gen consoles getting anything above 60FPS will be difficult in 30FPS console games.
 
Last edited:
As you say, you are preaching to the choir. We've already had polls here which show that members overwhelmingly prefer 1440p60 or even 1080p60 over 4k30. 4k is just a stupid buzzword for the casuals. Most developers know that it's a waste and it's already showing with Assassin's Creed and Watch Dogs doing "4k" upscaled from 1440p.

But expect next gen open world games to run at around 1440p30.
 

Tschumi

Member
I've been saying this for weeks, 4k60 was highlighted as a key piece of evidence of next gen gaming, but really it's beyond the tech right now. You can see that in the games coming out, they look worse than last gen 1440p games. Halo Infinite actually is the best looking 4k60 game we've seen thus far, imo, and it is by no means a good looking game. Heck, it's a technical accomplishment lol when you consider what they're aiming for. 1440p is the way to go. Fake 4k is undoubtedly going to look better than true 4k, for the next few years.
 
Last edited by a moderator:
I've been gaming on 144hz freesync monitors since I started PC gaming.

I have one that's 1080p, and my current one is 1440p. I would go back to my old 1080p 144hz monitor before I go 4K60.
 
Yes, that's a perfectly valid opinion. Personally - and I play both PC and console - I have no no interest whatsoever in playing above 60 fps.

I always play on my TV with a controller regardless, and maybe I don't play games that benefit that much from HFR, I don't know.

I do think native 4k is a waste of resources compared to a lower resolution with more eye candy and good reconstruction though. And while 30 fps looks and feels like shit on PC, it can often be a great experience on console for me - at least every Sony exclusive IMO.

If anything, I think that animations can look super weird at 60 fps, almost like you're using that shitty interpolation mode on TV's. Maybe it has to do with the notion capture, I don't know. Or looking like that fucking atrocity The Hobbit at 48 fps, which is the most unappealing thing I've seen, visually.

Also, particle effects and such, can sometimes stick out like a sore thumb and be jarring above 30 fps I think. Don't know how to explain it..

But I really do love the cinematic look (yeah, laugh it up boys) rather than having a super crisp, sharp image without optical effects. I absolutely love shallow Depth of Field (DOF) or film grain if it's well made.

The more it resembles a movie, the better. Even the feel of 60 fps is awkward to me in some console games.

For example, The Last of Us remastered on PS4 looks and feels "cheap" at 60 fps but is amazing at 30 fps for me.

Then again, Dark Souls 3 is kind of awful on PS4 and a great experience at 60 fps on PC. So it's almost down to a game to game basis for me. But there's just something about 30 fps on PC that never feels as smooth, responsive or pleasant as it can do on the PS4 Sony exclusives.

So for me, it's really not black or white and 30 fps experiences definitely has a place for me on console. The best thing here is choice - so you can choose a higher framerate at a lower visual fidelity. Be my guest, but I would almost never make the same choice.

And going above 60 fps, no way. For me personally there's 0 benefit and a total waste of resources that could make the game look much better. 🙂
 
Last edited:

Zenaku

Member
I loved my 28" 1440p 144hz monitor. But I love my 55" 4k 60fps LG C7 more.

When I gave up multiplayer and stopped giving a damn about my (player) performance, I realised that the extra resolution clarity and beauty of HDR over a 20-60 hour game far outweighed the few instances where I felt 'damn, could've done better there with a higher refresh rate'. By a mile.

The beauty of PCs is in the opportunity of the user to customize to their own preferences. There's no such thing as a 'must have feature/config', as it'll always vary from person to person. And that's fine.
 

THE DUCK

voted poster of the decade by bots
This may in fact be true, though as we push the consoles harder, sometimes the raw pixel output and graphics fidelity trumps frame rate.

That said, there are many lists that show which tv's this years do 4k 120z, but are there any show which do 1440 120hz? (I assume its more models)
 

baphomet

Member
Not the latest games for the last 6 years, let’s be honest here...

Even my 1080Ti couldn’t do that consistently.

Yes, the latest games for the past 6 years at 1440p/90+.

You think they were releasing 1440/gsync monitors just for fun back then?
 

BluRayHiDef

Banned
4K is requisite for large displays. 1440p is inadequate for displays that are 40" or larger because pixels are discernible on displays that are this large.

I game on a 55" display, so using 1440p natively is out of the question, as it looks noticeably inferior to 4K at this screen size.

Luckily, thanks to the brilliant software engineering of Nvidia, there's this technology that's called Deep Learning Super Sampling (DLSS) that uses artificial intelligence to predict how a frame is supposed to look at a targeted resolution, renders that frame at a lower resolution, and finally upscales it to the target resolution, thereby enabling frame rates that are consistent with the lower resolution.

Hence, a graphically demanding game such as Control can be rendered at 1080p and then upscaled to 4K via DLSS, which results in framerates that are consistent with the former and image quality that is consistent with the latter.

That powerful PC hardware is wasted on traditionally sized, small displays while relatively weak consoles are paired with large displays is nonsense, and DLSS is the key to ending this nonsense.
 

Poppyseed

Member
Yes, the latest games for the past 6 years at 1440p/90+.

You think they were releasing 1440/gsync monitors just for fun back then?

90FPS does not equal 144fps. And no, sure - you could lower the resolution/detail etc etc to hit higher frames. But we’re now at a point where we can play *most* games at 144fps at 1440p with pretty much everything turned on (except the joke/waste of resources known as RT).
 

Poppyseed

Member
4K is requisite for large displays. 1440p is inadequate for displays that are 40" or larger because pixels are discernible on displays that are this large.

I game on a 55" display, so using 1440p natively is out of the question, as it looks noticeably inferior to 4K at this screen size.

Luckily, thanks to the brilliant software engineering of Nvidia, there's this technology that's called Deep Learning Super Sampling (DLSS) that uses artificial intelligence to predict how a frame is supposed to look at a targeted resolution, renders that frame at a lower resolution, and finally upscales it to the target resolution, thereby enabling frame rates that are consistent with the lower resolution.

Hence, a graphically demanding game such as Control can be rendered at 1080p and then upscaled to 4K via DLSS, which results in framerates that are consistent with the former and image quality that is consistent with the latter.

That powerful PC hardware is wasted on traditionally sized, small displays while relatively weak consoles are paired with large displays is nonsense, and DLSS is the key to ending this nonsense.

DLSS isn’t ready for mainstream yet, clearly. And if you’re a console gamer you’re likely playing checkerboard-rendered games to 4K. I’m betting 99% of people couldn’t tell the difference between a 1440p image and a 4K image on a 55” screen at normal viewing distance.
 
Last edited:

ReLaxative

Neo Member
I use 1080p 144Hz and if I need a sharper image - plenty of games support super-sampling.
With that said, I think DLSS is the future, so it doesn't matter much what kind of resolution you're running on.
Refresh rate is something I would personally worry about these days.
 

yurqqa

Member
You are just playing the genres that benefit from 120 fps. It's a way to go in FPS (especially competitive), fighting games and, maybe, racing games.

But I agree that 4K is just a waste of resources. So some games will benefit from 120 Hz and some from better graphics - that would be great.

But don't talk about 120 Hz as it's a must for every kind of game. It's not.
 
4K is requisite for large displays. 1440p is inadequate for displays that are 40" or larger because pixels are discernible on displays that are this large.

I game on a 55" display, so using 1440p natively is out of the question, as it looks noticeably inferior to 4K at this screen size.

Luckily, thanks to the brilliant software engineering of Nvidia, there's this technology that's called Deep Learning Super Sampling (DLSS) that uses artificial intelligence to predict how a frame is supposed to look at a targeted resolution, renders that frame at a lower resolution, and finally upscales it to the target resolution, thereby enabling frame rates that are consistent with the lower resolution.

Hence, a graphically demanding game such as Control can be rendered at 1080p and then upscaled to 4K via DLSS, which results in framerates that are consistent with the former and image quality that is consistent with the latter.

That powerful PC hardware is wasted on traditionally sized, small displays while relatively weak consoles are paired with large displays is nonsense, and DLSS is the key to ending this nonsense.

And if I dont wanna play Control?
 

Geki-D

Banned
Considering that already this gen, the way to really see a difference between native 4K & upscaled was to do this:
cf20603fc0_Virtual-Face.jpg


I'd say we should be pretty safe with upscaled games running at high FPS. I'm not even that bothered by FPS but the whole "MUST BE NATIVE 4K!!" thing is dumb when upscalers are so effective. Devs would do best to use that extra power in other areas.
 

rofif

Banned
I don't agree. I still prefer solid 60 and best graphics for single player games. I forget 240hz pretty quickly when I use it but when I play at solid 60 or even 30 in some cases, I will remember the graphics more. Dark souls 3 would not be better at 144hz but 4k helps with it's aa so atmosphere.
Death stranding would not be any better at 144hz but would have to look worse... Not that all can't happen. I have 3080 so ds runs 4k60 no problem and 4k60 is now my fav combo for so games. Especially locked 58 fps for low lag freesync
 
That's what all the 30 fps warriors don't seem to get: games literally look better at higher frame rates.

The best example is Ratchet & Clank for the PS4. The game looks like absolute dogshit once you start panning the camera. 60 FPS (or even higher) would do wonders for that game.
 
That's what all the 30 fps warriors don't seem to get: games literally look better at higher frame rates.

The best example is Ratchet & Clank for the PS4. The game looks like absolute dogshit once you start panning the camera. 60 FPS (or even higher) would do wonders for that game.

I agree but I don't think 120 makes a big difference unless you have the proper monitor. Like, I absolutely agree 60 FPS improves the visuals on a game for me but above 60 you have less and less room to fit graphical effects into each frame, it's why so many games able to run at framerates like 120 easily look so simplistic.
 

Lanrutcon

Member
Legit: a stupid high framerate or stupid high resolution by themselves are fairly pointless. If they both aren't kept in balance, the one ruins the other.

I wouldn't play 4k at 30fps, just like I wouldn't play 480p at 120fps.
 

CrysisFreak

Banned
I hope that if next-gen consoles get successors or at least spiritual successors to games like Trackmania Turbo or Resogun those will at least roll with 120
 

THE DUCK

voted poster of the decade by bots

The limits of human eyesight wants to have a chat with you.

https://carltonbale.com/does-4k-resolution-matter/

What the chart shows is that, for a 84-inch screen, 4k resolution isn’t fully apparent until you are at least 5.5 feet or closer to the screen. For a “tiny” 55-inch screen, you’ll need to be 3.5 feet or closer. Needless to say, most consumers aren’t going to sit close enough to see any of extra resolution 4k offers, much less 8k.

It’s important to note that research by Bernard Lechner (former VP of RCA Laboratories) found the average viewing distance of American TV viewers is 9 feet. This is substantially farther than the 5.5 foot distance required to fully resolve normal-sized 4k screens. I don’t imagine people rearranging their living rooms to take advantage of the otherwise unnoticeable UHD resolution benefits.
 

OutRun88

Member
Legit: a stupid high framerate or stupid high resolution by themselves are fairly pointless. If they both aren't kept in balance, the one ruins the other.

I wouldn't play 4k at 30fps, just like I wouldn't play 480p at 120fps.
Exactly.

As to the OP, it depends what game is be played.

In some instances I prefer 144+ FPS at even 1080p, other times I prefer 4K60.
 

AV

We ain't outta here in ten minutes, we won't need no rocket to fly through space
Still shooting for that 4k/120 average. 3080ti maybe..? :messenger_grimmacing_
 

Tchu-Espresso

likes mayo on everthing and can't dance
The limits of human eyesight wants to have a chat with you.

https://carltonbale.com/does-4k-resolution-matter/

What the chart shows is that, for a 84-inch screen, 4k resolution isn’t fully apparent until you are at least 5.5 feet or closer to the screen. For a “tiny” 55-inch screen, you’ll need to be 3.5 feet or closer. Needless to say, most consumers aren’t going to sit close enough to see any of extra resolution 4k offers, much less 8k.

It’s important to note that research by Bernard Lechner (former VP of RCA Laboratories) found the average viewing distance of American TV viewers is 9 feet. This is substantially farther than the 5.5 foot distance required to fully resolve normal-sized 4k screens. I don’t imagine people rearranging their living rooms to take advantage of the otherwise unnoticeable UHD resolution benefits.
I’m about 2.5m (9 feet) from my 65 inch TV (sitting much closer would be just awkward) and can definitely see that at this distance I’m not getting the full benefit of 4K. I’m more than happy with developers targeting lower resolution to prioritise frame rate.
 
Last edited:

rofif

Banned
That's what all the 30 fps warriors don't seem to get: games literally look better at higher frame rates.

The best example is Ratchet & Clank for the PS4. The game looks like absolute dogshit once you start panning the camera. 60 FPS (or even higher) would do wonders for that game.
That's bs. games don't look better at hfr. It's just motion handling.
I had 240hz monitor. Doom 2016 looked exactly the same 60hz with motion blur compared to 240hz without motion blur. I had 2 monitors side by side.
That's why motion blur is there. If done right it fills the missing data your brain wants.
240hz was so fast that image looked like it it has motion blur on because... Wave you for damn hand in front of your face... It's not sharp. 30 or 60 without motion blur is just wrong. Motion is not fully static images. Pause a movie in motion and it will be blessed because camera capture time frame.
So all the idiots who turn off motion blur in all games as a first thing, make their game do look worse.
144hz was not enough to sacrifice motion blur but 240hz I always disabled it.
 
Last edited:

Lethal01

Member
Nope, people are able to test it themselves they did so and many preferred high resolution, this is the point of having options.
 
Top Bottom