• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

1440p is overrated by people who can't go up to 4k

FUBARx89

Member
Native 1080p or 1440p looks like ass on a 4k screen.

I still remember how amazing diablo3 looked on my native 1440p dell ultrasharp. Near launch.

And how good motorstorm looked on my native 720p flatscreen tv at the ps3 launch. Or how good casino royale looks on a native 1080 plasma from panasonic.

4K is overated as fuck.
Only good thing imo is HDR and VFR

HDR is good.......if the dev team aren't useless for implementing it.
 

nemiroff

Gold Member
OP appears like a joke or a troll to me, so I'm not going to bother putting a lot of effort into my reply.

I just want to add that for many 4K is a total waste of processing power, especially for people who prefer smaller monitors f.ex.. Balance is needed, so image detail/quality and framerate would be at the top of the priority list for most people in general.
 
Last edited:

b0uncyfr0

Member
'Going from 1080p to 1440p is a 78% increase'

Too bad it doesnt look 78% better...

'more accessible than ever with the powerful GPUs available in the markets and technologies like DLSS to not compromise 60fps'

You need some serious hardware/dosh to get consistent 4k + 60 fps. 1440p (with upscaling tech) at 1440p feels better.

Also think about this - if 4K was viable from the jump : we wouldn't have upscaling tech. Why didnt upscaling tech kick in at 1080p 10 years ago or 1440p 5 years ago?

We realised flat 4K is a very tough sell for the hardware we have and there are better ways to get there.
 

yamaci17

Member
1200p internal resolution upscaled to 4k with 4k hints+lods+assets are better than native 1440p while performing similarly

all it needs is higher amounts of VRAM, and that's it
 
Last edited:

StueyDuck

Member
ML image reconstruction methods like FSR TIA and DLSS will remain king. the more beef i can get from my components while outputting a higher res will always be the best.
 

Allandor

Member
Oh no, not the blind guys telling me you can't tell the difference in a 400% resolution increase while they went for a 78% increase.

Logic through the window
There is always a point where the difference is really, really small. For current size diagonals (especially monitors) it almost makes not difference. While you have to fill and calculate about double time the pixels (from 1440p to 4k) your hardware is only capable to do so much as it can. So if you calculate more pixels you might get more sharpness but might also sacrifice details or performance to hit your target.
There are games where 30fps might be enough but especially with tv sizes 30 fps and fast movement can get really ugly.

Guess why DLSS, FSR,... exist. Getting things running at 4k by brute force ist just to expensive in most cases for games. Yes you can do 6k or 8k but than your game looks e.g. like the Tourist. It is still a good looking game but only from its style perspective.


People should really stop bragging about thinks that people should want ;)
 
Didnt know you were an idiot OP and everyone else who agrees. Google pixel density. People here with tiny ass monitors runnin at 4k lmao. What a waste. If you game on a huge ass tv/monitor 4+ meters away from them, sure thats fine but most gamers have 24 to 27 inch monitors and are a few inches away from the monitor.

Also here are the gpus most PC gamers have https://store.steampowered.com/hwsurvey/videocard/ I'm sure the 5 people who play games at 4k will be thrilled to have you defending them OP.
 
Last edited:

Crayon

Member
Talking about 4k with that confidence when in reality you play like 2.5K native. Praise DLSS, FSR for your fake 4k.

Yeah that part got me. Going all high horse on people who "can't go up to 4k" and then turning around and saying to use dlss.
 

Mr.Phoenix

Member
Nope. Native 4K is a waste of resources. and this is just fact.

1080p = 2M pixels
1440p = 3.69M pixels (1.8x more than 1080p)
4k = 8.3M pixels (4.15x more than 1080p, 2.25x more than 1440p)

Can you sincerely see a 2.25x visual improvement when looking at 4K vs 1440p? This is made even harder to justify when looking at reconstruction techniques on their quality preset which are running natively in 1440p but outputting at 4K.
 
Sure, it looks better. However. It doesn’t better enough to justify not playing at 120fps, which I can do all day long on most titles at 1440p with a 3080.

I don’t think anyone is going to argue that it doesn’t look better, but I play my games, not just look at them. So yes, 1440p is the sweet spot for me.

Edit: You also shot yourself in the foot by talking about DLSS as this doesn’t render at native 4K. Drops its internal res and then upscales it. Doesn’t it usually internally render at less than 1440p too? Interesting if so.
 
Last edited:

Klik

Member
I have 1440p 165hz and my second monitor is 4k/60hz.

I don't see a big difference in resolution between 1440p and 4k on 27" monitor. But i see HUGE difference in my fps lol, i get about 50% more fps on 1440p,not to mention its much easier to stay above 100fps.

I guess if i have 32"-40" inch monitor/tv it would be bigger difference but on 27"its still not worth it.


I will keep my 1440p for until RTX 6xxx release probably around 2026..
 
Last edited:

Drew1440

Member
It's nice to play last gen games that can be scaled to 4K and seeing the little details missed, but for current gen games 1440p is a good compromise for performance.
 

Dunnas

Member
Not sure if anybody mentioned it in posts 51-100 (since I don’t care enough to read them) but your figures in the op are wrong. You can’t compare 78% and 400%. It is either 78% and 300% more pixels than 1080p or 178% and 400% as many pixels as 1080p.
 

AV

We ain't outta here in ten minutes, we won't need no rocket to fly through space
1440p is a great tradeoff for frames but I just didn't enjoy it. Everything looked noticeably less clear to me and I went back to 4K, just waited for a panel (TV...) that also did 120hz
 
Nah OP it 8K


Samantha Jones Wow GIF by HBO Max
 

Mr.Phoenix

Member
What's funny here is that if you have two TVs side by side, one running 4K native with TAA, and the other running 4K DLSS/FSR2 quality.

A majority of people choose the reconstructed image as the native res because it resolved finer details better while maintaining the IQ fidelity of native 4k.

There is a reason why image reconstruction in GPUs is the single biggest advancement they have made in the last 5 years. Even more relevant than RT.

I remember this exact thread but 720p/1080p
What about 720p with DLSS/FSR2 to 1080p?

I doubt when that 720p/1080p conversation was being had image reconstruction tech is where it's at now.
 
Last edited:

8BiTw0LF

Banned
What a stupid hot take OP.

I have a 4K 27" monitor and it's good for productivity (moving very close to it to see fine details - approx. 5cm/2" from it) but for gaming on a 27" 1440p, having nearly perfect pixel ratio, it's way better than playing 4K on a 55".
 

ZehDon

Gold Member
Imagine making a boast thread about being stuck in 16:9 resolutions from over a decade ago.

4k is for little children using their daddies Dell work laptops to play Overwatch. I game in 32:9 (5120×1440 at 240Hz) and I could probably stand to go a little wider. 4K DLSS? Might as well play on a console like a fucking peasant.

Obviously sarcasm, but I do actually game at 32:9
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
3440x1440p144

When 5120 x 2160p144 are more affordable ill jump on that
 

OZ9000

Banned
I like 4K as much as the next guy however the barrier to 4K gaming is extremely high.

Do I really want to spend £1,000 just to play 4K games in 60FPS? I don't particularly care about wasting that much money on a pointless activity such as video games.

As a compromise I would settle on DLSS Performance 4K60.
 

yamaci17

Member
I like 4K as much as the next guy however the barrier to 4K gaming is extremely high.

Do I really want to spend £1,000 just to play 4K games in 60FPS? I don't particularly care about wasting that much money on a pointless activity such as video games.

As a compromise I would settle on DLSS Performance 4K60.
thats the problem, "native 1440p" users are unable to understand that even 4K DLSS performance, despite having internal 1080p rendering resolution, stomps native 1440p image quality

it is achieving this while having comparable performance. so there really is no point to get a native 1440p screen and run output 1440p resolution. only thing 4k/upscaled demands is higher amounts of memory.

i can understand fickle 3080 10 gb/ 4070ti 12 gb owners to defend their precious native 1440p however, considering their VRAM budget is buckled even with 4k/dlss performance+ray tracing.

4k upscaled, even if internal res. is around 1080p, still demands high amounts of VRAM. rightfully so.
 
Last edited:

OZ9000

Banned
thats the problem, "native 1440p" users are unable to understand that even 4K DLSS performance, despite having internal 1080p rendering resolution, stomps native 1440p image quality

it is achieving this while having comparable performance. so there really is no point to get a native 1440p screen and run output 1440p resolution. only thing 4k/upscaled demands is higher amounts of memory.

i can understand fickle 3080 10 gb/ 4070ti 12 gb owners to defend their precious native 1440p however, considering their VRAM budget is buckled even with 4k/dlss performance+ray tracing.

4k upscaled, even if internal res. is around 1080p, still demands high amounts of VRAM. rightfully so.
To get 4K60 you need the RTX 4080 as a minimum. The lowest price for this card is £1200 in the UK.

Spending that much on a single component for video games alone is a hard sell for me.

I would be happy to spend £600 on such a GPU as a limit but not a penny more than that.
 
Last edited:

RyRy93

Member
Native 4K is into the realm of massively diminishing returns, complete overkill when 1440p still looks great while allowing for vastly improved performance
 

Mattyp

Gold Member
Truth. Native 4K144 is supreme if your system can handle it. I've never been able to go back to anything below 4K the clarity is to shit.
 

yamaci17

Member
Sure, it looks better. However. It doesn’t better enough to justify not playing at 120fps, which I can do all day long on most titles at 1440p with a 3080.

I don’t think anyone is going to argue that it doesn’t look better, but I play my games, not just look at them. So yes, 1440p is the sweet spot for me.

Edit: You also shot yourself in the foot by talking about DLSS as this doesn’t render at native 4K. Drops its internal res and then upscales it. Doesn’t it usually internally render at less than 1440p too? Interesting if so.
4K DLSS balanced (internal 1250p) DESTROYS and decimates so called native "1440p". I'm serious. it simply destroys it. anyone who tells otherwise have either does not have a 4K screen to observe/experienec it, or just cannot come to terms with it; they do not understand that DLSS rendering and upscaling 1250p to 4K is producing much better image quality they're getting on their native 1440p screens. just because DLSS renders internally at 1250/1440p, they think that going 4K is a farce, and they get similar image quality (coping part)



This video sums it up. It is not even a comparison. 1440p LODs+hints+assets look ugly, horrible and blurry compared to 4K lods+hints+assets DESPITE internal resolution being "MuH 1250p" which DLSS/4K haters tend to keep blabbering about



the 1440p one simply does not have the same 4K textures, lods. it will never get them, not unless you enforce DSR 2.25x and force the game to run at 4K, which at that point, just go for the 4K panel.
 

dext3rr

Member
I own 65'' LG CX OLED. Sitting about 3,5m from the tv, not much difference between 4k and 1440p from that distance. I always preffer 1440p 60fps than 4k 30fps.
 
4K DLSS balanced (internal 1250p) DESTROYS and decimates so called native "1440p". I'm serious. it simply destroys it. anyone who tells otherwise have either does not have a 4K screen to observe/experienec it, or just cannot come to terms with it; they do not understand that DLSS rendering and upscaling 1250p to 4K is producing much better image quality they're getting on their native 1440p screens. just because DLSS renders internally at 1250/1440p, they think that going 4K is a farce, and they get similar image quality (coping part)



This video sums it up. It is not even a comparison. 1440p LODs+hints+assets look ugly, horrible and blurry compared to 4K lods+hints+assets DESPITE internal resolution being "MuH 1250p" which DLSS/4K haters tend to keep blabbering about



the 1440p one simply does not have the same 4K textures, lods. it will never get them, not unless you enforce DSR 2.25x and force the game to run at 4K, which at that point, just go for the 4K panel.


Dumb comparisons as the native one does not use sharpening filters like DLSS internally uses. They dont look better at least not as hyperbolic as you make it out to be.
 

yamaci17

Member
Dumb comparisons as the native one does not use sharpening filters like DLSS internally uses. They dont look better at least not as hyperbolic as you make it out to be.
keep coping with the inferior 1440p output

you will never understand or experience what 4K resolve/hints/LODs mean for visuals with your ignorance
 
Last edited:

yamaci17

Member
if you seriously think the much better resolved detail that can be observed in the video is related to any kind of sharpening, you're the one who is being idiot here. sharpening is just a band aid, it cannot help a renderer to resolve finer details. 4k / dlss balanced clearly and observably (not unless you're an idiot) resolves finer details 1440p can ever hope to achieve
 
Top Bottom