• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Screw 4k?

Croatoan

They/Them A-10 Warthog
I have a 4k tv with HDR (nice Samsung), and a 144hz 1440p gsync monitor (no HDR). I have tested both extensively with multiple games and I have come to the conclusion that native 4k is pointless. 1440p with DLSS 2.0 looks incredible on Control and other titles with raytracing turned on and the games run 60+ fps. Even if I just toggle between 1440p and 4k on my tv the noticeable difference is relatively minor and can be fixed with some decent AA. I will say, my 4k tv running a game at 1440p/HDR looks better than my 1440p monitor, which makes sense considering the massive price difference, but honestly, hitting that 100+ fps on the monitor with Gsync makes the game "Feel" better. Its hard to describe with words I guess.

IDK guys, I think stuff like RT, and high framerates make more sense than 4k 4k 4k. The performance drop from rendering games at a native 4k just isn't worth it for the slightly sharper image quality (which I can only barely tell is even there). IMO 1080p<<<<<<<<<1440p<4k.

Lastly, I really hope you console only people get to witness the absolute glory of HARDWARE Gsync/freesync in some of these new tvs. I am playing PC Days Gone on my 1440p monitor right now instead of the TV because of GSYNC. Playing on a regular tv and dealing with Vsync and screen tearing is just not an option for me anymore, which is a pity cuz my TV has much better colors and black levels.
 

Croatoan

They/Them A-10 Warthog
We say this everytime a new standard becomes a standard.... Every. Single. Time.
and I have been right there with it honestly. Don't get me wrong, I am a graphics whore, but to my eyes I just cannot see much of a difference between 1440p and 4k. Is there a difference? Yeah, but is that difference worth 30+ fps hit to performance? No

I can max out Control with ray tracing at 1440p and get 60-90fps with DLSS 2.0 (which renders even lower than 1440p I think). If I try to do that same thing at 4k I am stuck at around 40fps which is an unacceptably low framerate (60fps is bare minimum for me).

I am using a 2080ti btw, when the 4080ti comes out in a few years and PC can brute force every game maxed out with RT at 4k/120fps I wont have a problem with 4k. But right now we are in a transition period and will take 1440p 60+ over 4k -60.
 
Last edited:

jshackles

Gentlemen, we can rebuild it. We have the capability to make the world's first enhanced store. Steam will be that store. Better than it was before.
Given the choice between 1440p at 60fps and 2160p at 30fps, I'll pick 1440p all day every day.

3440x1440 is the real deal though
 

Krappadizzle

Gold Member
and I have been right there with it honestly. Don't get me wrong, I am a graphics whore, but to my eyes I just cannot see much of a difference between 1440p and 4k. Is there a difference? Yeah, but is that difference worth 30+ fps? No
Hey, fair enough, you are entitled to feel that way. But this is just the same thing, different day.

Memory Cards? Put a battery in the system to save games.

HDD? We have memory cards.

720p? 480p is fine and has been the standard forever and is in the most homes.

1080p? 720p is fine and we don't need that much resolution anyways.

Et cetera et cetera.

In a few years time it won't be an issue. It's just growing pains.

I don't fundamentally disagree with you, but you playing on PC(I'm primarily a PC gamer too) means you'll always have the choice anyways so no need to raise concern to begin with. You won't be forced to sacrifice framerate(which is king) for resolution. Or at a minimum you'll always have a choice. Consoles almost all have the choice too so I don't really see the point you're driving at.
 
Last edited:

rofif

Can’t Git Gud
Nah. I was using 4k monitor for past 2 years and it was harder for me to go back to 1080p than it was preveiously to go from 240/144hz to 4k60 monitor.
Now that I have 48" oled, anything other than 4k or good dlss looks kinda bad.

To be hones. I see how 8k would be great. If I run Dark Souls 3 in 8k it is a slide show but looks like a damn postcard. On top of 4k already looking amazing.
That said - I would be fineif 4k was the final resolution. There is not much more data to be taken from 35mm film too. It resolves about at 4k. 8k would be useful for 70mm film but not that great advantage
 
Last edited:

Croatoan

They/Them A-10 Warthog
Hey, fair enough, you are entitled to feel that way. But this is just the same thing, different day.

Memory Cards? Put a battery in the system to save games.

HDD? We have memory cards.

720p? 480p is fine and has been the standard forever and is in the most homes.

1080p? 720p is fine and we don't need that much resolution anyways.

Et cetera et cetera.

In a few years time it won't be an issue. It's just growing pains.

I don't fundamentally disagree with you, but you playing on PC(I'm primarily a PC gamer too) means you'll always have the choice anyways so no need to raise concern to begin with. You won't be forced to sacrifice framerate(which is king) for resolution. Or at a minimum you'll always have a choice. Consoles almost all off the choice too so I don't really see the point you're driving at.
I didn't have much of a point other than to highlight that we are indeed in that transition and I think 1440p to be superior at this point in time. Devs seem to want to push 4k 4k 4k for their games, thankfully many are doing as you say and allowing people to have the option to chose what they want on console. I hope they continue to do so. I am also not just a PC gamer, I will grab a ps5 this year when they become more available (if they become more available).

Nah. I was using 4k monitor for past 2 years and it was harder for me to go back to 1080p than it was preveiously to go from 240/144hz to 4k60 monitor.
Now that I have 48" oled, anything other than 4k or good dlss looks kinda bad.

To be hones. I see how 8k would be great. If I run Dark Souls 3 in 8k it is a slide show but looks like a damn postcard. On top of 4k already looking amazing.
That said - I would be fineif 4k was the final resolution. There is not much more data to be taken from 35mm film too. It resolves about at 4k. 8k would be useful for 70mm film but not that great advantage
I am not talking about 4k vs 1080p, I am talking 4k vs 1440p at this point in time, with the hardware we have. 1080p is indeed outdated.
 
Last edited:
I mean, it depends on the game. FFVII Remake there is a huge difference between quality and performance modes. And if anything VRR will reduce the framerate deficit of 4K, because it is likely most locked 30 4K games could actually push 40+ consistently. Of course, that will only matter when the PS5 begins to support it, developers actually offer unlocked quality modes, and I get a TV that supports it lol
 

Cravis

Member
99UcPnu.jpg
 

rofif

Can’t Git Gud
I didn't have much of a point other than to highlight that we are indeed in that transition and I think 1440p to be superior at this point in time. Devs seem to want to push 4k 4k 4k for their games, thankfully many are doing as you say and allowing people to have the option to chose what they want on console. I hope they continue to do so. I am also not just a PC gamer, I will grab a ps5 this year when they become more available (if they become more available).


I am not talking about 4k vs 1080p, I am talking 4k vs 1440p at this point in time, with the hardware we have. 1080p is indeed outdated.
1440p is better than 1080p of course but I never found it to remove aliasing as much as I wanted (Yes, I had 1440p monitor too).
But at up to 27" it is fine
 

Excess

Member
Two things: 4K vs. 1440p on a monitor less than 32" in going to be a negligible difference, and I'd argue that 4K monitor PC gaming is a waste of money and performance resources. DLSS is just a wholly different discussion, and so I wouldn't call 1440p a "better" resolution than 4K just because DLSS can look better than native 4K.

Secondly, you can get the best of both worlds with an LG OLED, which has G-Sync up to 120Hz.
 

Croatoan

They/Them A-10 Warthog
1440p is better than 1080p of course but I never found it to remove aliasing as much as I wanted (Yes, I had 1440p monitor too).
But at up to 27" it is fine
Ohh you are right, 1440p still needs a bit of AA. 4k IMO also still needs some AA (at least in my tests it did), but you are right that it is "Cleaner" looking. I am sorta shouting at clouds here though as it would be really hard to market 1440p TVs to people. 4k is a buzzword for a reason...it sounds bigger than 1080p.
 

rofif

Can’t Git Gud
Ohh you are right, 1440p still needs a bit of AA. 4k IMO also still needs some AA (at least in my tests it did), but you are right that it is "Cleaner" looking. I am sorta shouting at clouds here though as it would be really hard to market 1440p TVs to people. 4k is a buzzword for a reason...it sounds bigger than 1080p.
absolutely 4k still needs aa. but it's easier to manage. Now with DLSS there is no performance drawback to 4k. Can run whatever internally and it will look better than that raw internal res
 

Allforce

Member
4K is the first time I've been really unable to see a dramatic improvement in TV technology and I've been buying HDTVs since 2002.

Framerate is the biggest jump but strangely nobody sticks that on the box. There's PS4 games that feel like a generational leap forward on PS5 just because they are running at 60fps now.

But put a game running at native 4K and 1080p side by side and I can't even tell when I'm playing.
 
i think going from 1440p to 4k has been proven difficult in the same way as 720p to Full HD 1080p.

I think we would have transitioned to 4k 60fps gaming by now if it were not for raytracing. 1080p gaming didnt have to worry about implementing raytracing, HDR and other graphical aspects which are kinda throwing a wrench in 'pixel pushing'

with RDNA 3, and upcoming NVIDA GPU's along with newer Zen and Intel CPU's we will make the full transition to 4k 60fps, HDR, Raytracing, DLSS, Freesync/Gsync by 2024-2025. I guess its ironing out the kinks phase now (!??🤷‍♂️)
 

ksdixon

Member
People have been saying that graphics don't mean shit for decades. Better framerate, better AI routines. There are times where I feel like we're all just playing HD PS2 games at their core.
 

Croatoan

They/Them A-10 Warthog
Two things: 4K vs. 1440p on a monitor less than 32" in going to be a negligible difference, and I'd argue that 4K monitor PC gaming is a waste of money and performance resources. DLSS is just a wholly different discussion, and so I wouldn't call 1440p a "better" resolution than 4K just because DLSS can look better than native 4K.

Secondly, you can get the best of both worlds with an LG OLED, which has G-Sync up to 120Hz.
I was testing 4k vs 1440p on a 55" 4k tv with and without DLSS 2.0. 4k looked better in every case but the performance hit was too much for me to stomach given the minor picture upgrades. I settled on 1440p with DLSS 2.0, then started testing tv vs monitor. Gysnc with the monitor won out there for me.

Does that LG have software or hardware Gsync?

4K is the first time I've been really unable to see a dramatic improvement in TV technology and I've been buying HDTVs since 2002.

Framerate is the biggest jump but strangely nobody sticks that on the box. There's PS4 games that feel like a generational leap forward on PS5 just because they are running at 60fps now.

But put a game running at native 4K and 1080p side by side and I can't even tell when I'm playing.
Playing Days Gone maxed at 90+ fps feels real good btw. Game is pretty on pc too.
 
Last edited:

Excess

Member
I was testing 4k vs 1440p on a 55" 4k tv with and without DLSS 2.0. 4k looked better in every case but the performance hit was too much for me to stomach given the minor picture upgrades. I settled on 1440p with DLSS 2.0, then started testing tv vs monitor. Gysnc with the monitor won out there for me.

Does that LG have software or hardware Gsync?


Playing Days Gone maxed at 90+ fps feels real good btw. Game is pretty on pc too.
I think that's a fair trade-off if you're pushing your hardware to its limits. Back when I had a 1070, I played AC Odyssey at 1080p on my 55" 4K TV simply for the frame rate. It was worth it to me and why I eventually left console gaming.

From my understanding, the LG OLEDs are G-Sync "compatible", which I think is a rough translation of software implementation.
 

Croatoan

They/Them A-10 Warthog
I think that's a fair trade-off if you're pushing your hardware to its limits. Back when I had a 1070, I played AC Odyssey at 1080p on my 55" 4K TV simply for the frame rate. It was worth it to me and why I eventually left console gaming.

From my understanding, the LG OLEDs are G-Sync "compatible", which I think is a rough translation of software implementation.
Gotcha, and you are correct "Compatible" means software gsync (basically freesync). Here is to hoping one day we get hardware Gsync in an affordable TV. I think there is one TV that has hardware gsync but it costs like 4k or something.
 

hoplie

Member
I don’t know the exact resolution but in Ratchet and Clank 4k / 1440p (?) makes a very very huge difference. It really depends. But ultimately 4k isn’t enough, but almost optimal.
 

e&e

Banned
Hey, fair enough, you are entitled to feel that way. But this is just the same thing, different day.

Memory Cards? Put a battery in the system to save games.

HDD? We have memory cards.

720p? 480p is fine and has been the standard forever and is in the most homes.

1080p? 720p is fine and we don't need that much resolution anyways.

Et cetera et cetera.

In a few years time it won't be an issue. It's just growing pains.

I don't fundamentally disagree with you, but you playing on PC(I'm primarily a PC gamer too) means you'll always have the choice anyways so no need to raise concern to begin with. You won't be forced to sacrifice framerate(which is king) for resolution. Or at a minimum you'll always have a choice. Consoles almost all have the choice too so I don't really see the point you're driving at.
There is only so much pixels you need on a regular size screen. Diminishing returns is here. So unless everyone gonna have 100 inch TVs in their rooms, it’s getting pretty pointless…
 
Last edited:

Holammer

Member
I'm on Team 1440p, but I'd love to get an 8k screen just for emulation. It should have the resolution to be able to render amazeball shadow masks for CRT shaders.
 

Excess

Member
Gotcha, and you are correct "Compatible" means software gsync (basically freesync). Here is to hoping one day we get hardware Gsync in an affordable TV. I think there is one TV that has hardware gsync but it costs like 4k or something.
I actually hope Nvidia gives up on G-Sync considering VRR FreeSync is a a royalty-free option. Because of this, I've noticed a trend towards "compatibility" for G-Sync, rather than hardware implementation. But like clockwork, I believe Nvidia has found another way to gouge manufacturers and consumers with G-Sync Ultimate.

I think the TV you're referring to is an ASUS gaming TV, but Linus had one and ended up replacing it once LG implemented G-Sync. That ASUS has G-Sync Ultimate.

 
Last edited:
and I have been right there with it honestly. Don't get me wrong, I am a graphics whore, but to my eyes I just cannot see much of a difference between 1440p and 4k. Is there a difference? Yeah, but is that difference worth 30+ fps hit to performance? No

I can max out Control with ray tracing at 1440p and get 60-90fps with DLSS 2.0 (which renders even lower than 1440p I think). If I try to do that same thing at 4k I am stuck at around 40fps which is an unacceptably low framerate (60fps is bare minimum for me).

I am using a 2080ti btw, when the 4080ti comes out in a few years and PC can brute force every game maxed out with RT at 4k/120fps I wont have a problem with 4k. But right now we are in a transition period and will take 1440p 60+ over 4k -60.
Then you, my friend, are not a graphics whore...
 

gypsygib

Member
Most mid to high levels TVs scale very well so 1440p would already look pretty close to 4K. On a 4k monitor though, 1440p will look a tad blurry.
 

Hoddi

Member
I don't particularly mind upscaling on console (since I'll be sitting further away) but I find 1440p absolutely dreadful on 4k displays. It really needs to be something higher (like 1800p) to be acceptable, in my opinion.

1440p is otherwise fine on native monitors. But there's a big difference between running it natively vs. upscaled at 4k even when both displays are 27".
 

RoboFu

One of the green rats
4K is pretty useless to me right now. 1440 is the sweet spot. And no it’s not like going from 720 p to 1080 .. the smaller the pixels get the less obvious the difference.
 

amscanner

Member
If you compare Native 4k vs dlss 2.0 QHD you can say 4k is pointless. DLSS is crazy.
But when you compare Native 4k vs whatever else upscaled QHD, it’s still huge difference.
 

reinking

Gold Member
Monitor (27" Dell S2721DGF). I prefer 1440p. I tried a 4K 27" monitor and it didn't seem to make much difference. TV (65" Sony X900H). I prefer 4K. I can't explain what you are seeing on your Samsung but I notice a significant difference in 1440p vs 4K on my TV. It does depend on what type of game I am playing to decide if I prefer the quality vs performance mode on the TV but that does not mean I am dismissing 4K. Just the sacrifice I am willing to make in some games.
 

SirVoltus

Member
Monitor (27" Dell S2721DGF). I prefer 1440p. I tried a 4K 27" monitor and it didn't seem to make much difference. TV (65" Sony X900H). I prefer 4K. I can't explain what you are seeing on your Samsung but I notice a significant difference in 1440p vs 4K on my TV. It does depend on what type of game I am playing to decide if I prefer the quality vs performance mode on the TV but that does not mean I am dismissing 4K. Just the sacrifice I am willing to make in some games.
That tv produce has 4 times contrast ratio and has local dimming than your monitor hence much better black level.. so yeah that alone produce better picture quality.. especially on hdr.

I also just downgrade, lol, from Samsung 55Q70R in living room to 27" msi 1440p MAG274QRF, while losing that contrast ratio and black level, still got better ppi and easier to achieve better colour just in front of my working desk now.

Msi also can accept 4k input signal from console so it is good when my ps5 arrive letter. In the mean time, i am happy with the combo of my old pc i7 3770k + gtx 1070 with this monitor.
 
Last edited:
The resolution/image quality tradeoff has always been mandated by the hardware available at the time.... 4K will be expected at some point, but right now we are better off (in my opinion) with a slightly lower resolution, some raytracing and 60fps—which is what we see on the PS5 at the moment.
 
I have a 4k tv with HDR (nice Samsung), and a 144hz 1440p gsync monitor (no HDR). I have tested both extensively with multiple games and I have come to the conclusion that native 4k is pointless. 1440p with DLSS 2.0 looks incredible on Control and other titles with raytracing turned on and the games run 60+ fps. Even if I just toggle between 1440p and 4k on my tv the noticeable difference is relatively minor and can be fixed with some decent AA. I will say, my 4k tv running a game at 1440p/HDR looks better than my 1440p monitor, which makes sense considering the massive price difference, but honestly, hitting that 100+ fps on the monitor with Gsync makes the game "Feel" better. Its hard to describe with words I guess.

IDK guys, I think stuff like RT, and high framerates make more sense than 4k 4k 4k. The performance drop from rendering games at a native 4k just isn't worth it for the slightly sharper image quality (which I can only barely tell is even there). IMO 1080p<<<<<<<<<1440p<4k.

Lastly, I really hope you console only people get to witness the absolute glory of HARDWARE Gsync/freesync in some of these new tvs. I am playing PC Days Gone on my 1440p monitor right now instead of the TV because of GSYNC. Playing on a regular tv and dealing with Vsync and screen tearing is just not an option for me anymore, which is a pity cuz my TV has much better colors and black levels.
You can use the AA to fix the cutscenes.
 

BadBurger

Is 'That Pure Potato'
I prefer 4K even checkerboarded with HDR for most games - especially third person open worlds. But it depends. I'll still not give up > 60 FPS 1440p all settings cranked for a FPS.
 

cheststrongwell

my cake, fuck off
Last year, I purchased a 4k tv to use as my PC monitor. The UI in my old man pc games was just way to small for me. Moved the 4k to the living room and crawled back to my sweet 1080p tv. Sorry 1080p tv, it will never happen again.
 

radewagon

Member
Say what you will about performance vs. resolution vs. blah blah blah. If time has taught me anything, it's that the best resolution is ALWAYS whatever the native resolution of your screen happens to be.
 

LiquidMetal14

hide your water-based mammals
First and foremost you really have to appreciate what the new consoles are doing as far as performance options. It's unprecedented and you cannot simply understand this because it has not been the norm. And we are back to getting 60 frames per second and solid. And that is speaking specifically on PS5 which does not have variable refresh rate yet almost all of the games, at least the first party once, or so well optimized that you don't even need that yet. But that is still a major plus amongst the bevy of new features that the new hardware allows.

And don't get me started on what you can do on a modern computer. Especially if you have an Nvidia graphics card. We are going in a really awesome direction when it comes to all these new features and rendering techniques and I'm telling you right now, you are young or old we are in quite the paradigm shift when it comes to these advancements which are extending the life of hardware and allowing for more performance on the same platforms. This has been the most exciting time for me, as somebody who observes and follows technology and software, since the mid-90s when 3D polygon graphics started taking off with the PlayStation 1 specifically.
 
Fun fact: 48" 4K = 24" 1080p in pixel density [same pixel size].

27" 2560x1440 monitor looks so much cleaner than even the smallest 4K TV that is running at native 4K.
 

Hoddi

Member
I dunno.

It seems like 4K is an optional bonus for anyone who can afford a 4K/HDR tv. That's not everyone. I'm happy with my pretty 1080 until I get that TV, which is a looong way down a long list of priorities.

I wouldn't sweat it too much. I already have a 4k OLED in the living room but I still ended up moving my PS4 Pro back to my old 1080p plasma.

1440p downsampled to 1080p looks incredibly nice no matter what anyone says. I'd much rather have that than upscaling it to 4k.
 

StreetsofBeige

Gold Member
4k is big.

Even old ass games like Skyrim and Fallout 4 using boosts and 4k mod textures looks wildly better than vanilla BC boosts like 4k on One X.

Although I dont have a PC that does DLSS. I'm just going on console improvements from 1080p --> 4k.
 

Tschumi

Member
I wouldn't sweat it too much. I already have a 4k OLED in the living room but I still ended up moving my PS4 Pro back to my old 1080p plasma.

1440p downsampled to 1080p looks incredibly nice no matter what anyone says. I'd much rather have that than upscaling it to 4k.
That's really interesting news, I've been so sure that one day a 4k hdr tv would make my games look extra amazing lolll
 

StreetsofBeige

Gold Member
Last year, I purchased a 4k tv to use as my PC monitor. The UI in my old man pc games was just way to small for me. Moved the 4k to the living room and crawled back to my sweet 1080p tv. Sorry 1080p tv, it will never happen again.
That's a problem with 4k gaming.

Play old school COD at 720/1080p and the UI and scoreboard are big.

Play recent COD or COD 4 Remastered at 4k and even on a 65" TV the UI/scoreboard are tiny as hell. Need to squint half the time. Some reason, the devs don't scale the size of the interface and charts bigger to be similar to playing older games.

If anyone wants to know what it's like to play 4k tv games with a magnifying glass, play Nowhere Prophet. It's a deck building game. I couldnt believe how small the cards and fonts were. And I'm on a 65" tv. I don't think the text would even be visible for anyone playing at under 50".
 
Last edited:
Top Bottom