That's really interesting news, I've been so sure that one day a 4k hdr tv would make my games look extra amazing lolll
The funny thing about 4k is that it's skewed to gamers to make it take off.It had to be said. 4k was a total waste of time. Unless you can get 4k with hz over 120 (which you can if you want to dump 1000+ on a screen), it's not worth it. Even if you do get the fancier monitors with 4k and high hz at the same time, im not sure if games or components would even be optimized for it?
I tried 4k for about 5 years. Couldn't wait to throw it the trash to get a 144hz 1440p screen. Now I can actually aim again in first person shooters and smoothly react to animations in combat games.
Two for 4k.
It's been the exact opposite for me. I upgraded from 24" 1080p to 27" 1440p a few years ago and I thought it was basically a zero difference. I then bought a 27" 4k display a few weeks ago and it's been a massive improvement.On pc 4K is pointless. I went from a 24” 1440p 60hz monitor to a 32” 4K 60Hz monitor and there was a difference in visual quality but small. of all The differences that monitor brought 4K was the smallest. a larger screen was nice and gsync was by far the biggest improvement. After a couple days I stopped caring about the small boost in clarity and not only that most of my games had worse performance. i can’t remember what card I had at the time but I went from being able to manage 60 or more FPS in games and was now playing some at 40-50fps with lowered settings.
I returned that 4K monitor and went with a 144hz 1440p gsync monitor. I couldn’t give up gsync and I felt 1440p was the sweet spot. and now if I wanted I could run games at 60-140fps. Even if I couldn’t manage 140fps in games then even if it ran at just 70-80fps I could do that.
for consoles…yeah get a 4K TV. I bought a 55” 4K 120Hz GSYNC tv. It’s overkill for consoles but I will be able to get the most out current gen hardware and even use the tv on my pc if I ever have the hardware to power 4K 120Hz.
I have a 4k tv with HDR (nice Samsung), and a 144hz 1440p gsync monitor (no HDR). I have tested both extensively with multiple games and I have come to the conclusion that native 4k is pointless. 1440p with DLSS 2.0 looks incredible on Control and other titles with raytracing turned on and the games run 60+ fps. Even if I just toggle between 1440p and 4k on my tv the noticeable difference is relatively minor and can be fixed with some decent AA. I will say, my 4k tv running a game at 1440p/HDR looks better than my 1440p monitor, which makes sense considering the massive price difference, but honestly, hitting that 100+ fps on the monitor with Gsync makes the game "Feel" better. Its hard to describe with words I guess.
IDK guys, I think stuff like RT, and high framerates make more sense than 4k 4k 4k. The performance drop from rendering games at a native 4k just isn't worth it for the slightly sharper image quality (which I can only barely tell is even there). IMO 1080p<<<<<<<<<1440p<4k.
Lastly, I really hope you console only people get to witness the absolute glory of HARDWARE Gsync/freesync in some of these new tvs. I am playing PC Days Gone on my 1440p monitor right now instead of the TV because of GSYNC. Playing on a regular tv and dealing with Vsync and screen tearing is just not an option for me anymore, which is a pity cuz my TV has much better colors and black levels.
Hey, fair enough, you are entitled to feel that way. But this is just the same thing, different day.
Memory Cards? Put a battery in the system to save games.
HDD? We have memory cards.
720p? 480p is fine and has been the standard forever and is in the most homes.
1080p? 720p is fine and we don't need that much resolution anyways.
Et cetera et cetera.
Until all the reconstruction techniques out there reach a certain, good level, 4K is the only guaranteed way to have a great picture quality on a 4K display.
Couldn't disagree more. Framerate is King. I cant play anything lower than 60fps.4k vs 1440p is the same for me than 60fps vs 30fps: you notice the difference when switching between them, but not when you're extensively playing.
So 4k 30fps is equally valid than 1440p 60fps when you are immersed in the game and not comparing modes.
I don't think that's a really good argument, with resolution you are just going to hit diminishing returns to the point of not even seeing a difference anymore, when it gets high enough.We say this everytime a new standard becomes a standard.... Every. Single. Time.
But it's an imaginary problem for the OP. Playing on PC you have nothing but choice. You don't have to sacrifice framerate for resolution. Consoles are almost all offering the same choice too. Having 4k as a baseline option for textures and resolution will just help them age better as time goes on. And especially because the OP mentions PC, it'll be even less of an issue going forward as new cards drop.I don't think that's a really good argument, with resolution you are just going to hit diminishing returns to the point of not even seeing a difference anymore, when it gets high enough.
Going from 360p to 1080p is a massive leap in clarity, 1080p to UHD is pretty good too, but not as impactful as the former, to the point where picking 1080p120 over UHD30 is perfectly reasonable.
By the time we can do UHD120, without any visual setting downgrades, 8K will be the "new standard"; at that point it's going to be really hard to tell the difference, unless you've got a massive screen.
I've seen 8K footage on an 80" 8K tv and it was pretty much like looking through a window, but I'd be difficult to say how much better it looks than UHD/5K footage on a 50" UHD TV.
30 fps is fine for me. I prefer 60, of course, but not at the cost of downgrading graphics.Couldn't disagree more. Framerate is King. I cant play anything lower than 60fps.