native 4K 60fps would be idiotic. But most AAA games don't target that. They use tools like DRS and image reconstruction tech to output 4K without taxing the GPU for that full workload. I keep seeing people complain about devs targeting 4K and they don't seem to realize that the vast majority are not. Dynamic 4K is just smart because you keep the load on the system low while achieving an image that is nearly indistinguishable from native 4K - win win
NO DLSS - again nearly every game released with a 4K target is using some form of image reconstruction to achieve that. DLSS is just
one method for reconstruction and while that is not available on console, many devs have their own custom solution or just use Epic's solution if using UE4. FSR is also starting to make some rounds on console. Insomniac's reconstruction for Spider and Ratchet rival anything out there IMO and the IQ in those games is stellar despite whatever the
internal resolution really is.
I've said many times that the sheer obsession with pixel counts and frame counts is just sickening to me and completely misses the whole point of what playing a game is about. Ultimately, the quality of the experience in terms of what you see and feel when playing is what should matter. The way I think about it how much of the complaining and discussion would go away if devs or media like Digital Foundry never specified a resolution for their games. Ratchet & Clank still looks stellar on a 4K screen in its performance mode. Without knowing a pixel count, how many people would have just enjoyed the game instead of refusing to play in that mode because sub-1440p just sounds low in their mind? Returnal is another classic example. Before the analysis came out and said it was a 1080p internal resolution (i.e. DLSS performance mode BTW), many people were saying based on their eyes while playing that it looked 4K to them. In fact most were shocked to hear that it was
only a 1080p input. That's everything wrong with gamers today. Who the f**k cares what the number used internally in the system is. With the sophistication of the upscaling, your eyes see a great looking image so what's the problem? Should the number of pixels really matter if you can't really see it on screen? So be it...but while you will be constantly pausing your game to run up to the screen and zoom in to count pixels, I'll be just keep playing and enjoying the game
They focus on 4k and that's what i say. I am not talking about native only here.
Go look at guardian of the galaxy for example ( not like its any meaningful comparison for a lot of reasons but just as example as u see this everywhere)
"Guardians of the Galaxy PS5 in Quality Mode uses a dynamic resolution with the highest resolution found being
3840x2160 and the lowest resolution found being 2880x1620."
4k Focus, u see this in every game over additional visual settings or framerate increases. why is that? Because 4k focus. CB is great but it only works when the input resolution is already high enough or the distance from the screen is far enough because it lowers the base resolution and it shows the lower it gets.
If consoles would have had access toward DLSS and move further on it even on top of it, u should be thinking about 720p internal resolution looking like 1440p and 1080p looking like 4k and 540p looking like 1080p specially on a tv.
That's why even nintendo is busy with DLSS and intel's next gpu is having directly out of the gate DLSS 3.0 most likely, as the nvidia dlss engineers are working on it.
People just saying "dlss is just another DSR" are not seeing the point here. CB works if the resolution detail is high enough but u still sacrifice detail, DLSS will always work and even better at it which is why its needed in todays games and its why nobody on PC buys AMD gpu's anymore if it wasn't for the shortages but even then they are practically irrelevant in any metric. Intel could slam AMD out of the market if they are not improving drastically ( which they seem to be doing with there next gpu's )
Now imagine that guardians of the galaxy, 1080p 60 fps, looking like full 4k native and 60fps. with performance mode 720p, 1440p at 120fps ( if the cpu allows it ) or better just slam RT on top of it and just hold that 60 fps. or go complete nuts with next gen games by pushing visuals actually a gen forwards.
This is why i stated no DLSS as problem specially in a age where RT absolutely kills framerate and we are going to move towards it which will effect consoles even more negativity. This is why they should have never gone with AMD a simple ampere core whatever clocks would absolutely pushed them into a stable solution looking forwards, instead they got the absolute worst deal out of the GPU market that will cripple them severely for years to come.
About the 60 fps remark, while 30 fps is considered for me unplayable already for what? decade + now if not longer. I don't think consoles should focus on 60 fps to start with, its to much stress for fixed hardware and it limits visual outputs greatly. Imagine a internal resolution of 720p and 30fps on the PS5 with tensor cores. how games would look like. versus what we got now 1620p and 60 fps Huge huge leap in performance.
I think people will be for a rude awakening when next generation games come out and the performance will absolute tank on those consoles because they are simple poor designed for the future. ( another thing i mentioned on day one already ) the focus should have be on the GPU and everything else should have been secondairy. Don't get me wrong SSD is needed, but the absolute insane focus on SSD tech is just mind boggling waste of time and budget in my view.
People can count resolutions all they want, they always will be. The same goes for framerate and other things. The problem isn't the output resolution. tv tech moves forwards and they want to support it, its the way to get there and that's where they are failing hard in ( focus ). focusing on 4k is simple useless and anybody including yourself knows this. I had a 1,5k to spend on a new pc screen i could have bought everything i wanted as i got a 3080 which will boot any resolution really but i basically ended up going for 3440x1440 why? anything above 1440p absolutely annihilates gpu performance for far to small gains. the same why ultra on settings on PC is a crapshoot in some games.
And i totally agree with you on the 1080p part, i was a 1080p player with top end hardware for a long time, the only reason i moved up from 1080p to 1440p was because i wanted a bit bigger screen, if that wasn't a requirement i would still be sitting at 1080p as the ppi is only a tad bit higher on my 3440x1440 solution.
And yes most people won't be able to see differences. i tested out 1080p ultra wide resolution on this screen and frankly it looks mighty fine on the screen. It just showcases how absolute overkill all those resolutions are.