Dust-by-Monday
Member
Still looks worse than 4K on a 27-28 inch monitorNot on a 27 inch monitor…
Still looks worse than 4K on a 27-28 inch monitorNot on a 27 inch monitor…
I see.That's not quite what I meant, because the Sony X1 Ultimate and XR 8K screens will attempt to enhance the source material to occupy more unique pixels even if it starts at 480p, but they don't need 1:1 dimming zones as demonstrated by how well 600 - 800 clusters does at 4K - or even +30 zones on your own do, the dimming can be much lower orders than 4 or 8K.
I suspect the reason 600 zones is still very good is that when you look at the channels of a histogram of a scene image, rarely are there situations - even with a star field - where the channels don't overlap with lots of common base intensity across most of the image, This should allow the local dimming zones to focus on the delta intensities - although I imagine it is far more complicated as it is probably taking the current - and two previous images - and minifying them to an image the size of the master backlight drive zones to get a base and zone delta intensity, and then using AI to predict the next minified frame's zone intensities, and then using that info to bias the current frame zone intensity - so they can minimise zone bloom between successive frames.
My point about needing at least 3200 zones/clusters in the Z9K to maintain the same level of controlled local dimming at 8K as the Z9D still sounds right to me, and unless that comparison in the video I linked is completely false even counting with the coarser 3x3 sized unique clusters/zones, I still think the Z9K is going to have about 5k-10k zones, or is going to be a huge disappointment for what it costs and how it has been marketed as a big step up from the Z9D.
It will be. As soon and Microsoft and Sony finish converting everyone to upscaling to be honest. 1440p to 4k is the zone to upscale to achieve minimal quality loss. Anything beyond that is very noticeable. When the herd is properly used to upscaling you will see a quiet standard implemented to upscale games that are too demanding to naturally push 4k.I wish 1440 was standard for consoles now and all that extra horsepower went into the games.
Must be awesome to play Cyberpunk on a 32:9! Could you post a pic?Only game I play that doesn’t support it is fall guys but I think that’s more to keep shit fair.
There are going to be issues with older games they for sure don’t like the aspect ratio stuff like Dragon age Inquisition look horrible.
Your going to need hardware that can do 4k gaming and your needed fps. Something like destiny runs at about 120ish fps on my 3080ti/rysen 7 5800x. Stuff like cyber punk runs at 60-70 maxed. Can drop some setting if u need faster
I'm not surprised at all by the cost cutting zone counts since the release of the Z9D(for 4K HDR), because every 5-10years or so - when there's a TV paradigm shift, Full HD->3D->4K->4K HDR->8K HDR->8K cognitive processing->8K miniLeD) Sony seemingly launches a new flagship TV that is costing more to make than they want at the entry size - probably subbed by a marketing budget - and then proceed to add cheaper features on the follow ups like, processing or a hdmi spec bump or refresh rate option, etc, while subbing out luxury build and the high cost expensive parts, if possible, And usually try using more DSP - which is a free moore's law gain on a follow up model - to offset a lot of the image quality losses by - say - dropping a quantum dot panel or master backlight drive feature. I suspect Sony's focus is only at the top-end of LCD and OLED model, and would happily let someone else make their non-flagship screens - including the x95K - to their specs and pay them for using the design and name.I see.
Well look at this way, z9k doesn't cost any more than z9j did, and they're adding mini led and many more zones, so at the very least it's more performance for the money.
My guess is around 2000 zones or less, but we will see when the reviews FINALLY come out because wow it's taking a long time!
2000~ zones with Sony's superior algorithm should be amazing and better than anything else for accurate blooming control. Sony likes to talk about it's not how many zones you have, but how you use them, which is another reason I would be shocked if z9k had 5000 zones. They have a point (once you see how terrible the 2000 zone lg mini led performs) but at the same time the amount of zones they've been using for their 4k TV's has been really disappointing, sans z9D.
framedrop?dat F R A M E D R O P
What games can you play at native 4K without framerate issues? My PC is struggling even at 1440p. DLSS and G-sync saves the day though.
I swear we're going through this every decade or so:
-1280x1024 is too much, 800x600 is fine;
-1920x1080 is a waste, 1680x1050 is just as good;
-4K is pointless, 1440p is the sweet-spot;
-8K...
Will do I’ll try get it tonight after workMust be awesome to play Cyberpunk on a 32:9! Could you post a pic?
I decided to postpone the super ultra wide purchase for now, a bit too pricey and it would’ve just ended with a new graphics card too, my PC is way weaker than what you have.
I went with a cheap (in comparison) 1440p 27” 16:9 screen instead, nothing to go wow over at all but it still feels like an upgrade due to the bigger screen size and modern screen tech. Been playing some Cyberpunk and it’s really pushing what my PC can handle. Native res is out of the question, hovers around 55-65fps in crowded areas with RT Ultra if I use DLSS Quality, without DLSS it’s like 30fps…
At this point I'm convinced it's just 100% weirdos who sit way too close to their screen and get off on having a really clear view of the 10% of their screen they can actually see.I don’t think it’s the same at all though. There really is big diminishing returns. I can tell a bigger difference between 1080p vs 1440p, then I can for 1440p vs 4K.
Even though 1440p to 4K is a greater pixel multiplier
My 3090 goes through 4K like a knife through hot butterdat F R A M E D R O P
What games can you play at native 4K without framerate issues? My PC is struggling even at 1440p. DLSS and G-sync saves the day though.
I am an older gamer at 30 years old with an extra 25 years of experience with old eyes and when sitting side by side even at 27" native 4k shines.I agree if you are playing on a 27" monitor. Different on a 65" TV
dat F R A M E D R O P
What games can you play at native 4K without framerate issues? My PC is struggling even at 1440p. DLSS and G-sync saves the day though.
Not at max settings with ray tracing it’s not…I know because I have one too. Keep lying to yourselfMy 3090 goes through 4K like a knife through hot butter
Max settings are for old/undemanding games. On current games it's just wasted performance, except for a few settings, and texturesNot at max settings with ray tracing it’s not…I know because I have one too. Keep lying to yourself
I run COD Cold War at 4K max settings with ray tracing included just fine. Not sure what else to tell you.Not at max settings with ray tracing it’s not…I know because I have one too. Keep lying to yourself
How's your PC run metro Exodus enhanced out of curiosity? Do you target 60 or 120 on games typically?I run COD Cold War at 4K max settings with ray tracing included just fine. Not sure what else to tell you.
I’ve never played Metro but I only have a 60hz monitor (with freesync/ gsync).How's your PC run metro Exodus enhanced out of curiosity? Do you target 60 or 120 on games typically?
Oh man yeah I bet you are cutting through 4k like butter!I’ve never played Metro but I only have a 60hz monitor (with freesync/ gsync).
Upgrade time your missing out. 60hz is wackoI’ve never played Metro but I only have a 60hz monitor (with freesync/ gsync).
To be honest with you, 30 fps doesn’t even bother me lol. I always chose more eye candy over frame rateUpgrade time your missing out. 60hz is wacko
I do agree Celcius , you should get a nice 120hz OLED haha.Upgrade time your missing out. 60hz is wacko
Haven’t tried that. Some kind of tunnel vision?VRS is a thing now a days.
The parts of the scene that needs to be 4K is 4K quality and when the HW can't keep up the parts of the scene that the player is not focused on are rendered at a lower quality. Best of both worlds.
Yeah now we’re talking, THAT I can understand, at that point I would want 4K too, and it was a long time since I had 20/20 vision.It's been few weeks since I've changed my previous PC monitor(s) to 42 inch LG C2. When sitting this close to screen this big, anything else than 4k looks not so great.
To be honest with you, 30 fps doesn’t even bother me lol. I always chose more eye candy over frame rate
Must be awesome to play Cyberpunk on a 32:9! Could you post a pic?
I decided to postpone the super ultra wide purchase for now, a bit too pricey and it would’ve just ended with a new graphics card too, my PC is way weaker than what you have.
I went with a cheap (in comparison) 1440p 27” 16:9 screen instead, nothing to go wow over at all but it still feels like an upgrade due to the bigger screen size and modern screen tech. Been playing some Cyberpunk and it’s really pushing what my PC can handle. Native res is out of the question, hovers around 55-65fps in crowded areas with RT Ultra if I use DLSS Quality, without DLSS it’s like 30fps…
Click link delete everything after and including the #No idea know what you did there or if it’s Imgur, that looks like 240p on this end lol
Don’t screen cap, just take a photo with your phone showing the beast sitting on the desk.
Used to play on three 24” screens, it’s the without-bezel look that makes a 32:9 screen so appealing to me.