• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[HDTVTest] Sony A95K QD OLED is here. Unboxing + Measurements

Kuranghi

Member
Although obviously tiny/small highlights are better on OLED to begin with the 10% is a bit disappointing, I hoped it could be 1200 nits at least to get closer to the best LCDs. Even if it did then you'd still have the crappy 25% and 50% window brightness values OLED panels seem bound by, even with it being this new Samsung Display panel.

Do we really need to wait for MicroLED to get the best of both (better than both really)? Or can OLED panels be made that remove these ABL limitations? If its power draw related then will MicroLED be bound by that even if it could do brighter large windows?

I wonder if electro-luminiscent quantum dots (QDEL) will come before MicroLED and how they will compare. I've watched videoes from Nanosys on this but I'm not sure how close it is to reality for general consumers.

I'm not meaning to be a Negative Narinder about this, I'm sure the 90% BT.2020 is awesome in terms of how it could improve the image, but devs rarely even use that extra colour volume anyway so won't be seen that much outside of demos made for it. I want more depth/better more accurate HDR reproduction really, as thats by far the best difference between SDR and HDR for me (thats when the HDR output is actually really good, which isn't that often unfortunately). Coming from an SDR direct-lit TV to my current FALD was amazing and now when I watch something that has well-done HDR and have to switch back to SDR is such a step down when you've seen how good it can look with HDR, the new Lego Star Wars game or Jedi Fallen Order being good examples, the increase in the depth of the image is stunning, 3D without glasses when you're in the dark (and being stoned increases it further lol).

edit: He's saying here it was 5 years from "meaningful display applications" and that was 2017, so I guess a few more years from now at least for proper consumer products? :(




6hcc0z.jpg
 
Last edited:

kyliethicc

Member
Although obviously tiny/small highlights are better on OLED to begin with the 10% is a bit disappointing, I hoped it could be 1200 nits at least to get closer to the best LCDs. Even if it did then you'd still have the crappy 25% and 50% window brightness values OLED panels seem bound by, even with it being this new Samsung Display panel.

Do we really need to wait for MicroLED to get the best of both (better than both really)? Or can OLED panels be made that remove these ABL limitations? If its power draw related then will MicroLED be bound by that even if it could do brighter large windows?

I wonder if electro-luminiscent quantum dots (QDEL) will come before MicroLED and how they will compare. I've watched videoes from Nanosys on this but I'm not sure how close it is to reality for general consumers.

I'm not meaning to be a Negative Narinder about this, I'm sure the 90% BT.2020 is awesome in terms of how it could improve the image, but devs rarely even use that extra colour volume anyway so won't be seen that much outside of demos made for it. I want more depth/better more accurate HDR reproduction really, as thats by far the best difference between SDR and HDR for me (thats when the HDR output is actually really good, which isn't that often unfortunately). Coming from an SDR direct-lit TV to my current FALD was amazing and now when I watch something that has well-done HDR and have to switch back to SDR is such a step down when you've seen how good it can look with HDR, the new Lego Star Wars game or Jedi Fallen Order being good examples, the increase in the depth of the image is stunning, 3D without glasses when you're in the dark (and being stoned increases it further lol).

edit: He's saying here it was 5 years from "meaningful display applications" and that was 2017, so I guess a few more years from now at least for proper consumer products? :(




6hcc0z.jpg


Most people would probably have issues watching a TV with over 1000 nits HDR in a dark room.

I don’t find many normal people saying their TV doesn’t get bright enough.

Anything around 1000 peak is more than good enough for 95% of people.
 

Kuranghi

Member
Guys, guys, guys.

Come round and see my ZD9/Z9D so you don't just think I'm mental going on about large screen brightness/HDR impact of my telly, okay?

giphy.gif
steve-harvey-crazy.gif
 

Kuranghi

Member
Most people would probably have issues watching a TV with over 1000 nits HDR in a dark room.

I don’t find many normal people saying their TV doesn’t get bright enough.

Anything around 1000 peak is more than good enough for 95% of people.

Caaaamm annn don't give me that silliness man, you know more about tellys/HDR than that surely?

That first sentence makes no sense, my ZD9 doesn't constantly put out 1600 nits, or even 1000 nits, in a dark room, small portions of the image could be that bright, but probably only for a short time or the centre of very small highlights. When I'm talking about 25% and 50% windows being ~1000 nits it would be up to the content producer to make sure it was used properly and/or for an amount of time that isn't unpleasant to watch.

I agree 1000 nits full field in a dark room would be eye-searingly painful.
 

DeepEnigma

Gold Member
Same QD OLED panel as the S95B.

Bit lower peak brightness than the Samsung, about ~ 960 vs ~ 1070 nits.

Has a heatsink tho, so it clears up image retention faster.

Excellent color gamut.

8 ms input lag for 120 Hz.

It accepted a 1080p 144 Hz signal but he’s not sure if it works correctly or is even intended to.
Bolded is a Sony TV staple.

Early tests are sounding good.
 

kyliethicc

Member
Caaaamm annn don't give me that silliness man, you know more about tellys/HDR than that surely?

That first sentence makes no sense, my ZD9 doesn't constantly put out 1600 nits, or even 1000 nits, in a dark room, small portions of the image could be that bright, but probably only for a short time or the centre of very small highlights. When I'm talking about 25% and 50% windows being ~1000 nits it would be up to the content producer to make sure it was used properly and/or for an amount of time that isn't unpleasant to watch.

I agree 1000 nits full field in a dark room would be eye-searingly painful.
Talk to normies. They often say HDR is too bright in a dark room, and they‘re usually looking at like just 500 nits peak.
 

Kuranghi

Member
Talk to normies. They often say HDR is too bright in a dark room, and they‘re usually looking at like just 500 nits peak.

Ahh okay, got you. I'm thinking thats because they have their TV in submental-mode though most of the time, if I set my ZD9 to Vivid then its completely unwatchable in a dark room with SDR as well, but you shouldn't do that anyway.

If I see anyone here setting their TV to Vivid/Dynamic I'll give their botty's a good tanning. Especially if they're beautiful females with nice big spankable arses. If they're men then I'll still do it but I won't enjoy it.

My experience is that people who own any panel type say dark scenes in HDR are unwatchable outside of a dark room, when set up for accuracy. I can't see HDR properly at all even with just one window uncurtained in my living room in sunny Scotland (this is sarcasm, its very dull here a lot of the time).

So, are when you coming over to mine? I've got fancy beers/booze and the devil's lettuce ready to go, we can watch 3 straight hours of HDR demos :messenger_tears_of_joy:
 
Although obviously tiny/small highlights are better on OLED to begin with the 10% is a bit disappointing, I hoped it could be 1200 nits at least to get closer to the best LCDs. Even if it did then you'd still have the crappy 25% and 50% window brightness values OLED panels seem bound by, even with it being this new Samsung Display panel.

Do we really need to wait for MicroLED to get the best of both (better than both really)? Or can OLED panels be made that remove these ABL limitations? If its power draw related then will MicroLED be bound by that even if it could do brighter large windows?

I wonder if electro-luminiscent quantum dots (QDEL) will come before MicroLED and how they will compare. I've watched videoes from Nanosys on this but I'm not sure how close it is to reality for general consumers.

I'm not meaning to be a Negative Narinder about this, I'm sure the 90% BT.2020 is awesome in terms of how it could improve the image, but devs rarely even use that extra colour volume anyway so won't be seen that much outside of demos made for it. I want more depth/better more accurate HDR reproduction really, as thats by far the best difference between SDR and HDR for me (thats when the HDR output is actually really good, which isn't that often unfortunately). Coming from an SDR direct-lit TV to my current FALD was amazing and now when I watch something that has well-done HDR and have to switch back to SDR is such a step down when you've seen how good it can look with HDR, the new Lego Star Wars game or Jedi Fallen Order being good examples, the increase in the depth of the image is stunning, 3D without glasses when you're in the dark (and being stoned increases it further lol).

edit: He's saying here it was 5 years from "meaningful display applications" and that was 2017, so I guess a few more years from now at least for proper consumer products? :(




6hcc0z.jpg

These are 1st gen qd-oled panels that can only be manufactured exclusively in very specific oled production factories, with very specific specilised manufacturing machines of which at the moment, there are very few. Until about a month ago, qd-oled were considered to be a rarity due to the abysmally low yield of less than 40%. It was after Samsung announced a milestone that allowed them to dramatically improve yields to above 75% that things took a turn for the better. 1st gen oled panels also had issues with brightness ,it took a while to push the tech enough to improve the quality of the oled panels. Give them a year or two and things will certainly improve. Nanosys spoke about and presented slides about advanced variants of quantum dot panels such as self emmisive qd displays and gave an estimate for when they would arrive ,that estimate hasn't been met yet.

Quantum dot display technology is very challenging and tricky to work with due to the nature of quantum dots, that's why certain quantum dot displays such as self emissive ones are still in heavy r&d. Qd-oled will certainly get better and Samsung already has a headstart with the technology and is supplying some big names as well. Msi, Dell for gaming monitors, Sony(at least for now) for tvs. Microled won't be seen in retail consumer products until at least halfway through the decade of the 2020's due to the process for mounting the leds and the leds themselves being very expensive, which is why they've only been deployed in industrial products only(e.g. Sony's CLEDIS).
 
Last edited:
I'm legitimately dumbstruck as to why no 2022 OLED offers 120hz black frame insertion. That automatically places them all as non options for me.

Lg c1 will have the best OLED motion for two years and a row.

For qd OLED I can understand as Samsung afaik has not offered 120hz bfi yet on any display. But why is lg so incompetent this year? It's just bizarre.

If interested in qd OLED, my advice is to wait for next year. And if you want an LG OLED get the c1 while you can.
 
Last edited:

dotnotbot

Member
Interesting how much things have changed over time. You just can't evaluate OLED TV properly without giving it some time to settle in.
 
Last edited:

sankt-Antonio

:^)--?-<
What do you turn down, the OLED Light setting? Brightness? This is for HDR or SDR... or both?
Its called screen brightness on Sony olds I think, I only watched HDR on that set. Peak brightness is only relevant for day time viewing at night 800nits is more than you would ever need. Our eyes perceive brightness in relation to the surrounding light levels.
 

kyliethicc

Member
I'm legitimately dumbstruck as to why no 2022 OLED offers 120hz black frame insertion. That automatically places them all as non options for me.
niche feature used by very few

Doesn't seem worth over the S95 unless you really need that heatsink. Or love Sony products.
Issue with the Samsung is inaccurate picture, even in filmmaker mode. Lotta extra processing.
 
Last edited:

DeepEnigma

Gold Member
What do you turn down, the OLED Light setting? Brightness? This is for HDR or SDR... or both?
It's called 'Brightness', you can have it with a quick toggle on the menu that pops up (should be there by default) when you hit the settings button, and you can quickly change it by 10 point increments (10, 20, 30, etc).

You can fine tune in-between those increments in the display settings main menu itself.

I personally keep modern Sony sets at 20-25 for SDR, and when HDR kicks it, it maxes it, but the quick setting can adjust on the fly.
 

Reallink

Member
Still hard to believe Sony, who once designed specialized processors for everyone elses electronics, can't build their own video and HDMI processor after 3 or 4 years of eating shit. Mediatek has probably cost them 10's or 100's of thousands of TV sales. I would have bought at least 2 Sony OLED's by this point if not for that. Can't imagine why anyone would pay their price premium for 1* HDMI port (*eARC eats the second), inferior VRR range (LOL 48Hz), missing support (Dolby Vision 120), and double the input lag of LG/Samsung.
 
Last edited:

dotnotbot

Member
Still hard to believe Sony, who once designed specialized processors for everyone elses electronics, can't build their own video and HDMI processor.

This processor is fine though. Don't confuse it with X90H processor. This one has full 4k@120 Hz support with VRR and all other features like very useful Smooth Gradation.
 
Last edited:

scraz

Member
Well I'm just happy that after 2 years of patches my sony X900H actually works perfectly fine at 4k 120hz.
 
Although obviously tiny/small highlights are better on OLED to begin with the 10% is a bit disappointing, I hoped it could be 1200 nits at least to get closer to the best LCDs. Even if it did then you'd still have the crappy 25% and 50% window brightness values OLED panels seem bound by, even with it being this new Samsung Display panel.

Do we really need to wait for MicroLED to get the best of both (better than both really)? Or can OLED panels be made that remove these ABL limitations? If its power draw related then will MicroLED be bound by that even if it could do brighter large windows?

I wonder if electro-luminiscent quantum dots (QDEL) will come before MicroLED and how they will compare. I've watched videoes from Nanosys on this but I'm not sure how close it is to reality for general consumers.

I'm not meaning to be a Negative Narinder about this, I'm sure the 90% BT.2020 is awesome in terms of how it could improve the image, but devs rarely even use that extra colour volume anyway so won't be seen that much outside of demos made for it. I want more depth/better more accurate HDR reproduction really, as thats by far the best difference between SDR and HDR for me (thats when the HDR output is actually really good, which isn't that often unfortunately). Coming from an SDR direct-lit TV to my current FALD was amazing and now when I watch something that has well-done HDR and have to switch back to SDR is such a step down when you've seen how good it can look with HDR, the new Lego Star Wars game or Jedi Fallen Order being good examples, the increase in the depth of the image is stunning, 3D without glasses when you're in the dark (and being stoned increases it further lol).

edit: He's saying here it was 5 years from "meaningful display applications" and that was 2017, so I guess a few more years from now at least for proper consumer products? :(




6hcc0z.jpg

I understand the sentiment but peak color luminance is doubled or even tripled across the board. Nit values are only calculated for full white, which is pretty rare in any content aside from things like hockey.

As cool as micro LED tech is I don’t see it being the next big thing or else we would’ve seen a harder drive to consumer viability and not $50k panels years after the tech was announced.

Even though the screen is technically dimmer, I had the same reactions going from a r615 tcl to an a90j. I put the tcl in the bedroom and I refuse to watch anything meaningful since the IQ is miles better

I was hoping the QD-OLED tech would be more impressive out of the gate but I’m hoping there is a bigger jump by the time I upgrade in 3-4 years
 
Last edited:

Reallink

Member
This processor is fine though. Don't confuse it with X90H processor. This one has full 4k@120 Hz support with VRR and all other features like very useful Smooth Gradation.

The halving or quartering of resolution they had to do to support 120Hz was criminal (literally, civil/consumer lawsuit territory), and while definitely the crown jewel of Mediashit's garbage SoC, note I did not mention it in my list of grievances. Everything I pointed out is still present in Sony's entire 2022 range. Only 1 usable HDMI 2.1 port, double the input lag of LG/Samsung, a VRR minimum of 48Hz (as opposed to 40 for Samsung/LG--a HUGE deal for everything but PS5), and no DV 120 support.

As cool as micro LED tech is I don’t see it being the next big thing or else we would’ve seen a harder drive to consumer viability and not $50k panels years after the tech was announced.

MicroLED is the end game, they just can't solve the manufacturing challenges. They sell every $100k+ wall they can make, they simply can't figure out how to produce them at scale. I think it's probable direct current emmisive quantum dot displays will take over TV's while MicroLED's will be relegated to the luxury home theater market (currently dominated by 5 figure JVC and Sony front projectors).
 
Last edited:

dotnotbot

Member
The halving or quartering of resolution they had to do to support 120Hz was criminal (literally, civil/consumer lawsuit territory), and while definitely the crown jewel of Mediashit's garbage SoC, note I did not mention it in my list of grievances. Everything I pointed out is still present in Sony's entire 2022 range. Only 1 usable HDMI 2.1 port, double the input lag of LG/Samsung, a VRR minimum of 48Hz (as opposed to 40 for Samsung/LG--a HUGE deal for everything but PS5), and no DV 120 support.

Well:
- Higher input lag, but also the best gradient handling on the market, Vincent mentions that in every Sony OLED review. The difference is really noticeable. 8/16 ms (60/120 Hz) lag is still very good and in the low enough territory for me (only one frame) so I don't care.
- DV 120 has horrible color banding on LG OLEDs
- it supports all features that you need for PS5 and that VRR range should be good enough for PC gaming as well

And there are all other advantages of Sony's processing and accuracy out of box.
 
Last edited:

kyliethicc

Member
Well:
- Higher input lag, but also the best gradient handling on the market, Vincent mentions that in every SONY OLED review. The difference is really noticeable. 8/16 ms (60/120 Hz) lag is still very good and in the low enough territory for me (only one frame) so I don't care.
- DV 120 has horrible color banding on LG OLEDs
- it supports all features that you need for PS5 and that VRR range should be good enough for PC gaming as well

And there are all other advantages of Sony's processing and accuracy out of box.
Exactly. None of those issues matter for PS5 users and you get all the benefits of Sony’s TVs.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Sitting here with a frikken VA Ultrawide VA monitor from 2021 that is under the HDR600 specification.
I aint gonna buy another screen till Ultrawide 4K QDUXPOLEDs cost ~500 dollars.

TV tech moves so fast and makes me feel so out of date constantly ive just resigned myself to being happy with what Ive got for now.


<--- Lies gonna get a [the best technology] big screen for the living room when I move to a bigger apartment


P.S What would the standard Ultrawide after Ultrawide 1440p actually be called?

2560x1080 - WFHD 1080p in 16:9 = (FHD)
3440x1440 - WQHD 1440p in 16:9 = 2K (QHD)
5120x2160 - WUHD 2160p in 16:9 = 4K (UHD)

So shouldnt the 5120x2160 resolution be WUHD aka Ultrawide 4K?
 

kyliethicc

Member
Sitting here with a frikken VA Ultrawide VA monitor from 2021 that is under the HDR600 specification.
I aint gonna buy another screen till Ultrawide 4K QDUXPOLEDs cost ~500 dollars.

TV tech moves so fast and makes me feel so out of date constantly ive just resigned myself to being happy with what Ive got for now.


<--- Lies gonna get a [the best technology] big screen for the living room when I move to a bigger apartment


P.S What would the standard Ultrawide after Ultrawide 1440p actually be called?

2560x1080 - WFHD 1080p in 16:9 = (FHD)
3440x1440 - WQHD 1440p in 16:9 = 2K (QHD)
5120x2160 - WUHD 2160p in 16:9 = 4K (UHD)

So shouldnt the 5120x2160 resolution be WUHD aka Ultrawide 4K?
This is a TV thread lol

keep the weirdo shit outta here
 
Last edited:

Neofire

Member
Close is relative. Right now Samsung's game mode is wildly inaccurate. I wouldn't even consider buying it until it's fixed.

eotf-game-small.jpg
In that case one would come out much cheaper and similar performance with the A90J. The A95k is just way too expensive especially for the nuance differences. Between it the A90J and still(imo) the S95B even with it's inaccuracy while playing games.
 
I'm legitimately dumbstruck as to why no 2022 OLED offers 120hz black frame insertion. That automatically places them all as non options for me.

Lg c1 will have the best OLED motion for two years and a row.

For qd OLED I can understand as Samsung afaik has not offered 120hz bfi yet on any display. But why is lg so incompetent this year? It's just bizarre.

If interested in qd OLED, my advice is to wait for next year. And if you want an LG OLED get the c1 while you can.
Exactly.
 

Salz01

Member
I really want to know if auto dimming occurs while playing games as it does on the other Sony Oleds.
 

dotnotbot

Member
In that case one would come out much cheaper and similar performance with the A90J. The A95k is just way too expensive especially for the nuance differences. Between it the A90J and still(imo) the S95B even with it's inaccuracy while playing games.

For enthusiasts the worst quirks A90J had were inherent "features" of LG's WRGB OLED panels:
- white subpixel overshoot
- "Venetian Blinds" effect
- uniformity issues like serious pink tint

So A95K is for those who don't really care about the price but want that A90J on steroids with improved panel. If you want Sony's goodness for a reasonable price then A80K/A75K/A80J should be a much better choice.
 
Last edited:

Kuranghi

Member
It's called 'Brightness', you can have it with a quick toggle on the menu that pops up (should be there by default) when you hit the settings button, and you can quickly change it by 10 point increments (10, 20, 30, etc).

You can fine tune in-between those increments in the display settings main menu itself.

I personally keep modern Sony sets at 20-25 for SDR, and when HDR kicks it, it maxes it, but the quick setting can adjust on the fly.

Thats digital black level though, not the "backlight" brightness (I know it has no backlight), on an LG OLED you change "OLED Light" if you want to reduce the overall brightness, if you lower/increase the "Brightness" value from its default of 50 you are crushing/blowing out shadow detail. Or do you mean "OLED Light"?

Its called screen brightness on Sony olds I think, I only watched HDR on that set. Peak brightness is only relevant for day time viewing at night 800nits is more than you would ever need. Our eyes perceive brightness in relation to the surrounding light levels.

Yes on Sonys "Brightness" is the backlight level, Contrast is the digital white level and Black Level is the digital black level. Samsung and LG have confusing names for digital black level.

The bolded part is not true in any way, if it were then why would content be mastered to 1000, 4000 and 10,000 nits? You need a higher and higher top end to make it so you don't have to tonemap the image to the capabilities of the display, I don't necessarily see more detail on my ZD9 with its 1500 nits peak brightness vs. an 800-nit OLED (because the tonemapping will/should work with the displays capabilities as said above) but I do see a more faithful representation of the image in highlight detail (not darker/shadowed areas near bright areas, LCDs big weakness). So when I'm playing Horizon Forbidden West and looking at a fire/the sun on my ZD9 vs. my OLED then it looks more striking and realistic, like an actual fire on the screen.

A lot of streaming movie and TV content is actually only mastered to/limited to 1000 nits so thats a lot of my ZD9s brightness capability being left on the table but games have way more freedom in this regard, most games you can achieve peaks of up 4000 or even 10000 nits so outside of really dark (LCD) challenging scenes I much prefer playing HDR games on my ZD9 over the OLED.
 

DeepEnigma

Gold Member
Thats digital black level though, not the "backlight" brightness (I know it has no backlight), on an LG OLED you change "OLED Light" if you want to reduce the overall brightness, if you lower/increase the "Brightness" value from its default of 50 you are crushing/blowing out shadow detail. Or do you mean "OLED Light"?



Yes on Sonys "Brightness" is the backlight level, Contrast is the digital white level and Black Level is the digital black level. Samsung and LG have confusing names for digital black level.

The bolded part is not true in any way, if it were then why would content be mastered to 1000, 4000 and 10,000 nits? You need a higher and higher top end to make it so you don't have to tonemap the image to the capabilities of the display, I don't necessarily see more detail on my ZD9 with its 1500 nits peak brightness vs. an 800-nit OLED (because the tonemapping will/should work with the displays capabilities as said above) but I do see a more faithful representation of the image in highlight detail (not darker/shadowed areas near bright areas, LCDs big weakness). So when I'm playing Horizon Forbidden West and looking at a fire/the sun on my ZD9 vs. my OLED then it looks more striking and realistic, like an actual fire on the screen.

A lot of streaming movie and TV content is actually only mastered to/limited to 1000 nits so thats a lot of my ZD9s brightness capability being left on the table but games have way more freedom in this regard, most games you can achieve peaks of up 4000 or even 10000 nits so outside of really dark (LCD) challenging scenes I much prefer playing HDR games on my ZD9 over the OLED.
I meant OLED Light, was going off LED memory in their menus.
 

sankt-Antonio

:^)--?-<
I really want to know if auto dimming occurs while playing games as it does on the other Sony Oleds.
Thats digital black level though, not the "backlight" brightness (I know it has no backlight), on an LG OLED you change "OLED Light" if you want to reduce the overall brightness, if you lower/increase the "Brightness" value from its default of 50 you are crushing/blowing out shadow detail. Or do you mean "OLED Light"?



Yes on Sonys "Brightness" is the backlight level, Contrast is the digital white level and Black Level is the digital black level. Samsung and LG have confusing names for digital black level.

The bolded part is not true in any way, if it were then why would content be mastered to 1000, 4000 and 10,000 nits? You need a higher and higher top end to make it so you don't have to tonemap the image to the capabilities of the display, I don't necessarily see more detail on my ZD9 with its 1500 nits peak brightness vs. an 800-nit OLED (because the tonemapping will/should work with the displays capabilities as said above) but I do see a more faithful representation of the image in highlight detail (not darker/shadowed areas near bright areas, LCDs big weakness). So when I'm playing Horizon Forbidden West and looking at a fire/the sun on my ZD9 vs. my OLED then it looks more striking and realistic, like an actual fire on the screen.

A lot of streaming movie and TV content is actually only mastered to/limited to 1000 nits so thats a lot of my ZD9s brightness capability being left on the table but games have way more freedom in this regard, most games you can achieve peaks of up 4000 or even 10000 nits so outside of really dark (LCD) challenging scenes I much prefer playing HDR games on my ZD9 over the OLED.
Do you stare into the sun in daylight? You don't need to be blinded by a TV at night for convincing HDR.
Anything mastered at 4000 or above is basically "just give it everything the panel's got" there is no detail in a 4000nits frame of the sun, its literally blinding... not an argument for quality TV viewing.
I'm talking watching at night, like in a cinema for what movies have been made for, for a hundred years now - no projector in a cinema is anywhere near 4000 nits. Its something you think you want but don't actually want/need (for nighttime viewing). Sony's VPL-GTZ380 80K$ projector maxes out at 800-1000nits. It's enough for nighttime viewing.
 
Top Bottom