• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Mini LED to be Sony's 2024 flag ship TV

Mr.Phoenix

Member
These are exciting times!
Not for me... The one thing in this world I want seems to be something no one is trying to tackle. And if you ask me it has the potential to be the best display tech period. At least its the only one that can hit over 100% Rec.2020. Thats something every other display tech will just not be able to do.
 
I'm a technical director working on the Samsung booth at CES. They have their large MICRO LED TVs on display.

Most astonishing thing I have ever seen. For sure the OLED replacement one day. Hopefully they can get costs down fast
As always, it comes down to what average consumers will pay. Sony has demonstrated with their $20,000 studio mastering monitors you can make the most amazing perfect displays on Earth if you just invest enough and pass the costs onto the buyers. That said, yeah if you told me I was buying the best TV for the rest of my life and my investment was truly safe, yeah I'd spend the $20k for a MicroLED TV today. Your move, Samsung.
 

King Dazzar

Member
Ahead of CES 2024 tomorrow, Samsung Display has teased that the 2024 QD-OLED TV panel will achieve a peak brightness of over 3000 nits.

Here is the relevant passage from the press release sent out in anticipation of CES 2024:

- "The TV panel has been upgraded to a brighter third-generation QD-OLED, which will be showcased at CES. This new version utilizes advanced panel drive technology and artificial intelligence (AI) to achieve a maximum brightness of over 3000 nits. This makes it the brightest OLED TV panel available, surpassing previous models. Each RGB color also sees a significant increase in brightness, approximately 50% more compared to last year’s version,"


Welp the nits war is heating up enthusiasts thought OLED couldn't get past 2000 nits looks like Samsung disagrees 😆 even if abl is aggressive this TV is still going to be extremely bright this is completely unexpected I didn't expect much improvement this year from OLED.
So, I still think it would be interesting to see what it can do beyond peak brightness, with larger window sizes. But, when you consider that brightness being controllable by individual pixels, then I get very excited. However, its still not quite where it needs to be for me yet. Screen sizes are still limited to 77". And burn in was a flagged issue with the first generation of QD-OLED and its still early in its life. So for me, its great how its progressing, but still not quite there for me yet. Couple of years though...
 
Even Chinese brand miniLED blows your argument out of the water with Nvidia LDAT to boot.

0.2 ms

So laggy.

Bring that WOLED to 144Hz and It will lose.

RSNPRqd.jpg




So what is? Whole point of modern panels and race to technology are for nits.



In a cute ~10~15% window size?

Dune's Arrakis scene when Atreides clan comes to the planet for the first time, on mini LED, which is basically 95~100% full screen HDR, is not even a fair match against OLED.



Characters squint? Oh you'll squint also.



Black
crush

Shadow detail is more important. Look at that $30k reference monitor.

Sony-X95-L-Review-Chasing-OLED-with-Less-Zones-vs-Samsung-TCL-Mini-LED-TVs-14-41-screenshot.png

MiniLED LCD with also crush details in dark scenes. Worst case scenario for a MiniLED is stars. Local dimming algorithms will either remove stars or increase the brightness to the point where you will see blooming

Personally I absolutely hate blooming. Samsung QN90c


P0DkPxV.jpg
 
Last edited:

Mister Wolf

Gold Member
MiniLES LCD with also crush details in dark scenes. Worst case scenario for a MiniLED is stars. Local dimming algorithms will either remove stars or increase the brightness to the point where you will see blooming

Personally I absolutely hate blooming. Samsung QN90c


P0DkPxV.jpg

The QN90C was a dud release from Samsung. Everyone who follows them knew the QN95C was the actual flagship LCD.
 

Bojji

Member
MiniLES LCD with also crush details in dark scenes. Worst case scenario for a MiniLED is stars. Local dimming algorithms will either remove stars or increase the brightness to the point where you will see blooming

Personally I absolutely hate blooming. Samsung QN90c


P0DkPxV.jpg

Yeah I can't stand it too. That's the main reason I went OLED. I had FALD Hisense and it was horrible to me, I returned it to the store

There are tvs with thousands of dimming zones so it's not that visible but they are more expensive than oleds sometimes.
 
The QN90C was a dud release from Samsung. Everyone who follows them knew the QN95C was the actual flagship LCD.
Fair enough, but the QN90C still has a large number of dimming zones. The QN95C has even more dimming zones, but will it really make a big difference in a dimly lit room? MinLED will probably never eliminate blooming completely, but I want to know if the most expensive model can at least give a good enough (viewable) experience.
 

Mister Wolf

Gold Member
Fair enough, but the QN90C still has a large number of dimming zones. The QN95C has even more dimming zones, but will it really make a big difference in a dimly lit room? MinLED will probably never eliminate blooming completely, but I want to know if the most expensive model can at least give a good enough (viewable) experience.

Timestamped:



The QN90C is a shit version half effort by Samsung right down to it's IPS panel.
 
Last edited:

Mister Wolf

Gold Member
Thanks, I wasnt aware QN90C was that bad compared to the QN95C.

Samsung pulled a fast one preying on the misinformed. The QN90C scores lower than my first generation QN90A in contrast and blooming according to Rtings. The QN95C is the true successor of the tech. That they decided to make two different models with two different panel types for the "C" generation was the tipoff.
 
Samsung pulled a fast one preying on the misinformed. The QN90C scores lower than my first generation QN90A in contrast and blooming according to Rtings. The QN95C is the true successor of the tech. That they decided to make two different models with two different panel types for the "C" generation was the tipoff.
Have you seen the QN95C in a dimly lit room and can you give us your thoughts on its performance? According to the rtings, the QN95 has impressive blooming control overall, but they did notice some blooming in certain situations.

VjHYzNX.jpg
 
Last edited:
Have you seen the QN95C in a dimly lit room and can you give us your thoughts on its performance? According to the rtings, the QN95 has impressive blooming control overall, but they did notice some blooming in certain situations.

VjHYzNX.jpg
Maybe it's also the case where different screen sizes get different panel types.
 

Mister Wolf

Gold Member
Have you seen the QN95C in a dimly lit room and can you give us your thoughts on its performance? According to the rtings, the QN95 has impressive blooming control overall, but they did notice some blooming in certain situations.

VjHYzNX.jpg

You'll never completely escape blooming with LCD tech. Right now the Hisense UX is the best at handling it with it's 5000 dimming zones. The QN90C is just an awful example to represent LCD local dimming tech.
 
Last edited:

S0ULZB0URNE

Member
MiniLED LCD with also crush details in dark scenes. Worst case scenario for a MiniLED is stars. Local dimming algorithms will either remove stars or increase the brightness to the point where you will see blooming

Personally I absolutely hate blooming. Samsung QN90c


P0DkPxV.jpg
I never saw crushed details on any of my tv's.
 
Fair enough, but the QN90C still has a large number of dimming zones. The QN95C has even more dimming zones, but will it really make a big difference in a dimly lit room? MinLED will probably never eliminate blooming completely, but I want to know if the most expensive model can at least give a good enough (viewable) experience.

Dont listen to him. He's salty because I told him the QN90C is better than the QN90A (which he owns). He doesnt even own a QN90C so he's just talking out of his ass. I owned a QN90A since day one until I replaced it this year with a QN90C.
 

Mister Wolf

Gold Member
Dont listen to him. He's salty because I told him the QN90C is better than the QN90A (which he owns). He doesnt even own a QN90C so he's just talking out of his ass. I owned a QN90A since day one until I replaced it this year with a QN90C.

And I posted this comparison to which you didn't respond.


Why is the QN90A beating the QN90C in contrast, blooming, black uniformity, lighting zone transitions, color gamut, and HDR brightness in and out of game mode? Make it make sense.
 
Last edited:

S0ULZB0URNE

Member
Interesting comments from Vincent on Mini LED vs. OLED (timestamped):


That's his opinion.

To add to the LG OLED vs QD-OLED talk...

"That's a pity, because in a PC monitor context, if LG's WOLED panel tech has a weakness in general and also compared to Samusng's QD-OLED panels, it's full-screen brightness."

*cough*ABL

 
And I posted this comparison to which you didn't respond.


Why is the QN90A beating the QN90C in contrast, blooming, black uniformity, lighting zone transitions, color gamut, and HDR brightness in and out of game mode? Make it make sense.
Winning in technical categories but losing overall image quality impressions is what doesn't make sense to you. Especially since the QN90C does have better blooming control in real content than the QN90A/B, there was a video about it but I'm too lazy to find it right now
 

Bojji

Member
That's his opinion.

To add to the LG OLED vs QD-OLED talk...

"That's a pity, because in a PC monitor context, if LG's WOLED panel tech has a weakness in general and also compared to Samusng's QD-OLED panels, it's full-screen brightness."

*cough*ABL


Full screen brightness exceeding certain nit values is rarely needed, most of the time your screen need to be able to display low and high nit values on the same frame in correct way.

Here you have lot of 0 nit blacks and ~1000 nits flashlights:

bB2yfwb.jpg


Here is very dark scene with ~1300 nits lights behind character:

ULlBwh5.jpg


This is average scene from matrix with ~400 nit highlit:

0EBuxot.jpg


There is one scene in this movie that is beyond OLED specs:

ghdwJPV.jpg


800 nits full screen brightness. No OLED can show it correctly but this is something that is not AT ALL common in movies, white room like this are not natural to real world (it's a computer construct in movie) so i don't think not being able to display it in 800 nits is a big issue.

OLED won't display this scene in full "glory" just like mini leds will have problems with scenes containing lot of dark and bright elements on screen.

Source
 

Mister Wolf

Gold Member
Winning in technical categories but losing overall image quality impressions is what doesn't make sense to you. Especially since the QN90C does have better blooming control in real content than the QN90A/B, there was a video about it but I'm too lazy to find it right now

I bring up those technical categories because that's what the the Neo QLED line was created to address, LCDs biggest deficiencies. Contrast and Blooming. Its what all the detractors bring up. Does it seem acceptable to you that the third generation of Neo QLED line would be worse than the first generation in addressing blooming and contrast? Should more of a priority be placed on viewing angles and filmmaker modes than contrast and blooming? As far as I'm concerned it's two steps forward, three steps back. Samsung even made it obvious it was an half effort by them by releasing the 95C model alongside the 90C.
 
Last edited:

S0ULZB0URNE

Member
Full screen brightness exceeding certain nit values is rarely needed, most of the time your screen need to be able to display low and high nit values on the same frame in correct way.

Here you have lot of 0 nit blacks and ~1000 nits flashlights:

bB2yfwb.jpg


Here is very dark scene with ~1300 nits lights behind character:

ULlBwh5.jpg


This is average scene from matrix with ~400 nit highlit:

0EBuxot.jpg


There is one scene in this movie that is beyond OLED specs:

ghdwJPV.jpg


800 nits full screen brightness. No OLED can show it correctly but this is something that is not AT ALL common in movies, white room like this are not natural to real world (it's a computer construct in movie) so i don't think not being able to display it in 800 nits is a big issue.

OLED won't display this scene in full "glory" just like mini leds will have problems with scenes containing lot of dark and bright elements on screen.

Source
It drops watching HDR and regular content.
It's almost as much of a deal breaker as burn in for me.

You posted Vincent well I have some Vincent for you.

 
And I posted this comparison to which you didn't respond.


Why is the QN90A beating the QN90C in contrast, blooming, black uniformity, lighting zone transitions, color gamut, and HDR brightness in and out of game mode? Make it make sense.

All that stuff you mention the 90A beating the 90C its like barely beating it. You wanna use Rtings numbers, fine lets use their numbers: 90A score 8.4, 90C score 8.3. But you know what thats just numbers based on their testing. In real life use, like I said before, the 90C is better.

First of all 90A got the local dimming bug. Its annoying but at least theres a work around. But whats way worse than the LD bug is the way local dimming works in game mode. It overly dims and desaturates small bright elements (to reduce blooming). Games have a LOT of small bright elements on screen all the time. Some games like TLOU2 are much harder to play because the TV desaturates the health bar so I cant even tell how much life I have. The only work around for this is to run local dimming on low, which cuts the brightness in half.

So just based on the way local dimming works the 90C is better than the 90A. But when you add all the additional improvements the 90C has its not even close.

I didnt respond last time because I'm not trying to type out a novel. If you want more reason you can just watch this video which explains why 90C is better than 90B.

 
Last edited:

Mister Wolf

Gold Member
All that stuff you mention the 90A beating the 90C its like barely beating it. You wanna use Rtings numbers, fine lets use their numbers: 90A score 8.4, 90C score 8.3. But you know what thats just numbers based on their testing. In real life use, like I said before, the 90C is better.

First of all 90A got the local dimming bug. Its annoying but at least theres a work around. But whats way worse than the LD bug is the way local dimming works in game mode. It overly dims and desaturates small bright elements (to reduce blooming). Games have a LOT of small bright elements on screen all the time. Some games like TLOU2 are much harder to play because the TV desaturates the health bar so I cant even tell how much life I have. The only work around for this is to run local dimming on low, which cuts the brightness in half.

So just based on the way local dimming works the 90C is better than the 90A. But when you add all the additional improvements the 90C has its not even close.

I didnt respond last time because I'm not trying to type out a novel. If you want more reason you can just watch this video which explains why 90C is better than 90B.



I don't have a QN90B. The "B" has worse blooming and lighting zone transitions to the "A" as well. It was a regression.This so called local dimming bug you keep mentioning, you have provided nothing to substantiate it's existence. Never heard Rtings or anyone else mention it. You probably had a defective TV.
 
Last edited:
I don't have a QN90B. The "B" has worse blooming and lighting zone transitions to the "A" as well. It was a regression.This so called local dimming bug you keep mentioning, you have provided nothing to substantiate it's existence. Never heard Rtings or anyone else mention it. You probably had a defective TV.
All Samsung TV's have an algorithm which intentionally desaturate colors when the image is dimmer to enhance the effect of darkness. On their LCD TV's the purpose of this is to add "fake" improvement to local dimming. For some reason, this algorithm was carried over to the QD-OLED TV's unchanged, which makes no sense because OLED panels do not have backlights which locally dim.

The science behind doing this "fake" enhancement of darkness on Samsung TV's isn't particularly hard to understand. Anyone who's lived as a human knows that in near darkness, human vision is effectively black and white. The brighter the ambient light, the more colorful the visible spectrum becomes. This is simply because human eyeballs contain two types of light-receptors, called rods and cones. The rods are more sensitive to light and responsible for most of visual detail, but crucially, the rods do not differentiate color in the visible spectrum. That job is for the cones, which cluster in density near the fovea, the visual center, and while they are able to collect the full spectrum of visible light, providing color vision, they are also ineffective in near darkness as their light sensitivity is lower.

Samsung studied this fact of human vision and try to simulate this effect by desaturating colors the darker the image becomes. This occurs independently of local dimming and it can be seen simply by setting local dimming to "Low" and viewing a scene with a low light level and comparing it to the same scene on another brand of TV. Some people hate this Samsung desaturation algorithm, but most people simply won't notice it unless they are told it exists. Well, now you know it exists. You'll never be able to unsee it now. You're welcome.
 
Last edited:

Bojji

Member
It drops watching HDR and regular content.
It's almost as much of a deal breaker as burn in for me.

You posted Vincent well I have some Vincent for you.



It was (apparently) fixed in "3" models:



You can fix it in older tvs using service menu. Something like this only kicks in in very specific scenarios, both Got and Dune in this scenes are super dark, few nits (in the case of GoT) of brightness so it has nothing to do with OLED ability to display bright HDR.
 
Last edited:

S0ULZB0URNE

Member
It was (apparently) fixed in "3" models:



You can fix it in older tvs using service menu. Something like this only kicks in in very specific scenarios, both Got and Dune in this scenes are super dark, few nits (in the case of GoT) of brightness so it has nothing to do with OLED ability to display bright HDR.

ABL still happens
 
ABL still happens
"here is a year-old video showing a problem"
"the problem has been fixed"
"I don't care"

Honestly, I don't even know what you two are arguing about. I have three OLED, two LG and one Sony. None of them have ever, and I literally mean ever, noticabley dimmed the image in any TV show, movie, video game, or PC desktop usage.
 

S0ULZB0URNE

Member
"here is a year-old video showing a problem"
"the problem has been fixed"
"I don't care"

Honestly, I don't even know what you two are arguing about. I have three OLED, two LG and one Sony. None of them have ever, and I literally mean ever, noticabley dimmed the image in any TV show, movie, video game, or PC desktop usage.
Not that I need to point it out... but he tells you right in the video that APL-ABL still kicks in to protect the TV.

Oh look we have a it doesn't do it or I don't notice it post!

Reminds me of the burn in doesn't happen anymore posts(yet ABL is there to reduce the chances)
 

King Dazzar

Member
"here is a year-old video showing a problem"
"the problem has been fixed"
"I don't care"

Honestly, I don't even know what you two are arguing about. I have three OLED, two LG and one Sony. None of them have ever, and I literally mean ever, noticabley dimmed the image in any TV show, movie, video game, or PC desktop usage.
So there's the panel protection algorithms (ASBL) which can dim things very slowly, but there's also panel luminance restrictions, which can cause scenes to change a bit suddenly (ABL). The former has always been a thing on my OLED's which irritated me, but the latter, wasn't a problem for me personally. Some people are sensitive to it though.
 

dotnotbot

Member
Interesting comments from Vincent on Mini LED vs. OLED (timestamped):



I'm hyped about what he's showing at 5:14-5:35, new proto-G4 shows significantly less overshoot compared to G3 on the left which was horrible with this (for some reason MLA Gen1 amplified overshoot visibility compared to standard panels).

chrome_mnEME6nqkK.png
 
Last edited:
Top Bottom