• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Questions About Chroma Subsampling, Color Depth, HDR, And Refresh Rate

BluRayHiDef

Banned
I'm currently using a TCL 55R635 as a 4K monitor, which is a 55" 4K TV with a 120Hz panel. However, it can convey a 120Hz signal at only 1440p and below (with or without HDR enabled); at 4K, it's limited to 60Hz. According to its official specifications, the TV supports Variable Refresh Rate (VRR) and Auto Low Latency Mode (ALLM); I don't know how to verify when VRR is active, but I do know when ALLM is active because when I start a game on my Xbox One X, it displays a notification that it has activated the TV's game mode.

Anyhow, I'd like to know what I'm missing out on in terms of image quality due to the TV's limited combinations of color formats, color depth, and refresh rate, etc, which I will explain.

When using Windows 10, the default resolution, refresh rate, and color settings in Nvidia Control Panel are as follows:

Resolution: 4k x 2k, 3840 x 2160 (native)
Refresh rate: 60Hz
Desktop Color Depth: 32-bit
Output Color Depth: 8 bits per color
Output Color Format: RGB
Output Dynamic Range: Full

If I want to set the output color depth to 10 bits per color or 12 bits per color, I have to set the output color format to either yCbCr 422 or yCbCr 420, which sets the dynamic range to Limited.

Setting the output color format to yCbCr 444 limits the color depth to 8 bits per color and the dynamic range to Limited.

However, if I lower the refresh rate to 30Hz or lower, I can set the output color depth to 10 bits per color or 12 bits per color in combination with the yCbCr 444 color format or with the RBB color format (Limited or Full).

When I set the resolution to 2560 x 1440, the refresh rate is locked to 120Hz and the default color settings that I listed above are loaded.

Also, whenever I force Windows 10 to output HDR, my TV's HDR mode is activated but the default color settings listed above are maintained, which is puzzling because the default output color depth is 8 bits per color even though HDR is 10 bits per color (or 12 bits per color for Dolby Vision). What's the explanation for this?

I'm personally content with this display despite these limitations; however, I am curious. So, what am I missing out on? What is the maximum bandwidth that my TV's HDMI ports can covey based on this information?
 

Kuranghi

Member
I'm currently using a TCL 55R635 as a 4K monitor, which is a 55" 4K TV with a 120Hz panel. However, it can convey a 120Hz signal at only 1440p and below (with or without HDR enabled); at 4K, it's limited to 60Hz. According to its official specifications, the TV supports Variable Refresh Rate (VRR) and Auto Low Latency Mode (ALLM); I don't know how to verify when VRR is active, but I do know when ALLM is active because when I start a game on my Xbox One X, it displays a notification that it has activated the TV's game mode.

Anyhow, I'd like to know what I'm missing out on in terms of image quality due to the TV's limited combinations of color formats, color depth, and refresh rate, etc, which I will explain.

When using Windows 10, the default resolution, refresh rate, and color settings in Nvidia Control Panel are as follows:

Resolution: 4k x 2k, 3840 x 2160 (native)
Refresh rate: 60Hz
Desktop Color Depth: 32-bit
Output Color Depth: 8 bits per color
Output Color Format: RGB
Output Dynamic Range: Full

If I want to set the output color depth to 10 bits per color or 12 bits per color, I have to set the output color format to either yCbCr 422 or yCbCr 420, which sets the dynamic range to Limited.

Setting the output color format to yCbCr 444 limits the color depth to 8 bits per color and the dynamic range to Limited.

However, if I lower the refresh rate to 30Hz or lower, I can set the output color depth to 10 bits per color or 12 bits per color in combination with the yCbCr 444 color format or with the RBB color format (Limited or Full).


When I set the resolution to 2560 x 1440, the refresh rate is locked to 120Hz and the default color settings that I listed above are loaded.

Also, whenever I force Windows 10 to output HDR, my TV's HDR mode is activated but the default color settings listed above are maintained, which is puzzling because the default output color depth is 8 bits per color even though HDR is 10 bits per color (or 12 bits per color for Dolby Vision). What's the explanation for this?

I'm personally content with this display despite these limitations; however, I am curious. So, what am I missing out on? What is the maximum bandwidth that my TV's HDMI ports can covey based on this information?

It seems to prioritise the RGB/444 over true 10-bit output when you turn on HDR, probably because dropping to 422 or lower has much more impact on the image quality than going from 10-bit to 8-bit does, thats because it applies dithering to the 8-bit signal to give you close to what you would've gotten from a true 10-bit output on your 10-bit panel (I checked it and its true 10-bit, a lot of lower end HDR TVs have 8-bit panels, but they use something called Frame Rate Control, which is kind of like dithering, to get the benefits of a 10-bit HDR signal).

Going from RGB/444 to 422 (or especially down to 420) is a much bigger quality loss when you are gaming, when you are watching a 4K bluray its fine though as those are encoded in 420 anyway. If you go to the page where you turn on HDR in Windows you can click on "advanced display settings" at the bottom and even though nvidia control panel reports thats its outputting 8-bit when HDR is turned on you can see that its actually "8-bit + dithering" so it can show most of the colour gradients of the 10-bit HDR signal it is getting from a game/application.

So technically I think changing your chroma to 422, then setting it to 12-bit colour and full dynamic range (12-bit isn't really needed because the signal is 10-bit but I don't think it allows you to do 10-bit for some reason) when you want to watch a 4K HDR video/iso in VLC/mpv/a proper HDR video player will give you the best image quality.

I personally don't really bother with the hassle of changing it over because I did a few comparisons and I didn't notice much difference between 8-bit+dithering & 12-bit and full/limited. I think the dynamic range of a 4K bluray would be limited anyway since its meant for display on a TV which in the past didn't have the full 0-255 range as an option. You still get the BT.2020 colour space regardless and the dynamic range of the brightness isn't affected afaik (Someone else correct me if you know otherwise).

I have read that you actually get more colour banding (bad gradients) with a 10-bit output over 8-bit+dithering in some (all?) cases but not sure exactly why that is. Hope that helps.
 

BluRayHiDef

Banned
It seems to prioritise the RGB/444 over true 10-bit output when you turn on HDR, probably because dropping to 422 or lower has much more impact on the image quality than going from 10-bit to 8-bit does, thats because it applies dithering to the 8-bit signal to give you close to what you would've gotten from a true 10-bit output on your 10-bit panel (I checked it and its true 10-bit, a lot of lower end HDR TVs have 8-bit panels, but they use something called Frame Rate Control, which is kind of like dithering, to get the benefits of a 10-bit HDR signal).

Going from RGB/444 to 422 (or especially down to 420) is a much bigger quality loss when you are gaming, when you are watching a 4K bluray its fine though as those are encoded in 420 anyway. If you go to the page where you turn on HDR in Windows you can click on "advanced display settings" at the bottom and even though nvidia control panel reports thats its outputting 8-bit when HDR is turned on you can see that its actually "8-bit + dithering" so it can show most of the colour gradients of the 10-bit HDR signal it is getting from a game/application.

So technically I think changing your chroma to 422, then setting it to 12-bit colour and full dynamic range (12-bit isn't really needed because the signal is 10-bit but I don't think it allows you to do 10-bit for some reason) when you want to watch a 4K HDR video/iso in VLC/mpv/a proper HDR video player will give you the best image quality.

I personally don't really bother with the hassle of changing it over because I did a few comparisons and I didn't notice much difference between 8-bit+dithering & 12-bit and full/limited. I think the dynamic range of a 4K bluray would be limited anyway since its meant for display on a TV which in the past didn't have the full 0-255 range as an option. You still get the BT.2020 colour space regardless and the dynamic range of the brightness isn't affected afaik (Someone else correct me if you know otherwise).

I have read that you actually get more colour banding (bad gradients) with a 10-bit output over 8-bit+dithering in some (all?) cases but not sure exactly why that is. Hope that helps.

Very informative post that's very fun to read. Thanks a lot.

I guess that since it's a TV and is therefore meant to be used to primarily watch movies (which are 24Hz) and shows (which are 30Hz), it was designed to be able to process yCbCr 444 (Full and Limited) with 10 bits and 12 bits per color at 24Hz and 30Hz. So, in that regard, it doesn't make any compromises.

It's only when it has to process refresh rates higher than 30Hz that it compromises the color output.
 

Kuranghi

Member
Very informative post that's very fun to read. Thanks a lot.

I guess that since it's a TV and is therefore meant to be used to primarily watch movies (which are 24Hz) and shows (which are 30Hz), it was designed to be able to process yCbCr 444 (Full and Limited) with 10 bits and 12 bits per color at 24Hz and 30Hz. So, in that regard, it doesn't make any compromises.

It's only when it has to process refresh rates higher than 30Hz that it compromises the color output.

No problems at all I like writing that stuff if it helps someone out.

You are right on the money with the 24/30 output, thats why Sony hasn't prioritised HDMI 2.1 in its 2020 models even though they have a HDMI 2.1 console out the same year. I'll speak to Sony because its what I know best.

They view their TVs as primarily for movie & broadcast watching + streaming video so they don't care about supporting modes that very few people will take advantage of. Thats why they lag behind (no pun intended lol) with input lag figures compared to Samsung and LG.

Samsung have to turn off a lot of image processing, and more recently gimp their aggressive local dimming in game mode so the image doesn't look as good in game mode compared to a Sony in game mode.

LG don't have to worry about dimming on their OLEDs but they still sacrifice a bit of IQ to get those sub-10ms figures, so don't look as good as a Sony in game mode again, even an LCD (For OLED owners about to murder me for lying I mean the image processing/gradient handling, I know an OLED will smash an LCD in many areas of picture quality).

The way Samsung and LG handle game mode isn't for me, I like a low input lag like anyone but gaining 10-15ms of reaction speed (basically imperceptible imo) but making the IQ worst to do it is a terrible trade off. A lot of people won't notice the difference though, its why they buy Samsung or LG over Sony/Panasonic in the first place. For people looking at LG, perceived small/no differences in IQ vs. saving 100s of £/$/€ is too attractive. Samsung buyers are being completely conned to pay more for less quality in almost every comparable feature between an it and a Sony. There used to be a few areas where Samsung excelled but they either gimped those things or removed them in 2019/2020 and even then there are 2016/2017 Sonys that still smash the Q9FM/Q9FN/Q90R PQ-wise, although they were hard to come by new by 2018 so people still went with Samsung.

Actually you know what, when you want to watch a 4K bluray in windows its probably quicker to switch your refresh rate to 24hz before turning on HDR, I think then Windows will default to outputting RGB/444 + 12-bit because it can see it has the bandwidth on the HDMI 2.0 connection because the framerate is (more than) halved. A bit quicker than messing about with the nvidia chroma/colour bit depth output since you have to change them in the right order.

Let me know if that works for you if you bother with it. Personally I don't think you need to though if your TV handles pulldown from 60 to 24 properly, but 24hz videos would look terriblly juddery if it wasn't (ie randomly changing frametimes, not the general 24 stutter which should be a consistent sort of skip during medium/fast camera pans) so you'd know straight away.

Christ I've waffled about Sony again, sorry lol.
 
Top Bottom