• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Sony’s 2021 TV lineup runs Google TV and fully embraces HDMI 2.1

Kuranghi

Member
Disney doesn't include Dolby Vision on their discs (usually). They only put Dolby Vision on their digital copies or Disney+. So when presented with a choice, the iTunes digital copy streams at a higher bitrate than D+ (not by much, D+ is very close) so this is the digital copy streaming on an Apple TV 4K set to Dolby Vision.

The X90J will automatically switch to Dolby Vision mode when it detects it.

Apple TV 4K does seem like the best way to view streaming content from what I've seen, but definitely try a 4K disc because HDR10 discs are going to be better overall quality than DV streaming, even without the dynamic tonemapping, because they have a lot less compression artifacts.

The DV picture mode is still a mode you can change settings on, only some of the settings are greyed out/fixed, default settings will be the best it can look without calibration though so its probably for the best you didnt change it anyway, except you should turn off Smooth Gradation if its on as it doesn't work properly with DV streaming or otherwise, it erases a lot of detail even on low.
 
Last edited:

ManaByte

Member
Apple TV 4K does seem like the best way to view streaming content from what I've seen, but definitely try a 4K disc because HDR10 discs are going to be better overall quality than DV streaming, even without the dynamic tonemapping, because they have a lot less compression artifacts.

The DV picture mode is still a mode you can change settings on, only some of the settings are greyed out/fixed, default settings will be the best it can look without calibration though so its probably for the best you didnt change it anyway, except you should turn off Smooth Gradation if its on as it doesn't work properly with DV streaming or otherwise, it erases a lot of detail even on low.

I haven't tried a UHD disc on the PS5 yet, so decided this was a good chance (I also have a XSX, and an old Sony X800 that won't do DV). So these are just in HDR10.

So here's a disc I literally just opened up, Braveheart:
AtaJcQ1.jpg
 

Kuranghi

Member
I haven't tried a UHD disc on the PS5 yet, so decided this was a good chance (I also have a XSX, and an old Sony X800 that won't do DV). So these are just in HDR10.

So here's a disc I literally just opened up, Braveheart:
AtaJcQ1.jpg

Looks nice detail-wise but his skin is a little blue/purple, it could be camera doing that, but if its like that IRL then I recommend switching colour temperature to at least Warm or Expert (Not sure its still called that on newer sets tbf), whites will seem yellow/too warm at first vs the cool temp you've been seeing but after 5 minutes or so your eyes will adjust and whites should look nice and white again, but creamier than before.

Here I've chucked the image into basic photo editor and warmed up the colour temperature a bit and reduced saturation a tiny bit too, super crude and just as an example, but see how blue/purple he was before but you didnt really notice until you see a warmer version (yours on the right):

XKQYHKP.jpg
AtaJcQ1.jpg


Thats without doing any advanced colour calibrating specific to your panel, which will make the skin tones much more accurate if done, but even without those tweaks setting colour temperature to Warm/Expert will give you a more accurate image overall and remove the purple/blue cast. This is a perfect example of how it makes a big difference as well because he has the blue face paint on, because now his face is way more 3D when its not blue like the paint, makes it pop more and gives a higher perceived contrast.
 

ManaByte

Member
Looks nice detail-wise but his skin is a little blue/purple, it could be camera doing that, but if its like that IRL then I recommend switching colour temperature to at least Warm or Expert (Not sure its still called that on newer sets tbf), whites will seem yellow/too warm at first vs the cool temp you've been seeing but after 5 minutes or so your eyes will adjust and whites should look nice and white again, but creamier than before.
It's my camera. Color is set to expert.
 
Last edited:

Kuranghi

Member
X80J review released on RTINGS to the public: https://www.rtings.com/tv/reviews/sony/x80j

IPS panel. Don't think Sony is ever bringing back a VA panel to the X800 series. In game mode, input lag is pretty much the same across the board.

uEBpBcv.png

You could sort of get a VA panel 80 series by buying specifically the 49" version of the 85-series in the past (85E-85H definitely, but rtings didnt review the 85H, last one was 85G and they gave it ~14ms for input lag on average). Flatpanelshd haven't said yet whether the 85J has continued that trend yet though (IPS for all other sizes, VA for 49") yet though. So we'll need to wait and find out.

I always used to sell the 49" 85 series as a poor mans 90-series, since even though it was still edge lit vs FALD you got the improved contrast and shadow detail of the VA panel from the 90-series. It was the worst when I told them how much better then viewing angle was on the 85 series vs. the 90-series and then they decided to go up to a 49", doh!
 

Kuranghi

Member
Holy crap I wanted to know what that Braveheart shot looked like natively and I found this website where they have thousands of screen grabs from different films and TV series, don't know if thats useful to anyone so thought I'd post it:


braveheart(1995)_4307.jpg


Really shows you how much off-screen images/video isn't in anyway representative of what you see in person. The peeling paint on his forehead looks awesome.

ManaByte ManaByte Not sure if you've tried Ozark on Netflix but definitely check that out for a really bright and dynamic Dolby Vision presentation to show off your new telly. Its great to show how bright the large highlights get on LCD, one of the few scenarios where LCD beats out OLED for HDR pop.

Anyone else have suggestions to show off a new telly?

I highly recommend trying the HDR in Jedi Fallen Order on your XSX, some of the best I've seen in a game, I played on PC, but I presume XSX/PS5 would be the same. It gives great depth to the cutscenes and lets you actually see things in the poorly lit caves, when its supposed to be black its jet black as well. The red laser field door thingies look amazing.
 
Last edited:
It's what I did for the X90H. Got it last September for $840 and some change.

Best damned TV for the price, hands down. Wait for the price drops, you won't be disappointed.

This set is going to be a massive upgrade all around. Wait until you see the picture, color accuracy and those inky blacks for an LED.

I went from another Sony 4K 120hz edge lit, and while the picture clarity was comparable, the 10bit panel's colors and blacks were a major upgrade. Speed of the OS was a major upgrade as well.
You got a 55 or 49 inch 900h?
 

Kuranghi

Member
Yep, was brand new off the Mexico assembly line literally a couple of weeks prior. Sony had a deal at the time where the price difference between the 55 and 65 was only like $50, combined with a military/first responder discount that was run, it was perfect timing.

I love those deals, had a time when there was a pricing error on the system for a 65" Sony AF8 (OLED), making it the same price as the 55", and they had to honor it until the end of the day because of the way they advertised it. I think it was £1799 for both so I called up all my customers who were waiting on price drops for (all 3 of them! lol) 65" OLEDs and told them to come and buy today or miss out on the best deal ever.

One guy kept coming back in for weeks after and looking at the 2.5k 65", he was at work and didnt reply til the next day :messenger_grimmacing_ I wanted to buy a bunch of them but store manager was pissed I was even calling my customers in to buy it.
 
Last edited:


Not using evo panel after all. Just heatsink. Room for improvement next year then!

No peak brightness measurements yet, until the full review. I had already decided, but i'm more confident in my choice to stick with A8H.
 
Last edited:


Not using evo panel after all. Just heatsink. Room for improvement next year then!

No peak brightness measurements yet, until the full review. I had already decided, but i'm more confident in my choice to stick with A8H.


Does he talk about the G1 power consumption in his review of it? I’d be curious to know. His logic behind it not using the “evo” panel was Sony’s statement as well as the power consumption. Well, the Sony statement is kind of murky because LGD doesn’t call it the evo panel, LGE calls it that. Sony wouldn’t call it the evo panel.
 

ManaByte

Member
I actually gave in and bought the Skywalker Saga despite my desire to see the OT in theatrical format. It looks great.

The color timing on them are PEFECT. This is what they looked like in the 80s. Some people are too used the to DVDs/Blu-Rays that messed up the colors. This feels like a warm blanket. They're breathtaking and I still am in awe that I'm seeing them like this again so many years later.

The 4K discs ALSO finally fixed the audio. The trumpets sound on the Death Star approach again.
 
Does he talk about the G1 power consumption in his review of it? I’d be curious to know. His logic behind it not using the “evo” panel was Sony’s statement as well as the power consumption. Well, the Sony statement is kind of murky because LGD doesn’t call it the evo panel, LGE calls it that. Sony wouldn’t call it the evo panel.
Yeah, in his g1 review he noted reduced power consumption compared to the cx and c9 as well.
 
As Vinny says it, they can’t produce enough of the new panels yet but eventually everything will transition to it and he noted the possibility of even later model c1 oleds having the new panel as well as the g1.

If that happens, I suppose it would be possible for later produced Sony 2021 oleds to have it as well. But take all this with a grain of salt.

But I think all of next year models could have the new evo panel.
 
Last edited:

DeaDPo0L84

Member
I have a 65" CX which I know isn't a Sony but the other OLED thread doesn't have much traffic so I'll ask here.

I only just got my TV yesterday and it's my first OLED. I only have a soundbar and a 4k movie player currently. Do I need a 2.1 hdmi cable for the 4k player or is there no benefit?

Sorry in advance for the potentiality dumb question.
 

Bo_Hazem

Banned
I have a 65" CX which I know isn't a Sony but the other OLED thread doesn't have much traffic so I'll ask here.

I only just got my TV yesterday and it's my first OLED. I only have a soundbar and a 4k movie player currently. Do I need a 2.1 hdmi cable for the 4k player or is there no benefit?

Sorry in advance for the potentiality dumb question.

It's pretty cheap so it won't hurt to get an HDMI 2.1 cable.
 
I have a 65" CX which I know isn't a Sony but the other OLED thread doesn't have much traffic so I'll ask here.

I only just got my TV yesterday and it's my first OLED. I only have a soundbar and a 4k movie player currently. Do I need a 2.1 hdmi cable for the 4k player or is there no benefit?

Sorry in advance for the potentiality dumb question.
No ; 4K blu rays have full chroma on hdmi 2.0 because it’s only 24fps.

2.1 is for games.
 
Last edited:
No, Sony themselves say the X1 SoC and the XR SoCs are different.

The XR SoC is brand new for 2021.
Don’t bother ; dude has ignored my posts on input lag and processing settings. That’s why he keeps repeating “its for movies” lol. He thinks smooth gradation adds lag, tempted to put him on ignore to be honest.
 

dotnotbot

Member
A lot of people on avsforum also reported grid and tint on A90J.

Grid is the worst cause it seems like every panel made now has it to some extent, but how strong it looks is yet another variable in panel lottery. I went through 5 OLEDs, different brands and models, and only 1 was acceptable in my book - still visible but faint enough to be not distracting in games like Journey.
 
Last edited:

Kuranghi

Member
Its great to hear from VT that he thinks XR is an upgrade over X1U in terms of upscaling, smooth gradation/superbitmapping, and tonemapping but I'm not a fan of the forced always-on motion settings even if he said he didnt notice artifacts in real content, because I know he won't have tested many games so wonder how that fairs with complex and extremely unpredicable game imagery.

I suppose its par for the course with OLEDs seemingly becoming the standard for quite a while going forward (Based on the miniLED panel price predictions), OLED motion is fantastically clear but thats brought new problems for low framerate content and they want to strike a balance between clarity and perceived stutter.

I just don't like forced settings, now if it does cause problems you can't turn it off, always better to have the choice imo, unfortunately that goes against the majority opinion who will take a minor cost to simplify things a lot, which I get too, there are many areas of my life where I just want it to work and I don't care about the minutiae.

Now I want to see the XR processors across the range compared to see if there is a difference in them, maybe don't compare XR OLEDs to XR LCDs now though since I think the LCDs won't have the forced motion stuff due to LCDs not suffering from perceived stutter nearly as much.
 
Top Bottom