• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Sony’s 2021 TV lineup runs Google TV and fully embraces HDMI 2.1

Bojanglez

The Amiga Brotherhood
AFAIK it's only present on EU/UK firmware, Sony claims it's becuase of some EU law.

I have several TVs and none show this annoying message even on latest firmware I call this bullshit

In Vincent's recent interview with the TCL exec this was covered. He mentioned that TCL were given a G rating by the EU standards agency because they refused to implement these messages, whereas other companies may wish to have a 'better' rating and do so by implementing these messages. It's annoying as fuck, I'm no legal expert, but they should just have a global setting to tick a box that says you understand altering values can increase power usage and tick that box once and be done with it. Maybe as it is Android TV it is something that is actually accessible in Developer Options menu?

Here is the video (timestamped) where they discuss it

 

Bojanglez

The Amiga Brotherhood
Looks like the 'fix' for Dennon and Marantz receivers that had issues with HDMI 2.1 has been announced. It is a separate hardware box to process the HDMI 2.1 signal 🤮


I guess at least they have some kind of fix and hopefully 2021+ models will be fine (and have more '8K' ports)
 

Bojanglez

The Amiga Brotherhood
I have seen that the X90J has popped up on quite a few UK online retailer's stores in the last few days, ready for pre order. It seems like it will be officially available 15th May.

Martin Dawes - https://www.martindawes.com/catalogsearch/result/?q=+x90j
Box - https://www.box.co.uk/products/keywords/x90j/ex/true
Seven Oaks S&V - https://www.sevenoakssoundandvision.co.uk/cssearch.aspx?searchterm=x90j

55" start at £1,249.00 (some list at £1,399)
65" start at £1499 (some list at £1,799)

See what happens in the coming days when some of the bigger retailers get them listed.
 

S0ULZB0URNE

Member
Looks like the 'fix' for Dennon and Marantz receivers that had issues with HDMI 2.1 has been announced. It is a separate hardware box to process the HDMI 2.1 signal 🤮


I guess at least they have some kind of fix and hopefully 2021+ models will be fine (and have more '8K' ports)
I stopped listening to suspect reviews and stopped buying Denon.

I'm sure Pioneers offerings will handle this HDMI fiasco better.
 

dotnotbot

Member
In Vincent's recent interview with the TCL exec this was covered. He mentioned that TCL were given a G rating by the EU standards agency because they refused to implement these messages, whereas other companies may wish to have a 'better' rating and do so by implementing these messages. It's annoying as fuck, I'm no legal expert, but they should just have a global setting to tick a box that says you understand altering values can increase power usage and tick that box once and be done with it. Maybe as it is Android TV it is something that is actually accessible in Developer Options menu?

Here is the video (timestamped) where they discuss it



Thanks, this sounds reasonable, though from what I see all new Sonys available for purchase right now (X90J, A80J and A90J) still have G rating.
These new EU ratings are so stupid that I haven't seen a single TV rated higher than G. Energy requirements are absurdly low.
 
Last edited:

Kuranghi

Member
Thanks, this sounds reasonable, though from what I see all new Sonys available for purchase right now (X90J, A80J and A90J) still have G rating.
These new EU ratings are so stupid that I haven't seen a single TV rated higher than G. Energy requirements are absurdly low.

I remember us all having a laugh at the 2018 Q900R energy label because the 85" version was like an E, while all other sets in the shop were C at worst. I can't find it online though.

It is strange you can't just have a checkbox somewhere to acknowledge you don't care about the power consumption, forever. Fucking EU laws like that are stupid bollocks.
 

dolabla

Member
Some numbers being posted over on a reddit thread.



It's got higher SDR/HDR peak brightness and higher contrast ratio than last years X900H according to some guy on there.
 
Last edited:

Bo_Hazem

Banned
Some numbers being posted over on a reddit thread.



It's got higher SDR/HDR peak brightness and higher contrast ratio than last years X900H according to some guy on there.


Yeah seems like it around last years X950H, which was 1,200nits. Any improvement would be a plus anyway, my panel is 500-nits and usually melts my eyes as I'm usually in a dim room.
 

dolabla

Member
Yeah seems like it around last years X950H, which was 1,200nits. Any improvement would be a plus anyway, my panel is 500-nits and usually melts my eyes as I'm usually in a dim room.
Yep, it should easily blow away my X800D. Maybe the regret for selling my X900E will finally go away now, lol. Pretty happy to hear these early numbers though. Sounds like input lag is a little bit higher than the X900H. It looks like that happened with the X950H last year too.
 

Bo_Hazem

Banned
Yep, it should easily blow away my X800D. Maybe the regret for selling my X900E will finally go away now, lol. Pretty happy to hear these early numbers though. Sounds like input lag is a little bit higher than the X900H. It looks like that happened with the X950H last year too.

At 4K@120Hz, even when playing at 60fps or 30fps, input lag should be around 6-7ms. So if you're extremely into it that should do. My X700D is 32ms, yours is slightly higher (I know because I was going to buy the X800D then went with X700D which was the fastest 4K TV along with X750D).
 
Last edited:

kyliethicc

Member
Looks like Sony recently cut the prices on most of their 2021 TVs. I've seen retailers selling them at lower prices, but it looks like Sony made those prices official now across the lineup.

 

dolabla

Member
Looks like Sony recently cut the prices on most of their 2021 TVs. I've seen retailers selling them at lower prices, but it looks like Sony made those prices official now across the lineup.

I just noticed yesterday they started selling from the Sony site again. Not just tv's either, but cameras, bluray players, etc. That is something completely new. Guess they've re-opened the Sony store again? Got to wonder if they're going to be selling PS5's on there :pie_thinking:
 

kyliethicc

Member
I just noticed yesterday they started selling from the Sony site again. Not just tv's either, but cameras, bluray players, etc. That is something completely new. Guess they've re-opened the Sony store again? Got to wonder if they're going to be selling PS5's on there :pie_thinking:
Seems like that site is new, ya.

And they've got their own Direct PS Shop just for hardware and games.

 

Bo_Hazem

Banned
Sony X90J or LG CX?

I'll most definitely upgrade my OLED this year to a one which has hdmi 2.1.

All depends on how much you use your tv and what kinda content. To me it's always an LCD because I use it for extremely long sessions as a PC monitor with so much statics, and my gaming sessions can extend up to 24+ hours straight on weekends/vacations.
 

Bojanglez

The Amiga Brotherhood
An interview with some UK/EU Bravia execs from the AVForums podcast...



I have watched it all and almost all of the questions asked could be answered by anyone that has read this thread. So no a lot new. The one little bit of info that was interesting is that they claim the wait for VRR is down to the HDMI 2.1. standard for implementing the feature has not been officially completed, and therefore it will only be released once that is done 🤷‍♂️
 

Bojanglez

The Amiga Brotherhood
All depends on how much you use your tv and what kinda content. To me it's always an LCD because I use it for extremely long sessions as a PC monitor with so much statics, and my gaming sessions can extend up to 24+ hours straight on weekends/vacations.
😲 24 hours straight! Last time I was afforded anywhere near that was when I was at Uni.
 

Bo_Hazem

Banned
😲 24 hours straight! Last time I was afforded anywhere near that was when I was at Uni.

My max was 51-hour straight playing MGS5 on release, skipping work for 2 days + weekend while at it and getting a salary penalty after that, lol. Sipping coffee after another! That made my base the 1# in the world for like 2 weeks, which was accidental as I didn't even knew what that crown above my username and 1 mean. Thought it meant that I'm a noob and still level 1 because I didn't invade any base at that time.
 

Bo_Hazem

Banned
An interview with some UK/EU Bravia execs from the AVForums podcast...



I have watched it all and almost all of the questions asked could be answered by anyone that has read this thread. So no a lot new. The one little bit of info that was interesting is that they claim the wait for VRR is down to the HDMI 2.1. standard for implementing the feature has not been officially completed, and therefore it will only be released once that is done 🤷‍♂️


I believe that as well, you can see even the cable with PS5 wasn't having HDMI 2.1 branding. Actually the very first certified HDMI 2.1 cable was released few months ago!
 

Bojanglez

The Amiga Brotherhood
My max was 51-hour straight playing MGS5 on release, skipping work for 2 days + weekend while at it and getting a salary penalty after that, lol. Sipping coffee after another! That made my base the 1# in the world for like 2 weeks, which was accidental as I didn't even knew what that crown above my username and 1 mean. Thought it meant that I'm a noob and still level 1 because I didn't invade any base at that time.
A Herculean effort Bo. Closest to that for me was (pre having a wife or kids) taking a week off work to play Killzone 2 multiplayer, there was a medal in the game that I needed to get my platinum trophy, but it required being ranked in the top 1% in the world for a week. So I spent a a whole week hammering it, I played a lot anyway, but this was next level. It was so fun doing it and luckily resulted in me getting the plat (y)
 

Bo_Hazem

Banned
A Herculean effort Bo. Closest to that for me was (pre having a wife or kids) taking a week off work to play Killzone 2 multiplayer, there was a medal in the game that I needed to get my platinum trophy, but it required being ranked in the top 1% in the world for a week. So I spent a a whole week hammering it, I played a lot anyway, but this was next level. It was so fun doing it and luckily resulted in me getting the plat (y)

My Omani friend literally has the most time played in Killzone SF for quite some years! I can't have that dedication for online MP. But did good with one and done (one go run) on The Crew day one of the competitions:

The-Crew-20151217195636.jpg


The-Crew-20151217203735.jpg
 

dotnotbot

Member
Vincent just dropped A90J review. It's mostly great but 4k 120 Hz mode in HDR has noticeable banding (~17:15):


So there are still some issues with bandwitdth on this Mediatek chipset.
 
Last edited:

DeepEnigma

Gold Member
Yeah seems like it around last years X950H, which was 1,200nits. Any improvement would be a plus anyway, my panel is 500-nits and usually melts my eyes as I'm usually in a dim room.
I have my X90H set to 20 for brightness for SDR, and it took me 2 weeks for my eyes to adjust to the much brighter TV compared to the 4 year older Sony 4K that I had. I could literally feel the sore strain in them the whole time that lessened and went away.

Then I read from these reviewers that review everything through tools and retail store level lit rooms, that then say, "It may not be bright enough for some," and I shake my head. What do people have their TVs set on? Retail Demo Mode all the time? Even at 20 it is bright enough (maybe for now?), and it hits the HDR nicely when it maxes everything.
 
Last edited:

Bo_Hazem

Banned
I have my X90H set to 20 for brightness for SDR, and it took me 2 weeks for my eyes to adjust to the much brighter TV compared to the 4 year older Sony 4K that I had. I could literally feel the sore strain in them the whole time that lessened and now went away.

Then I read from these reviewers that review everything through tools and retail store level lit rooms, that then say, "It may not be bright enough for some," and I shake my head. What do people have their TVs set on? Retail Demo Mode all the time? Even at 20 it is bright enough, and it hits the HDR nicely when it maxes everything.

Yeah even my old 500-nits TV's literally melt my eyes. But there are some people who have extremely large windows with no curtains and have to compensate for that via higher brightness. My room is usually dim with 3 layers of curtains. :lollipop_tears_of_joy:
 
Last edited:

dotnotbot

Member
I don't know why would anybody want to buy a premium expensive 120hz TV lacking VRR in 2021.

Well, it has some noticeable picture quality advancements over LG. Better near-black handling, much better color gradation, more impactful HDR (Vincent explains why, brightness numbers aren't everything), better upscalling and video processing in general. Depends what you prioritize.

Panasonic will probably use the exact same Mediatek chipset as Sony and they also state that some things will be fixed with an update. VRR will be present day one on their OLEDs, but if you enable it in 4k 120 mode then vertical resolution will be halved. Panasonic seems confident that this is fixable at a later date. Seems like both companies are struggling with Mediatek issues and either are waiting for something to be fixed or working on a solution but it just takes time.
 
Last edited:

DeepEnigma

Gold Member
Yeah even my old 500-nits TV's literally melt my eyes. But there are some people you have extremely large windows with no curtains and have to compensate for that via higher brightness. My room is usually dim with 3 layers of curtains. :lollipop_tears_of_joy:
I can see that. I have very large/tall windows myself (over 10ft ceilings), but also curtains that can drown out light. When I upgrade again next year or so, I will probably go through the eye strain thing for a short bit all over again, lol.

Rinse and repeat.
 

HeisenbergFX4

Gold Member
Yeah even my old 500-nits TV's literally melt my eyes. But there are some people you have extremely large windows with no curtains and have to compensate for that via higher brightness. My room is usually dim with 3 layers of curtains. :lollipop_tears_of_joy:

Thats my home office it has a ton of natural light coming in and through the day I do wish my 900h was overall brighter but in the evening its more then enough.

2tYWLSc.jpg
 
Vincent just dropped A90J review. It's mostly great but 4k 120 Hz mode in HDR has noticeable banding (~17:15):


So there are still some issues with bandwitdth on this Mediatek chipset.

They are not smart as Philips who use a customised MT9950 while Sony uses a shitty MT5895. Have fun waiting for the VRR and gaming features updates.
 
I don't know why would anybody want to buy a premium expensive 120hz TV lacking VRR in 2021.
I would if I didn't already have a sony oled, since I use bfi while gaming so VRR is a no go. Allm is a more glaring omission for me. But the heatsink and the new processor, hdmi 2.1 are big upgrades for me.

Have A8h currently, which has banding in the left side of my panel. Shit won't go away even though I used panel refresh a bunch of times. If Sony doesn't replace it, I may have to cash in my warranty at best buy during BF for the A90J, and pay the difference, whatever that is. Really hope Sony doesn't screw me though as i'd like to wait longer to upgrade.
 

kyliethicc

Member
I have my X90H set to 20 for brightness for SDR, and it took me 2 weeks for my eyes to adjust to the much brighter TV compared to the 4 year older Sony 4K that I had. I could literally feel the sore strain in them the whole time that lessened and went away.

Then I read from these reviewers that review everything through tools and retail store level lit rooms, that then say, "It may not be bright enough for some," and I shake my head. What do people have their TVs set on? Retail Demo Mode all the time? Even at 20 it is bright enough (maybe for now?), and it hits the HDR nicely when it maxes everything.
And the X90J gets even brighter than the X900H. For SDR in low-medium light, no one needs these new TVs above 50%.

But I can totally get why such high brightness is needed for HDR in rooms with medium-high sunlight.

Apple just put out a new Mini-LED iPad than can peak at 1600 nits. If that was a TV sized panel it'd be blinding lol.
 

HeisenbergFX4

Gold Member
And the X90J gets even brighter than the X900H. For SDR in low-medium light, no one needs these new TVs above 50%.

But I can totally get why such high brightness is needed for HDR in rooms with medium-high sunlight.

Apple just put out a new Mini-LED iPad than can peak at 1600 nits. If that was a TV sized panel it'd be blinding lol.

Would love those ipad specs in an HDMI 2.1 somewhat affordable monitor
 

dolabla

Member
I have my X90H set to 20 for brightness for SDR, and it took me 2 weeks for my eyes to adjust to the much brighter TV compared to the 4 year older Sony 4K that I had. I could literally feel the sore strain in them the whole time that lessened and went away.

Then I read from these reviewers that review everything through tools and retail store level lit rooms, that then say, "It may not be bright enough for some," and I shake my head. What do people have their TVs set on? Retail Demo Mode all the time? Even at 20 it is bright enough (maybe for now?), and it hits the HDR nicely when it maxes everything.

You aren't kidding :messenger_grinning_smiling:. I got my brightness set to 20 too. And it's still bright as hell.

Enjoy. I'm still impressed by it. Been playing Spider-Man Remastered to finish up the Platinum and it's stunning on it.
It is a really impressive set. It absolutely smokes my X800D in picture quality. I finally got a chance to watch some Dolby Vision content on my (barely used) Panasonic UB820 player last night and it was amazing. Been testing random SDR and HDR games on it and am in awe. Ghost of Tsushima looks as glorious as I had expected it. This tv indeed gets BRIGHT.

The only issues I've had with it so far I'm thinking may be buggy software. I had the picture go black on me when I was switching to a different HDMI port. I could hear the sound, but the screen was black and I could never get a picture back on. I had to unplug it from the wall to fix it. That's only happened once. The other is I've had some audio drop outs around 2-3 times maybe. I'm not sure if you've had anything happen like that yet?

Also, I have noticed some dirty screen. But it's not enough to ruin the experience. I know some people like to play the panel lottery, but you may end up with something even worse. I'll pass on that.

Overall, I am very happy with the picture quality. I can't wait to get my RetroTink 5x Pro (hopefully this week) to do some retro gaming. I did try the Framemeister (my RGB NES) and those solid colors look so damn bright and sprites look so crisp. I haven't tried the OSSC yet to see what all is compatible.
 

HTK

Banned
I took my Sony 65" 900H back and got a 55" LG CX. Sony had Android TB which is 1000000x better than LGs own shit, however the picture and gaming quality on the LG is amazing.

I was waiting for the C1 but after the reviews it made no sense to pay extra $400 since it didn't offer anything special for the price premium.
 

Kuranghi

Member
And the X90J gets even brighter than the X900H. For SDR in low-medium light, no one needs these new TVs above 50%.

But I can totally get why such high brightness is needed for HDR in rooms with medium-high sunlight.

Apple just put out a new Mini-LED iPad than can peak at 1600 nits. If that was a TV sized panel it'd be blinding lol.

My 65" ZD9 I use as my PC monitor is 1600 nits, I have to turn it to minimum brightness at night.

My calibrated daytime cinema preset is 41/50 so even with this crazy bright panel its nearly maxed out backlight for daytime viewing, night-time cinema preset is 23 and game mode is "only" 20, which is quite bright in the dark.
 

Kerotan

Member
A Herculean effort Bo. Closest to that for me was (pre having a wife or kids) taking a week off work to play Killzone 2 multiplayer, there was a medal in the game that I needed to get my platinum trophy, but it required being ranked in the top 1% in the world for a week. So I spent a a whole week hammering it, I played a lot anyway, but this was next level. It was so fun doing it and luckily resulted in me getting the plat (y)
Yeah I played the shit out of killzone 2 to get that trophy.

Back when mw2 launched I played it day and night. Managed to get into the top 300 for kills before the boosters and hackers took over the leader boards.

My only claim to fame today is being top 10 in my country for warzone wins with over 700!
 

sankt-Antonio

:^)--?-<
I would if I didn't already have a sony oled, since I use bfi while gaming so VRR is a no go. Allm is a more glaring omission for me. But the heatsink and the new processor, hdmi 2.1 are big upgrades for me.

Have A8h currently, which has banding in the left side of my panel. Shit won't go away even though I used panel refresh a bunch of times. If Sony doesn't replace it, I may have to cash in my warranty at best buy during BF for the A90J, and pay the difference, whatever that is. Really hope Sony doesn't screw me though as i'd like to wait longer to upgrade.
At least the A90J behaves like it has allm when a ps4/5 is connected.
Love my 65” A90J and hope that the VRR patch is coming sonner rather then later.
 
Last edited:

kyliethicc

Member
Basically the X90J is better than the X90H, but a bit higher input lag and bit worse panel uniformity.

Brighter panel, better contrast, better local dimming, all make for a better HDR experience.


5XD0DFc.jpg
 
Last edited:

Kuranghi

Member
Basically the X90J is better than the X90H, but a bit higher input lag and bit worse panel uniformity.

Brighter panel, better contrast, better local dimming, all make for a better HDR experience.


5XD0DFc.jpg

Sounds like a fantastic set in so many ways, glad to see the native contrast and contrast with local dimming is back to 2018/2019 levels, the upscaling, general processing and motion sound amazing. The 2.1/VRR caveats are pretty annoying if you are buying the TV primarily for gaming, hopefully they'll have it all worked out next year.

Outside of 2.1/VRR support I only have two major issues with it, 750 nits on a 10% window is quite disappointing compared to 2017-2019 models, 900+ nits for HDR really makes a difference, now it has less of the general advantages LCD can have over OLED (10% window isn't any different than latest OLEDs now) but with all the same (massive) disadvantages vs OLED.

Maybe you could say it matches OLED in that regard so its not too bad but the real issue is 24 zones, even if the dimming algorithm is improved I still think (Will need to go see one in a shop soon now that I can) that will cause a lot of clouding in challenging HDR content and I'd say they've gone too far in trying to save money on the backlight.

We used to rip the piss out of LG for having 6, 12 and 24 zoned FALD backlights because HDR looked like crap on them, now that was also due to the IPS panels in fairness, but its still a big problem when you are trying to make things as bright in HDR. Maybe thats part of the reason for dropping the light output, to curb those problems. I'm guessing its more to have parity with OLED/wind down 4K LCD/save money though.

Of course this set is fantastic if you are coming from much older sets, edge-lit sets, recent entry level sets, but if you see this set next to something that outputs 1000, 1200 or even 1500 nits you'd know why dropping down to 700 from 1000 (in many older 9 series models) this is very a big deal for HDR.

At this point I say go OLED all the way if you have the dosh, because there are hardly any advantages for going LCD anymore. Although maybe the X95J will still have the 48 zones and higher brightness output (or more) of the previous years 900/950 series, do we know much about that model yet?
 
Top Bottom