• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Sony’s 2021 TV lineup runs Google TV and fully embraces HDMI 2.1

I bought x900h just for ps5, i hate the fact slls tends to freeze every fkn night- i tried hard wires too

I think my next set will be a oLED but i just have to wait till it goes blow 3k
 
I bought x900h just for ps5, i hate the fact slls tends to freeze every fkn night- i tried hard wires too

I think my next set will be a oLED but i just have to wait till it goes blow 3k
As nice as the A9J seems, asking 3K for a 55' is absurd. If gaming is your thang, you're much better off with the LG G1. Honestly, even $2200 they're asking for on their 55' is a bit much too. I think the LG C1 is the gaming sweet spot for 2021. The gaming focus that LG has this year is untouchable.
 

Kuranghi

Member
As nice as the A9J seems, asking 3K for a 55' is absurd. If gaming is your thang, you're much better off with the LG G1. Honestly, even $2200 they're asking for on their 55' is a bit much too. I think the LG C1 is the gaming sweet spot for 2021. The gaming focus that LG has this year is untouchable.

Yeah its the same launch price as the first OLED, the A1, was in 2017. Is the 65" A90J 5 grand? Thats what the 65A1 was. Totally mad prices for most people, its one to watch for clearance open box.
 
Yeah its the same launch price as the first OLED, the A1, was in 2017. Is the 65" A90J 5 grand? Thats what the 65A1 was. Totally mad prices for most people, its one to watch for clearance open box.
We don't even know if Sony is using LG's latest EVO panel. I would guess so but it's also possible they are using the same panel as the C1. What we do know is that they've incorporated a heatsink so that's how they've managed to hit such high brightness. As far as I know, the LG G1 does not use a heatsink, so the extra brightness is purely down to the new panel design. Both have a focus on increasing brightness on specular highlights.

I would also like to point out that the EVO panel brings 20% greater efficiency. Slightly higher brightness at lower power consumption levels. Less chance of burn-in due to reduced stress. I guess Sony is doing something similar by their heatsink implementation but this of course has nothing to do with power consumption. I want to see power consumption comparison between G1 and A9J.
 

Kuranghi

Member
4K 120hz blur has been fixed on the X90J. eARC @ 120hz is still broken on both sets causing audio dropouts.



Glad they fixed it but I hoped he would show side-by-sides of the image presentation (outside of this issue) from both sets to see how different it is, I think it will be a massive jump just going with prior experience but I really really want to know if XR is better in every way than X1U or not. Or even X1E for that matter since they seem to be pretending that doesn't exist anymore.
 

Kuranghi

Member
Nope that guy is misleading.



I'm not sure it truly is fixed, but this guys test is terrible regardless of that, he's comparing on console where you can't control the game settings and its a comparison of a 4K image supersampled from 6K to an image thats "native" 4K and has heavily reduced 2D layer detail to achieve the 120hz.

We need to see PC output of the same game, at the same settings, one test at 4K60 and the other at 4K120. I personally would guess they haven't fixed it because if the image processing is more intensive with XR vs X1 (or the same) and its still not using a separate chip for the OS/UI like before with some of the X1E and X1U 9-series LCDs then I don't see how they'd have the extra grunt to render 4K120 properly now when they didn't last year. Unless they've upgraded the chipset to be much more powerful and can do the job of the old separate chips AND more to accomodate the new processing jazz.

If you want a Sony LCD, I still say buy an XF90/X900F/XG90/X900G (49" XG has a 8bit+2-bit FRC panel rather than 10-bit, other sizes are 10-bit though, XG has metal legs though which is nice) or XG95/X950G before they disappear entirely if you can (or X950H if you can't find them in 2021) and forget about 4K120/2.1 support for now because they don't see it as a big enough priority to make the sets more expensive for it right now. They have always been focused on TV/Film/image quality over a better game experience.

I realise that even if you don't care about 4K120 support from 2.1 you DO get a benefit to 4K60 HDR output with 2.1 in the form of it not having to reduce chroma to 4:2:2 to fit the signal within the HDMI 2.0b bandwidth, but I doubt most people would even notice the difference there, compared to the difference in IQ between the 4K60 output and the 4K120 output of the X90H/X900H for instance.

Other 2.1 features like ALLM are not needed for PS4/PS5 use because Sony have had a proprietary system in place for years that does better than ALLM when you use a Sony TV with a PS4/PS5, it can even detect when you hit the PS button and go back to the right input, and the TV remote can control the PS/media menu via HDMI-CEC/Bravia Sync, so if you are watching movies you don't even need to pick up the controller.

VRR isn't here and won't be til they iron out all the kinks or whatever is stopping them from adding it - probably just money, ie they don't think its worth it for the majority of their customers - so you aren't really getting anywhere near proper 2.1 support so its not even going to increase resale value for the TV or keep it relevant longer.

If you really need/want a new TV right now you'd be much better served buying something older, without 2.1 but with better image quality (Much better in some comparisons, like X950G vs X900H and presumably also X90J according to early reports), for cheaper and then in ~3 years (pure guessing there really lol) when 2.1 and VRR is standard across all premium models you can get a hot piece of shit with all the stuff and no bugs/caveats.

Meanwhile everyone that bought X900H/X90J are either still waiting for VRR support or got it but it has caveats. The TVs I mentioned above - especially XG95/X950G - are definitely still available as open box or refurbished with full warranties for the same or less money than the X900H.

Fucking hell, long screed again. Being manic + talking about TVs + too much coffee = long rambling posts lol

Scottish TL;DR - don't buy the new wans, buy the old wans.
 
Last edited:

Kuranghi

Member
Found out the guy who posted that video is running the owners thread over on the AVSForum: https://www.avsforum.com/threads/2021-sony-bravia-xr-x90j-owners-thread-faq-no-price-talk.3190601/

He says he's got a 3080. He said he sent that broken 120hz video to Sony.

He also posted another video of real world performance between the X900H and X90J.



Well I'm glad this guy thinks its a jump up but I want to see a comparison with X1U sets from 2020 and prior. I don't trust the picture in the thread showing comparison with X90J (XR) and X950H (X1U), because in the video he's saying the XR is a bit better than the downgraded X1 in the X900H, which isn't even as good as the X1E, let alone X1U so I don't know how the X90J could look that much better than the X950H.

Unless the X1U in the 2020 sets was downgraded over the <2020 X1U too, like the X1 was, but I don't think thats the case from what I read/watched last year. The chinese image looks deceptive to me, happy to be proven wrong though!

Bring on the expert/technical comparisons!

edit - I had a look at VT's X950H review and he didn't say the image processing was downgraded over <2020 X1U like he did with X1 2020 but he also didn't compare it directly with those old sets, so I think its the same chip as the old ones and not a rebrand:




That makes me think the Chinese comparison image is disengenous, moreso than before.
 
Last edited:

dolabla

Member
Well I'm glad this guy thinks its a jump up but I want to see a comparison with X1U sets from 2020 and prior. I don't trust the picture in the thread showing comparison with X90J (XR) and X950H (X1U), because in the video he's saying the XR is a bit better than the downgraded X1 in the X900H, which isn't even as good as the X1E, let alone X1U so I don't know how the X90J could look that much better than the X950H.

Unless the X1U in the 2020 sets was downgraded over the <2020 X1U too, like the X1 was, but I don't think thats the case from what I read/watched last year. The chinese image looks deceptive to me, happy to be proven wrong though!

Bring on the expert/technical comparisons!
RTINGS bought their set. So we should know something soon!

That Chinese pic is from this video (timestamped it for you) below. He posted a video a few weeks ago (which I posted earlier in this thread) and then he posted this follow up video like a week ago.

 

Kuranghi

Member
RTINGS bought their set. So we should know something soon!

That Chinese pic is from this video (timestamped it for you) below. He posted a video a few weeks ago (which I posted earlier in this thread) and then he posted this follow up video like a week ago.



Oh cheers mate, appreciate it. Will be good to see what rtings say about colour and brightness and stuff, but going to them for information about image processing is not pointful imo, they don't care about that because they believe its too subjective.

I think there is actually more detail on the X950H side, if you look at the womans furrowed brow or skin in general is looks more smoothed over on the X90J side. Its could easily just be a camera effect though and I shouldn't evaluate it through offscreen images I know, but by the same rationale the different in colour saturation and contrast could be discounted. The left side is less pleasing to me even though it has more contrast, it just seems too "thick" to me and less gradation in the image, but again, its probably the camera.

My biggest fear (lol TV fear!) is that they are going down a Samsung route and just pump up the colour and contrast to compete with Samsung's hyper reality. I don't want that, I want an accurate image.
 
Last edited:

dolabla

Member
Oh cheers mate, appreciate it. Will be good to see what rtings say about colour and brightness and stuff, but going to them for information about image processing is not pointful imo, they don't care about that because they believe its too subjective.

I think there is actually more detail on the X950H side, if you look at the womans furrowed brow or skin in general is looks more smoothed over on the X90J side. Its could easily just be a camera effect though and I shouldn't evaluate it through offscreen images I know, but by the same rationale the different in colour saturation and contrast could be discounted. The left side is less pleasing to me even though it has more contrast, it just seems too "thick" to me and less gradation in the image, but again, its probably the camera.

My biggest fear (lol TV fear!) is that they are going down a Samsung route and just pump up the colour and contrast to compete with Samsung's hyper reality. I don't want that, I want an accurate image.
Yeah, I'm with you on that. I hope they don't abandon color accuracy as that was one of the main reasons I liked them over others. Samsung goes all out with the colors and it's a little too much.

I did read a comment on one of the X90J youtube vids that the guy Ken (the Chinese reviewer in that video above) said the X90J color accuracy was excellent (or something along those lines). Not sure if that was true or somebody was just lying/trolling in the comments. Luckily, it won't be too much longer since they're out in the wild now and reviewers will be getting theirs.

I'm personally waiting to see if my Best Buy gets any 50" in stock at the local store (they have the 55" in right now). Mainly due to receiving my LG CX last year that ended up getting damaged in transit so I sent it back and didn't bother to get another one (got to thinking about OLED burn in as well and just decided on LCD again and to wait until 2021). Kind of left a bad taste in my mouth. If they don't, guess I'll have no choice but to get one shipped to me.
 

Kuranghi

Member
Uh oh, borked 120hz still on the new 900 would not be a good look.

Most VA panel TVs have a 100% response time above 10ms anyway so 120hz is best seen on an IPS monitor if you really care about motion clarity. The X900H is ~11.3ms so its nearly 50% longer than an 120hz frame so in dark scenes it will smear due to the next frame coming too fast for it.

Its not good though aye, they should just not do it if you don't care to do it properly.
 
Waiting on the TCL OD Zero televisions to come out before I make my final decision. So far I'm not impressed by mini-LED tech. Still blooming and viewing angle issues. Worst part is the performance loss in game mode. You pay lots of money for all that brightness and contrast in an LCD just to have a washed out image while in game mode. Image processing takes a huge hit since it prioritizes input. I don't think manufactures will ever have a powerful enough processor to ensure a deep contrast even if in game mode. Chances are I'll end up getting a 55' LG C1.
 

Bojanglez

The Amiga Brotherhood
Waiting on the TCL OD Zero televisions to come out before I make my final decision. So far I'm not impressed by mini-LED tech. Still blooming and viewing angle issues. Worst part is the performance loss in game mode. You pay lots of money for all that brightness and contrast in an LCD just to have a washed out image while in game mode. Image processing takes a huge hit since it prioritizes input. I don't think manufactures will ever have a powerful enough processor to ensure a deep contrast even if in game mode. Chances are I'll end up getting a 55' LG C1.
I'm going to wait until reviews and full comparisons come out. But as it stands there may be too many compromises with these Sony sets for gaming. I have some concerns about OLED (with kids leaving the TV on) but leaning back towards the C1 just because it seems to have such good out of the box support for gaming. If the X90J is getting decent reviews I may take a punt and worst case it can become my bedroom TV in a couple of years.
 

usctrojan

Member
Picked up a 65 inch A90J two or so weeks ago now and gave it the obsessive and detailed examination. Picture is generally phenomenal if you ignore the details but I assumed in 2021 my four complaints wouldn’t have to be so:

1) Apple TV not logging in. I have a Google TV connected that I manually switch to for it at the moment.

2) grey uniformity isn’t perfect. My set has visible pink tint on the left side of the screen.

3) very very light banding. Probably better than I’ve ever seen on an oled but still those “omg it’s flawless” video reviews are total nonsense.

4) expected but the warping reflections on the panel can be a bit distracting. Image is of course not affected as a result but with a bright back drop it’s hard to ignore the fun house mirror effect.
 

Bo_Hazem

Banned

Kuranghi

Member
Picked up a 65 inch A90J two or so weeks ago now and gave it the obsessive and detailed examination. Picture is generally phenomenal if you ignore the details but I assumed in 2021 my four complaints wouldn’t have to be so:

1) Apple TV not logging in. I have a Google TV connected that I manually switch to for it at the moment.

2) grey uniformity isn’t perfect. My set has visible pink tint on the left side of the screen.

3) very very light banding. Probably better than I’ve ever seen on an oled but still those “omg it’s flawless” video reviews are total nonsense.

4) expected but the warping reflections on the panel can be a bit distracting. Image is of course not affected as a result but with a bright back drop it’s hard to ignore the fun house mirror effect.

Can I ask if you saw issues 2 and 3 on test patterns or in actual content? By banding do you mean bad gradients or highlight banding?
 

Kuranghi

Member
I'm going to wait until reviews and full comparisons come out. But as it stands there may be too many compromises with these Sony sets for gaming. I have some concerns about OLED (with kids leaving the TV on) but leaning back towards the C1 just because it seems to have such good out of the box support for gaming. If the X90J is getting decent reviews I may take a punt and worst case it can become my bedroom TV in a couple of years.

By "ready for gaming" I assume you mean all HDMI 2.1 features are supported, like VRR and 4K120? In fairness to Sony I don't know of any sets that support those features with no caveats yet, LG OLED supports 4K120 but has issues with VRR (gamma), Sony has the 4K120 burriness, Samsung supports 4K120 but the game mode local dimming is rubbish for a while.

Is there a single model where it has 4K120 with no issues, VRR and a non-gimped game mode?
 

usctrojan

Member
Can I ask if you saw issues 2 and 3 on test patterns or in actual content? By banding do you mean bad gradients or highlight banding?
Pink tint is only visible in very light grey splash screens (like the youtube TV loading screen) or in commercials where a logo is splashed on a screen for a bit (think of an insurance commercial with a brief white screen showing the geico logo etc...). Not really noticeable in regular content situations. Disclaimer: I have an eagle eye so I see every minuscule variation in tone/color etc...

the banding is highlight and not really visible when watching actual content. I only notice it in the dimmest 2-3 settings on very light grey test patterns (or YouTube TV loading screen which has the same color more or less).

its not at all bad at all but I think going into the purchase having watched those two “pro calibrators” gush about perfection, its easy to point out that its not perfect at all. Unless Sony sent them a perfectly binned set of units (but even the best oled I have ever seen has some issues in dim backlight). They mention they opened two and the second was flawless?
 
Most VA panel TVs have a 100% response time above 10ms anyway so 120hz is best seen on an IPS monitor if you really care about motion clarity. The X900H is ~11.3ms so its nearly 50% longer than an 120hz frame so in dark scenes it will smear due to the next frame coming too fast for it.

Its not good though aye, they should just not do it if you don't care to do it properly.
Sony's IPS panels are slower in response time than their VA panels.

So response time is the issue? I thought it was the tv couldn't actually output real 4k120? I should look more into this, admittedly never looked too hard as I didn't care about x900h at all.
 

Kuranghi

Member
Sony's IPS panels are slower in response time than their VA panels.

So response time is the issue? I thought it was the tv couldn't actually output real 4k120? I should look more into this, admittedly never looked too hard as I didn't care about x900h at all.

Thats why I said IPS monitors, I don't mean Sony IPS TVs. I was just making the observation that 120hz won't be nearly as good on TV-sized panels because the total response time of the panel is longer that a 120hz frame, 8.33ms, so there will be certain situations where you won't see as clearly as you would on a faster panel, the frames will smear together. On OLED its not a problem as they have 2-3ms total response time.

No the issue with 4K120 on X900H is that it can't present/process 3840x2160 x 120 frames a second so I think what they do is reduce the resolution of one of the axis, like how Call of Duty games do dynamic resolution scaling, example - its 1920x2160 but they scale that internally to 3840x2160.

Vincent Teoh said LG make their own in house components for the internals of the TV, so while Sony is limited by the 2x HDMI 2.1 ports available on the products they can/are willing to buy, LG can just add however many they want, maybe its the same for the chipset that processes image.

Its a processing thing, I think the panel could display 4K120 because it can display 4K and 120hz separately without issues. Its a bandwidth of the chip sort of thing maybe? 4K120 working at input and output, but messed up somewhere in between.
 
Thats why I said IPS monitors, I don't mean Sony IPS TVs. I was just making the observation that 120hz won't be nearly as good on TV-sized panels because the total response time of the panel is longer that a 120hz frame, 8.33ms, so there will be certain situations where you won't see as clearly as you would on a faster panel, the frames will smear together. On OLED its not a problem as they have 2-3ms total response time.

No the issue with 4K120 on X900H is that it can't present/process 3840x2160 x 120 frames a second so I think what they do is reduce the resolution of one of the axis, like how Call of Duty games do dynamic resolution scaling, example - its 1920x2160 but they scale that internally to 3840x2160.

Vincent Teoh said LG make their own in house components for the internals of the TV, so while Sony is limited by the 2x HDMI 2.1 ports available on the products they can/are willing to buy, LG can just add however many they want, maybe its the same for the chipset that processes image.

Its a processing thing, I think the panel could display 4K120 because it can display 4K and 120hz separately without issues. Its a bandwidth of the chip sort of thing maybe? 4K120 working at input and output, but messed up somewhere in between.
Makes sense, I thought it was the processor most likely. Though I’d never sacrifice colors, uniformity, black level and hdr just for a smoother 120fps motion on an ips monitor (not even factoring in bfi). Which... I don’t trust the monitor response times that are advertised anyway, and I don’t just mean “1ms gtg” bs.

But I mean, if the 900J has the new processor, you’d think it wouldn’t be an issue. But perhaps it’s not the same. We will see I suppose
 
Last edited:
A90J have VRR yet?
Man I so wish TVs never went “smart”. Firmware updates to fix or add something that should have been there day one is completely absurd, but here we are.

Worst thing i ever heard was Samsung nerfing Q9fn picture quality with an update to try and get users to upgrade. How they didn’t get sued to oblivion is beyond me

Seriously, why put power into these companies hands for the entire time you own the thing?!

Never update unless it fixes something, then disconnect again.
 
Last edited:

Kuranghi

Member
Pink tint is only visible in very light grey splash screens (like the youtube TV loading screen) or in commercials where a logo is splashed on a screen for a bit (think of an insurance commercial with a brief white screen showing the geico logo etc...). Not really noticeable in regular content situations. Disclaimer: I have an eagle eye so I see every minuscule variation in tone/color etc...

the banding is highlight and not really visible when watching actual content. I only notice it in the dimmest 2-3 settings on very light grey test patterns (or YouTube TV loading screen which has the same color more or less).

its not at all bad at all but I think going into the purchase having watched those two “pro calibrators” gush about perfection, its easy to point out that its not perfect at all. Unless Sony sent them a perfectly binned set of units (but even the best oled I have ever seen has some issues in dim backlight). They mention they opened two and the second was flawless?

Yeah, I think perfect to them isn't what you or I would consider perfect, they are meaning compared to whats come before. Vincent Teoh told me my ZD9 was really good in terms of DSE and its pretty visible when I play games with very light coloured solid backgrounds (game equivalent of ice hockey really) but to be honest when I focus on the game itself I don't see the DSE. only when I focus on it.

I would never put 5% grey slides up or anything though, thats just a path to pain imo. I didn't have the luxury of return though, I got it in 2018 (18 months after launch) and it was one of the last in my country. I use mine as a desktop display for a PC so I see solid greys on it all day ha, since its a VA panel off-axis desaturation and loss of contrast is more of a problem than DSE. The viewing angle is super narrow for actually get the performance that its known for, I sit too close to it (Its 65") so even though I'm completely straight on I can stil see the fade from the centre to the outer edges in solid colour, its not noticeable in actual content though. Ideally I think I should go a few feet further back but I don't really have the room to do that. Its a silly setup!
 

Bojanglez

The Amiga Brotherhood
By "ready for gaming" I assume you mean all HDMI 2.1 features are supported, like VRR and 4K120? In fairness to Sony I don't know of any sets that support those features with no caveats yet, LG OLED supports 4K120 but has issues with VRR (gamma), Sony has the 4K120 burriness, Samsung supports 4K120 but the game mode local dimming is rubbish for a while.

Is there a single model where it has 4K120 with no issues, VRR and a non-gimped game mode?
To be fair I didn't actually say "ready for gaming", I said LG C1 had better out of the box support for gaming. I think that is reasonable to say at this stage as they have 4 HDMI ports that all support 2.1 and (as far as I can tell) all support features that gaming may be useful (4K 120hz, VRR, G-sync) out of the box without having to wait for firmware updates.

You are right though that results may vary for their implementation and in reality I personally would not use G-Sync on my TV. So it mostly comes down to if I can deal with only 2 HDMI 2.1 ports knowing that one of those would be required for eARC and therefore I may need to manually change ports between PS5 and XSX depending on if I want to plat at 4K/120. Also do I trust Sony to deliver VRR at all, for the PS5 this seems less of an issue as there is very little tearing, but the SXS would benefit a bit more from this.

I'm still going around in circles, so I'll be interested in the reviews. This morning I am thinking X90J at 65", but who knows what I'll think tomorrow 😆
 

dolabla

Member
Well, looks like horrible news from the X90J :messenger_neutral:


Have the x90j next to the C1 now. Not sure how well this will show on camera yet, but there is a noticable judder when panning in games. I stood on a ledge in destiny 2 and aiming down sight looked back and forth at various speeds. It makes it look as those it's dropping frames, but on the C1 it's completely smooth.


Just checked the x900h to make sure I'm not going crazy or something lol.
X900H doesn't have the judder, just the x90j. Also input lag on the x900h matches the C1 more or less (no perceivable difference).
The x90j is perceivably, albeit very slightly, slower in input lag than the x900h and C1.

Guessing that maybe Sony really wants to push the x95j this year, or they really f'd up.


Updated the second post with this.

I cannot recommend this TV in it's current state.
Gaming has noticable judder and higher input lag than X900H, and Dolby Vision color science has a strong green push.
The only way around the judder is to not be in game mode and have Cinemotion, and motionflow settings on, which implements lots of input lag.

Quantum TV posted a video about (I know he can be annoying, but looks like he's right since that guy is experiencing the same thing) it showing the juddery panning:

 

Kuranghi

Member
To be fair I didn't actually say "ready for gaming", I said LG C1 had better out of the box support for gaming. I think that is reasonable to say at this stage as they have 4 HDMI ports that all support 2.1 and (as far as I can tell) all support features that gaming may be useful (4K 120hz, VRR, G-sync) out of the box without having to wait for firmware updates.

You are right though that results may vary for their implementation and in reality I personally would not use G-Sync on my TV. So it mostly comes down to if I can deal with only 2 HDMI 2.1 ports knowing that one of those would be required for eARC and therefore I may need to manually change ports between PS5 and XSX depending on if I want to plat at 4K/120. Also do I trust Sony to deliver VRR at all, for the PS5 this seems less of an issue as there is very little tearing, but the SXS would benefit a bit more from this.

I'm still going around in circles, so I'll be interested in the reviews. This morning I am thinking X90J at 65", but who knows what I'll think tomorrow 😆

Sorry about the misquote, I'm not doing too well recently.
 

Bojanglez

The Amiga Brotherhood
Well, looks like horrible news from the X90J :messenger_neutral:










Quantum TV posted a video about (I know he can be annoying, but looks like he's right since that guy is experiencing the same thing) it showing the juddery panning:


Can this dude be trusted? Things that go against him for me:
  • He wears a jarg Deadpool mask
  • He spams Youtube with 50 videos about the same subject
I'm hesitant to even give him any clicks for these reasons.
 

dolabla

Member
Can this dude be trusted? Things that go against him for me:
  • He wears a jarg Deadpool mask
  • He spams Youtube with 50 videos about the same subject
I'm hesitant to even give him any clicks for these reasons.
He's pretty damn annoying and no doubt I don't trust him considering he rips every tv to shreds. But it's being confirmed by the OP of the owners thread on the AVSForum so it looks like he is right.

He was initially very high on this tv until he was asked to look and see if what Quantum was showing was actually true. Looks to be true and what a huge disappointment it is. This is something that should not be broken out of the box. Can't believe Sony thinks this was okay to release it like this. Maybe they fix it in an update? Maybe, but their track record of fixing/adding things isn't exactly the best right now.

Shit sucks as I was super excited for this set.
 

Bojanglez

The Amiga Brotherhood
He's pretty damn annoying and no doubt I don't trust him considering he rips every tv to shreds. But it's being confirmed by the OP of the owners thread on the AVSForum so it looks like he is right.

He was initially very high on this tv until he was asked to look and see if what Quantum was showing was actually true. Looks to be true and what a huge disappointment it is. This is something that should not be broken out of the box. Can't believe Sony thinks this was okay to release it like this. Maybe they fix it in an update? Maybe, but their track record of fixing/adding things isn't exactly the best right now.

Shit sucks as I was super excited for this set.
Yeah I've just gone back into that thread to read what is being said. I mean, best case is that a firmware update fixes it, but Sony's reputation for releasing timely fixes seems to be in the mud (at least on AVS). Super disappointing and sounds like it will be an issue on all models with the XR chip without an update. 😟
 

Kuranghi

Member
Given the way 4K LCD is going, my advice is to buy a ZD9, XE94, XE93, XG95 (Except if buying 75" or 85", then drop down the XF90) or XF90 in that order of preference ASAP (The first 3 will be very hard to find not used/refurb'd now) if you want the best movie/TV/game "popping" HDR experience, otherwise just buy a 2.1 OLED and wait for microLED sets. If you do watch mostly HDR content thats very dark and barely has any bright scenes then OLED will be better overall though, or you just don't mind light controlling the room/not watching during the day then OLED will be best in almost all regards, I still prefer 24/30hz motion on LCDs though, especially the ZD9 and XE94. 24hz content looks amazing on my ZD9, even better on XE94 I'd imagine since its 100% PRT is almost exactly one 24th of a second, ~40ms. My ZD9 is ~30ms hence why 30hz (33.33ms) games are so smooth/playable on it.








No, none of these sets support HDMI 2.1 or VRR.
 
Last edited:

dolabla

Member
Given the way 4K LCD is going, my advice is to buy a ZD9, XE94, XE93, XG95 (Except if buying 75" or 85", then drop down the XF90) or XF90 in that order of preference ASAP (The first 3 will be very hard to find not used/refurb'd now) if you want the best movie/TV/game "popping" HDR experience, otherwise just buy a 2.1 OLED and wait for microLED sets. If you do watch mostly HDR content thats very dark and barely has any bright scenes then OLED will be better overall though, or you just don't mind light controlling the room/not watching during the day then OLED will be best in almost all regards, I still prefer 24/30hz motion on LCDs though.








No, none of these sets support HDMI 2.1 or VRR.

Yeah, with this news about the X90J, I'm at the point of just saying screw it and getting another OLED. The C1 possibly since it comes in a 48" size. There really just isn't a good market for the size I want (50" and under).

Burn is definitely a worry though, which is why I didn't bother getting a replacement when my CX arrived last year. Just got to thinking that it wasn't for me due to playing older consoles (via the OSSC and Framemeister) that have black bars on the side (uneven aging of the screen possibly?) and those older games have lots of static elements. I do mix in a lot of tv/movies and of course newer consoles that fill the screen so maybe that will help?
 
Last edited:

Bo_Hazem

Banned
Yeah, with this news about the X90J, I'm at the point of just saying screw it and getting another OLED. The C1 possibly since it comes in a 48" size. There really just isn't a good market for the size I want (50" and under).

Burn is definitely a worry though, which is why I didn't bother getting a replacement when my CX arrived last year. Just got to thinking that it wasn't for me due to playing older consoles (via the OSSC and Framemeister) that have black bars on the side (uneven aging of the screen possibly?) and those older games have lots of static elements. I do mix in a lot of tv/movies and of course newer consoles that fill the screen so maybe that will help?

I would say either way we should wait until black Friday, by then let's see how it performs.
 

Kuranghi

Member
Yeah, with this news about the X90J, I'm at the point of just saying screw it and getting another OLED. The C1 possibly since it comes in a 48" size. There really just isn't a good market for the size I want (50" and under).

Burn is definitely a worry though, which is why I didn't bother getting a replacement when my CX arrived last year. Just got to thinking that it wasn't for me due to playing older consoles (via the OSSC and Framemeister) that have black bars on the side (uneven aging of the screen possibly?) and those older games have lots of static elements. I do mix in a lot of tv/movies and of course newer consoles that fill the screen so maybe that will help?

I think its fine to use OLED for gaming/dekstop use but you have to take some precautions like hiding static elements and having a slidehow of wallpapers thats randomly ordered and has a LOT of images to go through, like 100s at least. I cant' speak to burning in black bars from devices like OSSC and Framemeister though unfortunately.
 
Top Bottom