Since the (double database lookup of the) X1 processor that debuted in the KD-55/65/85 X9000A TVs at the PS4 launch, Sony has been using a form of ML upscaling with their signal processing, which is semantically no different than the lag that Deep Learning Super Sampling DLSS adds, only that one is in engine, and one isn't - and the lag in the TV chip is a function of the fixed silicon/settings - so it will be interesting to see who like DLSS when lag is small but is anti ML upscaling in TVs when it is equally small.
I really need to ask where I'm trolling. There are a good amount of games with notable bad IQ because BC or with horrible resolution at 60 FPS. Reality Creation on last Bravia have a dedicated chip to help to fix, input lag is not too far off to DLSS implementation but still I continue to read post about lololol I preferred horrible IQ because you doh idiot you don't understand a shit lololol interpolation. Even at the primary school I get less childish attitudeI think OP is trolling us. Cant be real.
I’m happy you made the thread. I didn’t think of using it and that shit was unplayable in 1080p without it. Now I’m 6 hours in and loving it.I really need to ask where I'm trolling. There are a good amount of games with notable bad IQ because BC or with horrible resolution at 60 FPS. Reality Creation on last Bravia have a dedicated chip to help to fix, input lag is not too far off to DLSS implementation but still I continue to read post about lololol I preferred horrible IQ because you doh idiot you don't understand a shit lololol interpolation. Even at the primary school I get less childish attitude.
I’m happy you made the thread. I didn’t think of using it and that shit was unplayable in 1080p without it. Now I’m 6 hours in and loving it.
So wait you blame me to praise Reality Creation but meanwhile you even care to post a link about a test of that intolerable input lag for Reality Creation?Wow
Dude you are free to ignore the thread indeed to complain about his stupidity as many others there if they are here just to mock me. So I can't even complain if someone troll the whole time adding nothing to the discussion, are you fucking serious? You are even pissed off if ask to you to post some link about the entity of the input lag, when even on DLSS give such issue, I think it's normal to ask to have more data, it's the point of thread.
Google is FREE, pal.
Have you stopped to wonder why everyone keeps roasting you in these threads?
Do the goddamned research instead of all this yammering on endlessly and erroneously about established realities and for the love of bacon stop making these silly threads, ffs.
It doesn't seem as though English is your native tongue, otherwise I would question that tenuous grasp you seem to have on it.... because the result is most of what you are saying sounds like gibberish, to be frank.
I noticed the lag when many others started to blame this thread (honestly I wouldn't even noticed it without a straight comparison and paying attention) but I have to say it's very minimal and in game like Dying Light 2 is preferably all the life than the muddy IQ it gives at 1080p. It's a fucking nightmare to play it in a 4k screen. Yes there is the resolution mode but here input lag of 30 FPS is definitely more noticeable than the Reality Creation one. Said that, why with DLSS no one put the same energy to lol about it as they did for the input lag of Bravia TV upscaling? ML upscaling cause it in all the implementation.Sorry guys....I'm guilty of this too....
But It does improve the image quality, especially on games with softer outputs and it's not the same as sharpening, there is no over sharpening effect.
although I wouldn't put as high as the OP, Automatic should be enough.
I can't say I've noticed the lag with MS sensitive games like VFV and Street Fighter V so it can't be much and I mostly play slower paced games anyway.
Not saying there isn't lag I'm sure there is but if you can't notice it, who cares?
But I understand peoples reaction.
It should be a big no no.
You asked me a question, and I answered. It's not my problem if you don't like the answer.Dude you are free to ignore the thread indeed to complain about his stupidity as many others there if they are here just to mock me. So I can't even complain if someone troll the whole time adding nothing to the discussion, are you fucking serious? You are even pissed off if ask to you to post some link about the entity of the input lag, when even on DLSS give such issue, I think it's normal to ask to have more data, it's the point of thread.
You can't even tolerate my english lol. Get a life outside of the forum if you can't stand some discussion there, what the hell you want
Dude if you can't stand the input lag in Bravia TV it's your problem, not mine, I don't see the reasons to make a whole post about the stupidity of the thread when I gently ask to you a concrete data of the entity of the input lag of Reality Creation just to compare it to the Nvidia DLSS because it's not even that easily perceivable. I don't like your tone in the previous post, not your answer.You asked me a question, and I answered. It's not my problem if you don't like the answer.
Trolling=/=pointing out the obvious. See above.
I added everything I needed to this "discussion" in my first post, you've just decided to try to argue without doing any research on the topic.
If you can't notice literal frames being off, I don't know what else to tell you...and perhaps pointing this out to someone with a proven track record of playing console games on a television that is not running in proper game mode with a wireless controller is my mistake to begin with, but regardless, I'm not telling you how to play your games, I just said the lag was unacceptable to me. Also worth pointing out that Bravia TV's by default have and av sync on that introduce a ton of lag as well.
If English isn't your 1st language, I apologize...but it's pretty hard to understand you. That has nothing to do with my life, which is pretty great, lol.
Said that, why with DLSS no one put the same energy to lol about it as they did for the input lag of Bravia TV upscaling? ML upscaling cause it in all the implementation.
Tried already with RDR2 and Far Cry 6: Reality Creation setting manually at the max it's almost transformative in such games with very aggressive TAA and blurred IQ. Unfortunately super resolution work very awful with most CBR solutions (strangely not with RDR2) but to play Dying Light 2 at 60 FPS I really suggest to use it to eliminate the annoying hazed vegetation. I start to really appreciate my Bravia after the first bad impression of the HDR tone compared the LG, at least for stuff like this.
Agreed. There’s a negligible hit to input lag and does an incredible job upscaling old content especially on the newer TVs.Unlike the interpolation thread, I agree with OP. Bravia upsampling is really good.
I was just clearing up the DLC trophies on the PS4 version of Control and used the creation engine and it cleaned up the image a lot, actually made the textures look a lot better.
Agreed. There’s a negligible hit to input lag and does an incredible job upscaling old content especially on the newer TVs.
On my A90J 4k native content with the AI/RC upscaling looks super sampled without over-sharpening the picture.
So for the hell of it I went and tried it on Battlefield 4 which is notorious for its blurry low resolution and I was shocked by the results. NO perceptible lag added and the image was so much better.
Tried already with RDR2 and Far Cry 6: Reality Creation setting manually at the max it's almost transformative in such games with very aggressive TAA and blurred IQ. Unfortunately super resolution work very awful with most CBR solutions (strangely not with RDR2) but to play Dying Light 2 at 60 FPS I really suggest to use it to eliminate the annoying hazed vegetation. I start to really appreciate my Bravia after the first bad impression of the HDR tone compared the LG, at least for stuff like this.
the only number I can find online from about a year ago is 30ms and someone from 4 months ago saying it adds noticable lag at anything above setting it to 0... and if that is accurate then that is not negligible at all.
are there any credible sources where the latency is measured?
because I highly doubt that this has no noticeable lag.
Nvm, I was wrong, I thought X900h has no RC.
You are wrong. Turning on reality creation in game mode (it’s on by default in game mode btw) does not add additional lag.Yes.
And they stack the more of these "enhancements" you use.
THIS WAS ALREADY COVERED IN YOUR PREVIOUS THREAD.
I tried to capture some screen with my ps5 but it seems it capture the ps5 output without the Reality Creation of Bravia. If someone know how I can capture shot from the Bravia I will try again because otherwise on ps5 it doesn't works. I assure to you in 1080p games or with aggressive TAA this TV setting is a benediction. Dying Light 2 is definitely more pleasant to look otherwise it's a pain to look.Hard to imagine, can you give us some examples?
you understand wrong and should probably try and have a basic grasp of shit you talk aboutInput latency in ms of the screen not correspond exactly to the ms of the controller input lag, from what I have understood.
Prove it.You are wrong. Turning on reality creation in game mode (it’s on by default in game mode btw) does not add additional lag.
Whilst you are in game mode, the only setting that adds additional input lag on Sony bravia modern sets is black frame insertion specifically on oled tvs. The setting is under motion settings and is listed as “clearness”.
Source : Myself with many different Sony tvs and leo bodnar lag tester tool.
I’m not in the habit of making a video to prove people talking out of their ass wrong. I’d be doing that all the time!Prove it.
There are multiple reports of it adding up to 30ms, and anecdotally there is perceivable lag. Also, not on as default in Game mode with x900f, as you claim...in fact, there isn't even an option for "on".
So, no then. OK.I’m not in the habit of making a video to prove people talking out of their ass wrong. I’d be doing that all the time!
I can guarantee you that you will not find a “source” with video documentation to support these 30ms claims.
If someone respectfully asked me out of genuine curiosity I would consider it.
I have owned x900e. z9f, x950g, a8g, a8h, A9S and a80j sony bravia tvs.
From memory 900f has actually downgraded the processing chip and option from 900e. Think it was reinstated on 900g onwards.So, no then. OK.
Also, nothing to say about not being able to select it in Game mode on X900F?
I can guarantee you that you will not find a “source” with video documentation to support these 30ms claims.
I have never used the x900f but that doesn’t sound right to me. I know others on this board have that unit so maybe they can chime in.So, no then. OK.
Also, nothing to say about not being able to select it in Game mode on X900F?
30ms of additional lag is an absurd claim on its face. That is nearly 2 entire frames of lag… it wouldn’t be “yeah you can kind of notice it maybe” it would be “wow that is a huge amount of lag.”because noone of the usual reviewers even considers testing for it... all I found was a single guy that tested his new TV about a year ago and gave people info on what does and doesn't add latency to the image. upon someone asking about Reality Creation he said ~30ms of additional lag.
I have no reason to believe that guy is lying, maybe he didn't measure and just gave an estimate, but 30ms is a lot so if it was an estimate he clearly felt it.
I play COD on 900e with reality creation. There’s no added lag on or off and I’m pretty astute to these things. E.g. Can’t play Cold War with RT on for example because latency is night and day to me with it on/off - I’m sure many don’t even notice that.because noone of the usual reviewers even considers testing for it... all I found was a single guy that tested his new TV about a year ago and gave people info on what does and doesn't add latency to the image. upon someone asking about Reality Creation he said ~30ms of additional lag.
I have no reason to believe that guy is lying, maybe he didn't measure and just gave an estimate, but 30ms is a lot so if it was an estimate he clearly felt it.
30ms of additional lag is an absurd claim on its face. That is nearly 2 entire frames of lag… it wouldn’t be “yeah you can kind of notice it maybe” it would be “wow that is a huge amount of lag.”
I am able to notice a 5ms difference in display lag (in certain titles where there isn’t already significant game lag) so a 30ms difference would be like a slap in the face.
Anyone making that claim has obviously nothing whatsoever to back it up, and for good reason.
I have nothing to back it up because i’m not paid to do so lol. Maybe one day.well you have nothing to back up your claim either. so either way we have no trustworthy info on this
and I have come to mistrust anyone who claims low input latency online, since we still actually have people that will tell you that Killzone 2 felt good to play
I have nothing to back it up because i’m not paid to do so lol. Maybe one day.
It’s part of the reason I often contemplate making videos myself, but i’m going to be very busy until at least the summer. And I still don’t know how to calibrate which i’d like to learn as well.it is a bit weird that noone who is paid to do it tests this. it seems review sites simply test Game Mode with everything off to have the optimal values and don't go deeper.
features like these would be interesting to test so it's kinda disappointing noone does
It's greyed out, no option to toggle anything...coffinbirth Actually now that I think about it you may not even understand how the reality creation toggle works ; there is no option for on because it’s either off, auto or manual in which you can adjust the values between 0 and 100.
Manual is “on”.
It's greyed out, no option to toggle anything...
So, I guess these guys are wrong too, huh?