The point here is that gamers are usually only concerned with the part they can control - that is what is typically discussed where a a 60fps game like super mario 3D world has levels that are unplayable on a TV with +20ms of video lag - and the controller lag for the pack-in controller is pretty much a function of the system, and factored in, because it was how the game passed QA certification, so isn't normally part of the typical discussion.
I've owned a TV with Reality Creation - full name Digital Reality Creation (DRC) - since my old KD-32X200 CRT TV. Back in the old days DRC menu was disabled by game mode (x,y set to 0,0) because it added too much lag for playing, but my more recent (still 5year old) KD-65ZD9 doesn't prohibit its use. I leave it to auto, so the TV will disable it itself based on context, but in games made for 30fps auto makes no difference to "lag, to stop someone beating a game", while still making a huge visual improvement - to the point Death Stranding on my launch PS4 on my TV, was almost the same visuals - other than FOV/max draw distance - as my friend's PS4 Pro playing Death Stranding on the same TV, so much so my friend replaced his 65" LG 4K/HDR LED TV with one of their newer OLEDs, realising his TV was the bigger limiting factor.
People talk about picture processing adding visual lag, but even the game engines use native picture processing - like Spiderman Morales' ML inference of animations - which because it doesn't impact the feedback loop and just improves the visuals is a non-issue.