I don't buy it. If I'd want a soft, blurry 4K image I'd get a one of those 100$ "4K" TV and call it a day, but isn't the whole 4K idea all about sharp, crystal clear image? Why not settle down for proper 1080p instead of going for higher but actually fake resolutions? DLSS was marketed as a supersampling replacement at the beginning, for like zero performance drop, which was indeed a neat idea, but now it's just one of those upscalling techniques, which by default will never be as good as your display's native resolution, because that's how displays work. DLSS wouldn't be needed if NV put double the amount of RT cores in Turign GPUs, instead of Tensor cores. It's like a car with two spare wheels, instead of four proper tires, why would anyone want to do that?