AMD's super resolution seems ace @ Ultra @ 1440p. I can't tell the difference with native and it edges the game to 60fps on my aging rig with Ulta textures and High everything (but I think I'll Ultra everything and lock to 30fps instead but still use this as it really looks the same). Maybe it's a tiny bit softer at the distance but without really losing detail, if anything maybe giving it a slightly less aliased look (probably even more unnoticable if you use film grain and such, which I don't) but maybe not even that. I took a bunch of screens to compare and couldn't tell them apart so need to redo
Edit: I skimmed the video and going by the time stamps they don't even test AMD's thingie, only DLSS? Come on guys, not everyone jumped to the RTX series. I have a 1080 and Nvidia locks me out of that stuff while AMD doesn't and, again, it really does look the same as native @ Ultra. I guess it's possible there's some artifacting in motion and/or if there are effects like heat distortion and motion blur and everything else applied on top of the image but again, are you really going to notice that while moving and playing rather than taking a snapshot of the action and checking it out later?
Edit: got the screens proper this time, can anyone really tell which is which in these stills, never mind while simply playing the game? Note I have film grain off but motion blur is at the default 10 and the camera isn't 100% immobile even when you're standing still.
1440p Native with Ultra textures and High everything else:
1440p Ultra AMD SR with Ultra textures and High everything else:
1440p Native with Ultra (and +) settings:
1440p Ultra AMD SR with Ultra (and +) settings:
Bonus shots showing some bad lod regardless of settings, the lighting on that door goes flat just a few feet away then pops closer (or is it the polygonal/normal mapped/tessellated/whatever relief design that goes flat so the lighting naturally follows, I dunno).