BS.Looks like ground truth render at least in Death stranding
I remember digital foundry doing...
...and saying it gave better image quality than native 4k...
when sony and microsoft use sub native 4k upscaling, gaf be like...
when nvidia use sub native 4k upscaling, gaf be like...
4k is a waste, 4k doesnt sell games, all the prettys does.
How does DLSS look better than native 4k?
lol @ 3070 vs 2080 Ti performance
I'm on 27" 1440p display + 2070S. Sitting close - around 50-80cm.What's your experience with using DLSS? How many games have you tried out? What's your setup (GPU, screen size, seating distance etc.)?
That's true. Again it really depends on game. Death Stranding do not have any raytracing, so you don't see much of that noise, but try something like Minecraft RTX and picture is not good.That's true. I believe 2.0 release was when DLSS started to look amazing. Metro Exodus for example is terrible, looks like a lube filter. And that's the beauty of DLSS, it's only getting better with time.
I need to try Minecraft out. Control is a fully ray-traced game and it looked amazing with DLSS. I think the biggest test for this technology will be the upcoming Cyberpunk. Can't wait for this one.That's true. Again it really depends on game. Death Stranding do not have any raytracing, so you don't see much of that noise, but try something like Minecraft RTX and picture is not good.
dude I have 4k monitor. I compared 4k dlss quality to 4k native.... Dlss looks much better. Especially in motion. Grass don't dither, pixels don't crawl on sign posts or buildings.
Oh, we are into comparing monitors.dude I have 4k monitor.
Me clicking paid shills' videos is not going to happen.10:40
It is fine to think something is better than you could demonstrate it is.Dlss looks much better
This is straight up trolling now. wtf is wrong with you.Oh, we are into comparing monitors.
I have this 4k monitor. I suspect it's likely better than yours for the task.
Oh, and in case it is relevant, this TV.
Oh, and before buthurt ones come,I run it at 1080p when my company laptop is connected to it.
Me clicking paid shills' videos is not going to happen.
It is fine to think something is better than you could demonstrate it is.
Truth be told, "in motion it is different" is a valid argument.
Well it is good, but inconsistent on every game in terms of glitches and so. Great nonetheless. PS5's Demon's Souls performance mode is much better though, as devs confirmed that the source is 1440p to make a 4K image, which is indistinguishable from native 4K without any flaws and glitches. It's a Sony patented solution, not sure if AMD will borrow it.
I find this tech impressive, but I would find it even more impressive if it wasn't limited to only a handful of games.
Dude leave me alone. There are no nvidia/amd wars. Where are You getting this from.Your green bros are not "everyone else", but thanks for not sharing your personal assessment.
This is exactly right. There was a rumor that a DLSS 3.0 that works essentially the same but with ANY game that has TAA. Which would expand the library of support by a massive amount and that would be great. DLSS is incredible tech, but it's only as great as the games that support it. As it stands there's probably 4 or 5 games that support it that I'm actually interested in playing. I couldn't care less if it doesn't support the games I play.
Until AMD comes out with their version, then he'll be praising the sun gods for it, lol
Actually, given that RDNA2 cards don't have separate, dedicated AI cores, their upscaling solution will most likely be some sort of sophisticated yet general-use algorithm executed on the CUs, which means, it might be possible that it'll be available on pretty much anything, from Windows desktop to 10-20yo titles, not just a few new releases that need special training before they ship, and if so, it would be the better solution indeed IMO. Personally that's my biggest issue with DLSS, that it's available only in handful of titles, and mostly the ones I'm not interested at all or I haven't even heard about, so I'm curious to see who will be the first company to allow upscaling not just per game basis, but all the way down from the driver level.
I don't like the sharpness filter, that creates the white and black ringing around edges in DLSS 2.0. I whish there were an user option to tone it down or turn it off. Adding post 2D sharpness just gives the illusion of extra detail while there is none.
Driver level upscaling happens now.
The big technical question I have for you is this: what makes you think that DL can be simulated in software in a pipeline that does nothing but image manipulation and not statistical based goal-oriented SS? The two techniques are so completely different where the two will be vastly different in appearance than each other not to mention that complex nature of the DL solution screwing up the graphics pipeline as a whole. I've worked on both for a brief amount of time in my career and I can't see how software DL can "fit" into the realtime graphics pipeline.