• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is DLSS Really "Better Than Native"? - 24 Game Comparison, DLSS 2 vs FSR 2 vs Native

Leonidas

Member


This was posted inside another thread but I feel that this deserves its own thread.

Conclusion:
DLSS2 (4K Quality) and 4K native have about an equal amount of wins and a few ties between DLSS2 and native.
DLSS2 (1440p Quality) and 1440p native have about an equal amount of wins and a few ties between DLSS2 and native.
DLSS2 can also be made to win in more titles if you swap out the DLL files for a later version.

FSR2 loses to native
FSR2 loses to DLSS2

I suspected this would be the outcome as I always turn on DLSS2 when given the option at 1440p to get good image quality along with a nice FPS boost.

Suck it native res :goog_kissing:
 

//DEVIL//

Member
Honestly, buying anything other than an Nvidia card is a mistake.
Buying an Nvidia card considering the outrageous prices is a mistake

I am part of the problem. 4090 owner here.

DLSS is heavenly. and I am very excited about the Switch 2 more than any other next-gen console just because of the DLSS idea into the next-gen Switch is gonna be amazing.

AMD needs to act up and fast if they want any pie in the GPU market. They do not even have a mid range GPU yet when Nvidia so far has released 4 SKUs..what is AMD smocking?

FSR3 must use hardware-level upscaling if they wanna compete. whatever crap they have now is not working.
 
Last edited:
The Nvidia tax is high but you can't deny their technology.

Seeing DLSS 3.0 with frame generation and path tracing in action on my RTX 4090 with games like CP2077 makes it all worthwhile.

It's an outrageously expensive card but I can't deny how impressive the tech behind it is.

I wish AMD would catch up and stop allowing Nvidia to run with the ball like this. It's why Nvidia feel they can charge whatever for their cards because they know no one else is coming close to touching them in tech implementation.
 
Last edited:
Obviously ML on top of Temporal reconstruction will give better results than just a Temporal solution alone.
You have to remember that FSR will continue to improve its algorithms and increase the tech it is using. While it isn't as good, it's good enough. It will help the console keep performance up.

I am really interested to see if anything comes from the XSXSs lower precision Int8 and Int4 abilities and their Direct ML extension.
It's pretty obvious that MS has been investing alot of time and money on AI and ML, and maybe there could be some attempt to integrate FSR and Direct ML.
MS quoted says saying their ML abilities on the XSX gives a 3-10x performance improvement.
Hopefully we actually get to see if this something tangible or if it's a nothing burger.
 

azertydu91

Hard to Kill
Obviously ML on top of Temporal reconstruction will give better results than just a Temporal solution alone.
You have to remember that FSR will continue to improve its algorithms and increase the tech it is using. While it isn't as good, it's good enough. It will help the console keep performance up.

I am really interested to see if anything comes from the XSXSs lower precision Int8 and Int4 abilities and their Direct ML extension.
It's pretty obvious that MS has been investing alot of time and money on AI and ML, and maybe there could be some attempt to integrate FSR and Direct ML.
MS quoted says saying their ML abilities on the XSX gives a 3-10x performance improvement.
Hopefully we actually get to see if this something tangible or if it's a nothing burger.
Infinite Loop Help GIF by CmdrKitten

You trying to connect every thread to MS PR...
 
The Nvidia tax is high but you can't deny their technology.

Seeing DLSS 3.0 with frame generation and path tracing in action on my RTX 4090 with games like CP2077 makes it all worthwhile.

It's an outrageously expensive card but I can't deny how impressive the tech behind it is.

I wish AMD would catch up and stop allowing Nvidia to run with the ball like this. It's why Nvidia feel they can charge whatever for their cards because they know no one else is coming close to touching them in tech implementation.
How do you enjoy the input lag in DLSS 3? I'd say it literally makes games worse to play.

And yes, I also have a 4090 and just got the 7800x3d.
 

Sentenza

Member
DLSS has kept my 3090 relevant even when faced with these recent awful console ports.
Oh, come on with this ridiculous "kept relevant".
Your 3090 will run circles around anything else in the console space for years to come, "awful ports" or not. Even without DLSS.

I have a 3080ti myself and I have yet to approach the point where I struggled with ANYTHING.
 
Last edited:

azertydu91

Hard to Kill
Oh, come on with this ridiculous "kept relevant".
Your 3090 will run circles around anything else in the console space for years to come, "awful ports" or not. Even without DLSS.

I have a 3080ti myself and I have yet to approach the point where I struggled with ANYTHING.
Believe me you haven't seen some atrocities that have recently released, even the 40XX seem to struggle sometimes...Even if everybody agrees that they should not.Not even counting the inglorious stutter that seems prevalent in recent games and I honestly have no idea why now.
 

consoul

Member
There's no one correct answer. It all comes down to the implementation in engine in each game.

While it seems crazy to suggest any image reconstruction can be better than native, the result can be better depending on what the native version is doing temporally.
 

Zathalus

Member
How do you enjoy the input lag in DLSS 3? I'd say it literally makes games worse to play.

And yes, I also have a 4090 and just got the 7800x3d.
It's generally only a 10ms penalty or so. So roughly 50ms to 60ms for example. I honestly can't notice it. You need to enable reflex to use it so latency would be miles better then any game that doesn't have it.
 

supernova8

Banned
The whole premise is kinda pointless. DLSS/FSR don't need to be better than native. They just need to be good enough (or as good as native) and then provide a big enough FPS bump (not including frame generation, get that input lag outta here) to compensate for any hit to image quality.
 
Last edited:
How do you enjoy the input lag in DLSS 3? I'd say it literally makes games worse to play.

And yes, I also have a 4090 and just got the 7800x3d.
I barely notice it at all. Maybe it’s ‘cos I haven’t used it on any fast paced games. Anyway I don’t really play twitchy shooters on PC so it’s probably something I’m not gonna be noticing anytime soon. Nvidia Reflex is probably helping a ton too to help me not notice it
 
Last edited:
From what i've seen DLSS 3 still has less or same input lag than native resolution. If i remember correctly.
What????

No, it adds latency.

https://www.tomshardware.com/reviews/nvidia-geforce-rtx-4080-review/7

"We only collected latency data for Cyberpunk, but you can see the pattern, which should also hold in other games. Of course, latency largely depends on framerate, so higher FPS means lower latency gaming. But Reflex helps to eliminate extra frames of latency, dropping the 4080 from 70ms at native to 43ms with DLSS Quality mode, and 32ms with Performance mode — that's without Frame Generation.

With Frame Generation (DLSS 3), latency increases to 88ms at native, 58ms with DLSS Quality, and 45ms with DLSS Performance. Ultimately, you can choose between higher fps and lower latency with a larger upscaling factor, higher fps with more latency but with better upscaling quality, or a blend between the two
."

On the example above it's adding about 35% latency penalty, it would be a no go for any multiplayer competitive game.

I'd say the ideal here would be to turn on Reflex if available and only use frame generation if you're on a very response forgiving game like flight sim or something similar.
 

01011001

Banned
DLSS is almost always turned on in an instant for me.

it's really rare that it looks worse, and if it looks worse it's not always noticeable and worth it for the performance boost.

it really benefits from the fact that most TAA implementations in modern engines are fucking DOOOGSHIT, so DLSS just wins by default due to its superior Antialiasing
 

01011001

Banned
oh damn, I just saw they tested Death Stranding, that game is indeed crazy with how much better DLSS looks, the video doesn't even do it justice here.

when I tested that on my TV at 4K, which thankfully I could easily run it at on my PC given that it's a last gen game... I thought something was broken... I thought my resolution was wrong or maybe I accidentally set my video output wrong.

but nope, it's just the fucking awful AA of that game.
turning on DLSS in that was an instant night and day difference. DLSS actually looked like the game was working correctly, native looked like something was broken.

the powerlines when looking at the city really looked like a line of disconnected shimmering pixels, and even on DLSS ULTRA PERFORMANCE it looked better. which is wild.
 
Last edited:
The Nvidia tax is high but you can't deny their technology.

Seeing DLSS 3.0 with frame generation and path tracing in action on my RTX 4090 with games like CP2077 makes it all worthwhile.

It's an outrageously expensive card but I can't deny how impressive the tech behind it is.

I wish AMD would catch up and stop allowing Nvidia to run with the ball like this. It's why Nvidia feel they can charge whatever for their cards because they know no one else is coming close to touching them in tech implementation.
There isn't any way for AMD to catch up unless they become a software company like Nvidia. People don't understand that Nvidia is quietly one of the world's most advanced software companies, they employ more software engineers than hardware engineers, and they not only sell GPU's to other people to build supercomputers but they themselves have a Top 10 supercomputer which they use for training the machine deep learning algorithms for DLSS in addition to running their self-driving car project. Nvidia is one of the world's foremost AI companies, not only making the hardware but also producing the software frameworks and tools which make it possible for you to take the Nvidia GPU inside your computer now, install Nvidia's AI framework software, and start training your own models and create your own Stable Diffusion or ChatGPT.

DLSS fixes shimmering, its always better. The main reason i pushed higher resolution then native + higher AA solutions that tanked performance massively.

DLSS is black magic.
Don't worry, it's not witchcraft. It's just advanced technology. AI, to be precise. Machine deep learning algorithms trained on millions of images of video games which are run on the Nvidia GPU's Tensor cores to perform image reconstruction.

The reason why DLSS is so superior to FSR is because it uses this AI image reconstruction which is pretty far beyond the simple mathematical interpolation used for TAA upscaling and sharpening. AMD doesn't have any capability in AI the way Nvidia does, and they don't have the resources to actually fund an entire AI ecosystem just to train an upscaler for playing video games. Nvidia does it because it's a happy side effect of all the other work they do in AI which happens to be applicable to gaming.
 

01011001

Banned
NVidia has been pushing ray-tracing hard, but the real killer feature for RTX cards, is DLSS2.

well they work in tandem.
raytracing tanks performance, DLSS brings back some of that performance.

I think this was crucial to make RT viable on lower end RTX cards
 

winjer

Gold Member
well they work in tandem.
raytracing tanks performance, DLSS brings back some of that performance.

I think this was crucial to make RT viable on lower end RTX cards

I always disabled RT, as the performance hit rarely justified the image quality improvement.
But I always turned on DLSS when I could.
 

01011001

Banned
I always disabled RT, as the performance hit rarely justified the image quality improvement.
But I always turned on DLSS when I could.

and even less people would use RT if DLSS didn't exist.

for example in Doom Eternal, I can run it with RT at around 90 to 100fps, which is great already, but then I turn on DLSS, which improves image sharpness and stability, and I'm at 120 to 144 fps.

so without DLSS, I might not use RT in that game, but given that with DLSS the game reaches my monitor's refresh rate, I use RT, because why wouldn't I?
 
Last edited:

Teslerum

Member
Oh, come on with this ridiculous "kept relevant".
Your 3090 will run circles around anything else in the console space for years to come, "awful ports" or not. Even without DLSS.

I have a 3080ti myself and I have yet to approach the point where I struggled with ANYTHING.
Even if it does, you can turn pre-sets down (preferable individual settings with a little research) to still play at a near indistinguishable (especially during gameplay) graphics quality.

My previous PC lasted nearly a decade and the worst I've gotten to was medium settings with high settings mixed in. People worry too much when it comes to Graphic Cards.
 
Last edited:

winjer

Gold Member
and even less people would use RT if DLSS didn't exist.

for example in Doom Eternal, I can run it with RT at around 90 to 100fps, which is great already, but then I turn on DLSS, which improves image sharpness and stability, and I'm at 120 to 144 fps.

so without DLSS, I might not use RT in that game, but given that with DLSS the game reaches my monitor's refresh rate, I use RT, because why wouldn't I?

If there is enough performance, then RT is an nice extra to have.
But I would never sacrifice frame rate to use RT.
 

Xcell Miguel

Gold Member
What????

No, it adds latency.

https://www.tomshardware.com/reviews/nvidia-geforce-rtx-4080-review/7

"We only collected latency data for Cyberpunk, but you can see the pattern, which should also hold in other games. Of course, latency largely depends on framerate, so higher FPS means lower latency gaming. But Reflex helps to eliminate extra frames of latency, dropping the 4080 from 70ms at native to 43ms with DLSS Quality mode, and 32ms with Performance mode — that's without Frame Generation.

With Frame Generation (DLSS 3), latency increases to 88ms at native, 58ms with DLSS Quality, and 45ms with DLSS Performance. Ultimately, you can choose between higher fps and lower latency with a larger upscaling factor, higher fps with more latency but with better upscaling quality, or a blend between the two
."

On the example above it's adding about 35% latency penalty, it would be a no go for any multiplayer competitive game.

I'd say the ideal here would be to turn on Reflex if available and only use frame generation if you're on a very response forgiving game like flight sim or something similar.
Latency depends on how many FPS you have, the more FPS (excluding FG), the lower the latency, logic.
In this example they compare native resolution with and without FG, of course FG will add a bit of latency. But if you enable DLSS 2 on top of that, you will get more base FPS, thus latency of DLSS2 + DLSS3 FG can be lower than native (without DLSS) as seen in other tests.
It really depends on what is compared, and there's a chart that is posted regularly showing that FG can have lower latency than native, because this test used DLSS2 + DLSS3 FG, thus base FPS is higher than native, so the added latency of FG still made it lower native.
That's why sometime you can read that DLSS3 FG can have a lower latency than native (because DLSS2 is also enabled, thus more base FPS).

Also, DLSS3 FG got better with drivers, some older latency tests could not be that relevant.
 

01011001

Banned
If there is enough performance, then RT is an nice extra to have.
But I would never sacrifice frame rate to use RT.

yeah but if you are at your max refresh thanks to DLSS, there's no real reason why you wouldn't use RT... unless you play a competitive game and you want to get every millisecond of latency back lol
 

GymWolf

Member
Did they updated the dlss version in older games before trying this test?

If not, these results are not trustworthy.
 

winjer

Gold Member
Did they updated the dlss version in older games before trying this test?

If not, these results are not trustworthy.

No. They used the default version for all games. Be it DLSS or FSR. And no mods.
Basically, it's the way most people play these games. As the majority of gamers don't bother switching dlls and moding games.
 

GymWolf

Member
No. They used the default version for all games. Be it DLSS or FSR. And no mods.
Basically, it's the way most people play these games. As the majority of gamers don't bother switching dlls and moding games.
I think that every dude who care about a dlss comparison video is nerd enough to do the change, since it's like a 30 seconds thing...
Modding games is a bit more complicated than that.

You know how noob i am, and i still update dlss in every game i play.
 

Mr.Phoenix

Member
Main takeaway here? Considering the performance gains and the marginal IQ loss vs native, this just shows how much of a waste of resources native resolutions are.

The only sad thing here is that it took a rendering paradigm so costly to make the PC master race/elite suddenly not just accept reconstruction techniques, but now even champion it as a defining feature in their hardware setup.
 

Ironbunny

Member
If you use DLSS try using also the image enhancement called sharpening+ from nvidia overlay. It kinda supercharges the image to look tack sharp even on 1440p monitors.
 

winjer

Gold Member
I think that every dude who care about a dlss comparison video is nerd enough to do the change, since it's like a 30 seconds thing...
Modding games is a bit more complicated than that.

You know how noob i am, and i still update dlss in every game i play.

You are right. But still, we are the minority.
And some games require going through some hoops t be able to switch the dll. And some just block it.
 

HTK

Banned
Personally in fast paced games where looking around quickly is the norm like Call of Duty for example; I find DLSS extremely noticeable and distracting. I hate the artifacts that come with it, makes the image quality look shitty so I play Natively.

I think the tech is great for slower paced type of games but in fast paced I notice it big time and it’s distracting. So I normally have DLSS always disabled.
 

Ironbunny

Member
Personally in fast paced games where looking around quickly is the norm like Call of Duty for example; I find DLSS extremely noticeable and distracting. I hate the artifacts that come with it, makes the image quality look shitty so I play Natively.

I think the tech is great for slower paced type of games but in fast paced I notice it big time and it’s distracting. So I normally have DLSS always disabled.

I'd argue against this. I cant notice any artifacts anymore which were there in earlier version of DLSS. I'm playing Warzone DMZ and MW2 multiplayer at constant 175 fps with DLSS in quality mode everything turned highest setting and its looks crips as fuck. No artifacts what so ever. Even tried pixel peeping to see some but no.
 

RoboFu

One of the green rats
In my experience dlss 2 has issues with symmetrical repeating patterns at a distance. Which honestly is to be expected as there just probably isn’t enough information there in a blurry af texture. But you can see it a lot in cyberpunk which has a lot of symmetrically patterned textures.
 
Last edited:
Even if it does, you can turn pre-sets down (preferable individual settings with a little research) to still play at a near indistinguishable (especially during gameplay) graphics quality.

My previous PC lasted nearly a decade and the worst I've gotten to was medium settings with high settings mixed in. People worry too much when it comes to Graphic Cards.
This is because games are made for console generations and those last 10 years.
 

yamaci17

Member
Add sharpening filter to your 1440p native and then compare with DLSS :)
nope. the common bait response. there's no sharpening involved. purely 4k lods+assets+textures at work there. you cannot simply recreate what is happening there with a sharpen filter. and it is disabled at %0 regardless

good luck bringing those face textures/details with a sharpener. I will wait for you to try

vO3vhOi.png


QMvsPI1.png



as I said, in all my comparisons, sharpening is disabled. Because I've dealt with such responses before

i don't blame you though, as for most "1440p" users, 4k dlss performance destroying+demolishing+annihilating native 1440p image quality presentation with matched or better performance is a tough pill to swallow
 
Last edited:

Honey Bunny

Member
Kind of insane people are using an upscaler to improve image quality. Why dont they release another *acronymed feature* without the upscaler but with the extra image processing?
 

Ironbunny

Member
i don't blame you though, as for most "1440p" users, 4k dlss performance destroying+demolishing+annihilating native 1440p image quality presentation with matched or better performance is a tough pill to swallow

Screen size is a factor in that depate too.
 

Ironbunny

Member
Kind of insane people are using an upscaler to improve image quality. Why dont they release another *acronymed feature* without the upscaler but with the extra image processing?

Whats insane about it? Its basicly using high ress data sets to push details to lower ress that cant be put there with game engines rendered in native resolution?
 

yamaci17

Member
Screen size is a factor in that depate too.
its not; regardless of screen size; 4k dlss performance produces better image quality. because It uses 4k lods+assets+texture that "native" 1440p will never use. this is a different beast that people are unable to comprehend.

the game will legit load/utilize higher quality assets+textures+lods with 4k upscaling even at internal 1080p rendering.

2yclmn8.png
6uu79Xj.png



detail is literally not there. you cannot bring any new detail with a sharpening, you can only pronounce them.

sharpening lol... give me a break

HuHQVu6.png
KXQO5hL.png



I even had to "show" overlay to prove someone that the effect above and below was not affected by sharpening


(I can repeat the same test in rdr2 at 1440p too. will provide similar results. 4k dlss perf will wreck 1440p there too)

it stops having to do anything with screen size when game will legit refuse to utilize actual high quality assets+textures when you output to 1440p. nothing can change that; unless you use DSR/custom res to go to 4K output and downsample back again
 
Last edited:

Buggy Loop

Member
What????

No, it adds latency.

https://www.tomshardware.com/reviews/nvidia-geforce-rtx-4080-review/7

"We only collected latency data for Cyberpunk, but you can see the pattern, which should also hold in other games. Of course, latency largely depends on framerate, so higher FPS means lower latency gaming. But Reflex helps to eliminate extra frames of latency, dropping the 4080 from 70ms at native to 43ms with DLSS Quality mode, and 32ms with Performance mode — that's without Frame Generation.

With Frame Generation (DLSS 3), latency increases to 88ms at native, 58ms with DLSS Quality, and 45ms with DLSS Performance. Ultimately, you can choose between higher fps and lower latency with a larger upscaling factor, higher fps with more latency but with better upscaling quality, or a blend between the two
."

On the example above it's adding about 35% latency penalty, it would be a no go for any multiplayer competitive game.

I'd say the ideal here would be to turn on Reflex if available and only use frame generation if you're on a very response forgiving game like flight sim or something similar.

These sites write “native” but it’s native + reflex in reality.

qRLYAFT.png


I guess it can depend on the game, but here. A whooping 10ms delay from what is native + reflex

plague-latency-tests.png


AMD latency?

fortnite-latency-4070-ti-perf.png


"We experimented with Radeon Anti-Lag here as well, which did seem to reduces latency on the Radeons by about 10 - 20ms, but we couldn't get reliable / repeatable frame rates with Anti-Lag enabled."

"Normally, all other things being equal, higher framerates result in lower latency, but that is not the case here. The GeForce RTX 4070 Ti offers significantly better latency characteristics versus the Radeons, though it obviously trails the higher-end RTX 4080."

But really, for sure that 10ms makes it unplayable (recurring comments in every frame gen discussions..) I guess all AMD flagship owners have everything unplayable. While consoles comparatively have >double the latency when compared to reflex technology.

If you play a competitive games and you’re a pro gamer (lol sure), the cards that have frame gen don’t need to run frame gen with these games, they run on potatoes. You don’t need to care for 10 ms in cyberpunk 2077 overdrive path tracing, but goddamn I wish I had frame gen right about now for that game.

Peoples just remember « oh no! More latency! » from the frame gen articles. Dude, you’re eating good with reflex and inherent lower latency on Nvidia cards.
 

Ironbunny

Member
its not; regardless of screen size; 4k dlss performance produces better image quality. because It uses 4k lods+assets+texture that "native" 1440p will never use. this is a different beast that people are unable to comprehend.

the game will legit load/utilize higher quality assets+textures+lods with 4k upscaling even at internal 1080p rendering.


detail is literally not there. you cannot bring any new detail with a sharpening, you can only pronounce them.

sharpening lol... give me a break



I even had to "show" overlay to prove someone that the effect above and below was not affected by sharpening

[/URL][/URL][/URL]

(I can repeat the same test in rdr2 at 1440p too. will provide similar results. 4k dlss perf will wreck 1440p there too)

I'm talking about if you have a 27" inch 1440p and 27" inch 1080p monitor next to each other. Not about screencaps as there is a physical pixel difference in reality.1080p is softer but there is some extra detail there.
 
Last edited:
Top Bottom