• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA shares DLSS benchmarks for War Thunder, Ready or Not, Enlisted & COD: Black Ops Cold War

skneogaf

Member
I want to see dlss work at 4k from 4k so basically just use AI to fix anything that needs fixing no matter how unnecessary it is.

I remember digital foundry doing a video on dlss 2.0 and saying it gave better image quality than native 4k in things like mesh at a distance so if AI can smarten it up at 4k from 4k then games that already do 4k@60fps should be extra nice.
 

littlecat

Neo Member
I tend to think DLSS as a 'decompression' method. The frame information is captured and condensed by the training process of DL network. At run time, the lower resolution materials are then 'decompressed' by the Tensor cores using DL model. So DLSS is not 'free' since it requires the Tensor core hardware and electricity to perform this task on the fly.
 

ZywyPL

Banned
when sony and microsoft use sub native 4k upscaling, gaf be like... :messenger_pouting::messenger_pouting::messenger_pouting::messenger_pouting:

when nvidia use sub native 4k upscaling, gaf be like... 🙏🙏🙏🙏

4k is a waste, 4k doesnt sell games, all the prettys does.


It's all about the individual execution - DLSS until version 2.0 is bad, and so are majority of upscaling techniques used on consoles, with Remedy's reconstruction used in Quantum Break being the worst IMO, while Insomniac's temporal injection and recent Demon's Souls remaster being top notch. If we were getting those best case scenarios across the board, no one would ever care about 4K, at all.
 
Last edited:
How does DLSS look better than native 4k?

It doesn't, depending on what is meant by that. Using the poster child of the 'better than native' narrative, Death Stranding gets you 95% of the way there with much better AA. It can look more pleasing than a native image due to it eliminating aliasing pretty much completely, but it does come at a minor cost of small detail - timefall is when it's most noticeable. I only really use DLSS for Control & Metro Exodus myself as a way of boosting RTX performance

lol @ 3070 vs 2080 Ti performance

YiCVfyH.gif

I think Nvidia confused themselves with the 'faster than' marketing as they place the 3070 above the 2080ti with less performance on some charts
 
What's your experience with using DLSS? How many games have you tried out? What's your setup (GPU, screen size, seating distance etc.)?
I'm on 27" 1440p display + 2070S. Sitting close - around 50-80cm.

It depends on game. I tried almost every game with DLSS. Newer are better. In any game older than Control DLSS is awful.
It's especially noticable in ray-tracing games. You will see a lot of RT noise and over-sharpened edges. Overall picture become like compressed jpeg.
All that said DLSS is amazing, especially newer versions. For example I tried Bright Memory benchmark DLSS is so good that I actually prefer "performance" mode for more frames rather than "quality", because difference is not worth it in that benchmark.

Overall DLSS looks more noisy and edges are bit "ringy" (search "ringing artifacts"), but if you need frames it's better than upscaling from lower res.
 
Incredible results. I think it is a way to go in the future - using DLSS instead of high-end hardware.
I wonder though if it can be used to resolve VR problems - where it requires 2x performance.
 
Last edited:
That's true. I believe 2.0 release was when DLSS started to look amazing. Metro Exodus for example is terrible, looks like a lube filter. And that's the beauty of DLSS, it's only getting better with time.
That's true. Again it really depends on game. Death Stranding do not have any raytracing, so you don't see much of that noise, but try something like Minecraft RTX and picture is not good.
 
That's true. Again it really depends on game. Death Stranding do not have any raytracing, so you don't see much of that noise, but try something like Minecraft RTX and picture is not good.
I need to try Minecraft out. Control is a fully ray-traced game and it looked amazing with DLSS. I think the biggest test for this technology will be the upcoming Cyberpunk. Can't wait for this one.
 

llien

Member
dude I have 4k monitor.
Oh, we are into comparing monitors.
I have this 4k monitor. I suspect it's likely better than yours for the task.
Oh, and in case it is relevant, this TV.
Oh, and before buthurt ones come,I run it at 1080p when my company laptop is connected to it.

Me clicking paid shills' videos is not going to happen.

Dlss looks much better
It is fine to think something is better than you could demonstrate it is.

Truth be told, "in motion it is different" is a valid argument.
 
Last edited:

rofif

Can’t Git Gud
Oh, we are into comparing monitors.
I have this 4k monitor. I suspect it's likely better than yours for the task.
Oh, and in case it is relevant, this TV.
Oh, and before buthurt ones come,I run it at 1080p when my company laptop is connected to it.


Me clicking paid shills' videos is not going to happen.


It is fine to think something is better than you could demonstrate it is.

Truth be told, "in motion it is different" is a valid argument.
This is straight up trolling now. wtf is wrong with you.
I have nothing to demonstrate to you. I and everyone else sees the results
 

Krappadizzle

Gold Member
Well it is good, but inconsistent on every game in terms of glitches and so. Great nonetheless. PS5's Demon's Souls performance mode is much better though, as devs confirmed that the source is 1440p to make a 4K image, which is indistinguishable from native 4K without any flaws and glitches. It's a Sony patented solution, not sure if AMD will borrow it.
giphy.gif



This is not a contest you want to participate in.

I find this tech impressive, but I would find it even more impressive if it wasn't limited to only a handful of games.

This is exactly right. There was a rumor that a DLSS 3.0 that works essentially the same but with ANY game that has TAA. Which would expand the library of support by a massive amount and that would be great. DLSS is incredible tech, but it's only as great as the games that support it. As it stands there's probably 4 or 5 games that support it that I'm actually interested in playing. I couldn't care less if it doesn't support the games I play.
 
Last edited:

rofif

Can’t Git Gud
Your green bros are not "everyone else", but thanks for not sharing your personal assessment.
Dude leave me alone. There are no nvidia/amd wars. Where are You getting this from.
DLSS was only bad when first introduced with metro. Now it's magic.
Dude let me be. I don't want to talk with you.
If I cannot prove You the stability of reconstructed image with DLSS with screenshots, videos or my opinion then You will not get it.
I have 4k monitor and 3080. so I have plenty enough power to run native 4k but I still choose 4k dlss quality since it's just plain better looking than taa
 
This is exactly right. There was a rumor that a DLSS 3.0 that works essentially the same but with ANY game that has TAA. Which would expand the library of support by a massive amount and that would be great. DLSS is incredible tech, but it's only as great as the games that support it. As it stands there's probably 4 or 5 games that support it that I'm actually interested in playing. I couldn't care less if it doesn't support the games I play.

This would be amaaaaaaazing if true.
 

ZywyPL

Banned
Until AMD comes out with their version, then he'll be praising the sun gods for it, lol

Actually, given that RDNA2 cards don't have separate, dedicated AI cores, their upscaling solution will most likely be some sort of sophisticated yet general-use algorithm executed on the CUs, which means, it might be possible that it'll be available on pretty much anything, from Windows desktop to 10-20yo titles, not just a few new releases that need special training before they ship, and if so, it would be the better solution indeed IMO. Personally that's my biggest issue with DLSS, that it's available only in handful of titles, and mostly the ones I'm not interested at all or I haven't even heard about, so I'm curious to see who will be the first company to allow upscaling not just per game basis, but all the way down from the driver level.
 

VFXVeteran

Banned
Actually, given that RDNA2 cards don't have separate, dedicated AI cores, their upscaling solution will most likely be some sort of sophisticated yet general-use algorithm executed on the CUs, which means, it might be possible that it'll be available on pretty much anything, from Windows desktop to 10-20yo titles, not just a few new releases that need special training before they ship, and if so, it would be the better solution indeed IMO. Personally that's my biggest issue with DLSS, that it's available only in handful of titles, and mostly the ones I'm not interested at all or I haven't even heard about, so I'm curious to see who will be the first company to allow upscaling not just per game basis, but all the way down from the driver level.

Driver level upscaling happens now.

The big technical question I have for you is this: what makes you think that DL can be simulated in software in a pipeline that does nothing but image manipulation and not statistical based goal-oriented SS? The two techniques are so completely different where the two will be vastly different in appearance than each other not to mention that complex nature of the DL solution screwing up the graphics pipeline as a whole. I've worked on both for a brief amount of time in my career and I can't see how software DL can "fit" into the realtime graphics pipeline.
 

JCK75

Member
I have no experience with DLSS in gaming, but isn't it the same basic tech behind the AI upscaling on Nvidia shield? people have no idea how amazing that is.
 

RedVIper

Banned
I don't like the sharpness filter, that creates the white and black ringing around edges in DLSS 2.0. I whish there were an user option to tone it down or turn it off. Adding post 2D sharpness just gives the illusion of extra detail while there is none.

The option is there, it's up to developers to make the slider available to the user. Not really Nvidias fault there.
 

ZywyPL

Banned
Driver level upscaling happens now.

The big technical question I have for you is this: what makes you think that DL can be simulated in software in a pipeline that does nothing but image manipulation and not statistical based goal-oriented SS? The two techniques are so completely different where the two will be vastly different in appearance than each other not to mention that complex nature of the DL solution screwing up the graphics pipeline as a whole. I've worked on both for a brief amount of time in my career and I can't see how software DL can "fit" into the realtime graphics pipeline.

By driver level I meant enabling the upscaling option from the NV Control Panel/Radeon Software, like it's already possible with for example DSR, FXAA, G-Sync and what's not, something that's globally available, not just in certain games directly in their settings menu. This, or a vastly improved RT performance, that doesn't need to rely on any upscaling at all, but I feel the former will be the first to appear.
 
Top Bottom