• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA shares DLSS benchmarks for War Thunder, Ready or Not, Enlisted & COD: Black Ops Cold War

KyoZz

Tag, you're it.
Nvidia%20RTX_5f02e110a6bf6.jpeg



NVIDIA has announced that four additional games now support its DLSS tech. These four games are: Call of Duty Black Ops Cold War, War Thunder, Enlisted and Ready or Not.
In addition, the green team released some benchmark figures between no-DLSS and DLSS that you can find below.

NVIDIA DLSS boosted frame rates by up to 85% at 4K on a range of GeForce RTX graphics cards. Keep in mind that this is with the performance mode, that can bring blur on the IQ.

call-of-duty-black-ops-cold-war-dlss-november-2020-3840x2160-performance.png




War Thunder sees a performance increase by up to 30% at 4K in Performance Mode. Not only that, but NVIDIA released a video, showcasing how great this implementation actually is in it. The following video compares native 4K with DLSS Quality Mode.
And, contrary to Call of Duty Black Ops Cold War, War Thunder appears to be sharper when DLSS is activated.




war-thunder-nvidia-dlss-november-2020-3840x2160-performance.png



Similarly, Enlisted appears to feature better and sharper images with DLSS Quality Mode. Not only that, but NVIDIA DLSS can boost frame rates by up to 55% at 4K (in its Performance Mode).




enlisted-nvidia-dlss-november-2020-3840x2160-performance.png



Lastly, Ready or Not currently supports ray-traced reflections, ray-traced shadows, ray-traced ambient occlusion shading, and NVIDIA DLSS. According to the team, DLSS accelerates performance by up to 120% at 4K with the new ray-traced effects activated in Performance Mode.
And while this is a huge performance increase, we should remind you that this is in Performance Mode. Thus, we don’t know whether the DLSS implementation brings any major blurriness/image degradation to this game.

ready-or-not-alpha-nvidia-dlss-november-2020-3840x2160-performance.png


 
Last edited:
An improvement of 35 Frames Per Second... this was (and still is) considered a generational leap in term's of hardware performance - if you plugged a new card in to replace your older variant (and maybe that card wasn't actually old) and if (Big IF) that new card delivered 35 FPS more than the last generation - then that was an amazing piece of hardware considering it delivered a generational leap when new card's usually only delivered a 5% - 8% performance improvement. And 15% on the high end... and that was, to overclockers - a worthy upgrade.

Believe it or not, you'd see many new graphics cards released 2 years later and you might only get a 10% at most to a 15% improvement in performance if you were lucky. And that usually only amounted to an extra 17 frames per second in problem areas.

What is really amazing is, due to software optimization that utilizes hardware better - we are now seeing hardware leapfrog it's own performance and deliver generational leaps within the same set of hardware specifications, and that margin only becomes more evident as you move from the 2070 to 3080. Both card's are literally doubling their performance margin's (delivering 35 -45 FPS more) with the advent of these software optimizations... so we are seeing generational leap's of performance improvement - on old hardware....

Edit: And to be more specific - you would typically need to wait 7 years to get a new GPU that delivered an extra 35FPS.

Nvidia has once and only once before this generation, delivered a full fledged product that performed 35 FPS better within a short time frame and that product took 4 years to get to consumers.
 
Last edited:

sertopico

Member
IQ still needs to be improved when using dlss, and I honestly don't know if they can push things further on this regard. The boost is considerable but there are still issues.
 

BluRayHiDef

Banned
On amd release day. Fuck off nvidia and your shitty practices.

How is it shitty for Nvidia to make their products perform as best as they can in comparison to the competition?

Also, has there been any indication that these updates were provided to tech reviewers already, so that they'd be ready for the comparisons that said reviewers would make between the RX 6000 Series and the RTX 30 Series in their reviews of the former?
 
Last edited:

JeloSWE

Member
I don't like the sharpness filter, that creates the white and black ringing around edges in DLSS 2.0. I whish there were an user option to tone it down or turn it off. Adding post 2D sharpness just gives the illusion of extra detail while there is none.
 

psn

Member
I don't know, I was open to it. But I tried it and I can't stand it. While moving, it all goes blurry compared to the native resolution. The low bitrate youtube videos don't tell the whole story.
 

Bo_Hazem

Banned
It's the best fake 4k by a country mile. Nothing is even remotely close.

I wonder what performance gains we can expect in Cyberpunk 2077. 🤔

Well it is good, but inconsistent on every game in terms of glitches and so. Great nonetheless. PS5's Demon's Souls performance mode is much better though, as devs confirmed that the source is 1440p to make a 4K image, which is indistinguishable from native 4K without any flaws and glitches. It's a Sony patented solution, not sure if AMD will borrow it.
 
Well it is good, but inconsistent on every game in terms of glitches and so. Great nonetheless. PS5's Demon's Souls performance mode is much better though, as devs confirmed that the source is 1440p to make a 4K image, which is indistinguishable from native 4K without any flaws and glitches. It's a Sony patented solution, not sure if AMD will borrow it.
Wait, you seriously think checkerboard > DLSS 2.0?!



 

Bo_Hazem

Banned
Wait, you seriously think checkerboard > DLSS 2.0?!





Who said Checkerboarding here? That's an old tech.

 
Who said Checkerboarding here? That's an old tech.

Are they using that patented tech in Demon Souls currently? I haven't heard anything about it, then again I haven't really been paying attention to it lately either.
 
Last edited:

Bo_Hazem

Banned
Are they using that patented tech in Demon Souls currently? I haven't heard anything about it, then again I haven't really been paying attention to it lately either.

They didn't mention it literally, but in the DF interview (very long one, but very good to watch) with 3 devs from Bluepoint and I think Japan Studio they confirmed that the 4K@60fps mode is actually using a 1440p source to reconstruct a 4K final image vs native 4K@30fps mode that they both look near identical. Need to watch it natively on the console as after zooms by John it has a very slight sharper image at native 4K with extra tessellation.
 
Last edited:
Well it is good, but inconsistent on every game in terms of glitches and so. Great nonetheless. PS5's Demon's Souls performance mode is much better though, as devs confirmed that the source is 1440p to make a 4K image, which is indistinguishable from native 4K without any flaws and glitches. It's a Sony patented solution, not sure if AMD will borrow it.
The only "glitch" I ever experienced with it was in Death Stranding: black trails for floating cryptobiotes particles. It's so minor that would never switch DLSS off because of it.

I've seen some direct footage taken from Gamersyde of Demon's Souls and it's indeed impressive. You can definitely tell which is 1440p upscaled and which is native 4k but it's impressive nonetheless. I think to achieve 60FPS they also had to lower some graphical settings (tessellation?).

The beautiful thing about DLSS 2.X is that in normal gameplay you can't really tell the difference between it being on or off. You don't have to comprise on any graphical settings. And you get ridiculous performance gains.

Like I said: nothing comes even remotely close to it at this time.
 

TonyK

Member
Who said Checkerboarding here? That's an old tech.

I know that you know it, but I ask regardless: do you know that fill a patent doesn't mean it's being used, right?
 

ratburger

Member
Who said Checkerboarding here? That's an old tech.

Not that nonsense again. At the bottom of that article is a link to the patent application. Open it up and scroll down to near the end. There's a bunch of drawings, one of which clearly shows a camera pointed at a person who seems to be gesturing. It's a patent for something completely unrelated to image upscaling, probably motion recognition for VR or the like.
 
Last edited:

Bo_Hazem

Banned
The only "glitch" I ever experienced with it was in Death Stranding: black trails for floating cryptobiotes particles. It's so minor that would never switch DLSS off because of it.

I've seen some direct footage taken from Gamersyde of Demon's Souls and it's indeed impressive. You can definitely tell which is 1440p upscaled and which is native 4k but it's impressive nonetheless. I think to achieve 60FPS they also had to lower some graphical settings (tessellation?).

The beautiful thing about DLSS 2.X is that in normal gameplay you can't really tell the difference between it being on or off. You don't have to comprise on any graphical settings. And you get ridiculous performance gains.

Like I said: nothing comes even remotely close to it at this time.

Should watch that video from Gamersyde. I think the glitches and blurriness in some DLSS 2.0 shown is probably from pushing it too hard to 1080p or lower instead of using 1440p as the main source. It should help devs to provide more juicy graphics when both consoles can make something close enough or better than DLSS 2.0 as games usually are made for the lower common denominator. AMD promised a tech that should be leveraged by both consoles and AMD cards later.
 
Last edited:
Should watch that video from Gamersyde. I think the glitches and blurriness in some DLSS 2.0 shown is probably from pushing it too hard to 1080p or lower instead of using 1440p as the main source. It should help devs to provide more juicy graphics when both consoles can make something close enough or better than DLSS 2.0 as games usually are made for the lower common denominator. AMD promised a tech that should be leveraged by both consoles and AMD cards later.
Man, I really hope so. We will need all the juice and more once these true next-gen games start to drop. Future where AI helps us to achieve higher frames is a much brighter than the one where it kills us off with nuclear fire. :messenger_grinning_sweat:
 
And to quickly put it another way, it's like if you were able to time travel ahead by 7 years - and drop 750 dollars on a new card then bring it back to play 4k games made today. Except you're getting all that performance and that generational leap, without having to upgrade or spend money at all.

Remarkable.

Simply Remarkable.

A day will come when people use DLSS tech for 1080p Image reconstruction as default as the benefit will be far too large to ignore, and then of course 4k and so on - but this tech isn't going to stop anytime soon, in fact it is scheduled to keep making disruptive performance gains ad infinite. Amazing.
 
Last edited:

Bo_Hazem

Banned
Man, I really hope so. We will need all the juice and more once these true next-gen games start to drop. Future where AI helps us to achieve higher frames is a much brighter than the one where it kills us off with nuclear fire. :messenger_grinning_sweat:

2022 is when photorealism gaming will start to pop out with UE5 games, along with PS Studios own similar tech of polygon streaming per frame. Lots of power being lost in the traditional LOD system, not to mention bloated game sizes! Nvidia promised 14GB/s, AMD didn't state any number yet with their future DirectStorage implementation.

When PCIe 5.0 hits the market I'll build another PC. :messenger_sunglasses: 32GB/s SSD's and DDR5 RAM's.:lollipop_anxious_sweat:
 

ZywyPL

Banned
4K60 with RT seems like a walk in the park for RTX GPUs thanks to DLSS, for real, let alone lower resolutions, wonder how far can NV go with the tech. Cannot wait for CP2077 benchmarks, this game will be packed with RT effects, this will be the ultimate test for DLSS.


I wonder what llien llien thinks about this...? 🤷‍♂️ 🤷‍♀️

Probably as per usual - DLSS is a gimmick, it's just a sharpening filter, and nothing stops you from lowering the resolution Radeon GPUs if you want to boost the performance.
 

Bo_Hazem

Banned
4K60 with RT seems like a walk in the park for RTX GPUs thanks to DLSS, for real, let alone lower resolutions, wonder how far can NV go with the tech. Cannot wait for CP2077 benchmarks, this game will be packed with RT effects, this will be the ultimate test for DLSS.




Probably as per usual - DLSS is a gimmick, it's just a sharpening filter, and nothing stops you from lowering the resolution Radeon GPUs if you want to boost the performance.

Not sure why some people call DLSS, 8K, Raytracing a gimmick. Any computation trickery is a welcome if the final result is great.

And yes, 8K is superior to 4K, but not as massive as going from 1080p to 4K. RT is a gamechanger, and tech should push hard until we reach path tracing and endless sound source tracing. It'll benefit all, as lesser tech gets cheaper.
 

Dampf

Member
Cmon Nvidia, bring DLSS to more popular titles. Like RDR2.

Nobody cares for these games anymore...
 
Last edited:

sertopico

Member
Please, expand on the issues as you see them.
Texture shimmering and flickering. Reconstructing parts of the scene which have "high frequency" shaded surfaces like metal fences, grids and such is still problematic and in the end worsens the overall image quality, probably it is because of the sharpening applied.
 

llien

Member
Amazing boosts in performance.
2020, when people discover running games at lower resolution "boosts performance".


I wonder what llien llien thinks about this...? 🤷‍♂️ 🤷‍♀️
Non-DLSS looks blurry as hell, this looks like videos comparing two TAA derivatives.

Anyhow, after Ampere fiasco, upscaling, RT and FUD about drivers is the way to go for team green.

How many games have you tried out?
One would have expected you'd shut the hell up on that, after you've been quickly told which of the two screenshots you've shared were DLSS 2 upscaled, but that might be too much to expect.

How does DLSS look better than native 4k?
You remember this videos are coming from serial liars right?
Native is blurred, likely by non DLSS TAA.
 
Last edited:
One would have expected you'd shut the hell up on that, after you've been quickly told which of the two screenshots you've shared were DLSS 2 upscaled, but that might be too much to expect.
Delicious salt. I'm lovin' it. :messenger_grinning_sweat: Do not stop. Do not ever quit. Keep going. :messenger_grinning_sweat:

The screenshot I included in that other thread had that tree in it on purpose. So you can zoom in on your 1080p screen by 500% and tell the difference. Good job, buddy! :messenger_grinning_sweat::messenger_grinning_sweat::messenger_grinning_sweat:
 

ZywyPL

Banned
2020, when people discover running games at lower resolution "boosts performance".



Non-DLSS looks blurry as hell, this looks like videos comparing two TAA derivatives.

Anyhow, after Ampere fiasco, upscaling, RT and FUD about drivers is the way to go for team green.


One would have expected you'd shut the hell up on that, after you've been quickly told which of the two screenshots you've shared were DLSS 2 upscaled, but that might be too much to expect.


You remember this videos are coming from serial liars right?
Native is blurred, likely by non DLSS TAA.


giphy.gif
 

Md Ray

Member
Are they using that patented tech in Demon Souls currently? I haven't heard anything about it, then again I haven't really been paying attention to it lately either.
In the Demon's Souls interview, they say they're using temporal upsampling technique to get to 4K from a 1440p base in performance mode (60fps). This is different from the checkerboard. I imagine it's similar to the UE5's temporal upsampling technique used in that Lumen in the land of nanite demo, which was also rendering at 1440p but with a dynamic scaler enabled. The IQ in that demo looked very pristine 4K-like.
 
Last edited:

acm2000

Member
when sony and microsoft use sub native 4k upscaling, gaf be like... :messenger_pouting::messenger_pouting::messenger_pouting::messenger_pouting:

when nvidia use sub native 4k upscaling, gaf be like... 🙏🙏🙏🙏

4k is a waste, 4k doesnt sell games, all the prettys does.
 
Last edited:
Top Bottom