• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Control vs DLSS 2.0: Can 540p Match 1080p Image Quality? Full Ray Tracing On RTX 2060?

pawel86ck

Banned
I'VE SEEN MOTHERFUCKING LIGHT

DLSS (60 FPS, VSYNC > 30% GPU utilisation )

utGeHkX.jpg


pK4nIG2.png


NATIVE (cca 40FPS)

qDDxD6h.jpg


2080Ti

Just HOW?
Game is rendering at 960p and you also added DLSS on top of that? If that's the case internal resolution should be even below 960p (that would explain only 30% GPU usage).
 

pawel86ck

Banned
Monitor is 3440x1440 and it renders at 2293x960 and DLSS upcales it to resolution of the monitor.
Are you sure? 2293x960 would tank much more GPU usage than just 30%. If there's just 40fps in your native resolution (we can assume 99% GPU usage) scrennshot, so 30% usage clearly suggest something even lower than 960p.
 
Last edited:

M1chl

Currently Gif and Meme Champion
Are you sure? 2293x960 would tank much more GPU usage than just 30%.
It's without AA, because DLSS takes care of that. You can see the setting in that screenshot.

GPU load is from Xbox Game Bar, so not entirely sure, what does it shows honestly.
 
Last edited:

Skyr

Member
Are you sure? 2293x960 would tank much more GPU usage than just 30%. As you can see there's just 40fps in your native resolution (so can assume 99% GPU usage), so 30% usage clearly suggest something even lower than 960p.
I'd assume ray tracing was turned off?
 

Zannegan

Member
Nintendo not utilizing this technology could cost them next generation.
I'd actually flip that statement on its head and say, IF AMD doesn't already have something similar built into PS5 and XSX, and IF Nintendo has the tensor cores in their next custom Switch design, then a Switch 2 could get surprisingly close to the next gen systems (NOTE: not "close," just surprisingly close given the form factor) and maybe even "match" the hypothetical Lockheart graphically. Of course, you still have to think about the gap in CPU power, which would be... generational.

Nintendo will never catch up to the other two, and this isn't a magic bullet, but it is exciting to think about their next hybrid using something like this to (potentially) punch significantly above its weight in first party games and maybe even get more 3rd party support. And, as Leadbetter said in the other video, it's also an interesting, low-cost path to a Switch Pro.

Caveat: I have no idea what I'm talking about. All my speculation is built on tech articles and Youtube videos, lol.
 
Not gonna lie, I didn't think I was going to be this excited to upgrade my GPU. I'm finally going to be replacing my 1080 but I honestly thought that I was still going to have to make significant sacrifices that would annoy me if I wanted to experience RTX in next gen games... This changes everything... I feel the same excitement now that I usually feel for a new console gen.

Buying my Viewsonic Elite xg270qg was a hard decision looking at some of the progress AMD have been making since it kinda locks me into Nvidia if I want to get 100% out of it's gsync module. I was kinda sweating a lil bit. But with this it looks like my gamble paid off. Man, this is going to be a game changer for sure. Especially if this can work for VR.

Now to figure out if I'm ok sticking with my 8086k paired with a 3080ti and dlss... Or if I need to go AMD for cpu... but then I would need to buy a new mobo as well... And probably ram to match the cpu? That's a big cost on top of a new gpu... Just don't know how much more performance I'd see over my 8086k.
 
Last edited:

TLZ

Banned
This is beautiful. I hope AMD can do this for PS and XB at hardware level. Gives us 4k and 60fps easy.

Also, Nvidia should help Nintendo take advantage of this. It'll benefit them the most since they have a lot of low res games, and this'll make them look gorgeous.
 

darkinstinct

...lacks reading comprehension.
This is the real gamechanger. I lways felt that native 4K rendering was a big waste of power and DLSS sixes that. With this shit even "8K" will be possible on consoles :pie_roffles:

Question is: will AMD, MS or Sony be able to replicate Nv solution.
Why do you think the XSX has 8K written on its APU?
 

BeardGawd

Banned
I keep seeing people say MS isn't using something like this (because Sony hasn't announced the same thing) but Ninja Theory is already using DirectML in a title for Xbox Series X:


Gwertzman: You were talking about machine learning and content generation. I think that’s going to be interesting. One of the studios inside Microsoft has been experimenting with using ML models for asset generation. It’s working scarily well. To the point where we’re looking at shipping really low-res textures and having ML models uprez the textures in real time. You can’t tell the difference between the hand-authored high-res texture and the machine-scaled-up low-res texture, to the point that you may as well ship the low-res texture and let the machine do it.

Journalist: Can you do that on the hardware without install time?

Gwertzman:
Not even install time. Run time.

Journalist: To clarify, you’re talking about real time, moving around the 3D space, level of detail style?

Gwertzman:
Like literally not having to ship massive 2K by 2K textures. You can ship tiny textures.

Journalist: Are you saying they’re generated on the fly as you move around the scene, or they’re generated ahead of time?

Gwertzman:
The textures are being uprezzed in real time.

Journalist: So you can fit on one blu-ray.

Gwertzman:
The download is way smaller, but there’s no appreciable difference in game quality. Think of it more like a magical compression technology. That’s really magical. It takes a huge R&D budget. I look at things like that and say — either this is the next hard thing to compete on, hiring data scientists for a game studio, or it’s a product opportunity. We could be providing technologies like this to everyone to level the playing field again.

Should be really interesting because this will reduce the speed needed for SSD transfer (smaller file sizes) and less space needed in RAM.
 

CrustyBritches

Gold Member
same, I almost went 5700 XT but decided on 2070 Super last second
Not to hate on AMD, but they were wrong on this one. I go with whatever I see as the smart move. In the past I've skewed heavily towards AMD(7850 2GB with ability to OC core clock ~30%, 7870 Myst(XT), R9 390 8GB(over paltry 970 3.5GB), RX 480 8GB(monster ether miner), but in this case 2060 Super was the correct move.

I almost went 2070 Super like you, as it's an amazing card, but I was able to get an open box/return deal from Amazon for $360 and still register it with Zotac for Manufacturer's Warranty. Zotac were very cool for allowing that in my case. I want to upgrade later this year, but I needed RTX and 8GB. You don't want to be stuck with 6GB on the 2060. In Wolfenstein Youngblood it totally tanks performance.
---
Youngblood 1440p comparison: *Note: Selected 1280x720 portion of screen and stacked to avoid supersampling effect of being resized to forum constraints*

1. Native vs DLSS 'Performance' vs DLSS 'Quality'...
Wolf-Compare1280x2160pn.png


'Performance' DLSS still has trouble with the wire mesh over the satellite in the center of the scene, but 'Quality' catches it correctly. 'Quality' gives ~40% boost in frame rate over Native, while 'Performance' gives ~68% boost over Native. In-motion, I find 'Quality' to look superior to Native. It helps solve aliasing and shimmer on edges without sacrificing sharpness.

Fullsize 1440p screenshots:
Native:
Native1440p-jp.jpg

'Performance' DLSS:
Performance1440p-jp.jpg

'Quality' DLSS:
Quality1440p-jp.jpg
 
Last edited:

martino

Member
That Nvidia "fine wine". Imagine buying a $400 graphics card in 2019/2020 and having no DLSS/RT support. There's a reason I went with 2060 Super over 5700 XT.
nvidia need the competition though but rdna2 has everything to prove while nvidia do everything already and will land a revision to do it even better
i really want to build that full amd pc next time but they need to deliver on high end for that.
 
Last edited:

CrustyBritches

Gold Member
nvidia need the competition though but rdna2 has everything to prove while nvidia do everything already and will land a revision to do it even better
i really want to build that full amd pc next time but they need to deliver on high end for that.
For sure, and don't get me wrong, I've been an ardent AMD supporter since the K6-2 days. I fit into the demographic that found a lot of value in their FX CPUs, and I've had more AMD than Nvidia by far(I didn't come close to listing them all above).

In this case, AMD basically said you can't provide performant RT at this level, but Nvidia has proved them wrong. That could be a real kick in the nuts if you buy a 5700 XT over a 2060 Super or 2070 Super in 2020 and find out you get no RT, while those cards are already enjoying the fruits of Nvidia's labor as evident by Control and Wolfenstein Youngblood.

P.S.- I'm totally excited about RDNA 2 and Amphere, and totally open to going RDNA 2. I want what's best for me over then next 2 years.
 
Last edited:

vkbest

Member
I keep seeing people say MS isn't using something like this (because Sony hasn't announced the same thing) but Ninja Theory is already using DirectML in a title for Xbox Series X:




Should be really interesting because this will reduce the speed needed for SSD transfer (smaller file sizes) and less space needed in RAM.

You can do ML on shaders. Xbox series x supporting ML don’t means it have custom hardware for it
 

martino

Member
You can do ML on shaders. Xbox series x supporting ML don’t means it have custom hardware for it

we when there is doubt one has no custom the other is full custom...even if assumptions only comes from a place that smell bad.
 

BeardGawd

Banned
You can do ML on shaders. Xbox series x supporting ML don’t means it have custom hardware for it

MS did custom work with AMD to support AI in Hardware:


"We knew that many inference algorithms need only 8-bit and 4-bit integer positions for weights and the math operations involving those weights comprise the bulk of the performance overhead for those algorithms," says Andrew Goossen. "So we added special hardware support for this specific scenario. The result is that Series X offers 49 TOPS for 8-bit integer operations and 97 TOPS for 4-bit integer operations. Note that the weights are integers, so those are TOPS and not TFLOPs. The net result is that Series X offers unparalleled intelligence for machine learning."

No only that. Hardware is only part of the equation. As mentioned in my link above it takes a shitload of R&D to even get AI Upscaling to work real-time on a software level let alone hardware. Which is why you haven't seen the equivalent from AMD or Sony at the moment.
 
Last edited:

Leonidas

Member
No only that. Hardware is only part of the equation. As mentioned in my link above it takes a shitload of R&D to even get AI Upscaling to work real-time on a software level let alone hardware. Which is why you haven't seen the equivalent from AMD or Sony at the moment.

Xbox is using DirectML which is coming to PC and will be available to Nvidia, Intel and AMD.

XSX theoretical ML capabilities are lower than that of Vega 64 and a far cry of what Nvidia has achieved with RTX and it's tensor cores.
 

BeardGawd

Banned
Xbox is using DirectML which is coming to PC and will be available to Nvidia, Intel and AMD.

XSX theoretical ML capabilities are lower than that of Vega 64 and a far cry of what Nvidia has achieved with RTX and it's tensor cores.

I'm not comparing XSX to PC just PS5. Anything Directx related will be available on PC.

It doesn't have to match Tensor core capabilities to achieve performance/quality gains. Like the aforementioned ai-upscaling of low-res textures I mentioned in a previous post. There will be many uses for AI going forward and MS took these into consideration when designing the system.
 

01011001

Banned
Man can this technique be added to the Switch, or is it in specific cards only like RTX?

well it would use more hardware resources on the current switch because it uses the tensor cores in the RTX cards, the Switch would need to use way more GPU and/or CPU resources to do this.

maybe Nintendo and Nvidia work on implementing a hardware solution into the next Switch?
 
This is the kind of tech nVidia should have begun with. Not that RTX "It Just Works" space magic and if you want acceptable frame rates on the mainstream card you have to enable this fancy new technique called Vaseline 1.0 which was WORSE than down scaling and applying a simple image sharpner.

If they can make this available for all games on a driver level then they will be dancing on AMD's grave for the foreseeable future.

It will be interesting to see what nVidia does with this. This tech will prolong the life of their cards as less people will feel the need to upgrade if they can just run a game @540p. And they don't want that, they want you upgrading every year.
 
Last edited:

PhoenixTank

Member
TAA is blurry.
Yeah, one of the concerns I had in the developer thread on this. We've got some footage in motion now, which is great, but it'd still be wonderful to see some comparisons without AA.
Just want to rule out science entirely before I declare this to be using some form of black magic.
Not gonna lie, I didn't think I was going to be this excited to upgrade my GPU. I'm finally going to be replacing my 1080 but I honestly thought that I was still going to have to make significant sacrifices that would annoy me if I wanted to experience RTX in next gen games... This changes everything... I feel the same excitement now that I usually feel for a new console gen.

Buying my Viewsonic Elite xg270qg was a hard decision looking at some of the progress AMD have been making since it kinda locks me into Nvidia if I want to get 100% out of it's gsync module. I was kinda sweating a lil bit. But with this it looks like my gamble paid off. Man, this is going to be a game changer for sure. Especially if this can work for VR.

Now to figure out if I'm ok sticking with my 8086k paired with a 3080ti and dlss... Or if I need to go AMD for cpu... but then I would need to buy a new mobo as well... And probably ram to match the cpu? That's a big cost on top of a new gpu... Just don't know how much more performance I'd see over my 8086k.
I wouldn't rush out to get a new CPU any time soon.
 
Top Bottom