• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Red Dead Redemption 2 DLSS support soon!

Armorian

Banned
Kinda on topic, but kinda not - as a newcomer to the PC Master Race, I have a Ultrawide 3440x1440p monitor and I'm not sure if DLSS can actually be of any use to me? In Doom Eternal when I turn it on, I see the trails on the little meteor bits in that opening scene to indicate that DLSS is working, so what does that mean in terms of the resolution that it's running at? Would it be rendering less than 1440p and DLSS'ing up to 1440p? Or is it going to be DLSS'ing 1440p up to 4K and then supersampling down to the 1440p monitor resolution?

With dlss you are always below native res, for 1440 quality mode is probably around ~1080p base res. Buy YOU CAN sypersample with dlss, just use DSR with it and your base res will be higher or the same as your native monitor resolution and with dlss magic it will look amazing. Just remember that only DSR X4 with zero smoothness looks good.

Pretty insane that the DLSS reconstructed 1080p looks better than native 1080p. How does that even make any sense haha.

Dlss doesn't have all the problems taa has, even before 2.2 ghosting was less pronounced with dlss compared to standard taa
 
Can they do that for PlayStation so it’s doesn’t have the dog shit blur filter that Xbox doesn’t have? is that too much to ask?
Turning off HDR makes a world of difference, the "HDR" implementation they use ruins the sharpness on both consoles.
 

hlm666

Member
PS5 die size: 308 sqmm
RTX 2060 die size: 445 sqmm

PS5 has lots more power and a processor built in at 2/3 die size. Even smallest dlss capable cards have more silicon than next gen consoles.

When you are not using RTX or dlss features, you are basically using 1/3 silicon you have on your GPU. That's why graphics chips tend to be expensive.

It's almost like the 2060 was made on a larger fab node, you know like 12nm vs 7nm.
edit: the rtx 3060 on 8nm is 276mm and faster than the 2060 for instance.

It's also not 1/3 of the die space for tensor cores.
 
Last edited:

Kazza

Member
I realize this post is somewhat old and you've probably had a chance to experiment more, but for posterity's sake, here's some shots of DOOM Eternal running 1080p Native vs DLSS(Quality), RT enabled...

This is using DLSS 2.1, not the new 2.2 .dll. In gameplay I prefer the look of DLSS(Quality) over native, even disregarding any frame rate target. It looks crisp and clean. Once you get into DRS with high frame rate target for high refresh laptop screen that usually ships with RTX 3xxx laptops, then the non-DLSS, native res with DRS enabled takes a massive hit to image quality.

I'm still very much interested in this, so thanks a lot!

My laptop screen is a 144Hz one with very nice colours, which is one of the reasons I haven't tried using it with my 60Hz/4K living room screen yet. I thought the small 14 inch screen would bother me, but it actually feels great to play propped up on a table close to me while I sit up in bed.

Even a low powered laptop 3060 is probably more than enough to keep up with the PS5/XSX for the duration of this gen, even without using DLSS, but it's nice to have it as an extra boost. Some people even seem to prefer it to native res (as you did here). I'm still optimistic that we will get even better 1080p DLSS performance once the Switch 2/Pro finally comes out, as I think Nintendo will want to utilise the tech in order to boost lower res handheld mode (in addition to 4K for docked).
 
Okay I'm not very knowledgeable about all this. I recently built a PC because I had some money and here's my question:

There is no way the PS5 can do native 4k for €399 right?
 

Buggy Loop

Member
It's almost like the 2060 was made on a larger fab node, you know like 12nm vs 7nm.
edit: the rtx 3060 on 8nm is 276mm and faster than the 2060 for instance.

It's also not 1/3 of the die space for tensor cores.


I wanted to post something akin to that but I woke up late for work.

Would also need to add the density differences between nodes, because in reality, the « x nm » technology does not tell the whole picture. The Turing 12nm finFet at TSMC is more or less a marketing gimmick as it barely has higher density than the 16nm node, 33.8 MTr/mm^2 vs 28.2 MTr/mm^2.

While TSMC’s N7P can give a whooping 96.5 MTr/mm^2.

(all theoretical of course)

So yea… you can literally fit a CPU in there and still have margin to be a smaller silicon area..
 

Md Ray

Member
With the same settings as PS4 Pro it could easily run the game at native 4K (at 30FPS tho).
Xbox One X does that already. PS5 should get close to 60fps though.

I imagine a native 4K patch, unlocked frame-rate targeting 60fps w/ VRR support would provide a good experience.

Or

Fix and update the terrible checkerboard implementation of PS4 Pro for PS5, uncap the frame-rate - this way locked 60fps will be guaranteed.
 
Last edited:

evanft

Member
Kinda on topic, but kinda not - as a newcomer to the PC Master Race, I have a Ultrawide 3440x1440p monitor and I'm not sure if DLSS can actually be of any use to me? In Doom Eternal when I turn it on, I see the trails on the little meteor bits in that opening scene to indicate that DLSS is working, so what does that mean in terms of the resolution that it's running at? Would it be rendering less than 1440p and DLSS'ing up to 1440p? Or is it going to be DLSS'ing 1440p up to 4K and then supersampling down to the 1440p monitor resolution?

The internal resolution is lower than what you've set as your game resolution. The actual value is dependent on the DLSS mode you've set, with performance being about 50% of your base resolution, balanced at 57%, and quality at around 66%.

If you want to fix the trails, do a quick google search on installing the DLSS 2.2 DLL. It works better than the version shipped with Doom Eternal and should mitigate some of the little issues seen with DLSS.
 

Kenpachii

Member
Doesn't the game run like ass on PC tho? remember this thing hitting 40''s at 1080p with a 2080ti, can only imagine what u need for 60 fps at 4k.
 

Armorian

Banned
Doesn't the game run like ass on PC tho? remember this thing hitting 40''s at 1080p with a 2080ti, can only imagine what u need for 60 fps at 4k.

Unlike 90% of console ports RDR2 settings are actually much higher than what is set on consoles so if you set everything to ultra it will run like shit...
 

Md Ray

Member
Doesn't the game run like ass on PC tho? remember this thing hitting 40''s at 1080p with a 2080ti, can only imagine what u need for 60 fps at 4k.
As long as you don't crank each and every setting up which isn't meant for today's PCs, you can even get up to 120fps and more with the right settings on a 3070 (a 2080 Ti equivalent with less VRAM).



I even made a thread a while ago on how to extract more perf out of Ampere GPUs (users of non-Ampere cards can try this too, btw):
 
Last edited:

yamaci17

Member
Doesn't the game run like ass on PC tho? remember this thing hitting 40''s at 1080p with a 2080ti, can only imagine what u need for 60 fps at 4k.


by making precise optimizations on settings (most of them will still be on high) i am getting 1800p 60 fps on my 3070

there's no point pushing ultra settings that will be mostly unnoticeable by many users. i've did countless comparisons between high and ultra and concluded that it was worthless for the performance loss
 
Last edited:

TIGERCOOL

Member
As long as you don't crank each and every setting up which isn't meant for today's PCs, you can even get up to 120fps and more with the right settings on a 3070 (a 2080 Ti equivalent with less VRAM).



I even made a thread a while ago on how to extract more perf out of Ampere GPUs (users of non-Ampere cards can try this too, btw):

was hoping someone else in that thread on a non-ampere card would have tested turning transfer queues on, but no... it had to turn into the typical rdr2 shit slinging fest. People who don't like the game really wear it as a badge of honour and take every opportunity to let everyone know, no matter how many threads need to be derailed to do it.

Thanks for sharing your findings. I'll give it a try when I reinstall for the dlss patch (rtx 2070)
 

Kenpachii

Member
As long as you don't crank each and every setting up which isn't meant for today's PCs, you can even get up to 120fps and more with the right settings on a 3070 (a 2080 Ti equivalent with less VRAM).



I even made a thread a while ago on how to extract more perf out of Ampere GPUs (users of non-Ampere cards can try this too, btw):


Nice video, 100+ fps average is dam nice.
 

TIGERCOOL

Member
Confirmed in the game folder that it's dlss 2.2. Image looks much crisper than TAA which is expected. Seem to be getting about 10+ fps on quality, 15+ on balanced and 20 on performance. Maybe I'm crazy but the difference between the three modes is visually less pronounced than in other games
 

Patrick S.

Banned
On one hand, I'm excited to try this. On the other hand, I play on an RTX 3080 at 1080p @ 60Hz, so I always had enough of them frames xD
 
Last edited:

Kenpachii

Member
On one hand, I'm excited to try this. On the other hand, I play on an RTX 3080 at 1080p @ 60Hz, so I always had enough of them frames xD

buy a 144hz screen right now. Do it

436ce7f927723c35cf3eecc63d1444fd.gif
 

SlimySnake

Flashless at the Golden Globes
The patch is out. 1.9gb.

Standing out in the woods:
Without DLSS: 85 Fps
Quality DLSS: 96 Fps
Balanced DLSS: 101Fps
Performance DLSS: 105 Fps
Ultra performance DLSS: 112 Fps


Just a quick test I ran. I'll probably do the benchmark now.
That's a rather small upgrade for Quality. How is balanced? Anything below Quality is typically too blurry for me.
 

TIGERCOOL

Member
Did more testing in benchmark (Vulkan). Average fps -
dlss off: 76
dlss quality: 87
dlss balanced: 93
dlss performance: 98
dlss ultra performance: 106 (looks like a ps3 game)

tested at 1440p on a 2070, ryzen 3600 with mixed settings.
Quality looks great. Balanced looks solid with some slight dithering on shadows and hair. Performance looks rough. Ultra performance falls off a cliff.

Edit: Quality dlss averaged 79 fps on dx12 benchmark. Vulkan still has the odd stutter in populated areas.
 
Last edited:

ebevan91

Member
Did more testing in benchmark (Vulkan). Average fps -
dlss off: 76
dlss quality: 87
dlss balanced: 93
dlss performance: 98
dlss ultra performance: 106 (looks like a ps3 game)

tested at 1440p on a 2070, ryzen 3600 with mixed settings.
Quality looks great. Balanced looks solid with some slight dithering on shadows and hair. Performance looks rough. Ultra performance falls off a cliff.

Curious to see dx12 benchmarks. Vulkan still has the odd stutter in populated areas.

I tested on DX12. I have a 2070 and did 4k resolution, medium preset and put DLSS on performance and I'm getting 60FPS (vsync on) near Saint Denis which tanked my FPS in the past. It looks good too. Might mess around and try on 1440p in a second.

OK I tried 1440p and turned off vsync and my frame rate went up to around 100 on performance and around 80 on quality but the game was choppy as hell so I put it vsync back on and it stayed on 60fps and was real smooth.
 
Last edited:

TIGERCOOL

Member
hmm... weird dithering and ghosting showing up for me in quality now. especially on hair and foliage. hope its just a bug that gets patched because right now there are image quality tradeoffs between dlss quality and off. still prefer the overall look with it on but there are some annoyances.
 
Last edited:

Shai-Tan

Banned
makes it playable with gfx options turned up on a 2080ti @ 4k

I think DLSS looks better than TAA, even with oversharpening, but doesn't improve how muddy the game looks w/ soft looking distant textures/objects. im just not a fan of how they sacrificed clarity to make the scene more complex (compared to GTA V).
 

TIGERCOOL

Member
Ok... so I just tested dlss 2.2.06 (the one used in r6 siege), and my fps is averaging 96 in the benchmark... 10fps higher than the 2.2.10 version of dlss that it updated with!
Anyone else want to run their own test on this? I actually raised my settings moderately before testing with 2.2.06 so others may get an even bigger bump.

Dithering and other artifacts seem comparable in my limited observations.
 
Last edited:

Md Ray

Member
Enable transfer queues if you're using Vulkan API (consider switching to VK from DX12 if it's faster).

Do let me know if you guys see any improved performance w/ Vulkan + transfer queues on.
 
Last edited:
Top Bottom