• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is DLSS really the way of the future?

Rikkori

Member
Yes!

watch-dogs-legion-geforce-rtx-1k-4k-8k-screenshot-resolution-comparison.jpg
 

GymWolf

Member
Out of the small hand of games it's available for, it really is only a couple of instances where the reconstructed image is better.

It's native rendered is the way to go issue is the engines the games are built on are not targeting ultra high end/ movie like quality assets. Which is why there are so many instances of games when scaled to 4k having a lot of rough edges.



Doom eternal says hi!
i didn't played that game yet, what about it?
 

Papacheeks

Banned
"doesn't need" that's like saying they don't "need 4k". It's not about what it needs it's about the fact that any game with it give you the option of a small decrease to image quality in exchange for higher frame rates and higher graphical setting.

No matter how much the hardware advances you will always be able to boost other things by lowering the resolution.

You totally missed the point.

Doom eternal, and Gears 5 have very well designed engines and created their game with 4k scaled assets. Like there are 4k ultra versions of textures that were created just for pc on gears 5. Doom eternal did a ton of api work with vulkan and were able to make the game look/run great with minimal impact to image quality.

DLSS works for games that struggle with 4k rendering in showing a very sharp uniform image. Some games were not designed with 4k in mind. So asset wise and resolution wise sometimes you get 4k rendered games have a higher pixel count rendered image, but everything within sometimes doesnt all scale correctly or render/scale without having some kind of weird artifact.

Thats where I think DLSS really shines in making the image look even better than native with it's data in the reconstruction by the samples it uses that are able to fill in the inconsistency in the engine.

Engine work being done right now wont require what nvidia created, unless their engine work is not up to par. UE5, CRYTEK, FROSTBYTE all push not only pixels but asset quality. And from who I've talked to they are not the only ones. The thing is if NVIDIA had made a deal back in the day and they were supplying consoles with high end apu's, it would be a different story in engine development in what they would be targeting.

Right now AMD is what they are targeting. If DLSS was such a giant deal we would have seen some demonstration after the fact for UE5 for PC. But SOny has some kind of relationship with them. We got fortnite gameplay with ray tracing and that game can run on a toaster, UE4 is super optimized.


Yea so one is 4k, the other is 8k DLSS?

Should we not compare native 4k to DLSS 4K? Or is this 8k downsampled DLSS running at 4k resolution? Also what game is this? What is this set at? If this is new watchdogs then It proved my point about engines and assets for what they were targeting which would be consoles. If it is watch dogs, then the so called running in 4k are the assets that were targeting consoles, upscaled to 4k which is why it would look the way it does.

If ubisosft had 4k created raw assets/textures that 4k native would look totally different.
 
Last edited:

Humdinger

Member
I thought I'd heard that some form of DLSS was coming to consoles. Is that wrong? Or is it still sort of unknown, because we don't know enough about the hardware?
 
I thought I'd heard that some form of DLSS was coming to consoles. Is that wrong? Or is it still sort of unknown, because we don't know enough about the hardware?

You heard wrong. DLSS requires Tensor Cores, which neither console has.

They will no doubt be working on new methods of checkerboard rendering. We'll see how those compare.
 

Rikkori

Member
Yea so one is 4k, the other is 8k DLSS?

Should we not compare native 4k to DLSS 4K? Or is this 8k downsampled DLSS running at 4k resolution? Also what game is this? What is this set at? If this is new watchdogs then It proved my point about engines and assets for what they were targeting which would be consoles. If it is watch dogs, then the so called running in 4k are the assets that were targeting consoles, upscaled to 4k which is why it would look the way it does.

If ubisosft had 4k created raw assets/textures that 4k native would look totally different.

If 8K DLSS runs faster than 4K native, then why would we compare it to 4K DLSS? As for how the game could look totally different if we were living in another world where they did things your way - okay, sure? But no one's going to do that, and in the meantime DLSS is here to stay & greatly improve image quality and framerates. Embrace it. 👐

The image is from here:
 

Papacheeks

Banned
You heard wrong. DLSS requires Tensor Cores, which neither console has.

They will no doubt be working on new methods of checkerboard rendering. We'll see how those compare.

Correct they dont have tensor cores. They are using Radeon arc, which if Rogane is right will have a lot more if we look at how he described navi 21::

4 Shader Engines > 2 Shader Arrays per Shader Engine > 5 WGPs / 10 CUs per Shader Array > 4 RBs per Shader Engine

So you kind of have to do the math on the PS5/XSX which are 36/52 cu's respectfully.

If 8K DLSS runs faster than 4K native, then why would we compare it to 4K DLSS? As for how the game could look totally different if we were living in another world where they did things your way - okay, sure? But no one's going to do that, and in the meantime DLSS is here to stay & greatly improve image quality and framerates. Embrace it. 👐

The image is from here:


Thats on a 3090 dude? a $1500 card that no one outside of CGI professionals and people with unlimited funds will buy.

Thats not applicable to the majority of PC gamers. 3060/3070/3080 are what we should be looking at when it comes to DLSS performance. 8K is still a ways off in terms of it being obtainable price wise to the masses.
 
Last edited:

Lethal01

Member
You totally missed the point.

You're saying doom doesn't need DLSS because it was designed around 4k.

I'm saying that if you used DlSS on doom to bring it from 1440p to 4k you would get better framerates/rendering at the cost of an extremely minor decrease in image quality in some cases(increased image quality in others).

DLSS will continue to give huge benefits regardless of if more engines are built around 4K.
 
Last edited:

INC

Member
You're saying doom doesn't need DLSS because it was designed around 4k.

I'm saying that if you used Dlss on doom to bring it from 1440p to 4k you would get better framerates/rendering at the cost of an extremely minor decrease in image quality.

DLSS will continue to give huge benefits regardless of if more engines are built around 4K.

You'd barely see a degrade in quality, unless u zoom in 400%

Watch the DF control dlss 2.0 breakdown to see it in action
 

Papacheeks

Banned
You're saying doom doesn't need DLSS because it was designed around 4k.

I'm saying that if you used Dlss on doom to bring it from 1440p to 4k you would get better framerates/rendering at the cost of an extremely minor decrease in image quality.

DLSS will continue to give huge benefits regardless of if more engines are built around 4K.

Yes, and if you looked at the Doom eternal benchmarks for 3080, you would see it's at 4k nothing in the video suggest DLSS.



Just that its native 4k with crazy fram rates. ANd even before the 3080 it looks insane.

And that game was just designed for high frame rates, its using same engine where it was optimized more for frames than image quality. But their character models are very detailed, maybe they dont have crazy 4k textures and what not but lighting, character models is super great. Image look very clean.

Also notice HDR is off, and RS is off, not sure if this game supports HDR?

Could be the panel they were using did not have HDR.
 
Last edited:
You're saying doom doesn't need DLSS because it was designed around 4k.

I'm saying that if you used Dlss on doom to bring it from 1440p to 4k you would get better framerates/rendering at the cost of an extremely minor decrease in image quality.

DLSS will continue to give huge benefits regardless of if more engines are built around 4K.

He argues that you somehow don't need DLSS when a game is designed around 4K, and that DLSS is somehow worse when working with 4K assets. Both statements don't make any sense tbh.
 

Lethal01

Member
Yes, and if you looked at the Doom eternal benchmarks for 3080, you would see it's at 4k nothing in the video suggest DLSS.



Just that its native 4k with crazy fram rates. ANd even before the 3080 it looks insane.

And that game was just designed for high frame rates, its using same engine where it was optimized more for frames than image quality. But their character models are very detailed, maybe they dont have crazy 4k textures and what not but lighting, character models is super great. Image look very clean.

Also notice HDR is off, and RS is off, not sure if this game supports HDR?

Could be the panel they were using did not have HDR.


Okay, the point is they could have reached those same framerates if they went with DLSS 1440p while also gaining better graphics.
Or they could have the same graphics but get closer to 244hz.

It seems like you're missing the point.
 
Last edited:

Papacheeks

Banned
Okay, the point is they could have reached those same framerate if they went with DLSS 1440p while having better graphics.
Or they could have the same graphics but go closer to 244hz.

It seems like you're missing the point.

Ur missing it entirely. We now have GPUS to handle 4k. And that will increase so will engine optimization that will target 4k and beyond in asset creation/quality.
Doom eternal is proof with correct hardware, not the shitty rtx 2000 series, and engine optimization you can achieve 4k 100fps+.
 
Last edited:

Lethal01

Member
Ur missing it entirely. We now have GPUS to handle 4k. And that will increase so won't engine optimization that will target 4k and beyond in asset creation/quality.
Doom eternal is proof with correct hardware, not the shitty rtx 2000 series, and engine optimization you can achieve 4k 100fps+.

Yes, if you took those GPU's that can now hand 4k games, and those games running on engines optimized to support games running in native 4k and then instead ran the games at 1440p upscaled to 4k via DLSS the game would end up either running better or looking better.
 
DLSS2 is a gaming miracle. You get higher resolution, nice antialiasing, and raytracing for a fraction of the compute cost and image quality doesn't suffer. In fact sometimes it is actually better than native. That is the magic of deep learning and tensor cores.
 
Last edited:

GlockSaint

Member
it does but you keep on keep on.
I understand your point but not all games are as well optimized as eternal. So while eternal doesn't really need dlss but many others like control and rdr2 do (i understand rdr2 doesn't have it) to run at decent framerates without much loss in quality. Super optimized ports for AAA titles are an exception not the rule. That's why I asked in the thread if dlss could be the solution to the problem in the future.
 

Papacheeks

Banned
Yes, if you took those GPU's that can now hand 4k games, and those games running on engines optimized to support games running in native 4k and then instead ran the games at 1440p upscaled to 4k via DLSS the game would end up either running better or looking better.

Wow. you truly believe that taking a 1440p image and reconstructing it at 4k 100% on every game will look better?

There are a couple instances where that is the case, but they are literally a few titles. ANd they come with some large caveats when it comes to image quality.

My whole point is we dont need to rely on software to do this work, we have and will in the next year or so have a renaissance of innovation in gpu design and engine design. Which is why I keep bringing up the unreal 5 demo. Wait until we have chiplet gpu's that have embedded storage on the same PCB and possible a co-processor.

You will have some of the clearest, cleanest image quality without needing software to downsample and reconstruct images.

I understand your point but not all games are as well optimized as eternal. So while eternal doesn't really need dlss but many others like control and rdr2 do (i understand rdr2 doesn't have it) to run at decent framerates without much loss in quality. Super optimized ports for AAA titles are an exception not the rule. That's why I asked in the thread if dlss could be the solution to the problem in the future.

DLSS is a short term solution for games that really need it. I agree with that, I don't think it's the future for image rendering going forward. I think it is a viable option for games with poor optimization, and poor engine scaling. But a lot of that will be ironed out mostly this gen. Most big publishers with good engines will now be targeting higher specs.

So asset quality wise it will be much larger and creating 4k/8k assets to be downsampled will now be possible without worrying about throughput.
 
Last edited:
What is your point against games having dlss exactly Papacheeks Papacheeks ?

Every single game, regardless if its the most optimized game in the whole world and even best looking at the same time miraculously, would still benefit from an optional feature like DLSS. Not every gamer has 3080 series gpu and even then DLSS is welcome. It is the future and future is now for Invidia 20 and 30 series RTX card owners.
 
Last edited:

Lethal01

Member
Wow. you truly believe that taking a 1440p image and reconstructing it at 4k 100% on every game will look better?

There are a couple instances where that is the case, but they are literally a few titles. ANd they come with some large caveats when it comes to image quality.

I'm not saying DLSS by itself will always be better than Native. I'm saying from what I've seen when it's implemented properly it's for the least as good as games in native 4k like doom and that the power saved from doing so will always be better utilized on other things.

You talk about massive innovations in future GPU's I'm saying that even after this innovation happen native 4k will be a waste when you could use DLSS to get something that looks as good and use the extra power on something like better raytracing or denser worlds.
 
Last edited:

Papacheeks

Banned
What is your point against games having dlss exactly Papacheeks Papacheeks ?

Every single game, regardless if its the most optimized game in the whole world and even best looking at the same time miraculously, would still benefit from an optional feature like DLSS. Not every gamer has 3080 series gpu and even then DLSS is welcome. It is the future and future is now for Invidia 20 and 30 series RTX card owners.

You'll see soon. I'll add on to this tomorrow I'm going to bed. But in short I understand it's implications and how it frees up processing on the gpu to be used for other things.

But that to me shows a larger issue. That hardware can't do the job correctly at those resolutions with great detail, and give 100% great images without artifacting, shuttering, among other things.

Nvidia uses DLSS. To push RT.
With advancement in chiplet design and a huge fundamental change coming in the future in how GPUS are used and work with other components.

The need to super sample which is not giving you a 1:1 native render only a reconstruction that comes from different data sets depending on settings/game etc.

Soon rt will become more standard, and GPUS won't be the only thing on the pcb anymore.

Soon your gpu won't need softwar to help get performance back.

Enjoy dlss. Glad you like it.
Just being objective and disagree with it being something like a gold standard. Because really it's not.
It was created to sell 4k and RTX.
 
Last edited:
Right question must be "Is upscaling the future over native res".

Answer is most likely yes as consoles also use checkerboarding in case there is need for constant fps with resolution scaling.

DLSS itself has some major drawbacks. It needs dedicated silicon, something consoles are always short of. It also needs per game training and doesn't work out of the box.

Both these things mean precious resources in terms of silicon and development time. I think consoles will move towards less effective but much cheaper solutions.
 
Last edited:

Alexios

Cores, shaders and BIOS oh my!
AI enhanced image output seems like the future, lots of companies are working on it. Facebook is researching similar tech without tying it into specific hardware vendors (but presumably for use in future Oculus VR products like the Quest, which incidentally just got discontinued according to various retailers just as we're waiting for the new Facebook - formerly Oculus - Connect conference slated for this month). They supposedly outperform any existing implementation already. Perhaps we can see similar tech within game engines like UE5 eventually. It'd certainly help lessen the gap of portable devices like a Switch 2 or Oculus Quest 2 if the hardware can specialize for this purpose. Also make dynamic resolution pretty much impossible to detect for more powerful machines, especially with a higher quality input unlike these extreme cases:

figure5-spaceship-composition.png

figure3-dancestudio-composition.png
figure4-village-composition.png


If it can really achieve those results already it's kinda magic honestly. Like those old detective/cop shows where they just zoomed in on shitty camera footage and just went "enhance" to fix it, lol.
 

ZywyPL

Banned
Yes. If the undisputed market leader (NV) pushes something hard, you know that's the direction the industry will follow. It's a game-changing, technology that provides better image quality and around double the framerate, at just a slight transistor/die size cost, it's a no-brainer. IMO devs should be bashed for not implementing the tech into their games going forward.
 
Last edited:

GymWolf

Member
When a 3080 without dlss is gonna get crushed at 4k60 ultra+rtx by cyberpunk and all the future heavy/broken games in the next 2 years i'm gonna fondly remember this narrative about "we already have gpu that don't even sweat at 4k, we don't need dlss"...

A fucking current gen games like avengers who only looks good put a 2080ti on his knees even at 1440p but we don't need dlss
giphy.gif


I like how people live in a fantasy world where heavy\broken games on pc don't exist and stuff like gears 5 and doom are the norm and not the exceptions.
 
Last edited:

geordiemp

Member
Better than your constant dickriding of one tech demo for a half year now. We all saw it, we know what it was.

Its an example of Temporal on ps5.

Other games that will use temporal on ps5 - all UE games going forward can use it.

And so far i 4k60 mode

Ratchet and clank
Miles Moraeles

and every other ps5 sony game that has mentioned it. Likely most games.

Just pointing out, Temporal like seen in that demo will be very common and is better at 60 FPS as well. If you dont like that and think its related to something sexual and related to your dick ?, thats your intellectual analysis and thats fine.

But it is a credible future upscaling technique which the thread is about.

If you want a DLSS appriasal thread, then rename the thread
 
Last edited:

Dampf

Member
When a 3080 without dlss is gonna get crushed at 4k60 ultra+rtx by cyberpunk and all the future heavy/broken games in the next 2 years i'm gonna fondly remember this narrative about "we already have gpu that don't even sweat at 4k, we don't need dlss"...

A fucking current gen games like avengers who only looks good put a 2080ti on his knees even at 1440p but we don't need dlss
giphy.gif


I like how people live in a fantasy world where heavy\broken games on pc don't exist and stuff like gears 5 and doom are the norm and not the exceptions.
It's already "crushed" by Fortnite.

 

GymWolf

Member
It's already "crushed" by Fortnite.

Yeah but for my knowledge they pump rtx to the max in these shitty looking games for these presentations so it's kinda a given that it doesn't run very well.
But yeah, you have a point. (Same point as mine :lollipop_grinning_sweat: )
 

GlockSaint

Member
When a 3080 without dlss is gonna get crushed at 4k60 ultra+rtx by cyberpunk and all the future heavy/broken games in the next 2 years i'm gonna fondly remember this narrative about "we already have gpu that don't even sweat at 4k, we don't need dlss"...

A fucking current gen games like avengers who only looks good put a 2080ti on his knees even at 1440p but we don't need dlss
giphy.gif


I like how people live in a fantasy world where heavy\broken games on pc don't exist and stuff like gears 5 and doom are the norm and not the exceptions.
I swear man, avengers looks pretty average for even current gen and is like this, horizon a three year old game can't run properly etc. It feels like fantasy when people say poorly optimized games will be a thing of the past. Technically the 20 series should already be more than enough for next gen (doom 4k 90+fps) but it's almost never the case in 99% of the games. So if dlss is as good as the hype it can help achieve that 4k without killing the gpu.
 

GymWolf

Member
I swear man, avengers looks pretty average for even current gen and is like this, horizon a three year old game can't run properly etc. It feels like fantasy when people say poorly optimized games will be a thing of the past. Technically the 20 series should already be more than enough for next gen (doom 4k 90+fps) but it's almost never the case in 99% of the games. So if dlss is as good as the hype it can help achieve that 4k without killing the gpu.
I played enough time on pc to see the same thing happening for every generation of gpus.

Omg it's a beast, it's made for professionists and cg worker, it's wasted only for gaming...then heavy or broken games come out (or are already out) that literally trash every gpu in the market at high settings and people is like

1fb.jpg
 

thelastword

Banned
When a 3080 without dlss is gonna get crushed at 4k60 ultra+rtx by cyberpunk and all the future heavy/broken games in the next 2 years i'm gonna fondly remember this narrative about "we already have gpu that don't even sweat at 4k, we don't need dlss"...

A fucking current gen games like avengers who only looks good put a 2080ti on his knees even at 1440p but we don't need dlss
giphy.gif


I like how people live in a fantasy world where heavy\broken games on pc don't exist and stuff like gears 5 and doom are the norm and not the exceptions.
Surely you see the problem in that don't you? Nvidia will keep on selling high TF cards that should be pushing native 4k and 8K, but devs can just throw in a new tech feature in RT that tanks performance and piggyback on DLSS as optimization 2.0.......Even without RT as you said, devs will say, just ship it baby, DLSS will save us......So they won't optimize their games as much, despite not looking all that great or next gen....
 

GymWolf

Member
Surely you see the problem in that don't you? Nvidia will keep on selling high TF cards that should be pushing native 4k and 8K, but devs can just throw in a new tech feature in RT that tanks performance and piggyback on DLSS as optimization 2.0.......Even without RT as you said, devs will say, just ship it baby, DLSS will save us......So they won't optimize their games as much, despite not looking all that great or next gen....
Yes, and this is why dlss is our saviour, devs are still gonna be lazy as fuck while porting stuff on pc so having something to increase performance with barely any iq loss it's the best thing that could happen to pc gaming.

Is it ideal? no.

Is it better than when we didn't have dlss, hell yes.

This is the whole point of the topic.

Dlss on console would be a dream, no more preoccupation of not seeing jumps in graphics because devs have to chase native 4k.

If crossgen stuff like ratchet and spidey looks great to you, imagine the same games with dlss and all resources for 4k redirected on other stuff like details, rtx and framerate.
 
Last edited:

Jon Neu

Banned
Even without RT as you said, devs will say, just ship it baby, DLSS will save us......So they won't optimize their games as much, despite not looking all that great or next gen....

Fucking DLSS, you ruined our games!

Already proven that some games using DLSS to 4K can have better IQ than Native 4K. See Death Stranding.

Have no doubt that can and will be the case for 8K.

This is literally the best thing that happened to videogames in recent times.

When consoles finally develop their upscaling response to DLSS, even when it's probably nowhere near as good, it's still going to be a huge boost of perfomance for games. You can literally make a lot more with the same hardware.
 

martino

Member
Surely you see the problem in that don't you? Nvidia will keep on selling high TF cards that should be pushing native 4k and 8K, but devs can just throw in a new tech feature in RT that tanks performance and piggyback on DLSS as optimization 2.0.......Even without RT as you said, devs will say, just ship it baby, DLSS will save us......So they won't optimize their games as much, despite not looking all that great or next gen....

you're hard on dev and you don't see the real concern of lastword....
Some of those costly effect could be easy to spot and advertise.
 

Xyphie

Member
Not necessarily DLSS, but something like it will be standard fare when next-gen game engines are mature as it's a big step up in quality from checkerboarding and other reconstruction techniques.
 

octiny

Banned
DLSS 1.0 was absolute shit.

However..

I rather play any DLSS 2.0 on Quality (highest DLSS) at 4K vs playing Native 4K w/ AA even if I could get the same frame rate on both. I don't expect anything less when w/ 8K DLSS.

It's simply that good now.

Can't wait for more games to implement & looking forward to DLSS 2.1.
 

thelastword

Banned
Yes, and this is why dlss is our saviour, devs are still gonna be lazy as fuck while porting stuff on pc so having something to increase performance with barely any iq loss it's the best thing that could happen to pc gaming.

Is it ideal? no.

Is it better than when we didn't have dlss, hell yes.

This is the whole point of the topic.
Well the downside to that is you won't see the necessary ambition and graphical boundary pushing in games. The real graphical engineering from talented devs that actually make games look the best with the hardware given, with the best framerates they can muster. We will get lots of so so looking games with bad optimization.....It's a perfect scenario for Nvidia if you think about it, they can keep selling a high TF number but not necessarily the performance that should come with it, hence why they are already on the 2 Cuda cores per SM train to sweeten people's eyes with a high TF number......Yet focus on a low GPU footprint in DLSS to sell those cards. Surely you see the irony...


Now I'll tell you this, eventually you will see even worse optimized games than Avengers, or perhaps games that are actually pushing graphical boundaries and they will tank the 3080 at 4K and even DLSS. If anything, DLSS will stagnate graphical boundary pushing, as people will rely on tech (such as RT) as a slap in to save them optimization work.....Yet RT is expensive if just bruteforced and here is your dilemma, that's why I said, and you will see it's continuity in the future too, that consoles will continue to lead in pushing the graphical boundaries of our games,, because no matter how powerful PC's get, they are only relegated to push more frames over consoles.......
 

GymWolf

Member
Well the downside to that is you won't see the necessary ambition and graphical boundary pushing in games. The real graphical engineering from talented devs that actually make games look the best with the hardware given, with the best framerates they can muster. We will get lots of so so looking games with bad optimization.....It's a perfect scenario for Nvidia if you think about it, they can keep selling a high TF number but not necessarily the performance that should come with it, hence why they are already on the 2 Cuda cores per SM train to sweeten people's eyes with a high TF number......Yet focus on a low GPU footprint in DLSS to sell those cards. Surely you see the irony...


Now I'll tell you this, eventually you will see even worse optimized games than Avengers, or perhaps games that are actually pushing graphical boundaries and they will tank the 3080 at 4K and even DLSS. If anything, DLSS will stagnate graphical boundary pushing, as people will rely on tech (such as RT) as a slap in to save them optimization work.....Yet RT is expensive if just bruteforced and here is your dilemma, that's why I said, and you will see it's continuity in the future too, that consoles will continue to lead in pushing the graphical boundaries of our games,, because no matter how powerful PC's get, they are only relegated to push more frames over consoles.......
The fact is, if dlss was not a thing, are you sure that things would be any different?

I think that if a dev team is good and has passion it would still generate their max effort, dlss being a thing or not.

Same for lazy dev teams doing a shitty work with or without dlss, i mean, watch all the gens before dlss, a lot of shitty stuff was still there.
 

The Cockatrice

Gold Member
Fucking DLSS, you ruined our games!

Disprove his point. It already happened with Control. Try running it without DLSS with ray tracing. It has like 25-30 frames per second, a linear game. Remedy are hot garbage at optimizing their games. Remember Quantum Break? yeah go watch some benchmarks of that as well. DLSS is a nice feature but it shouldnt be used as a tool for lazy devs but it will be.
 

GymWolf

Member
Disprove his point. It already happened with Control. Try running it without DLSS with ray tracing. It has like 25-30 frames per second, a linear game. Remedy are hot garbage at optimizing their games. Remember Quantum Break? yeah go watch some benchmarks of that as well. DLSS is a nice feature but it shouldnt be used as a tool for lazy devs but it will be.
Quantum break was the most broken game i ever played on pc in the last 5+ years, what a shitshow.
 

nochance

Banned
Well the downside to that is you won't see the necessary ambition and graphical boundary pushing in games. The real graphical engineering from talented devs that actually make games look the best with the hardware given, with the best framerates they can muster. We will get lots of so so looking games with bad optimization.....It's a perfect scenario for Nvidia if you think about it, they can keep selling a high TF number but not necessarily the performance that should come with it, hence why they are already on the 2 Cuda cores per SM train to sweeten people's eyes with a high TF number......Yet focus on a low GPU footprint in DLSS to sell those cards. Surely you see the irony...


Now I'll tell you this, eventually you will see even worse optimized games than Avengers, or perhaps games that are actually pushing graphical boundaries and they will tank the 3080 at 4K and even DLSS. If anything, DLSS will stagnate graphical boundary pushing, as people will rely on tech (such as RT) as a slap in to save them optimization work.....Yet RT is expensive if just bruteforced and here is your dilemma, that's why I said, and you will see it's continuity in the future too, that consoles will continue to lead in pushing the graphical boundaries of our games,, because no matter how powerful PC's get, they are only relegated to push more frames over consoles.......
This reasoning could be applied if we were in the PS2 days. Games rarely use proprietary engines anymore, they use standardised tools that psh optimised pipelines.

Outside of a few bad ports, the only time you will see DLSS required is when the game actually pushes the graphical envelope past the point that the hardware can cope with.
 
Last edited:
Top Bottom