• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 2080 Ti Can Barely hit 30fps at 1080p ultra running Watch Dogs Legion

Siri

Banned
Max settings are so overrated in a lot of cases. With RDR2 in particular there is no perceptible difference between high and max for a lot of the available settings other than the fact that they tank your framerate. Tree tessellation I'm looking at you. The same goes for other games like AC Odyssey.

The 2080ti is a perfectly good 4k 60fps card, especially as DLSS 2.0 becomes more prevalent.

The RTX 2080 TI is the first real 4K card from Nvidia - and I agree, for the most part, about max settings being over-rated.

Still, there are times when I have to lower the settings and am bothered by it. It’s a matter of personal taste.

Don’t get me wrong. I absolutely love my 2080 TI. I’d be lying, however, if I said I’m not waiting in anticipation of the 3080 TI. That card is going to be incredible - the price will likely be hard to justify though. LOL.
 

GHG

Member
The RTX 2080 TI is the first real 4K card from Nvidia - and I agree, for the most part, about max settings being over-rated.

Still, there are times when I have to lower the settings and am bothered by it. It’s a matter of personal taste.

Don’t get me wrong. I absolutely love my 2080 TI. I’d be lying, however, if I said I’m not waiting in anticipation of the 3080 TI. That card is going to be incredible - the price will likely be hard to justify though. LOL.

It's funny because I tried out Wolfenstein Youngblood today on my 2070 super and I'm getting 60-70fps everywhere maxed out with RTX on at 4k thanks to dlss 2.0.

DLSS 2.0 needs to be in every game possible. The results are mind boggling and it will change the entire game in terms of what qualifies as a 4k GPU and what doesn't.
 
okay...so its poorly optimized. its an ubisoft game....clearly its not pushing fidelity boundries or anything, so that shiuld tip you iff that its ubisoft. im not seeing what that has to do with the value of PC gaming lol

tired of people who dont play on both platforms, only play console...having an opinion as if it carries as much insight as the people who actively play both PC and Console lol

I can tell who isnt a PC gamer when the only reason they think PC gaming is valuable is bc of performance.
 

Reindeer

Member
And its nothing but assumption on your part to state otherwise. I would say the UE5 definately takes the performance crown,. cant recall anything that impressive being played in real time so...

And the Nannite renders in 4.25 ms, fine for 60 FPS, its the lumen GI that is being worked on to get it to 60.

If you prefer DLSS thats fine, I disagree. UE5 demo was more impressive than any PC game with RT, and it will also be on PC so everybody wins.

sDTvKY4.png


T5OSo23.png
Couple of issues you have here in your argument:

Your statement: "And its nothing but assumption on your part to state otherwise."

Actually we have proof of DLSS in action and the results are very impressive compared to what was achieved in Unreal Demo. DLSS on 2080TI for example is able to push 4K with ray tracing on, same ray tracing that is far more computationally demanding than Epic's solution. The fact that Unreal Demo wasn't even able to maintain 4K (1440p most of the time as per Epic) while utilising far less computationally expensive GI is rather damning.

Your statement: "UE5 demo was more impressive than any PC game with RT"

This is a bizzare statement to make since RT is far more expensive than Lumen and yet on PC we have seen RT in 4K60 through DLSS, which is far more than we can say for that Unreal Demo. You basically suggesting that a less computationally expensive demo achieving worse result is actually more impressive...
 
Last edited by a moderator:

A.Romero

Member
okay...so its poorly optimized. its an ubisoft game....clearly its not pushing fidelity boundries or anything, so that shiuld tip you iff that its ubisoft. im not seeing what that has to do with the value of PC gaming lol

tired of people who dont play on both platforms, only play console...having an opinion as if it carries as much insight as the people who actively play both PC and Console lol

I can tell who isnt a PC gamer when the only reason they think PC gaming is valuable is bc of performance.

I'm a PC gamer and while I recognize it is not the only value, it's probably the main one for me.

Followed by great prices for games (consoles have been catching up a bit).

Third the flexibility of playing with mods, the controller of my choosing, etc.
 

geordiemp

Member
Couple of issues you have here in your argument:

Your statement: "And its nothing but assumption on your part to state otherwise."

Actually we have proof of DLSS in action and the results are very impressive compared to what was achieved in Unreal Demo. DLSS on 2080TI for example is able to push 4K with ray tracing on, same ray tracing that is far more computationally demanding than Epic's solution. The fact that Unreal Demo wasn't even able to maintain 4K (1440p most of the time as per Epic) while utilising far less computationally expensive GI is rather damning.

Your statement: "UE5 demo was more impressive than any PC game with RT"

This is a bizzare statement to make since RT is far more expensive than Lumen and yet on PC we have seen RT in 4K60 through DLSS, which is far more than we can say for that Unreal Demo. You basically suggesting that a less computationally expensive demo achieving worse result is actually more impressive...

Just read what you typed, amusing, 4k through DLSS - not 1440p upscaled to 4k using DLSS

You notice how Nvidia fans dont like to say the native resolution and just state what it was upscaled to ?

Maybe its a Nvidia PR thing, they say 4K enhanced with DLSS 2,0 super computer machine learning

.......sounds better than 1440p upscaled

.....and how is that better than Nanite. You will get allot of UE5 games next gen I bet, so...

Next gen, everything is upscaled and fans argue which reconstruction technique is better - lol, who would of thought this a few years ago. Hilarious.

Also UE5 demo was most impressive, as their were no flat LODs or textures....everything had rendering depth, beats out any other renderer todate.
 
Last edited:

Reindeer

Member
Just read what you typed, amusing, 4k through DLSS - not 1440p upscaled to 4k using DLSS

You notice how Nvidia fans dont like to say the native resolution and just state what it was upscaled to ?

Maybe its a Nvidia PR thing, they say 4K enhanced with DLSS 2,0 super computer machine learning

.......sounds better than 1440p upscaled

.....and how is that different to Nanite.

Next gen, everything is upscaled and fans argue which reconstruction technique is better - lol, who would of thought this a few years ago. Hilarious.
Obviously it's 1440p upscaled to 4K, anyone with half a brain knows that, it's not something that I even thought needed to be mentioned. The fact that you somehow equated that to Nvidia fanboyism on my part without any further proof is pretty ridiculous if I may say. The very reason we are talking about upscaling here obviously means it's 4K upscaled and not native.

4K upscaled from 1440p is far more impressive than 4K upscaled from 1440p and not being able to hold that 4K resolution most of the the time (again, as per Epic), this is the very reason why I proceeded to mention 1440p for Unreal demo. It's quite clear the solution Epic is using is more akin to dynamic resolution than a flat upscale like what DLSS is, which clearly shows DLSS being superior when comparing the two. Any solution that has native hardware to back it up will be more performant than one without such hardware, which is why DLSS has clear advantage.
 
Last edited by a moderator:

geordiemp

Member
Obviously it's 1440p upscaled to 4K, anyone with half a brain knows that, it's not something that I even thought needed to be mentioned. The fact that you somehow equated that to Nvidia fanboyism on my part without any further proof is pretty ridiculous if I may say. The very reason we are talking about upscaling here obviously means it's 4K upscaled and not native.

4K upscaled from 1440p is far more impressive than 4K upscaled from 1440p and not being able to hold that 4K resolution most of the the time (again, as per Epic), this is the very reason why I proceeded to mention 1440p for Unreal demo. It's quite clear the solution Epic is using is more akin to dynamic resolution than a flat upscale like what DLSS is, which clearly shows DLSS being superior when comparing the two. Any solution that has native hardware to back it up will be more performant than one without such hardware, which is why DLSS has clear advantage.

Sorry I did not meant to imply you are a fanboy of Nvidia, it was just my general observation of how DLSS is introduced by digital foundry and everyone, 4K enhanced by DLSS. You got to admit its strong Nvidia PR thats won over. How did they manage that ? Nobody could tell the native resolution, it could not be measured or zoomed in and seen.

Nope the temporal UE5 is better becuase DF could not tell. They can tell when DLSS is being used, there are enough videos. DF gave up on nanite, do you want the link ?

Also the nantite looks more depth as its made of triangles not a flat texture stuck on top of a mesh - its why it looked so amazing and like nothing else ever shown.
 
Last edited:

Alexios

Cores, shaders and BIOS oh my!
ffs, every single time...
Performance at "max" settings, without context and deep understanding what these settings entail, is completely irrelevant for judging the technical quality of a game, and it's highly damaging how often it seems to be used to evaluate the same. I've wanted to make a thread about this for a while, and seeing how there is right now another one on the front page with "max settings" in the title it seems as good a time as ever.

These days, many people seem to judge the "optimization" (a broadly misunderstood term if I ever saw one!) of games on how they run at "max" settings. What does this mean in practise? Let's say I'm porting a game to PC, and I'm trying to decide which options to include. I could easily add the option of rendering shadow depth buffers at 32 bit precision and up to 4096x4096 instead of the 16 bit and 1024² default. But what would this actually cause to happen? Basically, it will improve IQ and image stability, especially at very high resolution. However, let's assume for the sake of argument that it also halves the framerate of my port, when enabled.

In the prevailing simplistic mindset, I just went from a "great, optimized port" to a "piece of shit port showing how my company is disrespectful of PC gamers" merely by adding an option to my game.

I hope everyone can see how fucking insane this is. As a developer aware of this, I basically have 2 options:
  1. Only allow access to higher-end settings via some ini file or other method which is not easily accessible.
  2. Simply don't bother with higher-end settings at all.
The first point wouldn't be too bad, but it seems like the much more rare choice. If the prevailing opinion of my game's technical quality actually goes down by including high-end options, then why bother at all?

Of course, gamers are not to blame for this exclusively. Review sites got into the habit of benchmarking only "max" settings, especially during the latter part of the PS360 generation, simply because GPUs wouldn't be challenged at all in the vast majority of games otherwise.
 
Last edited:

Mister Wolf

Member
Sorry I did not meant to imply you are a fanboy of Nvidia, it was just my general observation of how DLSS is introduced by digital foundry and everyone, 4K enhanced by DLSS. You got to admit its strong Nvidia PR thats won over. How did they manage that ? Nobody could tell the native resolution, it could not be measured or zoomed in and seen.

Nope the temporal UE5 is better becuase DF could not tell. They can tell when DLSS is being used, there are enough videos. DF gave up on nanite, do you want the link ?

Also the nantite looks more depth as its made of triangles not a flat texture stuck on top of a mesh - its why it looked so amazing and like nothing else ever shown.

They never said they couldn't tell it wasn't 4K. They said they couldn't tell what the base resolution was. That makes sense now because the demo is using dynamic resolution. Obviously they never had to attempt counting DLSS because they knew what its base resolution was from the get go.
 
Last edited:

Reindeer

Member
Sorry I did not meant to imply you are a fanboy of Nvidia, it was just my general observation of how DLSS is introduced by digital foundry and everyone, 4K enhanced by DLSS. You got to admit its strong Nvidia PR thats won over. How did they manage that ? Nobody could tell the native resolution, it could not be measured or zoomed in and seen.

Nope the temporal UE5 is better becuase DF could not tell. They can tell when DLSS is being used, there are enough videos. DF gave up on nanite, do you want the link ?

Also the nantite looks more depth as its made of triangles not a flat texture stuck on top of a mesh - its why it looked so amazing and like nothing else ever shown.
Guess we'll agree to disagree and wait till comparisons can be made. As Mr Wolf pointed out, DF only had issue with identifying base resolution of the Unreal demo, which can be explained by it not being constant as Epic pointed out. There are many DF videos where they were not able to tell base resolution this gen, this was especially a case in games that use reconstruction techniques, but this is in no way a proof of superiority of such techniques. The superiority can be measured in both quality of image and the performance it achieves, so I guess my argument was always on the side of performance.

For image quality DLSS 2.0 is pretty good, sometimes even better than whag native resolution achieves, but DLSS 3.0 could be even better and could outshine all other techniques in both visuals and performance. Time will tell. We'll also get a good idea of AMD solution and how it compares to Nvidia when RDNA2 GPUs arrive on PC.
 
Last edited by a moderator:

Bo_Hazem

Banned
Just read what you typed, amusing, 4k through DLSS - not 1440p upscaled to 4k using DLSS

You notice how Nvidia fans dont like to say the native resolution and just state what it was upscaled to ?

Maybe its a Nvidia PR thing, they say 4K enhanced with DLSS 2,0 super computer machine learning

.......sounds better than 1440p upscaled

.....and how is that better than Nanite. You will get allot of UE5 games next gen I bet, so...

Next gen, everything is upscaled and fans argue which reconstruction technique is better - lol, who would of thought this a few years ago. Hilarious.

Also UE5 demo was most impressive, as their were no flat LODs or textures....everything had rendering depth, beats out any other renderer todate.

DLSS is great, but I'm not sure if it's comparable as it tends to fake things and can get strange results sometimes. All techniques are great and should mature overtime.
 
Last edited:
Top Bottom