• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 2080 Ti Can Barely hit 30fps at 1080p ultra running Watch Dogs Legion

Siri

Banned
So I get it that at least half of you detest Ubisoft and want to see them go down - but the game was shown with heavy amounts of ray tracing.

The game wasn't running at a locked 30 FPS because it’s unoptimized - it was running that way because of ray tracing.

Are some of you trolling, or do you just have no experience with raytracing?
 
These Legion screenshots look ugly as hell. How's it possible that Watchdogs 1 a PS3/4 cross-gen game looks better than this?

Some of mine screenshots from WD1:

f1IuBNZ.jpg


aeEwPdv.jpg


VIOas5q.jpg
 
That benchmark is for an old build, with older drivers. Odyssey, at4k, maxed out on an RTX 2080 TI, absolutely does run like a dream. I’ve been playing it for two weeks now. The game is SO smooth.

I have an extremely low tolerance for poorly optimized games too, I can spot frame-time issues without requiring on-screen metrics. Stutter of any kind drives me insane.

But I guess if you’re going to argue otherwise there’s not much point in discussing this. I can only tell you what I’m experiencing personally.

I have no reason to doubt you. I sadly only have a 1080 and have no issues with 1440p on high for smooth 60 fps. I don't doubt it's been much more optimized since 2018 when that article was originally posted. don't jump the gun on being defensive, i wasn't attacking you. just holding conversation. i appreciate what you added with your actual experience vs what a 2018 mindset i had. Thanks
 
I love how much this backfired. Of course an Ubisoft game using a fairly new technology is gonna run like shit, that's almost par for the course.

Got to love the blatant fanboyism and strong implications of "PC is shit". I especially love the "PC architecture needs to be overhauled"... apparently conveniently ignoring the fact that consoles have literally been using PC architecture since 2013.

Salt isn't a good ingredient for threadmaking :messenger_tears_of_joy:
 
So I get it that at least half of you detest Ubisoft and want to see them go down - but the game was shown with heavy amounts of ray tracing.

The game wasn't running at a locked 30 FPS because it’s unoptimized - it was running that way because of ray tracing.

Are some of you trolling, or do you just have no experience with raytracing?

Also, i saw some videos that showed they were NOT using DLSS. that's going to make a huge difference in performance as well. Obviously it's not native 4k at that point, but DLSS is still very impressive for the fidelity and performance gains.
 
Also, i saw some videos that showed they were NOT using DLSS. that's going to make a huge difference in performance as well. Obviously it's not native 4k at that point, but DLSS is still very impressive for the fidelity and performance gains.
I'm a 4k purist but after experiencing DLSS 2.0 I changed my mind when it comes to resolution. I'd like to see DLSS 2.0 implemented in every game, RTX or not. It's just that good.
 
I'm a 4k purist but after experiencing DLSS 2.0 I changed my mind when it comes to resolution. I'd like to see DLSS 2.0 implemented in every game, RTX or not. It's just that good.

yeah from what i've seen in digital foundry videos and others, it looks like DLSS 2.0 truly is impressive.
 

Shifty1897

Member
Optimization is usually the last thing games focus on in development, they typically run like shit before then. Also, the game's running on Ultra, so it's possible the game is just future proofing it's graphics options, a feature welcome to most PC gamers.

Sensationalizing this is trolling, future proofed Ultra settings has been a staple of PC gaming for decades. Watch any digital foundry comparison video and you'll hear a lot of "the console version runs at a mixture of the low and medium settings on PC."
 
Last edited:
I stopped the money leak too. Ultimately you don't really get a better pic on PC than on consoles anyway (heck, you don't even have good HDR on PC, and best artists and biggest budgets are on consoles anyway), I know because I played both platforms on the same monitor for years.

On PC you end up trying to stupidly brute force every game with your $2000 hardware because nobody cares about optimizing shit for you. Which means whatever power you bought gets completely underused...

No thanks, back to consoles, where every game is optimized for your platform.
Holy hell that's a load of bullshit lol.
 

SF Kosmo

Al Jazeera Special Reporter
Does this series sell well? I'm surprised they keep making these games.
The first one sold well but got weak reviews. Second sold poorly but got good reviews. Ubi sees these series as longer term investments that don't always click right away. AC1, Far Cry 2, etc. Sometimes they don't hit right away.
 
RTX 20 Series GPUs as well as next gen consoles are gonna be real bad for Ray Tracing. The real ray tracing (60fps at high resolutions) will start with RTX 30 Series if the leaked ray tracing performance uptake is true. Nvidia GPUs will also have advantage of DLSS, something that consoles have no equivalent of.
Xbox does. Called Machine Learning and it. Has VRS.
 
So I get it that at least half of you detest Ubisoft and want to see them go down - but the game was shown with heavy amounts of ray tracing.

The game wasn't running at a locked 30 FPS because it’s unoptimized - it was running that way because of ray tracing.

Are some of you trolling, or do you just have no experience with raytracing?
I'm one of "those guys". I didn't even notice it was raytraced. It looks they deleted all the good graphics to finance the RT.

Of course it makes sense once you know that RT is eating the ressources but if your result looks like that.... maybe just stick to old school lighting; it looked good enough.
 

geordiemp

Member
RTX 20 Series GPUs as well as next gen consoles are gonna be real bad for Ray Tracing. The real ray tracing (60fps at high resolutions) will start with RTX 30 Series if the leaked ray tracing performance uptake is true. Nvidia GPUs will also have advantage of DLSS, something that consoles have no equivalent of.

Best upscaling seen so far was UE5 demo, nobody could even tell what the native was it was that good.

DF said hands up, we have no idea.

So Nvidia GPUs will have no advantage in upscaling techniques.

Sorry you cant win em all.
 
Last edited:
Best upscaling seen so far was UE5 demo, nobody could even tell what the native was it was that good.

DF said hands up, we have no idea.

So Nvidia GPUs will have no advantage in upscaling techniques.

Sorry you cant win em all.
Would probably best to actually see what was up and a game running it before screaming victory.
 
Last edited:

geordiemp

Member
And that superiority claim holds true until we get games actually running whatever upscaling technique was used in that demo.

I have no idea what your on about, that upscaling technique was real time game play on ps5, running at 1440p not that anyone could tell or measure it.

So that takes the upscaling crown for now, it was likely some form of temporal but nobody can tell or see it, or measure it. We have to be told.

In fact, there is no crown, once you cant tell, then nobody cares anymore how its done. Good that it does not need any vendor specific hardware, it will work on all platforms.
 
Last edited:
I have no idea what your on about, that upscaling technique was real time game play on ps5, running at 1440p not that anyone could tell or measure it.

So that takes the upscaling crown for now, it was likely some form of temporal but nobody can tell or see it, or measure it. We have to be told.
You perfectly know what I am talking about. A tech demo isn't admissible because it's just that, a tech demo.
 

geordiemp

Member
You perfectly know what I am talking about. A tech demo isn't admissible because it's just that, a tech demo.

No I do not, a tech demo is a cimenatic trailer - this was someone playing the game using a controller, it was a short game, but it was gameplay.

it was a gameplay slice and you know it was, you have watched it and know full well the difference.
 

JimboJones

Member
Unreal engine has some of the best TAA out there
DLSS2.0 is also very good and can seemingly introduce extra information in the upscaled image in some cases.

Both obviously have the draw backs one is exclusive to unreal engine the other to Nvidia.
A more open source DLSS technology would be ideal so it can be implemented relatively easily in all games and be engine/graphics vendor agnostic.
 
Last edited:

Siri

Banned
Also, i saw some videos that showed they were NOT using DLSS. that's going to make a huge difference in performance as well. Obviously it's not native 4k at that point, but DLSS is still very impressive for the fidelity and performance gains.

DLSS 2.0 (as opposed to 1.0) is phenomenal in Control. If that’s the future then the next gen RTX cards are going to be insanely great.

Honestly, I’m not convinced that most gamers understand what a giant leap DLSS 2.0 was over 1.0. We’re talking almost twenty frames at 4K with virtually no IQ loss.

The first iteration of DLSS was a blurry mess and left me cold. The second iteration left me speechless.
 
No I do not, a tech demo is a cimenatic trailer - this was someone playing the game using a controller, it was a short game, but it was gameplay.

it was a gameplay slice and you know it was, you have watched it and know full well the difference.
No, a tech demo is a demo, you know as in DEMONSTRATION to DEMONSTRATE the capabilities of something. That's exactly what this was.
 

geordiemp

Member
No, a tech demo is a demo, you know as in DEMONSTRATION to DEMONSTRATE the capabilities of something. That's exactly what this was.

It ran at 30 FPS in real time and was made for the GDC which got pulled and was played on ps5 with a controller.

It does not matter if the game was 5 minutes, 15 minutes or 50 hours, the technique was better than Nvidias by comments from Digital Foundy.

Nvidia fans dont like to hear that, they like to think everything the green team does is the best. PC will get it as well, so stop being so elitist.

Is it upsetting that everyone will get perfect upscaling ?
 
Last edited:
It deos not matter if the game was 5 minutes, 15 ninutes or 50 hours, the technique was better than Nvidias by comments from Digital Foundy.
Bullshit it doesn't. How many times have we seen gameplay snippets turn out to be nothing like the real thing because they were 10 minutes bits as opposed to the full-blown thing?

Plus we don't even know the ins and outs of the technique. Is it something that can be easily implemented? Or is it some bullshit-tier trick that needs AI training like DLSS that was promised it in a bunch of games that ended up never having it?(Darksiders III, I'm looking at you.)

We saw DLSS in action prior to its release. We saw it implemented in games and it was a dog turd and it took over a year for DLSS 2.0(and it's available in like 5 games) to give something worthwhile yet we're supposed to just roll with that UE5 demo because a 5 minutes demo showed it?

Gonna have to go with no.

Show me games that run it, I know you can't so I won't even bother with you anymore.
 
Last edited:

geordiemp

Member
Bullshit it doesn't. How many times have we seen gameplay snippets turn out to be nothing like the real thing because they were 10 minutes bits as opposed to the full-blown thing.

Plus we don't even know the ins and outs of the technique. Is it something that can be easily implemented? Or is it some bullshit-tier trick that needs AI training like DLSS that was promised it in a bunch of games that ended up never having it?(Darksiders III, I'm looking at you.)

We saw DLSS in action prior to its release. We saw it implemented in games and it was a dog turd and it took over a year for DLSS 2.0 to give something worthwhile yet we're supposed to just roll with that UE5 demo because a 5 minutes demo showed it?

Gonna have to go with no.

Show me games that run it, I know you can't so I won't even bother with you anymore.



:messenger_beaming:
 

vkbest

Member
Then its a very badly optimized game on PC. We've had this before with many other titles like Batman Arkham knight.

So if the 2080ti runs Cyberpunk down the line on the same settings what will we say then

was not cyberpunk running under 1080p/60fps on 2080ti + dlss2 activated (540p or 720p native res) on the last demo?
 

RedVIper

Banned
Turn off ray tracing, barely a visual difference and suddenly the game runs 4 times better.

Metro had the exact same issue, raytracing had a negligible visual impact, but halved the performance (In the best case scenario, worst case scenario you got 20% of the performance)
 
Last edited:

Dampf

Member
With ray tracing, which is very expensive even on the 2080ti. Thankfully next gen consoles have their own special way of using ray tracing that doesn’t give a massive hit, one a hell of a lot more so than the other as it was a bigger focus in design.
Sources? We don't know how consoles perform compared to RTX GPUs. If anything is an indicator, its a little worse given how Minecraft DXR runs on both.
 
All right PC master race, you just cant take it when DF said it was the best upscaling technique they had seen.

All you can do to reposte is insult.
It's also the best looking game ever, right?

I clearly addressed the flaw in your argument and you keep playing the troll-card.
 

nani17

are in a big trouble
was not cyberpunk running under 1080p/60fps on 2080ti + dlss2 activated (540p or 720p native res) on the last demo?

yes, another unfinished title. Do you know how many people own 2080ti in the world? Think about it that would mean more than 90% of PC gamers won't to be able to play final game if that was the case
 
Last edited:

JimboJones

Member
Sources? We don't know how consoles perform compared to RTX GPUs. If anything is an indicator, its a little worse given how Minecraft DXR runs on both.
He's just making stuff up.
It will either be a selected pc setting or console specific quality setting somewhere in-between pc medium to very high settings most likely.
 
Top Bottom