• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 2080 Ti Can Barely hit 30fps at 1080p ultra running Watch Dogs Legion

ZywyPL

Banned
Uploaded 2 hours ago, quite interesting and seems to related to what's happening here:



Good tip, but it was much better to simply not to get current RTX cards at all to begin with ;) But I get it, someone (NV) had to do it, RT was a needed baby step to move things forward, so the future generations of hardware can shine, just like consoles launched with shitty Jaguar just to jump on the x86 architecture, so we can have now Zen2 CPUs, and in the exact same fasion, the upcoming RDNA2 and Ampere GPUs will show how it's done.
 

april6e

Member
As a PC newbie, all this jargon makes my head spin. I just want a card that can play any triple AAA game in 4k 60fps. :messenger_loudly_crying:
 
Last edited:

Siri

Banned
As a PC newbie, all this jargon makes my head spin. I just want a card that can play any triple AAA game in 4k 60fps. :messenger_loudly_crying:

At max settings?

Might be the RTX 3080 TI. Even the 2080 TI struggles with certain high-end games, such as the incredibly beautiful Red Dead Redemption 2.
 

GHG

Member
This guy needs to STFU.

tenor.gif


At max settings?

Might be the RTX 3080 TI. Even the 2080 TI struggles with certain high-end games, such as the incredibly beautiful Red Dead Redemption 2.

Max settings are so overrated in a lot of cases. With RDR2 in particular there is no perceptible difference between high and max for a lot of the available settings other than the fact that they tank your framerate. Tree tessellation I'm looking at you. The same goes for other games like AC Odyssey.

The 2080ti is a perfectly good 4k 60fps card, especially as DLSS 2.0 becomes more prevalent.
 
Last edited:

shoegaze

Member
Watch Dogs: Legion on GeForce RTX 2080 Ti: 1080p can barely hit 30FPS
Ubisoft locked the preview build of Watch Dogs: Legion to 1080p 30FPS -- RTX 2080 Ti could barely handle that.

In their hands-on time with Watch Dogs: Legion, Digital Foundry may have been using the most powerful graphics card you can buy in the GeForce RTX 2080 Ti -- yet it couldn't handle the game at 1080p and anything over 30FPS. Ubisoft had locked Watch Dogs: Legion's preview build to 30FPS, for some stupid reason.






484k09.jpg

I'm sure it's just Ubisoft being shit at coding their games rather than 2080ti being weak.
 

pawel86ck

Banned
I'm not surprised because RT cores on 2080ti cant run every RTX effect at 1080p 60fps, so there's a huge bottleneck. I'm sure game will run at 4K 60fps without RTX and something like high settings instead of ultra.

But RT cores doesn't need to bottleneck entire GPU as long as developers will not exceed their rays budget for 1080p 60fps, and later on they can upscale RTX effects to 4K. That's what GT7 on PS5 is doing, so game is running at 4K and still use RT.

For example in metro exodus 2060S has only 3-5fps less with RTX at 1080p, so if developers will balance their rays budget then RT performance impact is minimal. Watch dogs 3 will probably run at 4K 30fps with upscaled RT effects on both PS5 and XSX.
 

Kumomeme

Member
from the gameplay trailer im not suprise if ubisoft turn this into gta online or battle royal kind of game later LOL
 
Last edited:

Hunnybun

Member
welcome to ray tracing.

Battlefield V with Ray Tracing turned on pushes a 2080ti to it's limits and it never hits stable 60 FPS, at 1080p. Yes 1080p.....not 4k.

That's why this notion that the new consoles will knock out AAA games at native 4K/60fps plus Ray Tracing is a joke. Only in the most simple indie game would that be possible.

Subnative 4k resolutions (likely 1440p upscaled), 30fps., and raytracing is the combo you are likely to see most next gen. Maybe Raytracing will get better on the software side as we go, but I wouldn't expect some sort of huge change with regards to target resolution and performance on next gen consoles.

Hope they give you the option to switch Ray Tracing on/off in most games because implementation will vary. Look at videos that compare Control with Ray Tracing turned on and off. It's up to you if you think the difference is worth it. IMO it is not worth annihilating your performance over.

Isn't Gran Turismo 7 doing ray tracing at 4k 60 fps?

The developers of Observer System Redux have said the same about their game, too.
 

pawel86ck

Banned
Turn off ray tracing, barely a visual difference and suddenly the game runs 4 times better.

Metro had the exact same issue, raytracing had a negligible visual impact, but halved the performance (In the best case scenario, worst case scenario you got 20% of the performance)
RTX performance penalty is not big in metro exodus at 1080p and RT GI makes a clear difference in this game.

5G7dKsC.png
 
Last edited:

supernova8

Banned
Ubisoft games never look good to me. Even Assassins Creed Origins, which looks better than most, I still find underwhelming compared to the best PS4 games.

I'm not sure if it's just me.

Only impressive Ubisoft stuff recently has been the Rainbow Six Siege and Watch Dogs 1 reveal back in like.. years ago..? And those turned out to be bullshit in the end.
 

Hunnybun

Member
Only impressive Ubisoft stuff recently has been the Rainbow Six Siege and Watch Dogs 1 reveal back in like.. years ago..? And those turned out to be bullshit in the end.

Yeah, and The Division.

Probably the worst thing about them revealing bullshit trailers is they don't even make attractive games compared to what's actually possible.

So the downgrades are ludicrously drastic.

That Ghost Recon game was another one.
 
Good tip, but it was much better to simply not to get current RTX cards at all to begin with ;) But I get it, someone (NV) had to do it, RT was a needed baby step to move things forward, so the future generations of hardware can shine, just like consoles launched with shitty Jaguar just to jump on the x86 architecture, so we can have now Zen2 CPUs, and in the exact same fasion, the upcoming RDNA2 and Ampere GPUs will show how it's done.
I disagree. 2080Ti was an amazing upgrade for those who could afford it. It provided ultimate performance for 2 years.
 
  • Like
Reactions: GHG

Bo_Hazem

Banned
Uploaded 2 hours ago, quite interesting and seems to related to what's happening here:



kkkkkk.jpg



Hmm, sounds like the theory of a 36+36=72CU PS5 Pro by 2023-2024 is pretty much inline with AMD's roadmap of only using 40CU dies max (40+40=80CU for the premium model of Big Navi). Just like PS4 Pro's 18+18=36CU butterfly-like chiplet, but this time around could be stacked.
 

ZywyPL

Banned
I disagree. 2080Ti was an amazing upgrade for those who could afford it. It provided ultimate performance for 2 years.

I don't know, I'm still rocking very hard with my 1080Ti @2.1GHz, I just don't have the some RT effects in what, 6, 7 titles? Which I'm personally not even interested in to begin with? RT won't seriously took off until PS5/XBX launch and the vast majority of games will be made with RT effects in mind from the get-go, and before that happens Ampere and Big Navi GPUs should be available and wipe the floor with 20xx RTX cards when it comes to RT performance.
 
I don't know, I'm still rocking very hard with my 1080Ti @2.1GHz, I just don't have the some RT effects in what, 6, 7 titles? Which I'm personally not even interested in to begin with? RT won't seriously took off until PS5/XBX launch and the vast majority of games will be made with RT effects in mind from the get-go, and before that happens Ampere and Big Navi GPUs should be available and wipe the floor with 20xx RTX cards when it comes to RT performance.
RT is just a cherry on top, I was talking more about general performance in 4k gaming. Again, the only downside to 2080Ti is the price. Let's say money is non-issue to you. Why would you not pick up 2080Ti?
 

Arun1910

Member
Ubisoft games never look good to me. Even Assassins Creed Origins, which looks better than most, I still find underwhelming compared to the best PS4 games.

I'm not sure if it's just me.

Ubisoft worlds are some of the biggest you will come across, they have to sacrifice something.
 

ZywyPL

Banned
RT is just a cherry on top, I was talking more about general performance in 4k gaming. Again, the only downside to 2080Ti is the price. Let's say money is non-issue to you. Why would you not pick up 2080Ti?

Nah, I'm already playing at 60-120FPS in 4K, so why bother with even more? When I'll be searching for a new card I expect similar performance, except with RT turned on.
 

Skyr

Member
The 2080ti may not be the card for RT but to say it was a bad buy is ridicoulus, especially in the current market situation.
You can sell it right now with minimal to no loss.
 

Hunnybun

Member
I kinda feel like the original Unity and Watch Dogs concepts were just that. Great looking games, but then the consoles turned out to be shit tier hardware so they had to keep downgrading the games.

I think Unity is the one big Ubisoft game that looked really good (I presume the PC game looks pretty much like the reveal?).

All the others seem to be technically mediocre and artistically awful.
 
I think Unity is the one big Ubisoft game that looked really good (I presume the PC game looks pretty much like the reveal?).

All the others seem to be technically mediocre and artistically awful.
Watchdogs 1 with mods looks amazing to this day. Unity is hands down one of the better looking games this generation.
 
Last edited:

diffusionx

Gold Member
Ubisoft games never look good to me. Even Assassins Creed Origins, which looks better than most, I still find underwhelming compared to the best PS4 games.

I'm not sure if it's just me.

Yea it might be you. One thing Ubi does right, is production values. Games like Wildlands, FC5, AC Odyssey, and Division are some of the best looking of the gen.
 
Last edited:

Hunnybun

Member
Yea it might be you. One thing Ubi does right, is production values. Games like Wildlands, FC5, AC Odyssey, and Division are some of the best looking of the gen.

I actually looked up The Division 2 on max settings after it was recommended above, and it looked fucking ugly to me. Definitely below average.

Wildlands from what I remember is ugly too. The Division is just about average.

Far Cry 5 isn't bad looking, but nothing like one of the best of the generation.
 

Rikkori

Member
I actually looked up The Division 2 on max settings after it was recommended above, and it looked fucking ugly to me. Definitely below average.

Wildlands from what I remember is ugly too. The Division is just about average.

Far Cry 5 isn't bad looking, but nothing like one of the best of the generation.

below average
:goog_rolleyes::goog_rolleyes::goog_rolleyes::goog_rolleyes:

49626908328_668e9a7498_o.jpg


33644562448_380f486e93_o.png


47491335081_5f0723a4b9_o.png


49650895666_51a1b38aa1_k.jpg


49591758378_9bf4199725_o.jpg


49619744962_ba4e133904_o.jpg
 

Rikkori

Member
Dunno what to tell you. I watched footage on YouTube of the PC version with max settings.

The game sure looked nothing like those screens.

Maybe you could link to an alternative video?

You can't judge based on youtube video, because games like TD2 are foliage heavy which looks worse on youtube due to bitrate & how that looks in highly compressed video. Same way you shouldn't judge racing game quality on youtube etc.

You can find high quality videos here, but you need to log-in to download: https://www.gamersyde.com/news_washington_d_c_in_hdr-21503_en.html
And ofc, they're HDR, so you'd need a display that can show it.
 

WorldHero

Member
The thing about Ray Tracing is that if a game isn't made with it in mind from the very beginning, it turns into a VERY expensive effect. Don't get me wrong, when implemented correctly, it looks wonderful, but at this rate, it's too costly on performance. At this rate a 4K, ultra settings, high/ultra RT settings game at 60fps is highly unlikely on current cards.
 

Dontero

Banned
I played Div2. The screenshots show game at best not on average what game looks like. Game does look average for this gen. Graphics simply do not stands out from rest.

Here is similar example. GT5. This is screenshot i made years ago back on GT5 premiere on PS3:

unknown.png


Here is same game from different angle with different car on different tracks:

2579910-5350043984-Gran_.jpg
 
Last edited:

Reindeer

Member
Best upscaling seen so far was UE5 demo, nobody could even tell what the native was it was that good.

DF said hands up, we have no idea.

So Nvidia GPUs will have no advantage in upscaling techniques.

Sorry you cant win em all.

"So Nvidia GPUs will have no advantage in upscaling techniques."

This is nothing but assumption on your part as we have no way to test the two against each other right now. That Unreal Demo's upscaling also isn't looking all that great for performance as it's running at 1440p natively most of the time with little going on in the demo and it only runs at 30fps. I would say for now DLSS definitely takes the performance crown.
 
Last edited by a moderator:

magnumpy

Member
Watch Dogs: Legion on GeForce RTX 2080 Ti: 1080p can barely hit 30FPS
Ubisoft locked the preview build of Watch Dogs: Legion to 1080p 30FPS -- RTX 2080 Ti could barely handle that.

In their hands-on time with Watch Dogs: Legion, Digital Foundry may have been using the most powerful graphics card you can buy in the GeForce RTX 2080 Ti -- yet it couldn't handle the game at 1080p and anything over 30FPS. Ubisoft had locked Watch Dogs: Legion's preview build to 30FPS, for some stupid reason.






484k09.jpg


2080ti not enough for next gen, years of hard labor and years of debt is your sentence for choosing to participate in next generation :(
 

geordiemp

Member
"So Nvidia GPUs will have no advantage in upscaling techniques."

This is nothing but assumption on your part as we have no way to test the two against each other right now. That Unreal Demo's upscaling also isn't looking all that great for performance as it's running at 1440p natively most of the time with little going on in the demo and it only runs at 30fps. I would say for now DLSS definitely takes the performance crown.

And its nothing but assumption on your part to state otherwise. I would say the UE5 definately takes the performance crown,. cant recall anything that impressive being played in real time so...

And the Nannite renders in 4.25 ms, fine for 60 FPS, its the lumen GI that is being worked on to get it to 60.

If you prefer DLSS thats fine, I disagree. UE5 demo was more impressive than any PC game with RT, and it will also be on PC so everybody wins.

sDTvKY4.png


T5OSo23.png
 
Last edited:
Top Bottom