• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 2080 Ti Can Barely hit 30fps at 1080p ultra running Watch Dogs Legion

nani17

are in a big trouble
Then its a very badly optimized game on PC. We've had this before with many other titles like Batman Arkham knight.

So if the 2080ti runs Cyberpunk down the line on the same settings what will we say then
 
Last edited:

skneogaf

Member
My PC though...

DSC-4133.jpg


Radeon VII, Ryzen 7 2700x, 32GB RAM, 1TB 3.5GB/s Samsung 970 Pro, 2TB WD SATA3 SSD, etc. I can buy 2x 2080Ti today if I want to, went for Radeon VII for 16GB VRAM and HBM2 1TB/s bandwidth for productivity not gaming.

My 2nd pc is better than that to be honest but as you said it's not for gaming so I'll let you off.

You obviously don't care about graphics at the highest possible settings and fps or you would not have chosen amd for pc and also be so obsessed with playstation as both are far behind Intel, nvidia and xbox for gaming in terms of the higher graphics quality etc
 
Last edited:

Stuart360

Member
I think its worth remembering that WatchDogs 2 was so badly cpu dependant, that tech sites used to use it for benchmarking cpu's no joke lol.
WatchDogs 3 will run at 30fps on XSX and PS5, i'll bet anything on that.
 

Hairsplash

Member
funny... but seriously reduce the “useless” eye candy “ultra” settings, and it will run at a far higher FPS... the RTX ray tracing is “amazing” in a “reality” way... but not worth the frame rate hit.
 

skneogaf

Member
Technically the weakest system would be the Lockhart and that system is supposed to handle next gen games at 1080P.

True but you then could argue that the weakest potato pc that will play this without crashing is the weakest so technically its best to say the system that is the direct competition.
 

Krisprolls

Banned
Ubisoft cant code for shit as always.

Vallhala is also locked 30fps

Not saying 2080ti cant do more

But like in the past consoles can perform better then equal pc cards and not just by a little.

This is why i rather settle for a $500 console and expect nothing rather than spending 1000 plus on pc and find out its not even being fully used and still get almost identical results to the cheaper alternative.

Pc gaming is a waste of money. I used to be a pc gamer few years prior to ps3.
Enjoyed all the pc exclusives like half life 2 soldiers of fortune 2 doom3 alien vs predator2 stalker plus many many more till ps3 showed up.
But i spent thousands on it. Back then i didnt care it was my only thing to do.
But now i just cant justify the cost.

I stopped the money leak too. Ultimately you don't really get a better pic on PC than on consoles anyway (heck, you don't even have good HDR on PC, and best artists and biggest budgets are on consoles anyway), I know because I played both platforms on the same monitor for years.

On PC you end up trying to stupidly brute force every game with your $2000 hardware because nobody cares about optimizing shit for you. Which means whatever power you bought gets completely underused...

No thanks, back to consoles, where every game is optimized for your platform.
 
Last edited:

skneogaf

Member
To be fair neither of the high end systems will play this at a Native 4K 60FPs with everything turned all the way up. PC will be the only option for that.

Yeah it's definitely a 30fps game for consoles unless vrr allows the frame rate to go nuts and hit 40 fps!
 

Bo_Hazem

Banned
My 2nd pc is better than that to be honest but as you said it's not for gaming so I'll let you off.

You obviously don't care about graphics at the highest possible settings and fps or you would not have chosrn and for pc and be so obsessed with playstation both are far behind Intel, nvidia and xbox for gaming in terms of the higher graphics quality etc

Nope, I love native 4K and would love 8K gaming in the future, but console gaming is much more convenient for me. Would play Star Citizen on PC when it's ready. I was going for a workstation build but Radeon VII seemed like a good hybrid and good to have it gaming-ready just in case. Might jump to Big Navi if it's worth the upgrade, the PC already paid its pricetag for me.

Both machines should be more effecient, more stable, more consistent. PC gaming is not my cup of tea.
 

Mister Wolf

Gold Member
Using raytracing for lighting is the best option for a open world/sandbox games(Metro Exodus, Control, Cyberpunk, Dying Light 2). Everyone knows this except Ubisoft since they used it for reflections anyway.
 
Last edited:

Stuart360

Member
Nope, I love native 4K and would love 8K gaming in the future, but console gaming is much more convenient for me. Would play Star Citizen on PC when it's ready. I was going for a workstation build but Radeon VII seemed like a good hybrid and good to have it gaming-ready just in case. Might jump to Big Navi if it's worth the upgrade, the PC already paid its pricetag for me.

Both machines should be more effecient, more stable, more consistent. PC gaming is not my cup of tea.
Dont forget Digital Foundry's budget I3 GTX750 powered machine pulling better results than PS4 at launch, a system that PS4 should of been pulling better results than in games.
The efficiency thing is a bit of a myth with consoles vs PC's. It only becomes a reality in the last couple of years of a gen because why spend a ton of time and money optimizing PC console ports when the 1060, 2060, and 2070 are the most used gpu's?.
 
Last edited:

GHG

Member
I wouldnt really say it runs like a 'dream' at 4k on a 2080ti. it gets an average framerate of 50 in the benchmark, and can drop into the 30's in bigger cities.

And yet turning down one stupid unoptimised setting (volumetric clouds) from ultra to high will net you 60fps at 4k pretty much everywhere on a 2080ti. Even still that setting for clouds is still much higher than the one used on consoles.
 

GHG

Member
The final game will have DLSS 2.0 by the way.

If you're going to be enabling Ray tracing you'd be stupid not to enable it.
 

skneogaf

Member
Makes more sense for developers to lock it at 30FPs since not everyone will have VRR.

Yes definitely but I had hoped both ps4 and xbox series x will be able to unlock the frame rate system controlled as vrr would be somewhat useless otherwise like it kind of is now on xbox one x. Only a few games utilise the feature on one x and that I believe is unintended.
 

skneogaf

Member
Nope, I love native 4K and would love 8K gaming in the future, but console gaming is much more convenient for me. Would play Star Citizen on PC when it's ready. I was going for a workstation build but Radeon VII seemed like a good hybrid and good to have it gaming-ready just in case. Might jump to Big Navi if it's worth the upgrade, the PC already paid its pricetag for me.

Both machines should be more effecient, more stable, more consistent. PC gaming is not my cup of tea.

I can't argue with any of that unfortunately
 

GetemMa

Member

The ones that use Ray Tracing.

RT was added to BFV after launch, intial benchmarks will not include RT figures. I don't think the ones you are using here do.

It seems EA has patched BFV since I last looked at these numbers and it has improved their framerates while using RayTracing. But they are still getting pretty low performance on what is a 1200 dollar GPU.

There are multiple levels of RT detail that can be implemented (DXR High/Med/Low). But RT at High settings means they are hovering around 1440p and 60fps in most scenes. Some scenes may be using high amounts of RT than others so performance can vary.

this article has charts both pre patch and post patch.


I am also hugely skeptical of the value of "Low" RT settings, as TechSpot pointed out:

"DXR off is still 56% faster than DXR Low and in our test area there is essentially no visual difference between the two modes. "

IMO if you are not running RT at high setttings you are barely running it at all, while it still destroys your framerate.

I think it would be a mistake for next gen consoles to obsess over raytracing. At least give us an option to turn it on or off and increase performance. I would take Native 4K/60 FPS with RT off over sub native 4K/ 30 fps with RT turned on at this point.
 
Both are more efficient than PC's with what they have, I believe. PC architecture needs to be overhauled.

I dunno it's that, when both MS and Sony's systems are using that same PC architecture. Same x86-based CPUs. Same GPUs based on AMD tech that's also on PC (by and large). Same type of VRAM. Same northbridge/southbridge setup. Same storage technology etc.

The consoles today are more like PCs than ever before. They just do a few things better but it's got nothing to do with having non-PC architectures tbh.

welcome to ray tracing.

Battlefield V with Ray Tracing turned on pushes a 2080ti to it's limits and it never hits stable 60 FPS, at 1080p. Yes 1080p.....not 4k.

That's why this notion that the new consoles will knock out AAA games at native 4K/60fps plus Ray Tracing is a joke. Only in the most simple indie game would that be possible.

Subnative 4k resolutions (likely 1440p upscaled), 30fps., and raytracing is the combo you are likely to see most next gen. Maybe Raytracing will get better on the software side as we go, but I wouldn't expect some sort of huge change with regards to target resolution and performance on next gen consoles.

Hope they give you the option to switch Ray Tracing on/off in most games because implementation will vary. Look at videos that compare Control with Ray Tracing turned on and off. It's up to you if you think the difference is worth it. IMO it is not worth annihilating your performance over.

I dunno man; Bright Memory Infinite's RT gameplay looked very smooth and polished and didn't seem to hold back performance. That was running on specs roughly equivalent to a Series X IIRC, during the presentation.

Maybe it's moreso the combination of RT and the scope of an open-world game like WD2? Smaller games like BMI and Project Mara, I think they won't have any issues with heavy RT because they don't have the equivalent of open-world physics, simulation, logic etc. to run simultaneously. So if people were expecting that level of RT in a WD2, Valhalla or the next-gen GTA, they need to put those expectations on the shelf. It's just too much for those types of games.
 
Last edited:

Bo_Hazem

Banned
I dunno it's that, when both MS and Sony's systems are using that same PC architecture. Same x86-based CPUs. Same GPUs based on AMD tech that's also on PC (by and large). Same type of VRAM. Same northbridge/southbridge setup. Same storage technology etc.

The consoles today are more like PCs than ever before. They just do a few things better but it's got nothing to do with having non-PC architectures tbh.

I don't agree with that, but let's assume so, I'm more than sure they'll get much better optimization than PC that have insanely wide variety of setups.
 

Jtibh

Banned
Nope, I love native 4K and would love 8K gaming in the future, but console gaming is much more convenient for me. Would play Star Citizen on PC when it's ready. I was going for a workstation build but Radeon VII seemed like a good hybrid and good to have it gaming-ready just in case. Might jump to Big Navi if it's worth the upgrade, the PC already paid its pricetag for me.

Both machines should be more effecient, more stable, more consistent. PC gaming is not my cup of tea.
8k flat chest abbey with 6 inch clit.
Mmmmm just imagine Bo.
Hdr ....ray traced lightning on those muscles just waiting for you to stick it in.
 

ShirAhava

Plays with kids toys, in the adult gaming world
1080 TI had a good run

but man 2080 TI is gonna age like shit in the sun

If you own one sell it...asap
 
Last edited:

MiguelItUp

Member
Eh, it's Ubisoft and an early build. Who knows what the problem is. That being said, I wouldn't be surprised if it releases and is just as bad. Not like Ubisoft is known for absolutely amazing performance and optimization in their engines.
 

Rathalos

Banned
This says more about the game than the hardware. Ubisoft games often run like shit without really looking good enough to warrant it. That being said, there's always a few settings which you can turn to high to massively improve your experience without any real loss in quality.
 

GetemMa

Member
I dunno it's that, when both MS and Sony's systems are using that same PC architecture. Same x86-based CPUs. Same GPUs based on AMD tech that's also on PC (by and large). Same type of VRAM. Same northbridge/southbridge setup. Same storage technology etc.

The consoles today are more like PCs than ever before. They just do a few things better but it's got nothing to do with having non-PC architectures tbh.



I dunno man; Bright Memory Infinite's RT gameplay looked very smooth and polished and didn't seem to hold back performance. That was running on specs roughly equivalent to a Series X IIRC, during the presentation.

Maybe it's moreso the combination of RT and the scope of an open-world game like WD2? Smaller games like BMI and Project Mara, I think they won't have any issues with heavy RT because they don't have the equivalent of open-world physics, simulation, logic etc. to run simultaneously. So if people were expecting that level of RT in a WD2, Valhalla or the next-gen GTA, they need to put those expectations on the shelf. It's just too much for those types of games.

Yeah I agree. RT impact on performance will certainly vary from game to game. I agree, AAA open world games seem almost impossible to accommodate proper RT while maintaining acceptable resolution and performance. I just hope developers don't obsess over RT at the expense of everything else because imo it wouldn't be worth it. Gimme native resolution and high FPS over accurate light reflection any day of the week.
 

GymWolf

Member
Odyssey runs like a dream at 4K on an RTX 2080 TI - and the amount of detail is staggering. You’d need over a thousand people working on a game to pull that off.

Very few studios have that clout.
Odissey (and origins) were a disaster at launch (and even later) the cpu usage was off the charts and the game doesn't even look noticeably better than some exclusive open worlds on console.

Ubisoft is not exactly what i call a good developer on pc...
 

IntentionalPun

Ask me about my wife's perfect butthole
2080TI will be over 2 years old when these consoles release....

And this game likely isn't optimized yet... we have no clue what the final performance will be.. (and honestly, it's Ubisoft, "final performance" will come 6 months after release lol)

Great thread Bo-bro.
 
Last edited:

Skyr

Member
Just took a look at the DF video. A good example how RTX doesn't make a bad looking game magically look good.
 

IntentionalPun

Ask me about my wife's perfect butthole
the 2080ti is still 1300 dollars in 2020 bro

Sure; but PC gamers aren't expecting to run next-gen at "4k120" with 2 year old cards, it's about the upcoming generation of cards.

And that will include much more affordable options than the ridiculous 80/80TI series cards.

Most of us actually don't give half a shit about running games on "Ultra" either; it's about buying what you can afford and then running the games how you want to. Crank details if you care less about framerate, or tune to framerate if that's your preference.
 
Last edited:
Top Bottom