• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[DF] Guardians of the Galaxy: PS5 vs Xbox Series X/S - A Great Game But 60FPS Comes At A Cost

retsof624

Member
Been replaying Last Of Us 2 and a 1.8TF GPU renders more grass.

LOU2.jpg
Every time I see a screenshot of this game it just makes me wanna go back again
 

CrustyBritches

Gold Member
KitGuru has one of the better benchmark analysis of GotG.
Before getting to the benchmark data for every single GPU, it’s worth noting that we didn’t use the game’s built-in benchmark for our testing today. As you can see above, the built-in bench delivers frame rates up to 38% higher than what we saw while actually playing the game. Instead, for this analysis we benchmarked a section from the game’s first chapter, delivering data that was much more representative of what I saw over the two hours or-so that I played the game.
I think that's a good approach. As for the data...
This testing seems more representative of actual gameplay and better for comparison to consoles. Here the GTX 1070 has 55fps avg | 47fps 1% low. In the other graph floating around in this thread they have 73fps avg | 50fps low. The PS5/XSX slot in around the 5700 XT, maybe a little lower where the RX 5700 would be. It's not unusual for consoles to land around that performance as seen with AC Valhalla. This is simply a Nvidia favored title.
 
Last edited:

Stuart360

Member
It's the one kind of trolling that gets a free pass, so the usual suspects are jumping on the opportunity.
The thing is they are using this game as the 'norm' when it comes to XSS, when its really not. This game seems to have by far the biggest differences between XSS and XSX/PS5 we have seen so far. Most games on XSS look just the same but at a lower resolution.

The game just seems badly optimized on console compared to PC (about time lol).
 

DenchDeckard

Moderated wildly
KitGuru has one of the better benchmark analysis of GotG.

I think that's a good approach. As for the data...

This testing seems more representative of actual gameplay and better for comparison to consoles. Here the GTX 1070 has 55fps avg | 47fps 1% low. In the other graph floating around in this thread they have 73fps avg | 50fps low. The PS5/XSX slot in around the 5700 XT, maybe a little lower where the RX 5700 would be. It's not unusual for consoles to land around that performance as seen with AC Valhalla. This is simply a Nvidia favored title.

This looks inline with the consoles.

3080 is a beast in this game. Im running on my LG CX at 4k DLSS everything on very high and its so smooth. Man I love this TV. G sync is god.
 
This game is fucking impressive. Technical benchmark and impressive storytelling, great combat, excellent exploration & character interactions, and I've only just witnessed a very small piece. Now I understand why the game is 1080p in the 60fps mode.

It can likely be better optimized, but this is more than just a badly optimized console title. It's doing a whole lot of shit that is going unnoticed. It hardly seems like it's using any baked lighting at all. So many aspects seem dynamic, and lots of very nice geometry work going on, lots of great interactivity with the world and other physics elements. This game is doing a ton.
 

Stuart360

Member
This game is fucking impressive. Technical benchmark and impressive storytelling, great combat, excellent exploration & character interactions, and I've only just witnessed a very small piece. Now I understand why the game is 1080p in the 60fps mode.

It can likely be better optimized, but this is more than just a badly optimized console title. It's doing a whole lot of shit that is going unnoticed. It hardly seems like it's using any baked lighting at all. So many aspects seem dynamic, and lots of very nice geometry work going on, lots of great interactivity with the world and other physics elements. This game is doing a ton.
The game runs at native 4k in the 30fps mode though, they shouldnt need to have to drop from 4k to 1080p AND use lower settings, to get 60fps. It doesnt really make sense.

If i had to guess it would be a time issue for me. It will be interesting to see if the 60fps mode improves over the coming months.
 

Topher

Gold Member
This game is fucking impressive. Technical benchmark and impressive storytelling, great combat, excellent exploration & character interactions, and I've only just witnessed a very small piece. Now I understand why the game is 1080p in the 60fps mode.

It can likely be better optimized, but this is more than just a badly optimized console title. It's doing a whole lot of shit that is going unnoticed. It hardly seems like it's using any baked lighting at all. So many aspects seem dynamic, and lots of very nice geometry work going on, lots of great interactivity with the world and other physics elements. This game is doing a ton.

How much difference can you tell between quality and performance mode as far as image quality?

The game runs at native 4k in the 30fps mode though, they shouldnt need to have to drop from 4k to 1080p AND use lower settings, to get 60fps. It doesnt really make sense.

If i had to guess it would be a time issue for me. It will be interesting to see if the 60fps mode improves over the coming months.

I think you are right. The lack of DRS is a bit puzzling and I'm betting they just did a native port from PC. DRS seems to be the "secret sauce" that make these consoles perform so well. Frankly, I'm not really that surprised that native versions of a game like this doesn't perform well on console.
 
Last edited:

Neo_game

Member
Disappointing performance from the consoles. Hopefully they can improve it via patch. As tech demo like Unreal5 was doing 1440P and Minecraft RT was 1080P 60fps on SX. I think gfx demanding games is expected to be around these settings. Personally I will take gfx details over resolution so IMO 4K is just waste of resources.

I need a PS5 Pro or the Xbox equivalent. These consoles are struggling too much. Yeah, “lazy devs”, but when that is the norm, all you can hope for is better hardware.

I agree with you. But apparently some people are happy with SS it self 🤷‍♂️

Sony is hoping PS5 will last 10years. I think 5years is as far as it can go. They will need a mid gen upgrade or new gen console by 2025.
 
Last edited:

Topher

Gold Member
Random thought. How many graphically intensive AAA games have there been this gen without DRS? I can't think of any. I don't think these consoles were designed for native resolutions.
 

Neo_game

Member
Don't you mean both consoles since they are similar in power?

Yes I think it should be same for both. But it depends on sales and consumer demand. Some one posted an estimate that 40% sales of Xbox is SS. So we have to wait and see what their strategy is. I think Pro version will release if they want to extend the lifecycle of this gen.
 
Yes I think it should be same for both. But it depends on sales and consumer demand. Some one posted an estimate that 40% sales of Xbox is SS. So we have to wait and see what their strategy is. I think Pro version will release if they want to extend the lifecycle of this gen.

Same for the XSS? That system is going to struggle by the time the Pro versions of the XSX and the PS5 comes out.
 

CrustyBritches

Gold Member
This is on ultra though. On consoles, the game is nowhere near ultra in performance mode.
Console look to be somewhere in the High range.
KitGuru has a presets scaling graph
Random thought. How many graphically intensive AAA games have there been this gen without DRS? I can't think of any. I don't think these consoles were designed for native resolutions.
The PC version has DRS.
---
I love the game and get decent enough performance, but there's weird shit going on with this game that I can't really explain. First off, the newer Nvida "game-ready" driver has worse performance than the previous driver. Secondly, I'm getting no performance difference between Native, DLSS Quality, and DLSS Ultra Performance. I know it's kicking on from the difference in image quality, especially noticeable with DLSS Ultra Perf. Initially I though I was CPU-bound, but resource monitor has only 63% CPU usage, and balanced multi-threading(no main game thread being hammered). I've never had a game where DLSS doesn't provide any performance increase.
DLSS Off
6KUFePK.jpg

DLSS Quality
S0swfQA.jpg
At 1080p, no performance difference between Native and Ultra Performance:

I've done clean driver reinstall a couple times. I'm going to try to reinstall the game. It's some weird bottleneck. I don't get it.
 
Same for the XSS? That system is going to struggle by the time the Pro versions of the XSX and the PS5 comes out.
Based on what evidence? This game is running 1080p/60 on the high end consoles. You think those consoles will 'struggle' too? Are people not aware there is such a thing as graphics scaling? The PC which will have every game the Xbox will have will have to account for specs that are lower than the XSS. GPU scaling is far easier than CPU scaling and news flash the XSS has the same CPU as the other current generation consoles.
 

Tchu-Espresso

likes mayo on everthing and can't dance
Based on what evidence? This game is running 1080p/60 on the high end consoles. You think those consoles will 'struggle' too? Are people not aware there is such a thing as graphics scaling? The PC which will have every game the Xbox will have will have to account for specs that are lower than the XSS. GPU scaling is far easier than CPU scaling and news flash the XSS has the same CPU as the other current generation consoles.
Sounds pretty grim to me.
 

Md Ray

Member
Console look to be somewhere in the High range.
KitGuru has a presets scaling graph


The PC version has DRS.

---
I love the game and get decent enough performance, but there's weird shit going on with this game that I can't really explain. First off, the newer Nvida "game-ready" driver has worse performance than the previous driver. Secondly, I'm getting no performance difference between Native, DLSS Quality, and DLSS Ultra Performance. I know it's kicking on from the difference in image quality, especially noticeable with DLSS Ultra Perf. Initially I though I was CPU-bound, but resource monitor has only 63% CPU usage, and balanced multi-threading(no main game thread being hammered). I've never had a game where DLSS doesn't provide any performance increase.

At 1080p, no performance difference between Native and Ultra Performance:


I've done clean driver reinstall a couple times. I'm going to try to reinstall the game. It's some weird bottleneck. I don't get it.
That's definitely weird. Have you looked at other people's 1600X benchmarks with DLSS on/off?
 
Yeah, no... As soon as Xbox One/PS4 are phased out expect Series S to become the lowest common denominator when building games with DX12U as base, not PC.
Bold claim to think all PC games will be built around the XSS. If that's true there should be no problems with future games on the platform and people should have nothing to complain about. There goes the narrative about the system struggling in the future 🤷🏾‍♂️

Bodes well to think Zen 2 CPUs will become the baseline but I'm curious what number of PCs meet the XSS baseline you claim.

I actually don't read your comments to be honest.
Yeah I shouldn't have taken anything you said seriously anyway my bad.
 

ethomaz

Banned
Metro Exodus.
Sometimes I believe people don’t read the original quote lol

Sair that GI RT in S is not the same as X… there are artifacts and grain on S while not on X… not talking about all the others compromises outside resolution the S version had.

PS. Your ideia of heavy use of RT is very lacking btw… Metro Exodus is only using RT in GI.
 
Last edited:

Md Ray

Member
Bold claim to think all PC games will be built around the XSS. If that's true there should be no problems with future games on the platform and people should have nothing to complain about. There goes the narrative about the system struggling in the future 🤷🏾‍♂️

Bodes well to think Zen 2 CPUs will become the baseline but I'm curious what number of PCs meet the XSS baseline you claim.
Do you still see AAA games being developed for lower spec than the 2013 Xbox One in the PC space? Nope.

All games are currently built with Xbox One as the target spec (lowest common denominator). XSS will simply take that place soon, but this doesn't necessarily mean XSS won't have any problems or won't struggle in future titles, lol.

PC has more DX12U GPUs out there than XSX/PS5 now. And all of them are far more powerful than what's inside XSS.
 
Last edited:

ethomaz

Banned
Do you still see AAA games being developed for lower spec than the 2013 Xbox One in the PC space? Nope.

All games are currently built with Xbox One as the target spec (lowest common denominator). XSS will simply take that place soon, but this doesn't necessarily mean XSS won't have any problems or won't struggle in future titles, lol.

PC has more DX12U GPUs out there than XSX/PS5 now. And all of them are far more powerful than what's inside XSS.

That number, 18m, is correct or a guess from him? Because even in my low-end estimative for XSX numbers it is higher than these 4.7m.
 
Last edited:

CrustyBritches

Gold Member
I'm referring to the console version. DRS, at this point, has become the norm.
Are you thinking of them going up or down? If you have DRS enable during the tough scenes anything in the 40s and 50s would result in 900p res, or maybe less depending on the bottleneck. You might have some 1440p cutscenes, I suppose.
 
FIRE! Heh heh heh
Yes Beavis.

Do you still see AAA games being developed for lower spec than the 2013 Xbox One in the PC space? Nope.

All games are currently built with Xbox One as the target spec (lowest common denominator). XSS will simply take that place soon, but this doesn't necessarily mean XSS won't have any problems or won't struggle in future titles, lol.

PC has more DX12U GPUs out there than XSX/PS5 now. And all of them are far more powerful than what's inside XSS.

Nice so developers will make games around a platform that will struggle to play those games huh. I hope developers are smarter than you give them credit for. Also 2013 is an eternity in technology. Pretty sure plenty of GPUs came out between then and now. I highly doubt developers will ignore GPUs that are older than the XSS.

And you are showing data that 80-85% of PCs have worse than or equal to GTX 2060 and you think devs will makes games leaving those people out? That doesn't sound like good business sense. GPUs are the thing that is most easily scaled so regardless the XSS will be just fine. It's CPU matches the other consoles easily.
 
KitGuru has one of the better benchmark analysis of GotG.

I think that's a good approach. As for the data...

This testing seems more representative of actual gameplay and better for comparison to consoles. Here the GTX 1070 has 55fps avg | 47fps 1% low. In the other graph floating around in this thread they have 73fps avg | 50fps low. The PS5/XSX slot in around the 5700 XT, maybe a little lower where the RX 5700 would be. It's not unusual for consoles to land around that performance as seen with AC Valhalla. This is simply a Nvidia favored title.

Tid5LqA.png


Wow whats with 3080 demolishing 6900xt?
 

CrustyBritches

Gold Member
That's definitely weird. Have you looked at other people's 1600X benchmarks with DLSS on/off?
I reinstalled the game and still getting the same results. I haven't found anything as specific as a Ryzen 1600 DLSS On/Off comparison. I did find a i7-10700(@4.5GHz)+3060 DLSS On/Off comparison. A stock 3060 is very comparable to a 2060S oc'd.

At 1080p/Ultra Settings/Ultra RT, i7-10700(@4.5GHz)+3060 is getting 58fps avg, while my Ryzen 1600+2060S oc is getting 54fps avg. He's on the previous driver(49613) while I'm on the newer "game-ready" driver(49649). I had a 6% perf increase when I rolled back to that version, so if I was on that version it would be i7-10700+3060 at 58fps avg, Ryzen 1600+2060S oc at 57avg. Pretty much no difference, maybe because the Ultra+Ultra RT forces a GPU-bound scenario?

Now with DLSS(Ultra Performance) he's getting 112fps avg, while I'm getting 55fps avg. My CPU frame times are pretty much double. This is despite the overall and per-thread CPU load only being 60-70%. It's definitely some sort of CPU bottleneck, though.

Lastly, DSOGaming had a system RAM speed scaling graph that might be relevant, as well. DDR4-3800 offers a 32% increase in performance over DDR4-2666. I'm on 2x16GB DDR4-2400, so that gap could be even bigger. However, without first removing the CPU bottleneck, I couldn't accurately gauge the impact of that.
 
Last edited:

Topher

Gold Member
Are you thinking of them going up or down? If you have DRS enable during the tough scenes anything in the 40s and 50s would result in 900p res, or maybe less depending on the bottleneck. You might have some 1440p cutscenes, I suppose.

Just speculating. If they employed some reconstruction techniques along with DRS I would think higher resolution in performance mode would be possible. It is strange that they went to the trouble of putting DRS on PC but not consoles. It is usually the other way around.

At the same time, I'm still having a hard time getting my head wrapped around the quality and performance mode differences. Half the frame rates in quality mode is very obvious. 1/4 the resolution in performance mode is not. I'm really curious to see what VG Tech comes up with here. This whole thing seems off.
 
I took a punt on this after reading impressions and decided on XSX.

I was really underwhelmed by the menu image quality of the characters as it’s in engine and set to performance by default. The moment I switched it to quality mode even the menu image improved dramatically.
 

Md Ray

Member
Nice so developers will make games around a platform that will struggle to play those games huh.
??
Just like XB1 is the lowest common denominator for game development and isn't immune from having perf issues in some games, the Series S won't be immune either when it takes the XB1's place. No need to twist my words.
I highly doubt developers will ignore GPUs that are older than the XSS.
Why? Devs targeting XSS GPU, the lowest spec, on the console side means that's what we will be seeing as the min requirement on PC, a GPU requirement with the same feature set as XSS GPU.

When Forza Motorsport 6 came out, it required a minimum GCN-based graphics card on PC because it was developed around XB1's GCN architecture with DX12 API as a base. The game wouldn't even boot on GCN's predecessor TeraScale based GPUs because these cards didn't support DX12 (feature level 12_0). Anyone who had an older GPU was forced to upgrade because of the new consoles which lead to new API & HW requirements, there was no way around it. Now DX12U brings feature level 12_2 which according to MS is a major jump from 12_1. Games using this API as a base will not work on older cards like GTX 1060 as they only support up to feature level 12_1, there's no DX12U support on those cards.
 
Last edited:
How much difference can you tell between quality and performance mode as far as image quality?



I think you are right. The lack of DRS is a bit puzzling and I'm betting they just did a native port from PC. DRS seems to be the "secret sauce" that make these consoles perform so well. Frankly, I'm not really that surprised that native versions of a game like this doesn't perform well on console.

Image quality is a big step up in 30fps mode I would say, but once you realize just how much interactivity is there in this world, and especially how crazy combat can get, and how so much of it appears to be heavily physics based, you will quickly accept the changes from the 30fps mode. This game is a winner.

Eidos Montreal just turned into a golden goose for Square Enix. Also, I don't know if other people have mentioned this yet, but the sheer amount of dialogue and script in this game is, in a word, game changing. No game does this much to connect characters and to build relationships. Shit, I think this should become a norm, just appropriate to the kind of mood the game is going for.
 

Md Ray

Member
I reinstalled the game and still getting the same results. I haven't found anything as specific as a Ryzen 1600 DLSS On/Off comparison. I did find a i7-10700(@4.5GHz)+3060 DLSS On/Off comparison. A stock 3060 is very comparable to a 2060S oc'd.

At 1080p/Ultra Settings/Ultra RT, i7-10700(@4.5GHz)+3060 is getting 58fps avg, while my Ryzen 1600+2060S oc is getting 54fps avg. He's on the previous driver(49613) while I'm on the newer "game-ready" driver(49649). I had a 6% perf increase when I rolled back to that version, so if I was on that version it would be i7-10700+3060 at 58fps avg, Ryzen 1600+2060S oc at 57avg. Pretty much no difference, maybe because the Ultra+Ultra RT forces a GPU-bound scenario?


Now with DLSS(Ultra Performance) he's getting 112fps avg, while I'm getting 55fps avg. My CPU frame times are pretty much double. This is despite the overall and per-thread CPU load only being 60-70%. It's definitely some sort of CPU bottleneck, though.


Lastly, DSOGaming had a system RAM speed scaling graph that might be relevant, as well. DDR4-3800 offers a 32% increase in performance over DDR4-2666. I'm on 2x16GB DDR4-2400, so that gap could be even bigger. However, without first removing the CPU bottleneck, I couldn't accurately gauge the impact of that.
Oh, DDR4-2400 might explain why you're not seeing higher fps with DLSS enabled, you're most likely system RAM bandwidth bound and Ryzen loves faster RAM especially for high refresh rate gaming.
 

CrustyBritches

Gold Member
Oh, DDR4-2400 might explain why you're not seeing higher fps with DLSS enabled, you're most likely system RAM bandwidth bound and Ryzen loves faster RAM especially for high refresh rate gaming.
It's bizarre since I'm able to hit 120fps in DOOM Eternal and Back 4 Blood pretty easily.

I just tested my 3060 laptop again that has 2x8GB DDR4-3200 and I'm getting worse performance with DLSS than without. 69fps avg with DLSS Off, 67fps avg with DLSS Ultra Performance. DLSS Ultra Perf should be shooting my frames up over 100fps with ease.

I've settled on the idea that I'm CPU-limited on both my desktop and laptop. It is what it is. This game needs some CPU optimization. It plays acceptably with the rolled-back driver on my PC. I'm just going to enjoy what I got for now. I'll revisit it sometime next year after I upgrade my hardware. Amazing game, has me hyped and laughing non-stop.

 

Kenpachii

Member
According to that benchmark, a Ryzen 1600X can push 83 fps. So the consoles shouldn't be CPU limited when targeting 60 fps.

(The console CPUs are roughly 10-20% faster than a 1600X)

Do consoles have RT in this game? that could explain it
 
Last edited:
Top Bottom