• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

GhostRunner PS5/XSX/XSS DF Performance Analysis

Md Ray

Member
The 6600XT has infinity cache to make up the difference:

75953_09_what-is-amds-new-rdna-2-feature-infinity-cache-and-does-it-do_full.png


But we see the same performance at 1080p, where the bandwidth difference between the two (especially with infinity cache) should have minimal impact:

relative-performance_1920-1080.png


7% performance difference in favour of the 6600XT. Overall TFLOP difference? 7% in favour of the 6600XT. Clock speed advantage, over 600Mhz in favour of the 6600XT. Rasterisation and pixel fill-rate? Massive advantage for the 6600XT again.
The inclusion of infinity cache makes TFLOPs comparisons between RDNA 1 & 2 all the more difficult and skews the results. Someone should do this with two GPUs of the same architecture.

RX 6700 XT (40 CU) at 2100 MHz and RX 6800 (60 CU) at 1400 MHz will have identical TFLOP numbers.
 
Last edited:
PS5 has a 22% advantage in geometry , color ROPS fillrate & internal bandwidth which matches up with the around 22% advantage it has in the 4k 60fps RT mode.

50fps + 22% = 61fps.

PS5 depth ROPS advantage is like 2X the Xbox Series X which could explain it having up to a 30% advantage at times in 120fps mode in this game or close to a 2X advantage in The Touryst.


Seems that when pushed to extremes like 8k or 4k 60fps with RT Xbox Series X get bottlenecked by ROPS or internal bandwidth.
I really don't see any data pointing to those extremes outside the touryst which is a launch game vs a 1 year later game and we don't know the headroom left over for either since they both locked fps.
 
Based on how ECC works. It constantly checks for memory errors, which takes additional time and thus performance.
Why is there even a comparison of a content creator GPU and a gaming GPU in the first place? That's a random one ... No one compares DDR4/5 to ECC, nor would they even be used in the same build or application. It's slower and just not meant for gaming. Apples to oranges.
 
Last edited:

onQ123

Member
I really don't see any data pointing to those extremes outside the touryst which is a launch game vs a 1 year later game and we don't know the headroom left over for either since they both locked fps.
Because they are extremes & pretty much the only times we have seen the consoles in situations where they are tested in this way unless you know of another native 4K 60fps RT game or another game rendered at 8K 60fps on these consoles.
 
Last edited:
Wait one minute!

Where is your point of reference other than the fact that it runs better on PS5?

The game run native 4k 60fps on Xbox Series X the problem comes in when the game is giving you bonus features Ray Tracing or 120fps .

What other games run 4k 60fps with Ray tracing on Xbox Series X?

None that I can think of at the moment so why do you feel that it should be 4k lock 60fps with Ray tracing other than the fact PS5 did it?
Because both consoles have very similar specs. Look at a number of games that have come out on both platforms and parity is the order of the day in the vast majority of cases. Little Nightmares 2 had missing raytracing effects on Xbox. Do you think that that was because only the PS5 had the power to render the effect? This is a 3rd party cross gen game that apparently had different teams doing each version. The evidence points to a less than stellar port of this title on Xbox consoles. This will probably continue to happen because the PS5 has the larger install base and devs won't want to spend the resources.
 

onQ123

Member
Because both consoles have very similar specs. Look at a number of games that have come out on both platforms and parity is the order of the day in the vast majority of cases. Little Nightmares 2 had missing raytracing effects on Xbox. Do you think that that was because only the PS5 had the power to render the effect? This is a 3rd party cross gen game that apparently had different teams doing each version. The evidence points to a less than stellar port of this title on Xbox consoles. This will probably continue to happen because the PS5 has the larger install base and devs won't want to spend the resources.

The question is do you have a point of reference for how a native 4K 60fps game with ray tracing should run on Xbox Series X.

The problem is that PS5 somehow did it while Xbox Series X came up short so you say that it is running poorly but you don't have any example of Xbox Series X reaching a stable 60fps at 4K with RT on.
 

Bogroll

Likes moldy games
The question is do you have a point of reference for how a native 4K 60fps game with ray tracing should run on Xbox Series X.

The problem is that PS5 somehow did it while Xbox Series X came up short so you say that it is running poorly but you don't have any example of Xbox Series X reaching a stable 60fps at 4K with RT on.
I know its not 4k but Doom Eternal hits higher resolution in RT mode.
 

onQ123

Member
I know its not 4k but Doom Eternal hits higher resolution in RT mode.

That wouldn't tell you about how Xbox Series X react to a game pushing RT at 4K 60fps.

Like if this same game pushed a more compute heavy ray tracing engine that wasn't able to reach 4K 60fps it wouldn't have these results because PS5 would be held back by it's lower compute output.
 

Bogroll

Likes moldy games
That wouldn't tell you about how Xbox Series X react to a game pushing RT at 4K 60fps.

Like if this same game pushed a more compute heavy ray tracing engine that wasn't able to reach 4K 60fps it wouldn't have these results because PS5 would be held back by it's lower compute output.
Maybe Watchdogs Legions. DRS but they both still hit 4k ?Oh yes 30fps.

Isn't Little Knightmares 2 DRS hitting 1684p on PS5 and 1890p on X
 
Last edited:

Mr Moose

Member
I don't know if it has but I was referring to post #361. I don't know if I'm reading wrong but isn't he saying LK2 is 4k RT on PS5.

Little Nightmares 2 had missing raytracing effects on Xbox
4K DRS.
 
Last edited:
The question is do you have a point of reference for how a native 4K 60fps game with ray tracing should run on Xbox Series X.

The problem is that PS5 somehow did it while Xbox Series X came up short so you say that it is running poorly but you don't have any example of Xbox Series X reaching a stable 60fps at 4K with RT on.
Metro Exodus Definitive Edition targeted 4k and 60 fps. Amazingly it ran similarly on both the Xbox and PlayStation. Do you have any answer why the Little Nightmares 2 had RT on PS5 and not XSX? Would you say it is the power of PlayStation? Do you honestly believe that the Xbox version of Ghostrunners was fully optimized but it just wasn't powerful enough to run this game as well as a similarly speced PS5? Highly dubious.
 
Last edited:

OmegaSupreme

advanced basic bitch
Metro Exodus Definitive Edition targeted 4k and 60 fps. Amazingly it ran similarly on both the Xbox and PlayStation. Do you have any answer why the Little Nightmares 2 had RT on PS5 and not XSX? Would you say it is the power of PlayStation? Do you honestly believe that the Xbox version of Ghostrunners was fully optimized but it just wasn't powerful enough to run this game as well as a similarly speced PS5? Highly dubious.
Correct me if I'm wrong but doesn't the PS5 cpu have the ability to clock higher than the X? I'm not reading through this whole thread of warring but I brought this up earlier. That would easily account for the differences we see in some titiles. This isn't a new thing. A number of games have performed better on PS5. It's just as likely if not more than any 'optimization' claims.
 

Mr Moose

Member
Metro Exodus Definitive Edition targeted 4k and 60 fps. Amazingly it ran similarly on both the Xbox and PlayStation. Do you have any answer why the Little Nightmares 2 had RT on PS5 and not XSX? Would you say it is the power of PlayStation? Do you honestly believe that the Xbox version of Ghostrunners was fully optimized but it just wasn't powerful enough to run this game as well as a similarly speced PS5? Highly dubious.
All we can do is go from the info we currently have.
If it gets patched then maybe it can be revisited.
There's no reason The Medium was missing RT at launch on PS5, they had plenty of time to port it, and it still loads almost twice as fast on the Xbox consoles.
Correct me if I'm wrong but doesn't the PS5 cpu have the ability to clock higher than the X? I'm not reading through this whole thread of warring but I brought this up earlier. That would easily account for the differences we see in some titiles. This isn't a new thing. A number of games have performed better on PS5. It's just as likely if not more than any 'optimization' claims.
100MHz slower.
 

onQ123

Member
Metro Exodus Definitive Edition targeted 4k and 60 fps. Amazingly it ran similarly on both the Xbox and PlayStation. Do you have any answer why the Little Nightmares 2 had RT on PS5 and not XSX? Would you say it is the power of PlayStation? Do you honestly believe that the Xbox version of Ghostrunners was fully optimized but it just wasn't powerful enough to run this game as well as a similarly speced PS5? Highly dubious.
The dev said it was just a bug so maybe it will add RT to the Xbox Series X soon.


As far as Ghostrunners it seems to be trying something that no other game on these consoles have tried native 4k 60fps with RT so the thing that's holding it back on Xbox Series X is more than likely the areas where PS5 have a advantage like the ROPs.

Just like The Touryst was doing something no other games tried on these consoles with 8k
 

onQ123

Member
Correct me if I'm wrong but doesn't the PS5 cpu have the ability to clock higher than the X? I'm not reading through this whole thread of warring but I brought this up earlier. That would easily account for the differences we see in some titiles. This isn't a new thing. A number of games have performed better on PS5. It's just as likely if not more than any 'optimization' claims.
No Xbox Series X has the higher clocked CPU
 

OmegaSupreme

advanced basic bitch
No Xbox Series X has the higher clocked CPU
I see. Architecture differences then? I'm admittedly not familiar with the entire spec sheet of either console. Don't care that much. Surely not every single game that performs better on PS5 is simply due to lack of optimization? Why? Developers don't give a shit about xbox?
 
Last edited:

Md Ray

Member
Why is there even a comparison of a content creator GPU and a gaming GPU in the first place? That's a random one ... No one compares DDR4/5 to ECC, nor would they even be used in the same build or application. It's slower and just not meant for gaming. Apples to oranges.
Answer timestamped:

 
Last edited:

Lysandros

Member
I see. Architecture differences then? I'm admittedly not familiar with the entire spec sheet of either console. Don't care that much. Surely not every single game that performs better on PS5 is simply due to lack of optimization? Why? Developers don't give a shit about xbox?
Without entering into details, it essentially comes down to PS5's GPU running at 400 MHz (22%) higher frequency compared XSX GPU and yes, a more efficient architecture on PS5's side. This counterbalances XSX' TF/compute advantage quite effectively, hence the results.
 
Answer timestamped:


The highlighted comment answers that perfectly.

"Were at a point where people discuss wether or not $1200-+ workstation GPUS are an affordable altemative to mid-range gaming GPUS. Let that sink in"

So no, it's a horrible comparison lol. Workstation GPU vs gaming GPU.
 

Bogroll

Likes moldy games
????????? What is LK2 & what does it have to do with my post?
Yes sorry hence why I put "I don't know if I'm reading it wrong" I was getting LN 2 mixed up with you saying it was 4k with RT.
I must stop doing 10 things at once while reading Neogaf and replying on a mobile phone :)
 

Md Ray

Member
The highlighted comment answers that perfectly.

"Were at a point where people discuss wether or not $1200-+ workstation GPUS are an affordable altemative to mid-range gaming GPUS. Let that sink in"

So no, it's a horrible comparison lol. Workstation GPU vs gaming GPU.
Lol, ok.

The point still stands. A GPU with fewer CU/SMs and higher clock speed can outperform a wider GPU with lower clock speed while having similar TF, on both of these configs.
 
Last edited:
Lol, ok.

The point still stands. A GPU with fewer CU/SMs and higher clock speed can outperform a wider GPU with lower clock speed while having similar TF, on both of these configs.
Not debating that at all, just the methodology behind it was a little odd. Something like the previous example of the 5700xt vs 6600xt was more akin to the debate, compared to a workstation low profile GPU vs gaming GPU.
 

Md Ray

Member
Not debating that at all, just the methodology behind it was a little odd. Something like the previous example of the 5700xt vs 6600xt was more akin to the debate, compared to a workstation low profile GPU vs gaming GPU.
How? 5700 XT and 6600 XT aren't even the same architecture, vastly different bandwidth throughput between the two. The A4000 and 30-series RTX have a lot more in common than that of AMD's here.
 
Last edited:

OmegaSupreme

advanced basic bitch
Without entering into details, it essentially comes down to PS5's GPU running at 400 MHz (22%) higher frequency compared XSX GPU and yes, a more efficient architecture on PS5's side. This counterbalances XSX' TF/compute advantage quite effectively, hence the results.
I see. Thanks. I knew there was some aspect of the PS5 that was clocked higher. Besides the faster storage. Sadly this point will be lost on the 'optimization' folks.
 
All we can do is go from the info we currently have.
If it gets patched then maybe it can be revisited.
There's no reason The Medium was missing RT at launch on PS5, they had plenty of time to port it, and it still loads almost twice as fast on the Xbox consoles.
I just blame the lack of RT on PS5's version of Medium on lack of optimization. There is no technical reason the PS5 version couldn't have it. I'm glad you mentioned the slower load times on PS5. No one in their right mind would say the faster loading is due to Jason Ronald's engineers working Xbox magic. At minimum it was a design choice or lack of utilizing the the PS5's strengths. The same is true with the Xbox version of Ghostrunner.

Without entering into details, it essentially comes down to PS5's GPU running at 400 MHz (22%) higher frequency compared XSX GPU and yes, a more efficient architecture on PS5's side. This counterbalances XSX' TF/compute advantage quite effectively, hence the results.
Did the less efficient Xbox architecture aid that platform in loading Medium faster? Still interesting to see a cross gen 3rd party game be used a some sort of benchmark in a platforms performance.
 
Last edited:
How? 5700 XT and 6600 XT aren't even the same architecture, vastly different bandwidth throughput between the two. The A4000 and 30-series RTX have a lot more in common than that of AMD's here.
Bandwidth isn't an issue at all @1080p. That's why there is infinity cache. While 30xx series is same architecture as the workstation card, they are highly different in purpose. Sure you can game on a quadro GPU... But why would anyone do that? Or why would anyone compare the two? Different purpose completely, same with apple A1 not being for gaming.

Would you expect to compare a 30xx GPU to a workstation GPU in regards to productivity and editing? If not, the comparison doesn't make much sense at all?
 
Last edited:

//DEVIL//

Member
I didn’t watch the whole video but what kind of detailed settings on ps5 ? It would be kinda impressive if it hit full hd 120 frames everything ultra . Because my friend has a 3070 with that game and he can get 120 frames on 2k everything ultra . So if the ps5 has that then impressive
 

Md Ray

Member
Bandwidth isn't an issue at all @1080p.
I was looking at GPU trace capture of a DOOM Eternal frame using nvidia's graphics profiler the other day... The game was hitting VRAM throughput (bandwidth) pretty hard even at 1080p on a 3070 with 512 GB/s BW.
That's why there is infinity cache.
This is exactly the reason why I suggested this earlier:
Someone should do this with two GPUs of the same architecture.

RX 6700 XT (40 CU) at 2100 MHz and RX 6800 (60 CU) at 1400 MHz will have identical TFLOP numbers.

While 30xx series is same architecture as the workstation card, they are highly different in purpose. Sure you can game on a quadro GPU... But why would anyone do that?
Why not? It's basically a downclocked 16GB version of 3070 Ti that also happens to support GeForce game ready drivers.
Or why would anyone compare the two?
Curiosity, science? And the fact that HUB received pages and pages of requests to take a look at it by their viewers. I mean is there any harm in comparing the two?
Would you expect to compare a 30xx GPU to a workstation GPU in regards to productivity and editing?
Sure, 100% yes. More tests, more info out there for the public, the better, no?
 
Last edited:

Lysandros

Member
How? 5700 XT and 6600 XT aren't even the same architecture, vastly different bandwidth throughput between the two. The A4000 and 30-series RTX have a lot more in common than that of AMD's here.
Yes, different RBE and rasterizer/prim unit hardware crucially. 2 rasterizers and prim units per SE for 5700 XT compared to 1 rasterizer/prim units per SE for 6600 XT, with ROPS having more Z/stencil ROPs for 5700 XT. L2 cache amounts also differ, 4 MB for 5700 XT and 2 MB for 6600 XT.
 
Last edited:

01011001

Banned
How? 5700 XT and 6600 XT aren't even the same architecture,

they are the same architecture in everything but name. RDNA2 is not much different than RDNA1. if you don't do any raytracing related tasks they should perform basically the same, only that RNDA2 is more power efficient.
they literally could have named it RDNA+ (like they did with Zen+) and noone would have questioned it, but it's better PR to add that 2
 
Last edited:

onQ123

Member
I just blame the lack of RT on PS5's version of Medium on lack of optimization. There is no technical reason the PS5 version couldn't have it. I'm glad you mentioned the slower load times on PS5. No one in their right mind would say the faster loading is due to Jason Ronald's engineers working Xbox magic. At minimum it was a design choice or lack of utilizing the the PS5's strengths. The same is true with the Xbox version of Ghostrunner.


Did the less efficient Xbox architecture aid that platform in loading Medium faster? Still interesting to see a cross gen 3rd party game be used a some sort of benchmark in a platforms performance.
You do know that The Medium was also missing RT when it was 1st released so maybe that's just the way the devs do things.
Also The Medium was made around Xbox Series X specs so maybe the data was setup to take advantage of direct storage & just don't work well with PS5 loading solution.
 
I was looking at GPU trace capture of a DOOM Eternal frame using nvidia's graphics profiler the other day... The game was hitting VRAM throughput (bandwidth) pretty hard even at 1080p on a 3070 with 512 GB/s BW.

This is exactly the reason why I suggested this earlier:



Why not? It's basically a downclocked 16GB version of 3070 Ti that also happens to support GeForce game ready drivers.

Curiosity, science? And the fact that HUB received pages and pages of requests to take a look at it by their viewers. I mean is there any harm in comparing the two?

Sure, 100% yes. More tests, more info out there for the public, the better, no?
It's mostly power effieciencies at the end of the day, not much difference between the two architectures. Nothing like comparing a gaming GPU to a workstation GPU though.

Bandwidth issues for one GPU isn't the same for a completely different architecture or manufacturer. Can't compare a 3070 to an AMD GPU when it comes to bandwidth limitations, as they behave differently in different scenarios.
 

Md Ray

Member
It's mostly power effieciencies at the end of the day, not much difference between the two architectures. Nothing like comparing a gaming GPU to a workstation GPU though.

Bandwidth issues for one GPU isn't the same for a completely different architecture or manufacturer. Can't compare a 3070 to an AMD GPU when it comes to bandwidth limitations, as they behave differently in different scenarios.
If you want to compare the perf of narrow vs wide GPU configs w/ similar TF numbers, it's best to compare it on the same architecture with identical BW numbers like the A4000 vs 3060 Ti or something like this: 6800 @ 1.4GHz vs 6700 XT @ 2.1 GHz for e.g. on the AMD side. Scratch that. They don't have identical BW. 6900 XT @ 1.5 GHz vs 6800 @ 2.0 GHz is more appropriate.
 
Last edited:
If you want to compare the perf of narrow vs wide GPU configs w/ similar TF numbers, it's best to compare it on the same architecture with identical BW numbers like the A4000 vs 3060 Ti or something like this: 6800 @ 1.4GHz vs 6700 XT @ 2.1 GHz for e.g.
Second example makes sense, just not a workstation vs gaming GPU on the other hand. Horrible example. That was my whole point of even commenting. Didn't even know it was you who brought up this weird comparison. You can't compare a single slot gpu made for workstations, to a to a full fledged gaming GPU. I wouldn't compare either one. I could compare AMD to Nvidia, but wouldn't compare workstation GPU's, that were made for a completely different purpose, to gaming GPU's, even if you can apply a gaming profile. Doesn't mean it's a gaming GPU, meant for gaming.
 
Last edited:
You do know that The Medium was also missing RT when it was 1st released so maybe that's just the way the devs do things.
Also The Medium was made around Xbox Series X specs so maybe the data was setup to take advantage of direct storage & just don't work well with PS5 loading solution.
The point is there was no technical reason why the PS5 was missing RT right? It wasn't Xbox magic it was a developer choice. Ghostrunner running well on the PS5 isn't because the PS5 is a mystical piece of hardware it was because the dev optimized that version of the game.
 
Top Bottom