• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 compared to RX 5700 XT in games

Blizzje

Member
I've been looking for comparisons between PS5 performance in games and the RX 5700 XT, the card that seems be closest to the PS5 GPU. Dying to see what kind of improvement the PS5 has over this card as there should be at least a difference. There aren't many possible comparisons and it's not easy to get a like for like comparison in for example the same level or the same part of that map with an exact number of enemies.

The Digital Foundry analysis of DMC5 gives some insight and perhaps a viable comparison. In the normal (4K) mode, we see frame rates of around 90fps after a battle is won in mission 2.


I compared this to this video that is running a RX 5700 XT and noticed lower framerates of around 70 fps during the same cutscene and the same running sequence.



Of course, they aren't like for like. We also don't know if the PS5 is running on the same graphics settings as this pc (is the PS5 running ultra settings for DMC5?).


The Assassins Creed Valhalla comparison Digial Foundry did had potential, but they only have an RX5700 available for testing.


Wondering if we can get some more like for like comparisons with the RX 5700 XT and if you think the PS5 GPU is clearly more powerful or is pretty close to the card.
 

regawdless

Banned
This will go well.

I think the performance fluctuates between certain cards, depending on the game and the features. I don't think we'll find a good equivalent. Because the PS5 GPU - at least so far - looks to be very good regarding rasterization, while being a bit lacking regarding raytracing.

But we just started this gen, so it's not that easy to say.
 
Didn't df said it's closer to 2080?
Yea 2080S


Sooqfrf.jpg
 

rofif

Can’t Git Gud
No, under ACV that's case in their vidéo, but with games better optimized for nvidia, it will be more 2070 for example. In any case, who really cares ? This amount of ressources will be so nice when well used by 1st party dev :)
That's still fine. A huge jump from ps4 and I bet the games will use it well
 
No, that's a fact that PS5 under ACV is running as 2080 super, not that ALL GAMES will run like that on PS5 against a 2080 super, try to think a little bit on that.
Most games its doing between 2080 to 2080 super and much higher than 2070. Borderlands 3 another example of that or god fall. Its not close to 2070.

With raytracing its not doing hot as amd RT is worse than nvidia RTX line .
 
Last edited:

regawdless

Banned
Did PS5 versions of games had a problem with RT?

It's doing exceptionally well for it's price, there is no "problem" with it being in the 2060S ballpark regarding raytracing performance. At least that's what the early DF raytracing analysis of Spider-Man and the Watch Dogs Legion RT performance are suggesting. In WD Legion for example it uses lower settings than the lowest possible PC settings for RT on both consoles.
By editing the config DF was able to use console settings on PC and it was comparable to a 2060S.

Again, that's all very early and there aren't many titles using any form of raytracing so far. Therefore it's just an early indication.
 
DF raytracing analysis of Spider-Man and the Watch Dogs Legion RT performance are suggesting. In WD Legion for example it uses lower settings than the lowest possible PC settings for RT on both consoles.

That isn't the case for both Spiderman. RT in both games is really step above than in WDL
 

regawdless

Banned
That isn't the case for both Spiderman. RT in both games is really step above than in WDL

Sure. But Spider-Man has severe limitations regarding it's raytraced reflections, it takes a lot of shortcuts to make it work on that scale. It was analyzed by DF, also comparing it to limitations a 2060S would have.

Again, not saying the PS5 is bad or anything, the console is amazing. But regarding raytracing, we need to keep our expectations in check. It's extremely performance heavy.
 
PS5 should be clearly faster than a 5700XT. I know people like to use it as a comparison because the CU count is similar, but there's more to it than just that. The 5700XT is RDNA1 for starters, doesn't clock as high (When I used one the max I could push is 2Ghz, but doesn't really improve performance much with RDNA1) and lacks all the customisation work Sony has done on RDNA2 for their GPU in the PS5. It'll be interesting seeing where the 5700XT replacement will land in comparison though
 

spyshagg

Should not be allowed to breed
Stop with such useless example, ACV is AMD optimized, the RX5700XT for example is clearly running better against NV card with ACV as it is the case in most games.

I'm sorry, but there is not a preordained order to how all GPU's should perform VS each other on different games. There isn't a law that a 5700XT should never surpass a 2080. And if they do, its not a Cheat. Its their potential.

For ACV, those are the requirements.
 

Dampf

Member
I'm sorry, but there is not a preordained order to how all GPU's should perform VS each other on different games. There isn't a law that a 5700XT should never surpass a 2080. And if they do, its not a Cheat. Its their potential.

For ACV, those are the requirements.

ACV is just not optimized for Nvidia hardware, that is why the RDNA cards perform so good. Do you seriously think the 5700XT can reach 2080Ti performance by "potential"? Do you seriously believe that a card this more powerful can be reached just by your magic "potential" ? (and you don't even tell me what it is) Spoiler alert: This card has none. This card has actually, let me make that loud and clear, zero potential because the absence of DX12 Ultimate support, machine learning inferencing acceleration and DirectStorage... unlike Turing, Ampere and RDNA2 which do have full DX12 Ultimate support and thus, true potential to perform much better than currently in next generation titles.

I expect any DX12 Ultimate compatible card even a RX 6300 or a RTX 3030, to outperform a 5700XT in true next generation titles, in terms of visual quality and performance. That includes the Series S, which also is a DX12U compatible console.

And the PS5 is no difference, in current and cross generation titles and without RT, the PS5 may be compared to RDNA1, but once next gen kicks in which will fully utilize the advanced Geometry Engine (on par with Mesh Shaders), their own version of the Sampler Feedback technique and Sonys version of VRS to its full potential, any card from the RDNA1 generation is dead in the water instantly.

And even for cross generation titles, it's quite obvious. Let me assure you a 5700XT would never be able to accomplish Spiderman with RT and with this performance, simply because RDNA1 lacks HW acceleration for Raytracing.

The RDNA1 generation was a pure cash grab focusing on performance/price but only in the short-term, disregarding everything the industry is leading forwards to. I feel very sorry for everyone who fell for it.

AMD should just discontinue every RDNA1 card now and replace it with a RDNA2 card. Because you know, that is actually a good and futureproof architecture and is capable of rendering next gen games.
 
Last edited:

v_iHuGi

Banned
Stop with such useless example, ACV is AMD optimized, the RX5700XT for example is clearly running better against NV card with ACV as it is the case in most games.

Games will be mostly optimized for Ps5 starting next year, i know what a SHOCK right?

So comparison is perfect.
 
Only true RTX Title is Spidey and looks phenomenal, also Cold War is basically 4K flat 60 with RTX Shadows maxed out.

He´s claiming that based on AMD vs NVIDIA RTX comparisons but we don´t know if Ps5 has or not Custom RTX built into the system.

The raytracing in Spiderman is very, very basic, especially in performance mode. Shadows also don't tax the hardware that much, that's why AMD performs really well in Dirt 5 (PC with RT shadows) and WoW (RT shadows).
 

assurdum

Banned
No, under ACV that's case in their vidéo, but with games better optimized for nvidia, it will be more 2070 for example. In any case, who really cares ? This amount of ressources will be so nice when well used by 1st party dev :)
Uh you can't know that. Or you have some evidence about it? It's all about presumption what you said and could end in a different way.
 

assurdum

Banned
The raytracing in Spiderman is very, very basic, especially in performance mode. Shadows also don't tax the hardware that much, that's why AMD performs really well in Dirt 5 (PC with RT shadows) and WoW (RT shadows).
The hell is that bullshit. WTF is it a very basic raytracing now? :messenger_tears_of_joy: Some developer has claimed the exact opposite anyway. And uncorrected, shadow in a mirror reflection via raytracing are quite taxing from what I have understood.
 
Last edited:
Uh you can't know that. Or you have some evidence about it? It's all about presumption what you said and could end in a different way.

It's common sense. If Nvidia cards perform better than AMD cards in certain games, the consoles won't miraculously perform better for some reason.
 

assurdum

Banned
It's common sense. If Nvidia cards perform better than AMD cards in certain games, the consoles won't miraculously perform better for some reason.
Series X is an AMD cards. Why perform worse then? :unsure: It's not common sense it's just presumption around the TF/bandwidth number, nothing more. But I doubt ps5 works as an Nvidia GPU
 
Last edited:

Mr Moose

Member
The DF Valhalla vid was a bit pointless since the both of the consoles use dynamic res and ~1440p is the lowest (PS5) it goes and reaches upto 4k. They were only testing a single cutscene.
 
Last edited:
The hell is that bullshit. WTF is it a very basic raytracing now? :messenger_tears_of_joy: Some developer has claimed the exact opposite anyway. And uncorrected, shadow in a mirror reflection via raytracing are quite taxing from what I have understood.


Its not bullshit. The reflections in Spiderman's 60 fps mode are quarter resolution. They greately reduce the number of objects they are reflecting, they apply lower resolution shadows in the reflection, that are cutoff at an close distance. The number of people in the reflection is reduced. I mean, its not magic what they're doing. They have a pretty limited performance budget on consoles that they need to work with.
 

assurdum

Banned
qsyQSuO.png


Compared to what games like Control or Cyberpunk are doing, this is incredibly basic. It works for the game because usually you're not staring into windows but are slinging around the city at high speeds, but if you look closely it really doesn't look that great.
Don't push me to post a screenshot which look as bad if not worse about WD Legion raytracing, please. Don't be that stupid and dishonest because you have to downplay the ps5 performance.
 
Games will be mostly optimized for Ps5 starting next year, i know what a SHOCK right?

So comparison is perfect.

You really think that 3rd party game will be optimized to run better against PC games running with Nvidia GPU ? PS4 and Xbox One were already using AMD APU, clearly we saw that against previous Nvidia GPU haha
 

assurdum

Banned
Its not bullshit. The reflections in Spiderman's 60 fps mode are quarter resolution. They greately reduce the number of objects they are reflecting, they apply lower resolution shadows in the reflection, that are cutoff at an close distance. The number of people in the reflection is reduced. I mean, its not magic what they're doing. They have a pretty limited performance budget on consoles that they need to work with.
Many if not the Most of raytracing solution even pc uses a quarter of resolution.
 

Mr Moose

Member
qsyQSuO.png


Compared to what games like Control or Cyberpunk are doing, this is incredibly basic. It works for the game because usually you're not staring into windows but are slinging around the city at high speeds, but if you look closely it really doesn't look that great.
*Finds the muddiest window* Yup, that looks shit enough to take a screenshot to prove a point. Isn't that also the 60fps RT mode?
Cyberpunk has RT on consoles?
 
*Finds the muddiest window* Yup, that looks shit enough to take a screenshot to prove a point. Isn't that also the 60fps RT mode?
Cyberpunk has RT on consoles?

It's muddy by design, it has to be because the RT needs to be very basic, otherwise the game would run like shit. Fidelity mode looks better but not by much. It's still at a quarter resolution and missing a lot of details. And even fidelity mode only has RT reflections and nothing else afaik. It's basic.
 

v_iHuGi

Banned
You really think that 3rd party game will be optimized to run better against PC games running with Nvidia GPU ? PS4 and Xbox One were already using AMD APU, clearly we saw that against previous Nvidia GPU haha

No, but games will be using better Ps5 capabilities and therefore Next Gen AMD gpus too.
 
Series X is an AMD cards. Why perform worse then? :unsure: It's not common sense it's just presumption around the TF/bandwidth number, nothing more. But I doubt ps5 works as an Nvidia GPU

You are answering about the fact that you can't really compare PS5 with Nvidia GPU on ONE game and claim that PS5 = 2800 super. The results will change if the game is more optimized for AMD or NVIDIA that's all.
And don't come saying that the console are using AMD so the games will Be better optimized for AMD. That's false => the previous generation has shown that it was not the case.
 

v_iHuGi

Banned
The raytracing in Spiderman is very, very basic, especially in performance mode. Shadows also don't tax the hardware that much, that's why AMD performs really well in Dirt 5 (PC with RT shadows) and WoW (RT shadows).

Yeah so? Rasterization will still be widely used where AMD is crushing Nvidia already in many games and where Ps5 is smashing 2070.

We all know AMD is a gen behind in RTX tech, nothing new.
 

Mr Moose

Member
It's muddy by design, it has to be because the RT needs to be very basic, otherwise the game would run like shit. Fidelity mode looks better but not by much. It's still at a quarter resolution and missing a lot of details. And even fidelity mode only has RT reflections and nothing else afaik. It's basic.
344MarvelsSpiderManMile.jpg

And the ~1440p/60fps
MarvelsSpiderManMile.jpg


Compared to
95


I know which one I prefer and it ain't Watch_Dogs.

The image you chose was muddy by design because that's a muddy window.
 
Last edited:
344MarvelsSpiderManMile.jpg

And the ~1440p/60fps
MarvelsSpiderManMile.jpg


Compared to
95


I know which one I prefer and it ain't Watch_Dogs.

The image you chose was muddy by design because that's a muddy window.

I think in this case, you are more comparing the design instead of RT level. RT in WDL are using higher quality, but I agree on one point, it's better used by insomniac in Spider-Man.
 

Mr Moose

Member
I think in this case, you are more comparing the design instead of RT level. RT in WDL are using higher quality, but I agree on one point, it's better used by insomniac in Spider-Man.
Looks like crap in Watch_Dogs on anything that isn't a puddle IMO. Especially on the cars.
 
Top Bottom