• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IGN/NXGamer: Spider-Man: Miles Morales PC vs PS5 vs Steam Deck Performance Review

ChiefDada

Gold Member


TL;DW

  • PC Specs - GPU RX 6800 (2.3ghz) 16GB; CPU 5600x 6ct/12t (4.8ghz); 32GB DDR4; 6gb/s PCIE 4 SSD
  • PC CPU bottlenecks primarily caused by BVH construction and data streaming/compression
  • PS5 and 6800 trade performance lead during early Rhino chase sequence (CPU bound) with performance profile similar to PS5 performance/Non RT mode
  • RX 6800 is 33% faster than PS5 in 4k/VRR mode, even with inclusion of High RT Shadows preset (more GPU dependent scenario)
  • Open-world traversal sections tend to be CPU and data throughput bound, closed/on-rail sections tend to be GPU bound
  • Steam Deck performance is great with performance maintaining above 30fps with sensible cuts to visuals.
  • RT Shadows on PC have ~10% performance impact and visual uplift isn't worth sacrifice in NX's opinion.
  • Insomniac's reconstruction tech is much worse on PC compared to PS5 (possibly broken)
  • Even with a 16gb VRAM, PC often fails to resolve higher texture quality compared to PS5

Imo, much better methodology used in this performance review for Spiderman Miles Morales compared to DF.
 


TL;DW

  • PC Specs - GPU RX 6800 (2.3ghz) 16GB; CPU 5600x 6ct/12t (4.8ghz); 32GB DDR4; 6gb/s PCIE 4 SSD
  • PC CPU bottlenecks primarily caused by BVH construction and data streaming/compression
  • PS5 and 6800 trade performance lead during early Rhino chase sequence (CPU bound) with performance profile similar to PS5 performance/Non RT mode
  • RX 6800 is 33% faster than PS5 in 4k/VRR mode, even with inclusion of High RT Shadows preset (more GPU dependent scenario)
  • Open-world traversal sections tend to be CPU and data throughput bound, closed/on-rail sections tend to be GPU bound
  • Steam Deck performance is great with performance maintaining above 30fps with sensible cuts to visuals.
  • RT Shadows on PC have ~10% performance impact and visual uplift isn't worth sacrifice in NX's opinion.
  • Insomniac's reconstruction tech is much worse on PC compared to PS5 (possibly broken)
  • Even with a 16gb VRAM, PC often fails to resolve higher texture quality compared to PS5

Imo, much better methodology used in this performance review for Spiderman Miles Morales compared to DF.

Game is not VRAM bottleneck ,however, it is CPU bound.

Even RTX 3070 is beating to RX 6800 in this game.

Which means that RTX 3070 is 45% faster than PS5 in this game and RTX 3080 is 70%.


 

ChiefDada

Gold Member
Game is not VRAM bottleneck ,however, it is CPU bound.

Lol, the game can certainly be VRAM bottlenecked if you want the highest quality textures.

Even RTX 3070 is beating to RX 6800 in this game.

Which means that RTX 3070 is 45% faster than PS5 in this game and RTX 3080 is 70%.

Yeah I'm hoping to avoid having this thread devolve into a similar shit show as the other so I'll just say you shouldn't come to conclusions if you can't compare apple to apple settings, and the benchmarks from the link doesn't aim to benchmark ps5 settings.
 

Fredrik

Member
Would’ve been fun to see it on a better PC and ultrawide etc but some day I’ll get this and max it out myself anyway, it’s one of my favorites among Sony’s games, highly underrated game imo.
 

MidGenRefresh

*Refreshes biennially
All I can say is that this game runs fine on 4090.

Also, DLSS3 implementation is super impressive. I don't know what they did but it's a big step up from the OG Spider-Man.
 

Fredrik

Member
All I can say is that this game runs fine on 4090.

Also, DLSS3 implementation is super impressive. I don't know what they did but it's a big step up from the OG Spider-Man.
Nice! It wasn’t too long ago since I played through it on PS5 but I’ll get to it eventually on PC, got the PS5 version on PS+ so no hard feelings paying for it on PC. I see that the mod community is already up and running.
 

MidGenRefresh

*Refreshes biennially
Nice! It wasn’t too long ago since I played through it on PS5 but I’ll get to it eventually on PC, got the PS5 version on PS+ so no hard feelings paying for it on PC. I see that the mod community is already up and running.

I can't force myself to play it for some reason. I platinumed the original game on PS4 but quickly dropped Miles Morales on the PS5. And now on PC, it's the same story. According to Steam I played for 3 hours. Game looks great, feels great to play (180+ frames or something ridiculous like that) but I think I'm burned out on the formula. I hope that Spider-Man 2 will shake things up in substantial way.
 

Fredrik

Member
I can't force myself to play it for some reason. I platinumed the original game on PS4 but quickly dropped Miles Morales on the PS5. And now on PC, it's the same story. According to Steam I played for 3 hours. Game looks great, feels great to play (180+ frames or something ridiculous like that) but I think I'm burned out on the formula. I hope that Spider-Man 2 will shake things up in substantial way.
I never played the original, thought Miles Morales was fantastic. Played it when new PS+ launched this year and it’s literally one of my favorites of the year, at this point even above GOWR, which I also love. It got a bit bloaty by the end but that’s about it, action sequences can get insanely complex too with all the things you can do but I really appreciate the freedom and how they let you experiment and do things your own way, if you just want to be a spider ninja then you can totally be that. I enjoyed it more than I ever thought I would.
But, no burnout problems for me of course.
 

Resenge

Member
All I can say is that this game runs fine on 4090.
giphy.gif
 

01011001

Banned
people discovered that AMD cards run this game reall badly for no apparent reason.

with or without RT they are far behind Nvidia cards, way more than usual
 

Gaiff

Member
Game is not VRAM bottleneck ,however, it is CPU bound.
Would you put your life on it?

Let's start at 1080p non-RT. Here is the 3070 8GB vs the 11GB 2080 Ti on pcgameshardware.de.

YOOjcwk.png
Neck and neck, as they should be. In average and 1% lows.

Now look at the 1% lows on computerbase at 1080p non-RT.

Sv4rwFu.png
Extremely similar to pcgameshardware.de which had 1% lows of 81fps instead. The 3080 is 22% faster than the 3070. This is within the expected differential.

Now let's move to 1440p non-RT.
8adnfUT.png
The gap widened from 22% to 38% in favor of the 3080.

Now let us take a look at 4K non-RT.
fGmUuvb.png
Remember, the gap at 1080p was 22% which is normal. Now the gap at 4K without RT has suddenly ballooned to 34% in favor of the 3080.

What about the 1% lows at 4K non-RT?
aNDIcLO.png
Now it's 61% in favor of the 3080, up from 38% at 1440p.

And last but not least, 4K RT/max settings.
CgRlrVS.png
Now the 2080 Ti is beating the 3070 by 14% (when it was a tad bit worse at 1080p) on average with 25% greater 1% lows. Notice also how the 3080 10GB is on average 25% faster than the 2080 Ti as it should be but the 1% lows are only 15% higher. The 3080 is also a whopping 42% faster on average with 44% higher 1% low than the 3070. I didn't use computerbase because their 4K benchmark includes DLSS Quality.

tl;dr the 8GB frame buffer of the 3070 starts exhibiting issues at resolutions as low as 1440p and crumbles at 4K. It's very likely a VRAM bottleneck issue seeing that the equally-matched 2080 Ti exhibits no such problems.
 
Last edited:

01011001

Banned
Would you put your life on it?

Let's start at 1080p non-RT. Here is the 3070 8GB vs the 11GB 2080 Ti on pcgameshardware.de.

YOOjcwk.png
Neck and neck, as they should be. In average and 1% lows.

Now look at the 1% lows on computerbase at 1080p non-RT.

Sv4rwFu.png
Extremely similar to pcgameshardware.de which had 1% lows of 81fps instead. The 3080 is 22% faster than the 3070. This is within the expected differential.

Now let's move to 1440p non-RT.
8adnfUT.png
The gap widened from 22% to 38% in favor of the 3080.

Now let us take a look at 4K non-RT.
fGmUuvb.png
Remember, the gap at 1080p was 22% which is normal. Now the gap at 4K without RT has suddenly ballooned to 34% in favor of the 3080.

What about the 1% lows at 4K non-RT?
aNDIcLO.png
Now it's 61% in favor of the 3080, up from 38% at 1440p.

And last but not least, 4K RT/max settings.
CgRlrVS.png
Now the 2080 Ti is beating the 3070 by 14% (when it was a tad bit worse at 1080p) on average with 25% greater 1% lows. Notice also how the 3080 10GB is on average 25% faster than the 2080 Ti as it should be but the 1% lows are only 15% higher. The 3080 is also a whopping 42% faster on average with 44% higher 1% low than the 3070. I didn't use computerbase because their 4K benchmark includes DLSS Quality.

tl;dr the 8GB frame buffer of the 3070 starts exhibiting issues at resolutions as low as 1440p and crumbles at 4K. It's very likely a VRAM bottleneck issue seeing that the equally-matched 2080 Ti exhibits no such problems.

this has been shown time and time again but people deny reality, so don't even bother trying to convince anyone
 
people discovered that AMD cards run this game reall badly for no apparent reason.

with or without RT they are far behind Nvidia cards, way more than usual
All Sony game run better on Nvidia compare to AMD.

God of War, Horizon Zero Dawn, Spider Man both, all run better on Nvidia compare to AMD.

Only Uncharted which run fair in both. It does not benefit any brand.
 

EverydayBeast

thinks Halo Infinite is a new graphical benchmark
Already beat Spider-Man, and Miles one of the great open world games, definitely allow yourself to enjoy the scenery and not worry about side quests.
 

SmokedMeat

Gamer™
Spiderman Remastered play fantastic on my PC, and runs great on Steam Deck. Upgrading my CPU this Christmas, so I should be seeing a big jump in performance.

I’ll check out Miles Morales eventually when the price drops.
 

ChiefDada

Gold Member
Are the settings even the same between PS5 and PC? I doubt the PS5 runs everything maxed out.

Proper benchmarking tests would prioritize matched settings, though it will never be perfect like for like. Playing on PC doesn't entitle you to max settings. Not even a 4090 can maintain 4k60 at max settings with this PS4 game.
 

ChiefDada

Gold Member
Based on what?

For starters, he's matching PS5 fidelity, performance, and performance RT modes much closer. Notice at 1:50 mark he has matched as close as possible with PS5 fidelity VRR presets. Look at the VRAM usage pushing beyond 12gb. That means 3080 is incapable of running at PS5 settings due to VRAM bottleneck. If you want to benchmark with PS5 then let PC card run as is and acknowledge why performance is suffering due to data management advantage on console. DF instead chooses to reduce textures and still claim optimized settings as comparable to PS5. To make PC settings less favorable to PS5 strengths without full disclosure reeks lack of objectivity.

Secondly, he highlights heterogenous nature of performance limitation based on different sections in the game (CPU/Data vs GPU limitations). This is very important for context and something DF failed to point out which left many confused for reasoning behind performance readings. Again objectivity comes into play, this time in favor of PC where we see how sequences leaning heavily on GPU power shows 6800 breeze past PS5. That is why I view his methodology as better and more objective. Again he shows the performance comparisons in applications that both favor and exploit weakness of both platforms.
 

Gaiff

Member
For starters, he's matching PS5 fidelity, performance, and performance RT modes much closer. Notice at 1:50 mark he has matched as close as possible with PS5 fidelity VRR presets. Look at the VRAM usage pushing beyond 12gb. That means 3080 is incapable of running at PS5 settings due to VRAM bottleneck.
Somebody doesn't know the difference between allocated VRAM and VRAM being actually needed. The 3080 has no issue running PS5 settings at 4K. It doesn't run into a VRAM bottleneck. The 3070 might.
If you want to benchmark with PS5 then let PC card run as is and acknowledge why performance is suffering due to data management advantage on console. DF instead chooses to reduce textures and still claim optimized settings as comparable to PS5. To make PC settings less favorable to PS5 strengths without full disclosure reeks lack of objectivity.
That's a 2-way street. The console runs at 8x and even a paltry 4x AF whereas even the lowest of lowest PC GPU has no problem running 16x AF. The PS5 is a console with pre-configured setting. Unlike the PC, it's not possible to force it to run beyond what it's capable of handling. Not exactly objective to complain about DF lowering textures but then turn around and ignore that the PS5 can't even run 16x AF or that PC cannot drop some settings below the lowest available setting.
 

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
Would’ve been fun to see it on a better PC and ultrawide etc but some day I’ll get this and max it out myself anyway, it’s one of my favorites among Sony’s games, highly underrated game imo.
Where?
 

Gaiff

Member
Lol, you can't make this shit up! Fucking amazing! Here's your reading for tonight; educate yourself:


M5y7dsX.jpg


MTsmKCx.png

KdVk6Mv.jpg


DF ran the game on a 3080 at PS5 settings vs a 6800 XT and the 3080 beat the 16GB card. It didn't run into any VRAM bottleneck.

They also state the following:
If you've got a 6GB GPU, stick to high textures, rising to 8GB for very high - but you might consider avoiding higher RT settings in this scenario. A GPU with 10GB or more should be good for anything the game throws at you, even with maxed out ray tracing features.
Before acting like a smug asshole, get your facts straight.
 

ChiefDada

Gold Member


DF ran the game on a 3080 at PS5 settings vs a 6800 XT and the 3080 beat the 16GB card. It didn't run into any VRAM bottleneck.

They also state the following:

Before acting like a smug asshole, get your facts straight.


Bruh...

You see a 6800xt keeping up with a 3080 at NATIVE 4k w/ HIGH RT and that didn't trigger any sort of lightbulb in your brain!!!??? YES, the Nvidia GPU is VRAM bottlenecked and DF got it wrong! I know it's difficult for you to comprehend that your authority gods at DF initially got it wrong. Oh, the HORROR!!!
 

Gaiff

Member
Bruh...

You see a 6800xt keeping up with a 3080 at NATIVE 4k w/ HIGH RT and that didn't trigger any sort of lightbulb in your brain!!!??? YES, the Nvidia GPU is VRAM bottlenecked and DF got it wrong! I know it's difficult for you to comprehend that your authority gods at DF initially got it wrong. Oh, the HORROR!!!
And this is getting even more pathetic. A VRAM bottleneck causes frame spikes, stutters, and other issues. It doesn't stay perfectly smooth like it is in the video. Look at the frame time graph during the comparison. Where is the VRAM bottleneck?

You literally made this shit up to save face.
0R3cQhj.png
Oh, look, you increase the RT workload and the 3080 utterly mops the floor with the 6800 despite the VRAM pressure being even higher with Very High RT. That's because as Richard states, the geometry is very low on high RT and the load is relatively modest.

You're wrong and have been since your very first post. The fact that you didn't even know WTF allocated VRAM usage is or how it's calculated on PC was telling enough.

Yeah, the guys who tested the game with matching setups and tools to measure fps and frame times are wrong, not the random Joe Schmoe who doesn't even own a gaming PC and has been making ridiculous claims and getting shut down for over a week lol.
 
Last edited:

ChiefDada

Gold Member
Oh, look, you increase the RT workload and the 3080 utterly mops the floor with the 6800 despite the VRAM pressure being even higher with Very High RT. That's because as Richard states, the geometry is very low on high RT and the load is relatively modest.

Did you not hear him say this particular area represents absolute worst case scenario for AMD in favor of RT, as in fringe case? Or you chose to ignore? Do you play games standing still? Earth to Gaiff! Is any of this getting through to you? Keep fighting the good fight lol.
 

Gaiff

Member
Did you not hear him say this particular area represents absolute worst case scenario for AMD in favor of RT, as in fringe case? Or you chose to ignore? Do you play games standing still? Earth to Gaiff! Is any of this getting through to you? Keep fighting the good fight lol.
Changes fuck all. This just destroys your argument of it being bottlenecked by the VRAM. Now please, point where DF states it's VRAM-bottlenecked or show it on the frame time graph lol. Don't even know what a VRAM bottleneck is like and literally avoided the argument to go on an irrelevant tangent.

You might wanna sit this one out. You're embarrassing yourself.
 

Fredrik

Member
From my perspective, everywhere. I don’t think it’s fair to rate it lower for being similar to the first one. A great game is a great game even if it’s similar to another great game, and Miles is 60fps and have improved and refined gameplay over the first one. You don’t rate GOWR lower than GOW2018 either. Things have improved. Score should go up or at the very least stay the same.
 
Last edited:

SmokedMeat

Gamer™
Based on what?

Based on his biased towards PlayStation. It’s beyond obvious and the entire reason he created this thread.

Literally a PlayStation fanboy and armchair PC performance expert who that thinks he knows more than DF. He doesn’t even play on PC, which makes it even funnier.
 
Last edited:

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
From my perspective, everywhere. I don’t think it’s fair to rate it lower for being similar to the first one. A great game is a great game even if it’s similar to another great game, and Miles is 60fps and have improved and refined gameplay over the first one. You don’t rate GOWR lower than GOW2018 either. Things have improved. Score should go up or at the very least stay the same.
I think you suffer from what the majority of people suffer from in here.

Giving a shit about MC.

It's not close to being underrated in here, people talked alot about it. And I'm sure this is not the only place.

I know you can say a game is overrated or underrated thanks to a certain score, but is it really underrated when it sells well and people play it?
 

winjer

Gold Member
Bruh...

You see a 6800xt keeping up with a 3080 at NATIVE 4k w/ HIGH RT and that didn't trigger any sort of lightbulb in your brain!!!??? YES, the Nvidia GPU is VRAM bottlenecked and DF got it wrong! I know it's difficult for you to comprehend that your authority gods at DF initially got it wrong. Oh, the HORROR!!!

The reason why the 6800Xt is keeping up with the 3080, with RT on, is because Spider-man wasn't using the BVH acceleration units in Ampere. It was doing it in the CPU.
RDNA2 does not have units for accelerating BVH traversal. So when a game doesn't use these units in Ampere, of course the 6800XT will narrow the gap to the 3080.

Here is a graph from the release version, about memory usage.
As you can see, the game is only using 10.5 GB, of the 16Gb that the 6900XT has.
So no, this game does not have an issue with performance on the 3080 because of it only having 10GB.

SqxnKz9.png
 

winjer

Gold Member
And here is the vram usage for Spider-man Miles Morales. Using a slightly improved engine.
Just look at the 4090, with 24Gb of vram, only using 9.1Gb, with RT at 4K.
So once again, the RTX 3080 does not have it's performance limited by vram.

qKBTuHj.png
 

ToTTenTranz

Banned
The reason why the 6800Xt is keeping up with the 3080, with RT on, is because Spider-man wasn't using the BVH acceleration units in Ampere. It was doing it in the CPU.

I doubt this is true as I'm not even sure it's possible.

The game engine calls DXR libraries which then automatically use whatever acceleration is possible on the hardware. I don't think you can write raytracing code in a game engine that bypasses selected raytracing accelerated stages.
You can design the feature/scene in a way that taxes more (or less) a certain accelerated stage, and there are RT stages that do run faster on AMD hardware, but what you're suggesting is a completely different thing.
 

Thebonehead

Banned
I doubt this is true as I'm not even sure it's possible.

The game engine calls DXR libraries which then automatically use whatever acceleration is possible on the hardware. I don't think you can write raytracing code in a game engine that bypasses selected raytracing accelerated stages.
You can design the feature/scene in a way that taxes more (or less) a certain accelerated stage, and there are RT stages that do run faster on AMD hardware, but what you're suggesting is a completely different thing.

I give you .....

rFvssT5.png
 

winjer

Gold Member
I doubt this is true as I'm not even sure it's possible.

The game engine calls DXR libraries which then automatically use whatever acceleration is possible on the hardware. I don't think you can write raytracing code in a game engine that bypasses selected raytracing accelerated stages.
You can design the feature/scene in a way that taxes more (or less) a certain accelerated stage, and there are RT stages that do run faster on AMD hardware, but what you're suggesting is a completely different thing.

You forget this is a PS5 port, with a custom engine. Not everything is done calling on DirectX.

 

ToTTenTranz

Banned
I give you .....

rFvssT5.png
As opposed to Nvidia featured games that are tailored to Nvidia hardware? Ever heard of Control and Cyberpunk?
What else is new?



You forget this is a PS5 port, with a custom engine. Not everything is done calling on DirectX.


The CPU is always involved in BVH management. The bounding volumes are attached to the geometry which is sent to the GPU by the CPU. Even more with RT on because it needs to load the bounding volumes for objects that are off-screen. There's a spike in CPU activity in all RT games. Why would Miles Morales be any different?
The bigger difference with a game like Spider Man is the cheer amount of new objects popping into the screen and constantly changing scenery, so of course more CPU<->GPU bandwidth is needed.

None of this suggests that Nixxes somehow wrote specific code that stops RTX cards from using their full potential. They were also involved in the PC RT implementations of Shadow of Tomb Raider and Avengers, both of which being RTX Nvidia featured.


Miles Morales was made for RDNA2 first and foremost and that's why its RT implementation drops less on AMD hardware (probably less ray bounces by design). Thinking it's somehow blocking Nvidia hardware accelerated stages is nonsense.
 

winjer

Gold Member
The CPU is always involved in BVH management. The bounding volumes are attached to the geometry which is sent to the GPU by the CPU. Even more with RT on because it needs to load the bounding volumes for objects that are off-screen. There's a spike in CPU activity in all RT games. Why would Miles Morales be any different?
The bigger difference with a game like Spider Man is the cheer amount of new objects popping into the screen and constantly changing scenery, so of course more CPU<->GPU bandwidth is needed.

None of this suggests that Nixxes somehow wrote specific code that stops RTX cards from using their full potential. They were also involved in the PC RT implementations of Shadow of Tomb Raider and Avengers, both of which being RTX Nvidia featured.


Miles Morales was made for RDNA2 first and foremost and that's why its RT implementation drops less on AMD hardware (probably less ray bounces by design). Thinking it's somehow blocking Nvidia hardware accelerated stages is nonsense.

We have a few clues. one is that the difference between RDNA2 and Ampere, remains similar with and without RT in Spider-man. While on other games using RT, the difference is much bigger.
The other is that the amount of data transferred between CPU and GPU, through the PCIe bus ramps up much more than on other games using RT.
And finally, because Nixxes told so to Digital Foundry, being one of the reasons why this game is so heavy on the CPU. At minute 21:35

 

Mister Wolf

Member
We have a few clues. one is that the difference between RDNA2 and Ampere, remains similar with and without RT in Spider-man. While on other games using RT, the difference is much bigger.
The other is that the amount of data transferred between CPU and GPU, through the PCIe bus ramps up much more than on other games using RT.
And finally, because Nixxes told so to Digital Foundry, being one of the reasons why this game is so heavy on the CPU. At minute 21:35



So Nvidia GPUs have to be handicapped because AMD is shit a doing raytracing.
 
Top Bottom