Marvel's Spider-Man: Miles Morales for PC | Review Thread

Gaiff

Member
This video has got to be a joke. A 2070 Super running1080p 60-70fps with optimized settings that are overall lower than PS5 (including lower RT object range) = console parity???
And that's coming from the person who was dismissively comparing PS5 settings vs PC with RT shadows, hand waving the performance impact of RT shadows when according to the metrics, no RT to Very High RT shadows can impact performance by 30%.
ID wait for Nx gamers vid
There's an edit feature. No need to post 10x in a row.
 
Last edited:

Stuart360

Member
It seems the extra time Nixes had to work on this has done wonders. The game is basically half as cpu intensive as the first game on my 3700x. 25% cpu usage at a locked 60fps when swinging through the city, compared to 50% cpu usage and random drops into the 50's with the first game. The game still only uses 8 cores like the first game but thank god they sorted out the cpu usage and streaming problems the first game had.

There is a trade off though, and thats the game is more gpu intensive now. With FSR i could run the first game at 4k/60 on my 1080ti., but i have to run this game at 1440p/60 with FSR. It still looks great and to be honest the trade off is more than worth it as a locked 60fps is better to me than slightly sharper visuals (plus FSR 2.1 looks super sharp anyway).

Good work Nixes, now lets try and implement these cpu changes into the first game please.

Oh and i said i wouldnt be buying this earlier in the thread but i managed to find a key for £22 so thought i'd give it a shot.

EDIT. Just skimmed over the thread and seen a lot of warring by Sony fans and stuff. I'd just say to them, and to any PC users having cpu usage probelsm with this game, i'm on a 1080ti so obviously i'm not using ray tracing. CPu usage for me without ray tracing has been halved from the first game, and i'm only on a midrange 3700x. So high cpu usage problems must be ray tracing related.
 
Last edited:

ChiefDada

Member
And that's coming from the person who was dismissively comparing PS5 settings vs PC with RT shadows, hand waving the performance impact of RT shadows when according to the metrics, no RT to Very High RT shadows can impact performance by 30%.

Lol, my point still stands. A last gen game is maxing out VRAM limitations of high end PC cards. RT shadows look great on PC, but that is not the main reason why performance is relatively low considering GPU compute specs. Even at 5min mark of video where he is playing at 1440p with high RT, he is averaging in the 70s (occasionally dips to 60 and below) with 8.5gb VRAM usage, whereas PS5 1440p Performance RT WITH the highest quality textures remains comfortably in 80-90fps.




67 to 61 is a 9% difference. Drop off of ~5 frames w/ a PS5 equivalent CPU. Player in the video I referenced was using a 3080, not a 2070 Super. Now STFU for Christ sake. You continue to retrieve answers from higher authority without any logical input from yourself or other contextual contribution. I can accept the former, but the latter is vomit inducing and extremely low effort.
 
It seems the extra time Nixes had to work on this has done wonders. The game is basically half as cpu intensive as the first game on my 3700x. 25% cpu usage at a locked 60fps when swinging through the city, compared to 50% cpu usage and random drops into the 50's with the first game. The game still only uses 8 cores like the first game but thank god they sorted out the cpu usage and streaming problems the first game had.

There is a trade off though, and thats the game is more gpu intensive now. With FSR i could run the first game at 4k/60 on my 1080ti., but i have to run this game at 1440p/60 with FSR. It still looks great and to be honest the trade off is more than worth it as a locked 60fps is better to me than slightly sharper visuals (plus FSR 2.1 looks super sharp anyway).

Good work Nixes, now lets try and implement these cpu changes into the first game please.

Oh and i said i wouldnt be buying this earlier in the thread but i managed to find a key for £22 so thought i'd give it a shot.

EDIT. Just skimmed over the thread and seen a lot of warring by Sony fans and stuff. I'd just say to them, and to any PC users having cpu usage probelsm with this game, i'm on a 1080ti so obviously i'm not using ray tracing. CPu usage for me without ray tracing has been halved from the first game, and i'm only on a midrange 3700x. So high cpu usage problems must be ray tracing related.
Yes because RTX 3080 is performing exactly as it should. IMO it is performing 70-80% faster than PC on same settings of fidelity mode.

10 GB Vram is enough for 4k+Ray tracing for this game.






 

yamaci17

Member
And that's coming from the person who was dismissively comparing PS5 settings vs PC with RT shadows, hand waving the performance impact of RT shadows when according to the metrics, no RT to Very High RT shadows can impact performance by 30%.

There's an edit feature. No need to post 10x in a row.
btw its also not representative of actual 2070super performance. it is cpu limited there (by ryzen 3600) around 70 framerate average with a %80 GPU utilization (so it would average 85-90 framerate at 1080p. and by extension, it would actually still continue getting 60-70 framerate average at 1440p due to GPU performance headroom being available)

that part is not even shown to showcase 2070s performance , it is a part to show the CPU bound performance of r5 3600 CPU

so 2070s is not getting 1080p 70 framerate, not at all, it will easily do 1440p 60 with that kind of headroom

(another golden logic nugget: compares CPU bound impact of Ray traced shadows when the discussion is about its GPU bound impact. just great!)
 
Last edited:

ChiefDada

Member
that part is not even shown to showcase 2070s performance , it is a part to show the CPU bound performance of r5 3600 CPU

so 2070s is not getting 1080p 70 framerate, not at all, it will easily do 1440p 60 with that kind of headroom

WITH THE GPU UTILIZATION ALREADY AT 85%!!!???




Captain America Lol GIF by mtv
The Office Lol GIF by NETFLIX
Cracking Up Lol GIF by HULU
Happy Simon Cowell GIF by America's Got Talent
Happy Big Brother GIF by MOODMAN
Happy Cracking Up GIF by MOODMAN


Yeah, we're done here. Shame, I thought you were better than this.
 
btw its also not representative of actual 2070super performance. it is cpu limited there (by ryzen 3600) around 70 framerate average with a %80 GPU utilization (so it would average 85-90 framerate at 1080p. and by extension, it would actually still continue getting 60-70 framerate average at 1440p due to GPU performance headroom being available)

that part is not even shown to showcase 2070s performance , it is a part to show the CPU bound performance of r5 3600 CPU

so 2070s is not getting 1080p 70 framerate, not at all, it will easily do 1440p 60 with that kind of headroom

(another golden logic nugget: compares CPU bound impact of Ray traced shadows when the discussion is about its GPU bound impact. just great!)
The ps5 is cpu limited as well so next
 

yamaci17

Member
The ps5 is cpu limited as well so next
You cannot really know that unless you see actual metrics from the hardware itself. Going by how PS5 performs in its locked modes and the kind of resolutions it drops to, it is QUITE clear and comprehensible that VRR unlocked modes are GPU bound, MOST of the time. the fact that even VRR unlocked modes still shift resolution, IT IS still GPU bound, in a controlled state by the developer

next
 
Last edited:

yamaci17

Member
see people, this is why you don't do drugs


Based on the data here, 2070 super renders 104 frames at 1080p, and 79 frames at 1440p in the very same engine.

A performance drop of %75.

RTX 2070 super in the DF's video and his "gotcha" picture is shown to hover around %85 at 1080p while getting 72 frames per second.
With a better CPU, RTX 2070 super would get around 84 FPS average with that extra %15 headroom (100/85 = 1.17) (72*1.17)

Based on how it drops %75 of its 1080p performance with ray tracing enabled at 1440p, we would get... 84*0.75 = 63 FPS

You're welcome.

No no, I THOUGHT you were better than this. You truly showed your colors. But beyond that, you simply fail at making simple deductions on a given data. Bravo.

If this isn't enough, here, MATCHED PS5 settings (not optimized settings) on a 3070 at native 1440p;



Now... how fast is the 3070 compared to 2070 super? Around %35 faster.

So in this scene, with matched PS5 settings, 2070 super would be getting around 70 FPS.

You're welcome.

You're also invited to understand that Turing and Ampere scales better at higher resolutions due to their architectures not being effective at low resolutions such as 1080p. But you probably didn't know that either.

Here another scene,





86 frames per second... so 64 FPS or so for 2070 super.

YOU'RE very, very welcome..

These PS5 matched ray tracing settings do not even sweat any midrange RTX GPU. Real deal starts with Very High ray traced geometry, which once again, most midrange and above RTX GPUs handle gracefully, 2070 super INCLUDED while having the best upscaling in tow in DLSS to push it even more gracefully.
 
Last edited:

sachos

Member

DF video.
Many many redundant settings. Sure you can change weather low,medum,high, very high... but why? There is no performance impact.
Imo pc games should not do that. Do not offer lower graphics options if it does not bring performance improvement
Haven't watched the video yet, does Alex do a propper side by side VS with the PS5 like he did with the recent Plague Tale video?
 

rofif

Member
Haven't watched the video yet, does Alex do a propper side by side VS with the PS5 like he did with the recent Plague Tale video?
Only cut few settings. He kinda avoids console version but also it’s a goal of the video. He looks for similar settings
 

Gaiff

Member


67 to 61 is a 9% difference. Drop off of ~5 frames w/ a PS5 equivalent CPU. Player in the video I referenced was using a 3080, not a 2070 Super. Now STFU for Christ sake. You continue to retrieve answers from higher authority without any logical input from yourself or other contextual contribution. I can accept the former, but the latter is vomit inducing and extremely low effort.
Look at the disingenuous twat. Alex in this scene is claiming that just turning RT Shadows to Medium has a 9% cost on the CPU which is why he DOES NOT recommend turning them on on mid-range hardware. This further cements that RT shadows are heavy on both CPU and GPU. Alex speaks the exact OPPOSITE of what you claim.

Alex: I recommend them only for those with really high-end CPUs and GPUs.

You: RT shadows are not demanding. Vindicated.

Christ, you're insufferable. Your claims have been debunked and you haven't put forward a iota of proof for what you said. Quit wasting everyone's time with your warring nonsense.
 
Last edited:

phil_t98

Member
Loved the first one but felt like MM didn’t have the same feel story wise. Maybe a bit rushed and was too short. Looking forward to Spider-Man 2 though
 

ChiefDada

Member
Look at the disingenuous twat. Alex in this scene is claiming that just turning RT Shadows to Medium has a 9% cost on the CPU which is why he DOES NOT recommend turning them on on mid-range hardware. This further cements that RT shadows are heavy on both CPU and GPU. Alex speaks the exact OPPOSITE of what you claim.

Alex: I recommend them only for those with really high-end CPUs and GPUs.

The video I originally posted was testing 3080 benchmarks. Is a 3080 GPU mid-range to you?

You: RT shadows are not demanding. Vindicated.

Instead of putting words in my mouth, you can simply reference what I said. The only negative is your mischaracterization gets exposed for everyone to see:messenger_hushed:

RT shadows, notorious for being relatively cheaper than other RT implementations.

Words matter. Again in the context of the video I posted with the 3080, turning on RT shadows had negligible impact on performance, meaning it is not the underlying cause of why console is performing significantly better than PC from a GPU/CPU/RAM spec comparison.


However there is always a logical answer and my position has always been the fundamental difference in how data is managed and moved throughout entire system. A lot of you PC owners keep looking at individual components in a vacuum when that's not how it works.

I thought we could all silently agree to disagree after the back and forth yesterday, but if you wish to continue antagonizing, I have no problem defending my original comments from the slander.
 

Gaiff

Member
The video I originally posted was testing 3080 benchmarks. Is a 3080 GPU mid-range to you?
In a world with 4090/80s and 7900 XT/X coming in a few weeks, yes. I'm obviously assuming that Alex considers it high-end still but the point is that RT shadows are taxing, they're not negligible even on high-end hardware as you seem to imply.
Instead of putting words in my mouth, you can simply reference what I said. The only negative is your mischaracterization gets exposed for everyone to see:messenger_hushed:
Which again was debunked. Even "relatively" speaking is false. RT shadows can and often are more demanding than reflections. Reflections entirely depend on the number and roughness of surfaces. That shadows are relatively light is completely false no matter how you slice it.
I thought we could all silently agree to disagree after the back and forth yesterday, but if you wish to continue antagonizing, I have no problem defending my original comments from the slander.
There's no disagreement to be had. You mad false claims that were debunked and double-down. We'll just go back to ignoring you and your nonsensical claims that as of yet are still unsubstantiated.

Words matter. Again in the context of the video I posted with the 3080, turning on RT shadows had negligible impact on performance, meaning it is not the underlying cause of why console is performing significantly better than PC from a GPU/CPU/RAM spec comparison.

1. Shifting settings frequently without restarting the game can negatively affect performance.
2. The fact that the 3080 is at 49fps then a second later 20fps should tell you exactly what is happening. There are massive fps lurches and sudden spikes.

We noticed that 4K and maximum DXR settings looks like it really wants more than 10GB VRAM — the 3060 12GB card for example has minimum FPS that's much closer to the average.
From Tom's Hardware. But at this point, it's blatant you have no idea what you're talking about and haven't got the faintest idea how PC hardware works.

Done wasting my time with you. Over and out.
 
Last edited:

TrebleShot

Member
Funny to see people warring in here.
I the two years it took for this to come to PC I’m sure Insomniac (arguably the best dev in the world right now) would be able to squeeze more out of the PS5 to add some bells and whistles. Even then it looks amazing on PS5, it was cross gen remember.
 

SmokedMeat

Gamer™
Because it’s not an equivalent gou bench at that point

What do you need an equivalent bench for?

What do you get out of holding back a PC to a two year old console level? Is there some need to make it look like they’re on the same level, when they’re not?

Do you also need a Switch level PC to get Monster Hunter Rise benchmarks, or else it’s not fair to make the console look worse?
 
Last edited:
Just finished the game on PC - first time playing it.

Man, that was weak. The venom powers were interesting at first but ultimately (pun intended) I prefer the Ratchet & Clank-ness of the combat in Pete's game. They're also kinda OP because I probably died a grand total of two times prior to the final boss on the hardest difficulty. It's also incredibly short - my playtime was 13 hours, and that's with most of the side stuff done. Just a handful of hideouts that I couldn't be bothered to mop up.

As for the story... I think the less said about it, the better.

Having said all that, it sure was a pretty game on PC. :) But I should have waited for a sale.
 
Last edited:

Gaiff

Member
90 minutes in and I'm already having more fun with Miles Morales than the OG game. Combat is significantly improved and the pace is tighter.
 
Top Bottom