• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[NxGamer] Spider-Man Remastered PC vs PS5 vs Steam Deck vs PS4 vs 750Ti - Technical Review & Comparison

different locations / different regions. just a few seconds later 3060 too shoots above 40+

2080ti will easily destroy and wreck PS5 in this game with matched settings at 4K, there is simply no competition

don't twist my words, or videos i've linked, i've just sent that as a proof of concept. in nx gamer video, he claims ps5 is hovering around 25 whereas ps5 gets 45, and this naturally puts ps5 to something like 3070 according to him, whereas in reality 3060 is capable of pushing 35/40+ in exact same location, meaning 2070 would be capable too, if it had enough VRAM
Are we sure Nx gamer didn’t Just mid peak on that part I’ve literally never seen a framerate test where rhe ps5 fell to 35fps in any mode tje absolute worst I’ve seen is right at 40. And I do expect a 2080 ti to beat it considering it’s more powerful hardware. The 3070 has limitations unlike the 2080 ti so I don’t know why you brought that gpu up
 
3060 is not outperforming the 2070
ps5 is not outperforming the 2070

2070 is underperforming compared to all its equivalent-power hardware
even 3080 underperforms compared to 2080ti at 4K.

[/URL]

Practically, VRAM constrained cards cannot perform as they should. Clear as crystal.

vUDE31i.jpg


The reason 3080 also buckles at 4k is because game only uses a total of %80 available vram.
As a result:
8 GB cards have 6.4 GB budget,
10 GB cards have 8 GB budget (and 4K+ray tracing breaches the budget, hence it underperforms)
11 GB cards have 8.8 GB budget (still not enough budget, but better than 8)
12 GB cards have 9.6 GB budget (only proper budget that can truly match PS5's allocated memory)


VRAM constraint buckles 3080 to a point where it performs almost like a 2080ti. Whereas 3070 is constrainted too much that it drops a nearly %30 perf.

2070/2070s also drops a %30 perf. drop, which is the reason why it drops below 30s, and often hangs around 25 in his obnoxious video.

These problems %100 exists. Steam discussions are full of people who keep saying they keep having performance issues after 15 mins of playtime. The entire reason all reviewers seem to ignore this is because it would create a PR nightmare for NVIDIA. Even the 10 gig 3080 falter, to a point where it perfoms like a 2080ti.

All the issues community experiences stems from the exact same VRAM conundrum.
Another question if the fear is that there are vram limitations and that’s what’s holding back the gpus how do we know the same thing isn’t happening to the ps5 and that say if it had 32gb of vram it would mostly in the 50s or even 60s?
 

Three

Member
I found some screenshots i had taken in 2020 comparing the Pedastrian density in Performance vs Fidelity mode. The insane amount of pedestrians in Fidelity mode was the first thing that jumped out at me. I was in a fight and throwing around a bunch of mailboxes and trash cans and they all reacted to them very convincingly. I dont think ive seen 100+ pedestrians in a city street at once since AC Unity. It was extremely impressive even in Performance modes.

What Alex is doing here is bizarre. This is not equivalent to PS5's performance mode. The NPCs and traffic density should both be hitting the CPU hard and even the GPU since they all cast shadows and in vehicle's case they all cast RT reflections. How the fuck are these optimized settings anyway when you are downgrading the image and entire experience so severely. The whole point of optimized settings is to have the image looking virtually the same while offering cutbacks you wouldnt even notice.

Fidelity:
EsoI5tdXAAISvOz

Performance:

EsoI0BFXUAU_ccO


Alex:
jOISPZt.png
In all fairness to Alex spiderman is coded to load in textures and geometry when you stop swinging but I'm not sure if it's that severely cut back when you are swinging. I don't think it was.
 
Another question if the fear is that there are vram limitations and that’s what’s holding back the gpus how do we know the same thing isn’t happening to the ps5 and that say if it had 32gb of vram it would mostly in the 50s or even 60s?
Lol it's about having "enough" VRAM. If the game asks for 7gb of VRAM during a certain scene, as long as you have 7gb available, it will perform as expected. But the performance won't improve if you have even more VRAM available. Performance doesn't scale that way. However, if you have less than the requirement, performance will tank.
 

yamaci17

Member
Another question if the fear is that there are vram limitations and that’s what’s holding back the gpus how do we know the same thing isn’t happening to the ps5 and that say if it had 32gb of vram it would mostly in the 50s or even 60s?
it wouldn't happen because PCIE transactions with PS5 equivalent settings stops at near 9.5-9.6 GB VRAM usage. I always theorized that PS5 is probably allocating nearly 10 GB worth of data. Game simply wants 10 GB VRAM at those settings. Those settings are literally tailored for PS5. PS5 is not going to be hold back by VRAM constraints. The game and those settings are specifically tailored towards 10 GB budget. That's it. If there was a theoritical 24 GB RTX 3060, it would perform the same as 12 GB RTX 3060, because the game saturates around 10 GB and stops going upwards of it.

For the 3060-2700x/3700x bullcrap, the video I've linked is entirely GPU bound. It will still give similar results with those CPUs you have mentioned. I've literally refinsihed the game twice with 2700x, never dropping below 55 FPS in CPU bound scenarios. In the video however, it is always a GPU bound scenario where 3060 hovers between 35-40. I even have video proof that 2700x never drops below 55s with a CPU bound Ray Tracing scenario. You're looking at the wrong things here. Then again, I've, lots of times, explained to you that no sane person combines a 2700x with a 2070/2070 super. Zen/Zen+ CPUs have horrific IPC, and outdated architectures that puts them in a bad situation. Zen 2 3700x is a proper match and 3700x performs excellently in this game. It will easily lock to a perfect 60 with a capable GPU.

Even then, let me accept that I might be partially wrong in claiming 3060 may exceed PS5. It still matches and hovers close to PS5, a hard fact you cannot deny, and it would still tail the PS5 even with those CPUs you've suggested. This is not a limitation of CPU, especially at 4K GPU Bound scenarios with a midrange 3060.

You're just focusing on wrong points, 3060 is literal equivalent of a 2070. The Nx gamer uses a super overclocked 2070 which performs like a 2070s, according to him at least. His 2070s, due to heavy VRAM constraints, drops frames below 25 at times, performing like crap, whereas 3060 never has similar issues, it always get respectable framerates. We're talking about a 329 bucks MSRP GPU. It just has enough VRAM for the confugiration. Exact same video shows 3060 hovering well above 50+ framerate with DLSS, but I'm not even bringing it up, because unlike you, I'm just trying to provide valuable input to the discussion instead of consntantly hailing "hey yo ps5 is now a 3070"

Saying PS5 performs like a 3070 is like saying Xbox One performs like a GTX 770. It does not, GTX 770 underperforms instead. Having only 2 GB of weak buffer made GTX 770's performance suffer. Xbox One can never theoritically match the GTX 770, and 4 GB GTX 960 can easily destroy Xbox One in any given day.

You cannot compare a console to a VRAM constrained GPU just because they power levels near each other. 8 GB VRAM RTX 3060ti/3070/3070ti are badly configured GPUs, which I've always criticised, and always said they'd start to get problematic with time moves on. For their capabilities, they should've had 10-12 GB VRAM instead. AS you can see, 2080ti performs like it should at 4K, having enough VRAM, whereas 3070 falters. I bring that GPU up because they're supposed to be equal in terms of raw, native, therotical power, in conditions of being VRAM constraints. There are thousands of games where 3070 and 2080ti simply performs the same, and then there's Spiderman and RE Village where 2080ti performs like it should, whereas 3070 lags behind. Simple as that.

You, however, is chasing the wrong things here, as I explained. And I'm really getting tired of explaining the same thing to you over and over again.

Only and only revelant factor in this discussion is VRAM. And nothing else.



Just take your time and watch 44:12. VRAM constraints causes the card to drop 40s and with high textures, it magically gets to a locked 60 (with %20 gpu headroom) VRAM constraint in this specific case caused 3060ti to lose %80 of its potential performance. That is simply too huge. This is at this point is a case of badly misconfigured GPU by NVIDIA, and is not a proper GPU you can use to compare PS5 performance with.
 
Last edited:
Lol it's about having "enough" VRAM. If the game asks for 7gb of VRAM during a certain scene, as long as you have 7gb available, it will perform as expected. But the performance won't improve if you have even more VRAM available. Performance doesn't scale that way. However, if you have less than the requirement, performance will tank.
I mean can we be sure that the ps5 at it’s settings arent requiring say 20gb of vram and that’s why it’s not in the 60s range mostly?
 
it wouldn't happen because PCIE transactions with PS5 equivalent settings stops at near 9.5-9.6 GB VRAM usage. I always theorized that PS5 is probably allocating nearly 10 GB worth of data. Game simply wants 10 GB VRAM at those settings. Those settings are literally tailored for PS5. PS5 is not going to be hold back by VRAM constraints. The game and those settings are specifically tailored towards 10 GB budget. That's it. If there was a theoritical 24 GB RTX 3060, it would perform the same as 12 GB RTX 3060, because the game saturates around 10 GB and stops going upwards of it.

For the 3060-2700x/3700x bullcrap, the video I've linked is entirely GPU bound. It will still give similar results with those CPUs you have mentioned. I've literally refinsihed the game twice with 2700x, never dropping below 55 FPS in CPU bound scenarios. In the video however, it is always a GPU bound scenario where 3060 hovers between 35-40. I even have video proof that 2700x never drops below 55s with a CPU bound Ray Tracing scenario. You're looking at the wrong things here. Then again, I've, lots of times, explained to you that no sane person combines a 2700x with a 2070/2070 super. Zen/Zen+ CPUs have horrific IPC, and outdated architectures that puts them in a bad situation. Zen 2 3700x is a proper match and 3700x performs excellently in this game. It will easily lock to a perfect 60 with a capable GPU.

Even then, let me accept that I might be partially wrong in claiming 3060 may exceed PS5. It still matches and hovers close to PS5, a hard fact you cannot deny, and it would still tail the PS5 even with those CPUs you've suggested. This is not a limitation of CPU, especially at 4K GPU Bound scenarios with a midrange 3060.

You're just focusing on wrong points, 3060 is literal equivalent of a 2070. The Nx gamer uses a super overclocked 2070 which performs like a 2070s, according to him at least. His 2070s, due to heavy VRAM constraints, drops frames below 25 at times, performing like crap, whereas 3060 never has similar issues, it always get respectable framerates. We're talking about a 329 bucks MSRP GPU. It just has enough VRAM for the confugiration. Exact same video shows 3060 hovering well above 50+ framerate with DLSS, but I'm not even bringing it up, because unlike you, I'm just trying to provide valuable input to the discussion instead of consntantly hailing "hey yo ps5 is now a 3070"

Saying PS5 performs like a 3070 is like saying Xbox One performs like a GTX 770. It does not, GTX 770 underperforms instead. Having only 2 GB of weak buffer made GTX 770's performance suffer. Xbox One can never theoritically match the GTX 770, and 4 GB GTX 960 can easily destroy Xbox One in any given day.

You cannot compare a console to a VRAM constrained GPU just because they power levels near each other. 8 GB VRAM RTX 3060ti/3070/3070ti are badly configured GPUs, which I've always criticised, and always said they'd start to get problematic with time moves on. For their capabilities, they should've had 10-12 GB VRAM instead. AS you can see, 2080ti performs like it should at 4K, having enough VRAM, whereas 3070 falters. I bring that GPU up because they're supposed to be equal in terms of raw, native, therotical power, in conditions of being VRAM constraints. There are thousands of games where 3070 and 2080ti simply performs the same, and then there's Spiderman and RE Village where 2080ti performs like it should, whereas 3070 lags behind. Simple as that.

You, however, is chasing the wrong things here, as I explained. And I'm really getting tired of explaining the same thing to you over and over again.

Only and only revelant factor in this discussion is VRAM. And nothing else.



Just take your time and watch 44:12. VRAM constraints causes the card to drop 40s and with high textures, it magically gets to a locked 60 (with %20 gpu headroom) VRAM constraint in this specific case caused 3060ti to lose %80 of its potential performance. That is simply too huge. This is at this point is a case of badly misconfigured GPU by NVIDIA, and is not a proper GPU you can use to compare PS5 performance with.

I thought I was bringing up rhe cpu in this game cause it’s arguably the most important triple a game released recently besides cyberpunk that has heavy cpu utilization and that in benchmarks that matters a lot this isn’t like death stranding where there isn’t a radical change going up you can easily lose 10+ frames depending on your cpu in this game which is why pairing the gpu with an inequitable cpu is off to say the least we wouldn’t bench Spider-Man on a 3090 with a 12900k but then do that same benchmark on a 3060 with a 12600k we would of course use the same cpu. I do agree it’s fair to use a cpu slightly better than the one in the ps5 but we absolutely should not be using a 2 or especially 3x better cpu than the one in the ps5 that makes things ineuqivalent and adds a needless variable. I’ve personally never really bought that the 3070 matches the 2080 ti that always felt like nvidia marketing propped up by other sources it can only perform like a 2080 ti in certain dlss and rt benchmarks but basically never in rasterization and usually falls between the ti and súper. I’ve always felt like the 3070ti is a closer match that at least equals and sometimes surpasses the 3070 ti which is why I disagree with you bringing that gpu up (it’s not a flex to say the 2080ti outperforms thats the expected results and it not doing that would only look bad on the gpu)
 

Hoddi

Member
I mean can we be sure that the ps5 at it’s settings arent requiring say 20gb of vram and that’s why it’s not in the 60s range mostly?
The game literally wouldn't run on PS5 if it needed 20GB of VRAM.

Anyway, here's how much VRAM the game will use without RT when run through a proper graphics profiler. RT adds a couple of gigabytes on top but RenderDoc doesn't support RT.

4k
Stats for Spider-Man_2022.08.13_16.28.12_frame14894.rdc.

File size: 2893.82MB (5064.99MB uncompressed, compression ratio 1.75:1)
Persistent Data (approx): 175.07MB, Frame-initial data (approx): 4871.66MB

*** Summary ***

Draw calls: 7158
Dispatch calls: 138
API calls: 69741
API: Draw/Dispatch call ratio: 9.5588

2386 Textures - 1799.89 MB (1799.59 MB over 32x32), 129 RTs - 1203.65 MB.
Avg. tex dimension: 690.992x665.378 (709.994x684.157 over 32x32)
4404 Buffers - 6326.40 MB total 898.22 MB IBs 805.28 MB VBs.
9329.94 MB - Grand total GPU buffer + texture load.
1080p
Stats for Spider-Man_2022.08.13_23.33.29_frame4695.rdc.

File size: 2419.59MB (4263.81MB uncompressed, compression ratio 1.76:1)
Persistent Data (approx): 327.00MB, Frame-initial data (approx): 3923.75MB

*** Summary ***

Draw calls: 6289
Dispatch calls: 106
API calls: 63350
API: Draw/Dispatch call ratio: 9.90618

1678 Textures - 1755.53 MB (1755.38 MB over 32x32), 96 RTs - 413.00 MB.
Avg. tex dimension: 804.176x762.843 (832.254x789.798 over 32x32)
3698 Buffers - 5397.80 MB total 898.22 MB IBs 805.28 MB VBs.
7566.33 MB - Grand total GPU buffer + texture load.
Only a complete beginner would look at the kind of PCIe load we saw in that video and think it was anything other than a VRAM capacity issue. It was blatantly obvious from the footage and I'm not willing to let that slide from someone masquerading as a 'tech expert'.
 
The game literally wouldn't run on PS5 if it needed 20GB of VRAM.

Anyway, here's how much VRAM the game will use without RT when run through a proper graphics profiler. RT adds a couple of gigabytes on top but RenderDoc doesn't support RT.

4k

1080p

Only a complete beginner would look at the kind of PCIe load we saw in that video and think it was anything other than a VRAM capacity issue. It was blatantly obvious from the footage and I'm not willing to let that slide from someone masquerading as a 'tech expert'.
The construction of the ps5 and pc versions are not the same as we constantly see so listing vram requirements based on the pc versión does not tell us the story of how much vram it used on ps5. And it doesn’t mean it wouldn’t run i said how do we know performance isn’t tanking
 

Corndog

Banned
the game is vram bottlenecked almost all the time if you enable RT with a 8 gig card (primarly because game limits itself to only use 6.4 GB of memory). then the game consntalty beings to use normal RAM as a substitute for VRAM

the test is all wrong, and he clearly failed to understand what made his 2070 perform like crap

not once in his video that he managed to make a comment regarding full vram utilization. the performance difference between 2070 and PS5 completely stems from running out of VRAM

in this specific shot, performance is dropped because of running out of VRAM, not because of "PS5 having better performance"

qvxpHqa.jpg



I actually proved it, in the exact same intro scene;

[/URL]

even high textures cause a performance drop. with low textures, I manage to get 56 frames. so 8 GB buffer cannot even fit high textures + ray tracing together at 4K without experiencing a performance drop

only at native 1440p, 8 GB+ very high textures are usable, but only in short burts of playtime. (not 4k+upscaling, 4k + upscaling uses 4k lods, therefore stresses vram further)

this is not to say NVIDIA and RTX cards are not to blame here. They are. Lack of VRAM causes these abnormalities, and opportunists like NX Gamer took a nice slice out of this problem for himself.

The entire reason game constantly hammers PCIE is not because how the engine is built, but because game artificially limits itself to only use a maximum of 6.4 GB VRAM and use normal RAM as a subsitute. It is not doing anything fancy: No game on PC should ever rely on transferring texture data constantly on the fly from RAM to VRAM. PCIE 4 wont solve that either, RAM to VRAM will always be slow, will always stall GPUs to worse performance and will always create more problems.

In this specific case, this happens becuase the said GPU does not have enough VRAM that game demands at 4K/RT+Very high textures. This config requires a solidly uninterrupted 10 GB BUFFER, which RX 6800 has, therefore does not run into similar performance issues.

If he actually had the courage to test and compare RX 6800 to PS5 with RT enabled in similar scenes, he would also see that PS5 is not "out performing" the other GPUs, as it would be decimated by the RX 6800. Notice how he cleverly avoided making a direct comparison between RX 6800 and PS5, it would undermine all of what he talked about in the very same video.

PS5 only "seemingly" destroys the 2070 because the buffer is at its limits.

Just my points and observations. You can do whatever you want with this info. I don't like the dude, but sadly 8 gigs of VRAM simply cannot enable features that PS5 can enable with its uncompromised 10 GB budget, which creates anomalies like this. And it too will continue to happen, especially if developers like Nixxes decides to artificially limit VRAM utilization to 6.4 GB, supposedly to leave "available VRAM" for background operations, even if you don't have nothing on the background

The RTX 3060 existing alone destroys the entire point of this video. If he tested 3060,

- It would match and exceed PS5 performance (unlike the 8 GB 2070)
- It would not cause PCIE to be hammered, since it has enough VRAM budget (unlike the 8 GB 2070)

Here's how 3060 performs at NATIVE 4k (similar to ps5's fidelity mode) with similar RT settings;



it gets 35-40 frames, just like PS5 does. look at his 2070 video, constantly having irregular frametimes, whereas 3060 is having a smooth ride all around.

as i said, it is indeed the 2070 being the problematic part here.

i have no idea how 8 gig rtx gpus will age, considering these events, I'd simply say stay away from them for your own good. wait for 12 GB 4070.

Ya, I wish my 3070 had more than 8 gbs. I guess ram plus the larger memory bus is too much money.
 
Ya, I wish my 3070 had more than 8 gbs. I guess ram plus the larger memory bus is too much money.
Also the 3070 has to do more work than the ps5 in this game cause of the construction of this port which is why it’s weird they consider these gpus underutilized
 

ChiefDada

Gold Member
If sony first party taylor a game to the PS5 of course its not going to run as well on PC, and sony probably wont go the extra mile to optimise a PC further, that does not mean the PS5 has some inheritly superior hardware in it, that cant be matched or surpassed with some slightly more powerful PC hardware.

It's not about matching performance, it's about efficiency gaps that cause console to " punch above it's weight". Yes I know it's cliche but that's exactly what's happening here. And likely at a level we had not seen in the prior generation. If I was a PC gamer, I wouldn't be worried about console outperfming on an absolute basis, no reason to be (assuming money isn't a worry). But I would be annoyed at the thought of being constrained by memory management overhead that limits the peak potential of my pc build. Especially when proof of concept exist and is already being utilized by console gamers with great returns. That's all this is about.

The guy really just wants to be redeemed for saying the PS5 is something special when it was first revealed. The guy has a pretty big ego.

I mean, you act like it's just him and that other neutral, highly qualified industry professionals haven't offered similar praise. I know you know this, there's no need to rewind back to 2019-2020. Nvidia DLSS hardware is fantastic and i hope consoles implement something similar either mid gen or next gen. Cool tech is cool tech, regardless of which company or platform introduces it first.
 

Hoddi

Member
The construction of the ps5 and pc versions are not the same as we constantly see so listing vram requirements based on the pc versión does not tell us the story of how much vram it used on ps5. And it doesn’t mean it wouldn’t run i said how do we know performance isn’t tanking
It doesn't matter. An uncompressed 4k texture at 4 bytes per pixel will always be 64MB on both PS5 and PC. You're never gonna see a scenario where PS5 uses less memory than a PC when showing the same graphics content on screen.

Computers straight up don't work like that.
 
Last edited:

hlm666

Member
Alex isn't perfect, but if anyone has more credibility here it's probably him seeing he did a pretty extensive interview with members of nixxes. I mean you can believe whatever but common sense would dictate the person who has spoke with the developers might have a bit more insight.
 

64bitmodels

Reverse groomer.
my condolences... but I mean as long as you don't expect good RT performance you're gonna be ok

Comforting Big Hero 6 GIF by Sky
i mean i love it, i don't care much about RT and every game i throw at it runs well over 60 @ 1440p
that being said, i probably should have waited until next gen for a card that had more Vram. yamaci17 yamaci17 's post has made me question if getting 8gb in 2022 was really a good purchase
 

01011001

Banned
i mean i love it, i don't care much about RT and every game i throw at it runs well over 60 @ 1440p
that being said, i probably should have waited until next gen for a card that had more Vram. yamaci17 yamaci17 's post has made me question if getting 8gb in 2022 was really a good purchase

I feel ya, I'm running an RTX3060ti, which has the same issue. although at 1440p I didn't have much issue with the VRAM yet, not even in demanding games like Cyberpunk

I kinda hope the RTX4060 will be decently priced and might switch to that if it has more memory (leaks say it doesn't tho... which would be weird)... also I hope my PSU can handle it xD I wouldn't want to buy a new PSU as well as a new card
 
Last edited:

Sosokrates

Report me if I continue to console war
It's not about matching performance, it's about efficiency gaps that cause console to " punch above it's weight". Yes I know it's cliche but that's exactly what's happening here. And likely at a level we had not seen in the prior generation. If I was a PC gamer, I wouldn't be worried about console outperfming on an absolute basis, no reason to be (assuming money isn't a worry). But I would be annoyed at the thought of being constrained by memory management overhead that limits the peak potential of my pc build. Especially when proof of concept exist and is already being utilized by console gamers with great returns. That's all this is about.



I mean, you act like it's just him and that other neutral, highly qualified industry professionals haven't offered similar praise. I know you know this, there's no need to rewind back to 2019-2020. Nvidia DLSS hardware is fantastic and i hope consoles implement something similar either mid gen or next gen. Cool tech is cool tech, regardless of which company or platform introduces it first.

Which professional has said the PS5 is built in a way which makes it perform noticeably better then comparable PC hardware?

Being memory constrained on PC is not because of the hardware, its because of the developer, there are plenty of PC games which are not vram constrained like it is in spiderman.

Anyway NX has been proven incorrect on this occasion by yamaci17 yamaci17
 
I mean can we be sure that the ps5 at it’s settings arent requiring say 20gb of vram and that’s why it’s not in the 60s range mostly?
Only if you believe insomniac Devs are complete doofuses to develop a PS only game with VRAM requirements that exceed the closed system specs that they were targeting from the start.
 
It doesn't matter. An uncompressed 4k texture at 4 bytes per pixel will always be 64MB on both PS5 and PC. You're never gonna see a scenario where PS5 uses less memory than a PC when showing the same graphics content on screen.

Computers straight up don't work like that.
im talking about specific ports we see pcs in this game specifically have to do more work cause basically the load of compressing and processing things on the fly cause of overhead may change the requirements and size of things in other areas since there isnt a specific construction of those things on pc like there is ps5 so no i dont buy it these are the same people who said the cpu doesnt matter and here we are (also funny the cpu doesnt matter yet we need the absolute top end cpu for our tests I guess it matters enough)
 
Alex isn't perfect, but if anyone has more credibility here it's probably him seeing he did a pretty extensive interview with members of nixxes. I mean you can believe whatever but common sense would dictate the person who has spoke with the developers might have a bit more insight.
[/URL]
alex doesnt have credibility on benchmarks besides emotionally verbally attacking people on both resetera or twitter who question his findings or tests (even if those people are proven right later) he never updates info and does way to much guesswork i would trust gamer nexus benchmarks before i trust any "benchmark" from alex and thats saying a lot. alex should stick to what he is good at which is discussing the construction of games (especially on the pc side) and tech implementations as well as engine limitations or deficiencies. this benchmarking and especially analyzing things like resolution, graphical settings, and differences have always not only been faulty but way to much "I think" or "I believe" in his statements than anything close to objective and thats without even mentioning the awful testing methods he does which adds so many variables like not testing each setup with the same parts and only having the gpu be different or any other assortment of variables like him using different drivers, os etc. im not saying nx gamer is perfect or even objective and he definitely shouldn't be the only source, but my god at least he is consistent and doesn't dip his toes into crap he has 0 clue or proficiency on and also nor does he attack people who disagree with him at least he shows how he came to his conclusions or findings unlike alex's I think
 
Only if you believe insomniac Devs are complete doofuses to develop a PS only game with VRAM requirements that exceed the closed system specs that they were targeting from the start.
i mean maybe the vram does things like limit the ability to hit 60 which is already not the intended target for this mode
 

ChiefDada

Gold Member
Being memory constrained on PC is not because of the hardware, its because of the developer,

No, it is because of current PC architecture. Blaming VRAM size is like saying a bandage is the solution for all wounds. And if you truly think Nixxes phoned it in in any way, shape, or form then shame on you lol.

there are plenty of PC games which are not vram constrained like it is in spiderman.

This dovetails into what I was saying above. Your are talking about games from the prior generation when you need to be thinking about memory requirements for console ports over the next five years. Again, Spiderman is the first native PS5 port that requires an SSD to reach PS5 levels of performance. And still, it's a PS4 game at it's core; "True" next gen games will be far more memory intensive. That is why NXGamer says the efficiency gap will widen in favor of console architecture.

Which professional has said the PS5 is built in a way which makes it perform noticeably better then comparable PC hardware?

Tom Hardy Bait GIF
 

Sosokrates

Report me if I continue to console war
No, it is because of current PC architecture. Blaming VRAM size is like saying a bandage is the solution for all wounds. And if you truly think Nixxes phoned it in in any way, shape, or form then shame on you lol.



This dovetails into what I was saying above. Your are talking about games from the prior generation when you need to be thinking about memory requirements for console ports over the next five years. Again, Spiderman is the first native PS5 port that requires an SSD to reach PS5 levels of performance. And still, it's a PS4 game at it's core; "True" next gen games will be far more memory intensive. That is why NXGamer says the efficiency gap will widen in favor of console architecture.



Tom Hardy Bait GIF

I suggest you read these posts again, its all there.

the game is vram bottlenecked almost all the time if you enable RT with a 8 gig card (primarly because game limits itself to only use 6.4 GB of memory). then the game consntalty beings to use normal RAM as a substitute for VRAM

the test is all wrong, and he clearly failed to understand what made his 2070 perform like crap

not once in his video that he managed to make a comment regarding full vram utilization. the performance difference between 2070 and PS5 completely stems from running out of VRAM

in this specific shot, performance is dropped because of running out of VRAM, not because of "PS5 having better performance"

qvxpHqa.jpg



I actually proved it, in the exact same intro scene;

[/URL]

even high textures cause a performance drop. with low textures, I manage to get 56 frames. so 8 GB buffer cannot even fit high textures + ray tracing together at 4K without experiencing a performance drop

only at native 1440p, 8 GB+ very high textures are usable, but only in short burts of playtime. (not 4k+upscaling, 4k + upscaling uses 4k lods, therefore stresses vram further)

this is not to say NVIDIA and RTX cards are not to blame here. They are. Lack of VRAM causes these abnormalities, and opportunists like NX Gamer took a nice slice out of this problem for himself.

The entire reason game constantly hammers PCIE is not because how the engine is built, but because game artificially limits itself to only use a maximum of 6.4 GB VRAM and use normal RAM as a subsitute. It is not doing anything fancy: No game on PC should ever rely on transferring texture data constantly on the fly from RAM to VRAM. PCIE 4 wont solve that either, RAM to VRAM will always be slow, will always stall GPUs to worse performance and will always create more problems.

In this specific case, this happens becuase the said GPU does not have enough VRAM that game demands at 4K/RT+Very high textures. This config requires a solidly uninterrupted 10 GB BUFFER, which RX 6800 has, therefore does not run into similar performance issues.

If he actually had the courage to test and compare RX 6800 to PS5 with RT enabled in similar scenes, he would also see that PS5 is not "out performing" the other GPUs, as it would be decimated by the RX 6800. Notice how he cleverly avoided making a direct comparison between RX 6800 and PS5, it would undermine all of what he talked about in the very same video.

PS5 only "seemingly" destroys the 2070 because the buffer is at its limits.

Just my points and observations. You can do whatever you want with this info. I don't like the dude, but sadly 8 gigs of VRAM simply cannot enable features that PS5 can enable with its uncompromised 10 GB budget, which creates anomalies like this. And it too will continue to happen, especially if developers like Nixxes decides to artificially limit VRAM utilization to 6.4 GB, supposedly to leave "available VRAM" for background operations, even if you don't have nothing on the background

The RTX 3060 existing alone destroys the entire point of this video. If he tested 3060,

- It would match and exceed PS5 performance (unlike the 8 GB 2070)
- It would not cause PCIE to be hammered, since it has enough VRAM budget (unlike the 8 GB 2070)

Here's how 3060 performs at NATIVE 4k (similar to ps5's fidelity mode) with similar RT settings;



it gets 35-40 frames, just like PS5 does. look at his 2070 video, constantly having irregular frametimes, whereas 3060 is having a smooth ride all around.

as i said, it is indeed the 2070 being the problematic part here.

i have no idea how 8 gig rtx gpus will age, considering these events, I'd simply say stay away from them for your own good. wait for 12 GB 4070.


it wouldn't happen because PCIE transactions with PS5 equivalent settings stops at near 9.5-9.6 GB VRAM usage. I always theorized that PS5 is probably allocating nearly 10 GB worth of data. Game simply wants 10 GB VRAM at those settings. Those settings are literally tailored for PS5. PS5 is not going to be hold back by VRAM constraints. The game and those settings are specifically tailored towards 10 GB budget. That's it. If there was a theoritical 24 GB RTX 3060, it would perform the same as 12 GB RTX 3060, because the game saturates around 10 GB and stops going upwards of it.

For the 3060-2700x/3700x bullcrap, the video I've linked is entirely GPU bound. It will still give similar results with those CPUs you have mentioned. I've literally refinsihed the game twice with 2700x, never dropping below 55 FPS in CPU bound scenarios. In the video however, it is always a GPU bound scenario where 3060 hovers between 35-40. I even have video proof that 2700x never drops below 55s with a CPU bound Ray Tracing scenario. You're looking at the wrong things here. Then again, I've, lots of times, explained to you that no sane person combines a 2700x with a 2070/2070 super. Zen/Zen+ CPUs have horrific IPC, and outdated architectures that puts them in a bad situation. Zen 2 3700x is a proper match and 3700x performs excellently in this game. It will easily lock to a perfect 60 with a capable GPU.

Even then, let me accept that I might be partially wrong in claiming 3060 may exceed PS5. It still matches and hovers close to PS5, a hard fact you cannot deny, and it would still tail the PS5 even with those CPUs you've suggested. This is not a limitation of CPU, especially at 4K GPU Bound scenarios with a midrange 3060.

You're just focusing on wrong points, 3060 is literal equivalent of a 2070. The Nx gamer uses a super overclocked 2070 which performs like a 2070s, according to him at least. His 2070s, due to heavy VRAM constraints, drops frames below 25 at times, performing like crap, whereas 3060 never has similar issues, it always get respectable framerates. We're talking about a 329 bucks MSRP GPU. It just has enough VRAM for the confugiration. Exact same video shows 3060 hovering well above 50+ framerate with DLSS, but I'm not even bringing it up, because unlike you, I'm just trying to provide valuable input to the discussion instead of consntantly hailing "hey yo ps5 is now a 3070"

Saying PS5 performs like a 3070 is like saying Xbox One performs like a GTX 770. It does not, GTX 770 underperforms instead. Having only 2 GB of weak buffer made GTX 770's performance suffer. Xbox One can never theoritically match the GTX 770, and 4 GB GTX 960 can easily destroy Xbox One in any given day.

You cannot compare a console to a VRAM constrained GPU just because they power levels near each other. 8 GB VRAM RTX 3060ti/3070/3070ti are badly configured GPUs, which I've always criticised, and always said they'd start to get problematic with time moves on. For their capabilities, they should've had 10-12 GB VRAM instead. AS you can see, 2080ti performs like it should at 4K, having enough VRAM, whereas 3070 falters. I bring that GPU up because they're supposed to be equal in terms of raw, native, therotical power, in conditions of being VRAM constraints. There are thousands of games where 3070 and 2080ti simply performs the same, and then there's Spiderman and RE Village where 2080ti performs like it should, whereas 3070 lags behind. Simple as that.

You, however, is chasing the wrong things here, as I explained. And I'm really getting tired of explaining the same thing to you over and over again.

Only and only revelant factor in this discussion is VRAM. And nothing else.



Just take your time and watch 44:12. VRAM constraints causes the card to drop 40s and with high textures, it magically gets to a locked 60 (with %20 gpu headroom) VRAM constraint in this specific case caused 3060ti to lose %80 of its potential performance. That is simply too huge. This is at this point is a case of badly misconfigured GPU by NVIDIA, and is not a proper GPU you can use to compare PS5 performance with.
 

DenchDeckard

Moderated wildly
the game is vram bottlenecked almost all the time if you enable RT with a 8 gig card (primarly because game limits itself to only use 6.4 GB of memory). then the game consntalty beings to use normal RAM as a substitute for VRAM

the test is all wrong, and he clearly failed to understand what made his 2070 perform like crap

not once in his video that he managed to make a comment regarding full vram utilization. the performance difference between 2070 and PS5 completely stems from running out of VRAM

in this specific shot, performance is dropped because of running out of VRAM, not because of "PS5 having better performance"

qvxpHqa.jpg



I actually proved it, in the exact same intro scene;


even high textures cause a performance drop. with low textures, I manage to get 56 frames. so 8 GB buffer cannot even fit high textures + ray tracing together at 4K without experiencing a performance drop

only at native 1440p, 8 GB+ very high textures are usable, but only in short burts of playtime. (not 4k+upscaling, 4k + upscaling uses 4k lods, therefore stresses vram further)

this is not to say NVIDIA and RTX cards are not to blame here. They are. Lack of VRAM causes these abnormalities, and opportunists like NX Gamer took a nice slice out of this problem for himself.

The entire reason game constantly hammers PCIE is not because how the engine is built, but because game artificially limits itself to only use a maximum of 6.4 GB VRAM and use normal RAM as a subsitute. It is not doing anything fancy: No game on PC should ever rely on transferring texture data constantly on the fly from RAM to VRAM. PCIE 4 wont solve that either, RAM to VRAM will always be slow, will always stall GPUs to worse performance and will always create more problems.

In this specific case, this happens becuase the said GPU does not have enough VRAM that game demands at 4K/RT+Very high textures. This config requires a solidly uninterrupted 10 GB BUFFER, which RX 6800 has, therefore does not run into similar performance issues.

If he actually had the courage to test and compare RX 6800 to PS5 with RT enabled in similar scenes, he would also see that PS5 is not "out performing" the other GPUs, as it would be decimated by the RX 6800. Notice how he cleverly avoided making a direct comparison between RX 6800 and PS5, it would undermine all of what he talked about in the very same video.

PS5 only "seemingly" destroys the 2070 because the buffer is at its limits.

Just my points and observations. You can do whatever you want with this info. I don't like the dude, but sadly 8 gigs of VRAM simply cannot enable features that PS5 can enable with its uncompromised 10 GB budget, which creates anomalies like this. And it too will continue to happen, especially if developers like Nixxes decides to artificially limit VRAM utilization to 6.4 GB, supposedly to leave "available VRAM" for background operations, even if you don't have nothing on the background

The RTX 3060 existing alone destroys the entire point of this video. If he tested 3060,

- It would match and exceed PS5 performance (unlike the 8 GB 2070)
- It would not cause PCIE to be hammered, since it has enough VRAM budget (unlike the 8 GB 2070)

Here's how 3060 performs at NATIVE 4k (similar to ps5's fidelity mode) with similar RT settings;



it gets 35-40 frames, just like PS5 does. look at his 2070 video, constantly having irregular frametimes, whereas 3060 is having a smooth ride all around.

as i said, it is indeed the 2070 being the problematic part here.

i have no idea how 8 gig rtx gpus will age, considering these events, I'd simply say stay away from them for your own good. wait for 12 GB 4070.

Great Post. The fact this guy is being commisioned to make videos that could be viewed by millions of people and he doesn't even have a grasp of v ram bottlenecks is agonising. I'm sure he is a nice guy but these videos are like some person with a slight grasp on something spreading bullshit to the masses and we see it far too much across many parts of media and social media nowadays....he's basically a flat earther and being paid for it.

...the fact he is willing to do this to get paid makes me change my stance from nice guy......

....to massive chode!
 

MidGenRefresh

*Refreshes biennially
As soon as my Steam version of this game was installed, I clicked uninstall on PS5 because you know it will run and look better on PC.
 

DenchDeckard

Moderated wildly
Another question if the fear is that there are vram limitations and that’s what’s holding back the gpus how do we know the same thing isn’t happening to the ps5 and that say if it had 32gb of vram it would mostly in the 50s or even 60s?

I mean can we be sure that the ps5 at it’s settings arent requiring say 20gb of vram and that’s why it’s not in the 60s range mostly?
Because the game wqs coded purely for a playstation console. Insomniac know exactly what they are doing and what hardware they are working with. Nixxes is then having to port that code to PC.

I expect in the coming years PC will be in sonys mind from inception of the project and PC ports will improve. Does anyone remember Microsoft first ports into PC? Forza horizon etc had weird stutters and performance issues at first. This was fixed with patches and improved in the sequels.
 

Loxus

Member
3060 is not outperforming the 2070
ps5 is not outperforming the 2070

2070 is underperforming compared to all its equivalent-power hardware
even 3080 underperforms compared to 2080ti at 4K.

[/URL][/URL][/URL]

Practically, VRAM constrained cards cannot perform as they should. Clear as crystal.

vUDE31i.jpg


The reason 3080 also buckles at 4k is because game only uses a total of %80 available vram.
As a result:
8 GB cards have 6.4 GB budget,
10 GB cards have 8 GB budget (and 4K+ray tracing breaches the budget, hence it underperforms)
11 GB cards have 8.8 GB budget (still not enough budget, but better than 8)
12 GB cards have 9.6 GB budget (only proper budget that can truly match PS5's allocated memory)


VRAM constraint buckles 3080 to a point where it performs almost like a 2080ti. Whereas 3070 is constrainted too much that it drops a nearly %30 perf.

2070/2070s also drops a %30 perf. drop, which is the reason why it drops below 30s, and often hangs around 25 in his obnoxious video.

These problems %100 exists. Steam discussions are full of people who keep saying they keep having performance issues after 15 mins of playtime. The entire reason all reviewers seem to ignore this is because it would create a PR nightmare for NVIDIA. Even the 10 gig 3080 falter, to a point where it perfoms like a 2080ti.

All the issues community experiences stems from the exact same VRAM conundrum.
I don't know why it feels like your misleading this thread. It rarely ever gets in the 7GB usage on the 2070. Normally the game see the GPU has more memory, so it stores more (just in case it's needed) files on the GDDR instead of RAM.

It performs as it should. That extra 4GB GDDR doesn't make the 3060 perform way better than the 2070.
rt69hVS.jpg
fGLyDc9.jpg


With Ray-Tracing
FjknUE0.jpg
RD5jZBu.jpg


If this game was to be bottlenecked, it would be because of the CPU.
 

Corndog

Banned
Also the 3070 has to do more work than the ps5 in this game cause of the construction of this port which is why it’s weird they consider these gpus underutilized
I’m sure they have to build some middleware that bridges Sonys api(GNM?) to either directx or Vulcan. That in itself will introduce some overhead.
 

NXGamer

Member
the game is vram bottlenecked almost all the time if you enable RT with a 8 gig card (primarly because game limits itself to only use 6.4 GB of memory). then the game consntalty beings to use normal RAM as a substitute for VRAM

the test is all wrong, and he clearly failed to understand what made his 2070 perform like crap

not once in his video that he managed to make a comment regarding full vram utilization. the performance difference between 2070 and PS5 completely stems from running out of VRAM

er...did you even watch the video, I literally call this out and explain it, with even a chapter entitled memory within the video. I show and discuss the Vram issues, low textures, lower mips, how it is worse than PS5 and the bigger 16GB of the RX6800. Your post is nothing but confirming what my video covers.

I then discuss the VRAM to Sysram issue, bandwidth a data bound and state in the 750Ti what I said at last gen, 2GB is and was not going to cut it and 8GB will not now.

I get frustrated when people attack facts with no logic, your argument is, "Well if the GPU had more Vram than it does it would be performing better!"
Well yeah, of course, This argument (which it is clearly not) is if my Fiesta had a Ferrari engine, it would be able to beat a Porshe. I see this very pigeonhole logic a great deal in comments, and it misses the point of these tests and how tests should be. The fact is you cannot buy a 2070 or 3070 with anything other than 8GB, so this game, in this mode, in this card, it performs as shown. All stated clearly in the video, you are arguing the same old, it is not a fair test, this is not about that or should it ever be, this is about what and where is the PS5 performing. I do not see you and others here arguing that DF using a 12900K with a RX580 is not madness and completely off what would be a real system do you? My rig here is a real example of what will exists and is around the same level as the consoles target.

Even that aside, you are skipping the other modes with the flat Performance Mode NO RT having clear GPU bound points that still show a deficit to the PS5, when not CPU constrained.

in this specific shot, performance is dropped because of running out of VRAM, not because of "PS5 having better performance"
I mean just read this comment, the GPU that is performing worse here has nothing to do with the PS5 performing better???!!!!????

The GPU can and is Memory bound often, but NOT 100% of the time and is not the only reason, as noted above the 3060 with more Vram is not suddenly leaping ahead in performance.
 
Last edited:

rofif

Can’t Git Gud
the game is vram bottlenecked almost all the time if you enable RT with a 8 gig card (primarly because game limits itself to only use 6.4 GB of memory). then the game consntalty beings to use normal RAM as a substitute for VRAM

the test is all wrong, and he clearly failed to understand what made his 2070 perform like crap

not once in his video that he managed to make a comment regarding full vram utilization. the performance difference between 2070 and PS5 completely stems from running out of VRAM

in this specific shot, performance is dropped because of running out of VRAM, not because of "PS5 having better performance"

qvxpHqa.jpg



I actually proved it, in the exact same intro scene;


even high textures cause a performance drop. with low textures, I manage to get 56 frames. so 8 GB buffer cannot even fit high textures + ray tracing together at 4K without experiencing a performance drop

only at native 1440p, 8 GB+ very high textures are usable, but only in short burts of playtime. (not 4k+upscaling, 4k + upscaling uses 4k lods, therefore stresses vram further)

this is not to say NVIDIA and RTX cards are not to blame here. They are. Lack of VRAM causes these abnormalities, and opportunists like NX Gamer took a nice slice out of this problem for himself.

The entire reason game constantly hammers PCIE is not because how the engine is built, but because game artificially limits itself to only use a maximum of 6.4 GB VRAM and use normal RAM as a subsitute. It is not doing anything fancy: No game on PC should ever rely on transferring texture data constantly on the fly from RAM to VRAM. PCIE 4 wont solve that either, RAM to VRAM will always be slow, will always stall GPUs to worse performance and will always create more problems.

In this specific case, this happens becuase the said GPU does not have enough VRAM that game demands at 4K/RT+Very high textures. This config requires a solidly uninterrupted 10 GB BUFFER, which RX 6800 has, therefore does not run into similar performance issues.

If he actually had the courage to test and compare RX 6800 to PS5 with RT enabled in similar scenes, he would also see that PS5 is not "out performing" the other GPUs, as it would be decimated by the RX 6800. Notice how he cleverly avoided making a direct comparison between RX 6800 and PS5, it would undermine all of what he talked about in the very same video.

PS5 only "seemingly" destroys the 2070 because the buffer is at its limits.

Just my points and observations. You can do whatever you want with this info. I don't like the dude, but sadly 8 gigs of VRAM simply cannot enable features that PS5 can enable with its uncompromised 10 GB budget, which creates anomalies like this. And it too will continue to happen, especially if developers like Nixxes decides to artificially limit VRAM utilization to 6.4 GB, supposedly to leave "available VRAM" for background operations, even if you don't have nothing on the background

The RTX 3060 existing alone destroys the entire point of this video. If he tested 3060,

- It would match and exceed PS5 performance (unlike the 8 GB 2070)
- It would not cause PCIE to be hammered, since it has enough VRAM budget (unlike the 8 GB 2070)

Here's how 3060 performs at NATIVE 4k (similar to ps5's fidelity mode) with similar RT settings;



it gets 35-40 frames, just like PS5 does. look at his 2070 video, constantly having irregular frametimes, whereas 3060 is having a smooth ride all around.

as i said, it is indeed the 2070 being the problematic part here.

i have no idea how 8 gig rtx gpus will age, considering these events, I'd simply say stay away from them for your own good. wait for 12 GB 4070.

There is nothing controversial for ps5 to be faster than 2070.... and it's using 200W at that at 400$.
Above that pc's can scale better. I don't think anyone or nxg are arguing that.
Vram or not, ps5 is faster than 2070 in this case by a good margin.

3060 doesn't destroy anything. It's 500$ gpu alone, so of course you would expect it to match ps5 or be quite a bit faster ideally.
And even with best spec PC, there are still some texture streaming issues and few other tiny drawbacks. Nitpicking but still.

You make it sound, like NXg dropped some controversy here...
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
There is nothing controversial for ps5 to be faster than 2070.... and it's using 200W at that at 400$.
Above that pc's can scale better. I don't think anyone or nxg are arguing that.
Vram or not, ps5 is faster than 2070 in this case by a good margin.

3060 doesn't destroy anything. It's 500$ gpu alone, so of course you would expect it to match ps5 or be quite a bit faster ideally.
And even with best spec PC, there are still some texture streaming issues and few other tiny drawbacks. Nitpicking but still.

You make it sound, like NXg dropped some controversy here...
What is happening on PC, also why I like closed box consoles with fixed HW and concept of generations ( ;) ) as well as well documented and focused very low level access, is that it takes a lot longer for breakthroughs to happen in this kind of ecosystem where hardware variety, pace of iterations, and interoperability are key.

Direct Storage with GPU decompression acceleration is still months to a year or so away and tools for it that took forever to be declared stable from MS. Direct Storage without GPU acceleration is still not widely used by many devs if any at all.

You might have some modders or some brave new PC devs that will be early adopters and that is the fun of having a PC, but the downside of PC’s and mobiles (like Android or iOS devices too to some extent) is that sometimes it takes many more years for new disruptive tech to launch (as it needs to work with a huge HW variety and deal with/harmonise integration with many vendors) and when it launches it may not get much adoption if any at all.

If what some Xbox fans say is true that Xbox first party titles are still not using much of/pushing some of their system defining features, like Direct Storage, it would not be too too surprising in a phase where they are developed multiplatform with the PC in mind on day 1 and not ported to the PC a year or so later with a dedicated team.
 
Last edited:

Darius87

Member
making excuses because: "It doesn't have enough VRAM" so it's performs worse then PS5 version it's like saying PS5 performs worse then 3090 because it doesn't have enough compute.
it is what it is RAM is also hardware and in my view it's more valuable then computation, it's not sony fault that Nvidia or AMD made theyr cards with not enough VRAM for the future games, sony did and have gone beyond with SSD which equates with even more RAM.
 

hlm666

Member
alex doesnt have credibility on benchmarks besides emotionally verbally attacking people on both resetera or twitter who question his findings or tests (even if those people are proven right later) he never updates info and does way to much guesswork i would trust gamer nexus benchmarks before i trust any "benchmark" from alex and thats saying a lot. alex should stick to what he is good at which is discussing the construction of games (especially on the pc side) and tech implementations as well as engine limitations or deficiencies. this benchmarking and especially analyzing things like resolution, graphical settings, and differences have always not only been faulty but way to much "I think" or "I believe" in his statements than anything close to objective and thats without even mentioning the awful testing methods he does which adds so many variables like not testing each setup with the same parts and only having the gpu be different or any other assortment of variables like him using different drivers, os etc. im not saying nx gamer is perfect or even objective and he definitely shouldn't be the only source, but my god at least he is consistent and doesn't dip his toes into crap he has 0 clue or proficiency on and also nor does he attack people who disagree with him at least he shows how he came to his conclusions or findings unlike alex's I think
NXG does the exact same shit only more often, he royally screws up and does a twitter poll to say people can't see the difference so his mistakes don't matter............... yeh that's a better way to do things sure.
 
er...did you even watch the video, I literally call this out and explain it, with even a chapter entitled memory within the video. I show and discuss the Vram issues, low textures, lower mips, how it is worse than PS5 and the bigger 16GB of the RX6800. Your post is nothing but confirming what my video covers.

I then discuss the VRAM to Sysram issue, bandwidth a data bound and state in the 750Ti what I said at last gen, 2GB is and was not going to cut it and 8GB will not now.

I get frustrated when people attack facts with no logic, your argument is, "Well if the GPU had more Vram than it does it would be performing better!"
Well yeah, of course, This argument (which it is clearly not) is if my Fiesta had a Ferrari engine, it would be able to beat a Porshe. I see this very pigeonhole logic a great deal in comments, and it misses the point of these tests and how tests should be. The fact is you cannot buy a 2070 or 3070 with anything other than 8GB, so this game, in this mode, in this card, it performs as shown. All stated clearly in the video, you are arguing the same old, it is not a fair test, this is not about that or should it ever be, this is about what and where is the PS5 performing. I do not see you and others here arguing that DF using a 12900K with a RX580 is not madness and completely off what would be a real system do you? My rig here is a real example of what will exists and is around the same level as the consoles target.

Even that aside, you are skipping the other modes with the flat Performance Mode NO RT having clear GPU bound points that still show a deficit to the PS5, when not CPU constrained.


I mean just read this comment, the GPU that is performing worse here has nothing to do with the PS5 performing better???!!!!????

The GPU can and is Memory bound often, but NOT 100% of the time and is not the only reason, as noted above the 3060 with more Vram is not suddenly leaping ahead in performance.
First of all, in your video you grossly misrepresented the number of PCs with GPUs more powerful than a 2070(super).. You used percentages based off of concurrent daily users instead of total monthly users.. of which Steam has more than 135M as of last year. Same goes for CPUs which have 8 cores or more, which is around 20%..

15% of 135M = ~20M PCs with GPUs better than PS5
20% of 135M = ~27M PCs with 8core+ CPUs
65% of 135M = ~87M PCs with 16GB+ RAM

So it's fair to say that there's around ~20M PCs that are of equivalent or greater specs than PS5 used on a monthly basis on Steam.

About your point of people not arguing about Digital Foundry using a 12900K with a RX580... is because they are removing any and all CPU bottlenecks out of the equation to test pure GPU performance... which is what people who test specific pieces of hardware do.... They're very clear about what they are testing. Your method of testing a "system" and not a particular component by isolating it... is what leads everyone to constantly mention how you're bottlenecking your system. I mean, even here, you KNOW that your configurations are bottlenecked in SOME way, whether that's CPU, or VRAM, or a combination. Your justification to continue use them because they're "around the same level as consoles" doesn't change the fact that you're bottlenecking performance and that you know you are.

Finally, call your damn 2070 what it is... it's a 2070. The fact that you go out of your way to OC it (which still doesn't match 2070s) and continually present it as "2070super-like" is because you want to present the PS5 to be just thaaaaaat much more impressive. Cmon now.. it's things like that which cause people to say the things they do.

/edit\ Also, the DRS in this game is fucky and can often cause performance fluctuation performance issues where there are none without it. Most of your testing is done with DRS active... and it doesn't work the same as it does on consoles, so the testing is "eh" to begin with.
 
Last edited:

NXGamer

Member
First of all, in your video you grossly misrepresented the number of PCs with GPUs more powerful than a 2070(super).. You used percentages based off of concurrent daily users instead of total monthly users.. of which Steam has more than 135M as of last year. Same goes for CPUs which have 8 cores or more, which is around 20%..

15% of 135M = ~20M PCs with GPUs better than PS5
20% of 135M = ~27M PCs with 8core+ CPUs
65% of 135M = ~87M PCs with 16GB+ RAM

So it's fair to say that there's around ~20M PCs that are of equivalent or greater specs than PS5 used on a monthly basis on Steam.

About your point of people not arguing about Digital Foundry using a 12900K with a RX580... is because they are removing any and all CPU bottlenecks out of the equation to test pure GPU performance... which is what people who test specific pieces of hardware do.... They're very clear about what they are testing. Your method of testing a "system" and not a particular component by isolating it... is what leads everyone to constantly mention how you're bottlenecking your system. I mean, even here, you KNOW that your configurations are bottlenecked in SOME way, whether that's CPU, or VRAM, or a combination. Your justification to continue use them because they're "around the same level as consoles" doesn't change the fact that you're bottlenecking performance and that you know you are.

Finally, call your damn 2070 what it is... it's a 2070. The fact that you go out of your way to OC it (which still doesn't match 2070s) and continually present it as "2070super-like" is because you want to present the PS5 to be just thaaaaaat much more impressive. Cmon now.. it's things like that which cause people to say the things they do.
OK, the figures were based on the concurrent users, as I noted in the video. And then stated Approximate and showed the stats from the August Survey, but yes, for arguments sake, let's take all 135million users as all unique (they are not) as you state that is still ~20mil PC's around or above PS5/SX. So that is already 10 million smaller target base for PC market, at best. The point in the video was not the numbers per se, but the target baseline for Multi-Platform titles moving forward, again all in the video.

Second, I test that also and even drop resolutions to 720 for CPU tests etc etc, nothing new here. But come on mate, they found a 4GB 750Ti to test a 750Ti, you being serious that is not some specific angle?

My point is not to test the GPU in this title, but the GAME with multiple configs. Even if I stuck in a 5950X in that rig at the same Fidelity matched Dynamic 4K results as shown the 2070 would not change and that IS testing the GPU. I clearly call that out in the video saying ONLY that mode is a GPU test.

Again, all I have read above is, my test is valid, you agree, but not fair as it paints an unfair light of PC as others have better hardware?
 
Last edited:

Hydroxy

Member
This is such a great port. I have locked it to 40fps in Nvidia control panel, on high settings at 1080p on quality fsr it most of the times remains at 40fps on my Nvidia 1650 laptop
 

hlm666

Member
OK, the figures were based on the concurrent users, as I noted in the video. And then stated Approximate and showed the stats from the August Survey, but yes, for arguments sake, let's take all 135million users as all unique (they are not) as you state that is still ~20mil PC's around or above PS5/SX. So that is already 10 million smaller target base for PC market, at best. The point in the video was not the numbers per se, but the target baseline for Multi-Platform titles moving forward, again all in the video.
Can you explain how these dgpu shipments since 2021 equal only 20 million gpus stronger than a ps5 at best? even though turing ceased production before ampere even released sep 2020, im not sure when rdna stopped. Basing the whole pc userbase off steam seems a little broken no? CoD launches on blizzards thing, Whatever is popular on pc on epic launches over there right, gamepass same deal however many people game on that. Apex addicts would still just launch through EA, maybe some moved to steam but im guessing alot of those steam users were new. Not everyone with a gaming pc is launching steam every month and we don't even know the amount that when the dialog pops up just hit cancel (I do screw giving companies my data if they don't compensate me for it).


wfSNSdFsUnhBC64WYjasMT.png
 

NXGamer

Member
Can you explain how these dgpu shipments since 2021 equal only 20 million gpus stronger than a ps5 at best? even though turing ceased production before ampere even released sep 2020, im not sure when rdna stopped. Basing the whole pc userbase off steam seems a little broken no? CoD launches on blizzards thing, Whatever is popular on pc on epic launches over there right, gamepass same deal however many people game on that. Apex addicts would still just launch through EA, maybe some moved to steam but im guessing alot of those steam users were new. Not everyone with a gaming pc is launching steam every month and we don't even know the amount that when the dialog pops up just hit cancel (I do screw giving companies my data if they don't compensate me for it).

[/URL]

wfSNSdFsUnhBC64WYjasMT.png
Easy
Load Up Gold Rush GIF by Discovery
200.gif
 
I do not see you and others here arguing that DF using a 12900K with a RX580 is not madness and completely off what would be a real system do you? My rig here is a real example of what will exists and is around the same level as the consoles target.

Exactlty. And he is constantly doing those kind of comparisons: Using that CPU to test only the GPU on PC and claiming it's a fair benchmark against PS5 GPU (not CPU + GPU). How can people not see how flawed the test is when we actually know PS5 CPU is about Zen 1700x or 2700 ? And we also know (thanks to them) that only 6.5 CPU cores are dedicated to the game on PS5.
 

yamaci17

Member
I don't know why it feels like your misleading this thread. It rarely ever gets in the 7GB usage on the 2070. Normally the game see the GPU has more memory, so it stores more (just in case it's needed) files on the GDDR instead of RAM.

It performs as it should. That extra 4GB GDDR doesn't make the 3060 perform way better than the 2070.


With Ray-Tracing


If this game was to be bottlenecked, it would be because of the CPU.


it does. hw texts are done with high textures and not tested extensively like PCGH did.

with PCGH extended tests, 3080 chokes so much it performs like a 2080ti
 

yamaci17

Member
I thought I was bringing up rhe cpu in this game cause it’s arguably the most important triple a game released recently besides cyberpunk that has heavy cpu utilization and that in benchmarks that matters a lot this isn’t like death stranding where there isn’t a radical change going up you can easily lose 10+ frames depending on your cpu in this game which is why pairing the gpu with an inequitable cpu is off to say the least we wouldn’t bench Spider-Man on a 3090 with a 12900k but then do that same benchmark on a 3060 with a 12600k we would of course use the same cpu. I do agree it’s fair to use a cpu slightly better than the one in the ps5 but we absolutely should not be using a 2 or especially 3x better cpu than the one in the ps5 that makes things ineuqivalent and adds a needless variable. I’ve personally never really bought that the 3070 matches the 2080 ti that always felt like nvidia marketing propped up by other sources it can only perform like a 2080 ti in certain dlss and rt benchmarks but basically never in rasterization and usually falls between the ti and súper. I’ve always felt like the 3070ti is a closer match that at least equals and sometimes surpasses the 3070 ti which is why I disagree with you bringing that gpu up (it’s not a flex to say the 2080ti outperforms thats the expected results and it not doing that would only look bad on the gpu)
dude cmon. you're just outstreching things here. even at worst case scenario, when not constrainted by VRAM, 2080ti is only %4-7 faster than 3070. in spiderman's case, gap widens to %28. i have literal PROOF that 2080ti performs like a 3080 at 4k with ray tracing, and outperforms 3070 by %30, yet you, nx gamer, and the other dude refuses to acknowledge it

as long as you don't respect PCGH's benchmark and give it proper credit, I won't discuss with any of them, you included, further

go do your own tests with VERY HIGH textures at 4k with ray tracing with 10+ minutes of play testing if you want to decredit PCGH's findings. i dont care about hardware unboxed benchmarks because 1) they're not done extensively 2) they're done not with very high textures


1440p ray tracing;
Nr3AxIy.jpg


4k ray tracing

vdo4StG.jpg



anyone with above 70 iq should understand this is a huge VRAM constrainment yet we still have people like nxgamer saying "more memory wont get 3060 more performance hurr durr". no shit. 2070 is underperforming. 3060 is not getting more performance

"There is nothing controversial for ps5 to be faster than 2070.... and it's using 200W at that at 400$."

says the guy who turns every situation like this into a controversy aimed at PC platform. the entire topic is controversy created NX gamer and his false findings. yes he did. he very much did drop controversy here. he dabbles things which he does not understand.

he thinks he "talked" about memory situations, yet he still doesn't understand. none of the issues he talks about happens when the GPU has enough budget. texture streaming issues, PCIE hammering, all of them are caused by huge VRAM bottleneck. what he fails to understand that he somehow believes and thinks it happens by design, therefore PC is faulty here. it is not. if you have a 329 buck MSRP 3060, none of that "design" happens. the game runs just like any other game runs when you jave enough VRAM. simple as that.

nx gamer, you're simply a manipulator who manipulated VRAM bottleneck scenario into your weird scenarios. the memory things you talked about has nothing to do with memory situation I'm talking about. this is a VRAM bottleneck through and through. and solution for 8 gig cards is not to reduce resolution or settings, but to reduce texture quality. You really fail to understand, if you're in such a situation, you won't simply play with those very high textures. You will drop to high instead, and get huge performance back. No one will give up %50+ more GPU performance to "stalls" created by constant RAM to VRAM transaction on PC space.

Also, 2 GB was pitted against the 8 GB of total budget the consoles back then had, which had 5.5 GB available to games, and most likely 3.5-4 GB available to GPU operations and in this situation 8 GB is pitted against the 16 GB of total budget consoles have now, which most likely have 10 GB available to GPU operations. 2 GB to 3.5 GB is a long stretch but 8 GB to 10 GB isn't. You have to think with percentages here, if you were a logical person you would, but since you're not, you cannot see past your ignorancy. 3.5 GB is %75 more than 2 GB, which is huge, but 10 GB is only %25 more than 8 GB. The entire generation, 1060 3 GB managed to run almost all console ports fun, even with ultra textures, actually and 4 GB GPUs still plays every game okay, with console matched setting. Saying 8 GB will die is like saying 4 GB died last generation. The 2 GB you're talking about matches the current gen 4 GB cards which rightfully age similar to 2 GB from now on.

The fact that you're trying to bridge the 2 GB GPUs to 8 GB GPUs is also hugely laughable. The entire generation 4 GB GPUs did fine, and 8 GB GPUs will do fine too, with small texture reductions here and there. Even in this game, dropping from very high to high textures greatly solves all kinds of issues you can come across. This is not even a debate. It is just a debate where met with very high textures, 8 GB buffer is not enough.
 

yamaci17

Member
There is nothing controversial for ps5 to be faster than 2070.... and it's using 200W at that at 400$.
Above that pc's can scale better. I don't think anyone or nxg are arguing that.
Vram or not, ps5 is faster than 2070 in this case by a good margin.

3060 doesn't destroy anything. It's 500$ gpu alone, so of course you would expect it to match ps5 or be quite a bit faster ideally.
And even with best spec PC, there are still some texture streaming issues and few other tiny drawbacks. Nitpicking but still.

You make it sound, like NXg dropped some controversy here...
no, you still don't get it. if you use high textures with all other settings matched with PS5, 2070 will match PS5 performance.

that "good margin" is created by a huge VRAM bottleneck.

you can argue all you want. texture scaling settings are there for a reason.
 
Last edited:

yamaci17

Member
I don't know why it feels like your misleading this thread. It rarely ever gets in the 7GB usage on the 2070. Normally the game see the GPU has more memory, so it stores more (just in case it's needed) files on the GDDR instead of RAM.

It performs as it should. That extra 4GB GDDR doesn't make the 3060 perform way better than the 2070.


With Ray-Tracing


If this game was to be bottlenecked, it would be because of the CPU.

time and TIME again I explained that GAME caps out at %80 VRAM utilization. Why do you people refuse to understand?

I have literal video proof that performance skyrockets once textures are set to High. How can you deny such concrete proofs and still go on about your own agendas?



44:05

"5.8" GB VRAM usage
13 GB RAM usage (3.5 gb of memory used as vram)
39 FPS at %99 utilization (actually, GPU is heavily stalling)

44:23

"5.2 GB" VRAM usage
11 GB RAM usage (substituion memory is no longer used)
60 FPS at %80 utilizatio (stalls are gone, GPU performs like it should)

Seeing "6.5-7 GB usage" is no meaningful metric in this game. If you don't have enough budget the game asks for with very high textures, you will run into the exact same issue where you lose %60-80 of your GPU performance. The game demands 9.5 GB+ for very high textures at 4K. simple. as. that. if you don't have it, you will have performance issues. end of the discussion

If not a single one of you acknowledges the situation happening in the video, I won't reply further. simple as that. explain how CPU or any other bottlenecks may cause a %70 performance difference between high and very high textures and then we can discuss. Until then, I'm off. It is impossible to discuss anything sensible with you people.

I have literal proof that very high textures cause a performance drop with a 8 Gig card

 
Last edited:

Darius87

Member
About your point of people not arguing about Digital Foundry using a 12900K with a RX580... is because they are removing any and all CPU bottlenecks out of the equation to test pure GPU performance... which is what people who test specific pieces of hardware do.... They're very clear about what they are testing. Your method of testing a "system" and not a particular component by isolating it... is what leads everyone to constantly mention how you're bottlenecking your system. I mean, even here, you KNOW that your configurations are bottlenecked in SOME way, whether that's CPU, or VRAM, or a combination. Your justification to continue use them because they're "around the same level as consoles" doesn't change the fact that you're bottlenecking performance and that you know you are.
they removing bottlenecks for PC while comparing to PS5 :messenger_grinning_smiling: in which you can't remove bottlenecks and of course CPU which costs like a PS5 or more beats PS5 that's what DF wan't to see, so the most fair comparisson should be PS5 vs PC(closest spec to PS5) without any bottleneck free parts to see what PC is needed or not to match PS5, in other case remove consoles from comparisson or be proud as DF and act how 1000$ PC is beating 500$ console.
 

rofif

Can’t Git Gud
no, you still don't get it. if you use high textures with all other settings matched with PS5, 2070 will match PS5 performance.

that "good margin" is created by a huge VRAM bottleneck.

you can argue all you want. texture scaling settings are there for a reason.
It really doesn't matter. 2070 is what it is. I don't want to lower the textures
 
Top Bottom