• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[NxGamer] Spider-Man Remastered PC vs PS5 vs Steam Deck vs PS4 vs 750Ti - Technical Review & Comparison

SlimySnake

Flashless at the Golden Globes
I don't know what to say about NXG review.

He pushes the PCIe bandwidth narrative, but he's testing with a 2070, that has neither ReBar, nor PCIe 4.0. Then he's continuing with an RX 6800 and R5 3600, but it's been shown RDNA2 benefits the most from Zen3 or Intel 11th gen or newer for ReBar.

He's also testing 2070 + R7 2700X. All fine and good, but Turing and Ampere kinda hate pre-Zen3 CPUs for whatever reason. Testing a 5700 XT for pure rasterized with 2700X or 6600 XT and 2700X would've been a more accurate take on the situation.

Then again, with how heavy the game streams and decompresses data, it seems that it completely overwhelms/invalidates RDNA2 Infinity Cache and the cards start acting like pure GDDR6 cards.

What a conundrum.
His testing methodology drives me mad. He keeps pairing the 2700 with his 2070 holding back the 2070 in virtually every single comparison he's made in the last two years. Now he's trying to say that his CPU is in the top 10%. Come on.

His 2070 OC is not a 2070 Super either. I just dont understand why he keeps saying this when we can see the OC clocks stay at 1950 Mhz during gameplay. The 2070 is a 36 CU GPU, 2070 Super is 40 CUs with the same damn clocks. The OC is simply not on par with the 2070.

Here is a 2070 Super paired with a 3600. You can see the clocks there. 1950-1980 GHz during gameplay. The card retains its 40 CU advantage. Thats 8.9 tflops of his OC card vs 9.9 tflops of a 2070 Super at 1.95 GHz.



With all that said, using the 2700 to gauge PS5 GPU performance is probably better than using a better CPU like what Alex uses. Both of them incorrectly use the excuse that these games are not CPU bottlenecked, but we have seen the same cards perform way better when paired with a better CPU than the 2700. Ive asked NXGamer to use the 2070 in his 3600 PC, but he refuses to do so. Regardless, the PS5 CPU is roughly on par with the 2700 so it stands to reason that its GPU is being held back for the reasons NX Gamer's 2070 is not performing as well.

I think NX Gamer's PS5 to PC comparisons would be far more accurate if he started using CPUs and GPUs that are more or less equivalent to the PS5. He's half way there with the CPU, he simply needs to go out and get the 10.6 tflops 6600xt. Or the 13.1 tflops 6700xt and downclock it to hit 10.23 tflops. Comparing an 8.9 tflops Nvidia card to a 10.2 PS5 GPU just doesnt make sense. Neither does comparing it to a 6 core 12 thread 3600 paired up with a 16 tflops 6800.

Alex is completely fucking hopeless, but NX Gamer is very close to getting these things right. I saw the 6600xt for just $250 the other day. If these fools dont buy it, i might go and grab it myself. But my 8 core 16 thread CPU runs at 4.9-5.1 GHz so it wont be an accurate comparison.
 

Mr Moose

Member
I will always believe DF over anyone else

vKO3cdf.jpg

"The biggest upgrade of all? Hardware-accelerated ray tracing support. I'm happy to see a range of granular settings here, but very high exceeds the high setting typically used by PS5. Building geometry boiled down into flat textures on consoles are fully modelled on PC, texture quality is on an altogether different level quality, while the amount of associated world detail - and shadows - also get fully reflected, sometimes where PS5 has no reflections at all. Nixxes also offers PC users the chance to push out draw distance on objects/crowds/traffic beyond the console standard - and that ties into the level of detail/crowd/traffic settings I've already discussed. Just remember that the more you push RT, the higher the load on both GPU and CPU.

The biggest upgrade offered by PC concerns ray tracing, where we're looking at a night and day improvement in every single way.
Very high vs high (PS5). DFs "optimised settings" are high, the same as PS5. You need a beefy GPU/CPU if you want to go the extra mile.
Here is a 2070 Super paired with a 3600. You can see the clocks there. 1950-1980 GHz during gameplay. The card retains its 40 CU advantage. Thats 8.9 tflops of his OC card vs 9.9 tflops of a 2070 Super at 1.95 GHz.


That's using DLSS.
 

SlimySnake

Flashless at the Golden Globes
That's using DLSS.
I know. I am not using that video as a comparison to his results. Im pointing out the average clocks when running this game on a 2070 Super. They are identical to his 2070 OC's clocks. You do the math and his 2070 is NOT 9.4 tflops during gameplay when its averaging 1950 mhz. it's 8.9 vs 9.9 tflops of the 2070.
 

Mr Moose

Member
I know. I am not using that video as a comparison to his results. Im pointing out the average clocks when running this game on a 2070 Super. They are identical to his 2070 OC's clocks. You do the math and his 2070 is NOT 9.4 tflops during gameplay when its averaging 1950 mhz. it's 8.9 vs 9.9 tflops of the 2070.
It's 8.8-9.3TF.
8.8 1920MHz, 9.3 2025MHz (from the quick check I did, I will try to see what the max MHz is).
Seems to be 2010-2025MHz during gameplay.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
I found some screenshots i had taken in 2020 comparing the Pedastrian density in Performance vs Fidelity mode. The insane amount of pedestrians in Fidelity mode was the first thing that jumped out at me. I was in a fight and throwing around a bunch of mailboxes and trash cans and they all reacted to them very convincingly. I dont think ive seen 100+ pedestrians in a city street at once since AC Unity. It was extremely impressive even in Performance modes.

What Alex is doing here is bizarre. This is not equivalent to PS5's performance mode. The NPCs and traffic density should both be hitting the CPU hard and even the GPU since they all cast shadows and in vehicle's case they all cast RT reflections. How the fuck are these optimized settings anyway when you are downgrading the image and entire experience so severely. The whole point of optimized settings is to have the image looking virtually the same while offering cutbacks you wouldnt even notice.

Fidelity:
EsoI5tdXAAISvOz

Performance:

EsoI0BFXUAU_ccO


Alex:
jOISPZt.png
 

SlimySnake

Flashless at the Golden Globes
Seems to be 2010-2025MHz during gameplay.
Where? Ive timestamped the gameplay for you while hes flying around. It's 1950 Mhz locked. The drops to 1860 mhz when he enters cutscenes only jumping above 1950 mhz in some very rare cutscenes.

During gameplay it is always 1950 mhz. The 2070 Super is the same 1950 mhz with jumps over that. This is true for virtually every single 20 Series cards. I had a 2080. Ive seen it go over 2000 mhz in some cases but on average it is a 1950 mhz.

 

Mr Moose

Member
Where? Ive timestamped the gameplay for you while hes flying around. It's 1950 Mhz locked. The drops to 1860 mhz when he enters cutscenes only jumping above 1950 mhz in some very rare cutscenes.

During gameplay it is always 1950 mhz. The 2070 Super is the same 1950 mhz with jumps over that. This is true for virtually every single 20 Series cards. I had a 2080. Ive seen it go over 2000 mhz in some cases but on average it is a 1950 mhz.




With 1860-2025 it would put it roughly between 8.6-9.3. The Super would be roughly 10.
Is that where this pic is from? Alex? Deserted as F.
 
Last edited:

ChiefDada

Gold Member
And he mentions the Ps5 HUMA memory system is some significant advantage.
Huma was how the ps4 + x1 did memory and yet it didn't really prove that advantageous compared to comparable PCs back then.


Well yes that's because consoles were still behind the curve with HDD, CPU was garbage, and GPU was ok. So unified memory setup with specific benefits wasn't enough to close the gap in any meaningful way.

Now that console SSD I/O setup is currently superior, and CPU/GPU is respectable, these factors now matter and make comparisons interesting (at least for the here and now).
 

winjer

Gold Member
His testing methodology drives me mad. He keeps pairing the 2700 with his 2070 holding back the 2070 in virtually every single comparison he's made in the last two years. Now he's trying to say that his CPU is in the top 10%. Come on.

His 2070 OC is not a 2070 Super either. I just dont understand why he keeps saying this when we can see the OC clocks stay at 1950 Mhz during gameplay. The 2070 is a 36 CU GPU, 2070 Super is 40 CUs with the same damn clocks. The OC is simply not on par with the 2070.

Here is a 2070 Super paired with a 3600. You can see the clocks there. 1950-1980 GHz during gameplay. The card retains its 40 CU advantage. Thats 8.9 tflops of his OC card vs 9.9 tflops of a 2070 Super at 1.95 GHz.



With all that said, using the 2700 to gauge PS5 GPU performance is probably better than using a better CPU like what Alex uses. Both of them incorrectly use the excuse that these games are not CPU bottlenecked, but we have seen the same cards perform way better when paired with a better CPU than the 2700. Ive asked NXGamer to use the 2070 in his 3600 PC, but he refuses to do so. Regardless, the PS5 CPU is roughly on par with the 2700 so it stands to reason that its GPU is being held back for the reasons NX Gamer's 2070 is not performing as well.

I think NX Gamer's PS5 to PC comparisons would be far more accurate if he started using CPUs and GPUs that are more or less equivalent to the PS5. He's half way there with the CPU, he simply needs to go out and get the 10.6 tflops 6600xt. Or the 13.1 tflops 6700xt and downclock it to hit 10.23 tflops. Comparing an 8.9 tflops Nvidia card to a 10.2 PS5 GPU just doesnt make sense. Neither does comparing it to a 6 core 12 thread 3600 paired up with a 16 tflops 6800.

Alex is completely fucking hopeless, but NX Gamer is very close to getting these things right. I saw the 6600xt for just $250 the other day. If these fools dont buy it, i might go and grab it myself. But my 8 core 16 thread CPU runs at 4.9-5.1 GHz so it wont be an accurate comparison.


It's much worse than that. As shown in previous benchmarks he has done, his PC has serious problems with performance. With very low performance for his 2700X and 3600, but also because his system uses much more ram.
Alex testing has problems and he is very biased. but his testing methods are not as fucked up as NXGamer. Not even close.
I don't understand why people still post his crap analysis, knowing that his PC is completely broken.
 
I will always believe DF over anyone else

vKO3cdf.jpg

"The biggest upgrade of all? Hardware-accelerated ray tracing support. I'm happy to see a range of granular settings here, but very high exceeds the high setting typically used by PS5. Building geometry boiled down into flat textures on consoles are fully modelled on PC, texture quality is on an altogether different level quality, while the amount of associated world detail - and shadows - also get fully reflected, sometimes where PS5 has no reflections at all. Nixxes also offers PC users the chance to push out draw distance on objects/crowds/traffic beyond the console standard - and that ties into the level of detail/crowd/traffic settings I've already discussed. Just remember that the more you push RT, the higher the load on both GPU and CPU.

The biggest upgrade offered by PC concerns ray tracing, where we're looking at a night and day improvement in every single way.

Exactly. What's the point in making all those comparisons if you can't choose such a low RT like on PS5?

The diff is staggering:
vKO3cdf.jpeg
 

Mr Moose

Member
It's much worse than that. As shown in previous benchmarks he has done, his PC has serious problems with performance. With very low performance for his 2700X and 3600, but also because his system uses much more ram.
Alex testing has problems and he is very biased. but his testing methods are not as fucked up as NXGamer. Not even close.
I don't understand why people still post his crap analysis, knowing that his PC is completely broken.
Forgot to enable XMP.
Exactly. What's the point in making all those comparisons if you can't choose such a low RT like on PS5?

The diff is staggering:
vKO3cdf.jpeg
High vs very high.
 

winjer

Gold Member
Forgot to enable XMP.

Considering the gap in performance and his memory usage, probably much more than that. Maybe a system filled with boatware, or running several programs in the background. Or even an old broken installation of Windows.
When he still participated here some users tried to help him to fix his PC. But he refused saying everyone else was wrong.
But regardless, his results with his PC are all invalid and should not be taken seriously.
The guy is very incompetent st benchmarking.
 

Sleepwalker

Member
It's much worse than that. As shown in previous benchmarks he has done, his PC has serious problems with performance. With very low performance for his 2700X and 3600, but also because his system uses much more ram.
Alex testing has problems and he is very biased. but his testing methods are not as fucked up as NXGamer. Not even close.
I don't understand why people still post his crap analysis, knowing that his PC is completely broken.
Everyone knows why.
 
Again it shows how biased DF are. So if I understood correctly they have compared PC uncapped vs PS5 capped while completely disregarding the unstable framerate (sub 60fps drops) on PC with a bit of cherry picking with the selected scenes.
It's borderline dishonest. While on the other hand NXGamer is doing the most fair comparison as usual. Like in Death stranding he found out the PS5 was often near 3700 level.
Alex already shows his bias when he is doing gpu benchmarks against console without a console equivalent cpu
 
I will always believe DF over anyone else

vKO3cdf.jpg

"The biggest upgrade of all? Hardware-accelerated ray tracing support. I'm happy to see a range of granular settings here, but very high exceeds the high setting typically used by PS5. Building geometry boiled down into flat textures on consoles are fully modelled on PC, texture quality is on an altogether different level quality, while the amount of associated world detail - and shadows - also get fully reflected, sometimes where PS5 has no reflections at all. Nixxes also offers PC users the chance to push out draw distance on objects/crowds/traffic beyond the console standard - and that ties into the level of detail/crowd/traffic settings I've already discussed. Just remember that the more you push RT, the higher the load on both GPU and CPU.

The biggest upgrade offered by PC concerns ray tracing, where we're looking at a night and day improvement in every single way.
That’s a 3090 my guy not pc in general you would hope there is some differences between a 3090 and ps5
 
An unfair comparison since it was optimized for 2060. Both consoles generally perform higher than a 2060 on other titles. Even goes as far as 2070 super if it’s more raster heavy and less RT is used.
It’s always at least a 2070 super in pure rasterization performance it’s in rt where it drops to the 2070 super range
 
They get accused of going both ways and maybe there is some bias but with the exception of John I think it's mostly incompetence.
In this case I think they straight up forgot about the ps5 vrr modes and in Alex case he deflated the settings as much as possible like crowd and traffic density
 
3070 is interesting, but this could be due to the ray tracing tech that was designed to take advantage of the PS5 hardware. Pretty much every other ray tracing game offers way better performance than even the 2080 which is on par if not better than the PS5 in standard rasterization games.
2080 is usually equal to ps5 in pure rasterization at least in game’s that aren’t nvidia sponsored
 
I don't know what to say about NXG review.

He pushes the PCIe bandwidth narrative, but he's testing with a 2070, that has neither ReBar, nor PCIe 4.0. Then he's continuing with an RX 6800 and R5 3600, but it's been shown RDNA2 benefits the most from Zen3 or Intel 11th gen or newer for ReBar.

He's also testing 2070 + R7 2700X. All fine and good, but Turing and Ampere kinda hate pre-Zen3 CPUs for whatever reason. Testing a 5700 XT for pure rasterized with 2700X or 6600 XT and 2700X would've been a more accurate take on the situation.

Then again, with how heavy the game streams and decompresses data, it seems that it completely overwhelms/invalidates RDNA2 Infinity Cache and the cards start acting like pure GDDR6 cards.

What a conundrum.
Why would he not test a GPU benchmark with a zen 2 cpu like the ps5 one. Do we test 3090s with 12900k vs a 3050 with a 12600k?
 

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
Lol and he wants to appear professional when the clown doesn't even have a 2070 super.
 
His testing methodology drives me mad. He keeps pairing the 2700 with his 2070 holding back the 2070 in virtually every single comparison he's made in the last two years. Now he's trying to say that his CPU is in the top 10%. Come on.

His 2070 OC is not a 2070 Super either. I just dont understand why he keeps saying this when we can see the OC clocks stay at 1950 Mhz during gameplay. The 2070 is a 36 CU GPU, 2070 Super is 40 CUs with the same damn clocks. The OC is simply not on par with the 2070.

Here is a 2070 Super paired with a 3600. You can see the clocks there. 1950-1980 GHz during gameplay. The card retains its 40 CU advantage. Thats 8.9 tflops of his OC card vs 9.9 tflops of a 2070 Super at 1.95 GHz.



With all that said, using the 2700 to gauge PS5 GPU performance is probably better than using a better CPU like what Alex uses. Both of them incorrectly use the excuse that these games are not CPU bottlenecked, but we have seen the same cards perform way better when paired with a better CPU than the 2700. Ive asked NXGamer to use the 2070 in his 3600 PC, but he refuses to do so. Regardless, the PS5 CPU is roughly on par with the 2700 so it stands to reason that its GPU is being held back for the reasons NX Gamer's 2070 is not performing as well.

I think NX Gamer's PS5 to PC comparisons would be far more accurate if he started using CPUs and GPUs that are more or less equivalent to the PS5. He's half way there with the CPU, he simply needs to go out and get the 10.6 tflops 6600xt. Or the 13.1 tflops 6700xt and downclock it to hit 10.23 tflops. Comparing an 8.9 tflops Nvidia card to a 10.2 PS5 GPU just doesnt make sense. Neither does comparing it to a 6 core 12 thread 3600 paired up with a 16 tflops 6800.

Alex is completely fucking hopeless, but NX Gamer is very close to getting these things right. I saw the 6600xt for just $250 the other day. If these fools dont buy it, i might go and grab it myself. But my 8 core 16 thread CPU runs at 4.9-5.1 GHz so it wont be an accurate comparison.

If we are talking about being held back how do we know the zen 2 in ps5 isn’t holding the gpu back? For all we know we could get much better performance out of the ps5 gpu if it had zen 3 in jt
 

Sosokrates

Report me if I continue to console war
Well yes that's because consoles were still behind the curve with HDD, CPU was garbage, and GPU was ok. So unified memory setup with specific benefits wasn't enough to close the gap in any meaningful way.

Now that console SSD I/O setup is currently superior, and CPU/GPU is respectable, these factors now matter and make comparisons interesting (at least for the here and now).

Yeah not currently, but with direct storage and modern PC tech I think you will get similar results, probably better with the best tech.
The thing is though we have already know there can be some advantages in closed systems, but it's going to be trivial.

At the end of the day a modern PC comparable to the PS5 is going to give a very similar experience. The PS5 does not have custom implementations or hardware which are going to cause a significant performance improvement across the board, NX even says it's not going to be across all games that the PS5 has a performance advantage, so I don't really know why he's bringing it up.
 

SlimySnake

Flashless at the Golden Globes
If we are talking about being held back how do we know the zen 2 in ps5 isn’t holding the gpu back? For all we know we could get much better performance out of the ps5 gpu if it had zen 3 in jt
Thats precisely why they should be using a zen 2 CPU in their testing. I only suggested the 2700 because the PS5 CPU has a cut down version of their zen 2 8 core 16 thread CPU. It only has 8 MB of cache which limits its performance profile. So it performs like a zen 1 CPU like the 2700 instead of the zen 2 3000 series CPUs like the 3600.

there was a leak of a PS5 SOC a few years ago and someone ran some benchmarks which indicated that PS5 CPU performing roughly equivalent to a 2700.
 

ChiefDada

Gold Member
Yeah not currently, but with direct storage and modern PC tech I think you will get similar results, probably better with the best tech.
The thing is though we have already know there can be some advantages in closed systems, but it's going to be trivial.

I don't think the PC/PS5 delta is trivial here; I didn't think a cross gen based game such as Spiderman was going to reveal the PC memory shortcomings in such a drastic way. But if you think it's trivial, hey we all can have different opinions.

At the end of the day a modern PC comparable to the PS5 is going to give a very similar experience. The PS5 does not have custom implementations or hardware which are going to cause a significant performance improvement across the board, NX even says it's not going to be across all games that the PS5 has a performance advantage, so I don't really know why he's bringing it up.

Yeah sure but he also says the exact opposite for 1st party - he expects PC real world performance to be worse next to PS5 as the generation continues, particularly with Sony 1st party games that eventually make their way to PC. That is why he predicts a long period between PS5 game and subsequent PC release. I tend to agree with him. If I cared much for multi platform, I probably wouldn't own a PS5. I think most PC players are more interested in Sony 1st party roadmap of PC ports than any others.
 
I really laughed when he figured that ~3M PC gamers have higher end GPUs than the 2070... lmao.

lol.png


He's using DAILY CONCURRENT ACTIVE USERS... and using the percentages based off of that... Not the MONTHLY ACTIVE users of Steam.. which is ~135 million. The Steam hardware survey is done monthly.

There's about 15% of Steam utilizing hardware better than a RTX2070... 15% of 135M = ~20M users with GPUs beating the PS5... and that's RTX alone, not counting AMD at all..
 

Sosokrates

Report me if I continue to console war
PC memory shortcomings
The thing is though we dont know if its that, it could just be more work is needed from the development side.
Yeah sure but he also says the exact opposite for 1st party - he expects PC real world performance to be worse next to PS5 as the generation continues, particularly with Sony 1st party

If sony first party taylor a game to the PS5 of course its not going to run as well on PC, and sony probably wont go the extra mile to optimise a PC further, that does not mean the PS5 has some inheritly superior hardware in it, that cant be matched or surpassed with some slightly more powerful PC hardware.

The thing that is funny about NX he thinks that the PS5 is built in a way which gives it a significant advantage over comparable PC hardware.
And while I agree consoles do have some slight advantages over PC in real world performance its trivial.

The guy really just wants to be redeemed for saying the PS5 is something special when it was first revealed. The guy has a pretty big ego.
 
I really laughed when he figured that ~3M PC gamers have higher end GPUs than the 2070... lmao.

lol.png


He's using DAILY CONCURRENT ACTIVE USERS... and using the percentages based off of that... Not the MONTHLY ACTIVE users of Steam.. which is ~135 million. The Steam hardware survey is done monthly.

There's about 15% of Steam utilizing hardware better than a RTX2070... 15% of 135M = ~20M users with GPUs beating the PS5... and that's RTX alone, not counting AMD at all..

Shocked Oh No GIF by Yêu Lu
 

yamaci17

Member
The thing is though we dont know if its that, it could just be more work is needed from the development side.


If sony first party taylor a game to the PS5 of course its not going to run as well on PC, and sony probably wont go the extra mile to optimise a PC further, that does not mean the PS5 has some inheritly superior hardware in it, that cant be matched or surpassed with some slightly more powerful PC hardware.

The thing that is funny about NX he thinks that the PS5 is built in a way which gives it a significant advantage over comparable PC hardware.
And while I agree consoles do have some slight advantages over PC in real world performance its trivial.

The guy really just wants to be redeemed for saying the PS5 is something special when it was first revealed. The guy has a pretty big ego.
the game is vram bottlenecked almost all the time if you enable RT with a 8 gig card (primarly because game limits itself to only use 6.4 GB of memory). then the game consntalty beings to use normal RAM as a substitute for VRAM

the test is all wrong, and he clearly failed to understand what made his 2070 perform like crap

not once in his video that he managed to make a comment regarding full vram utilization. the performance difference between 2070 and PS5 completely stems from running out of VRAM

in this specific shot, performance is dropped because of running out of VRAM, not because of "PS5 having better performance"

qvxpHqa.jpg



I actually proved it, in the exact same intro scene;


even high textures cause a performance drop. with low textures, I manage to get 56 frames. so 8 GB buffer cannot even fit high textures + ray tracing together at 4K without experiencing a performance drop

only at native 1440p, 8 GB+ very high textures are usable, but only in short burts of playtime. (not 4k+upscaling, 4k + upscaling uses 4k lods, therefore stresses vram further)

this is not to say NVIDIA and RTX cards are not to blame here. They are. Lack of VRAM causes these abnormalities, and opportunists like NX Gamer took a nice slice out of this problem for himself.

The entire reason game constantly hammers PCIE is not because how the engine is built, but because game artificially limits itself to only use a maximum of 6.4 GB VRAM and use normal RAM as a subsitute. It is not doing anything fancy: No game on PC should ever rely on transferring texture data constantly on the fly from RAM to VRAM. PCIE 4 wont solve that either, RAM to VRAM will always be slow, will always stall GPUs to worse performance and will always create more problems.

In this specific case, this happens becuase the said GPU does not have enough VRAM that game demands at 4K/RT+Very high textures. This config requires a solidly uninterrupted 10 GB BUFFER, which RX 6800 has, therefore does not run into similar performance issues.

If he actually had the courage to test and compare RX 6800 to PS5 with RT enabled in similar scenes, he would also see that PS5 is not "out performing" the other GPUs, as it would be decimated by the RX 6800. Notice how he cleverly avoided making a direct comparison between RX 6800 and PS5, it would undermine all of what he talked about in the very same video.

PS5 only "seemingly" destroys the 2070 because the buffer is at its limits.

Just my points and observations. You can do whatever you want with this info. I don't like the dude, but sadly 8 gigs of VRAM simply cannot enable features that PS5 can enable with its uncompromised 10 GB budget, which creates anomalies like this. And it too will continue to happen, especially if developers like Nixxes decides to artificially limit VRAM utilization to 6.4 GB, supposedly to leave "available VRAM" for background operations, even if you don't have nothing on the background

The RTX 3060 existing alone destroys the entire point of this video. If he tested 3060,

- It would match and exceed PS5 performance (unlike the 8 GB 2070)
- It would not cause PCIE to be hammered, since it has enough VRAM budget (unlike the 8 GB 2070)

Here's how 3060 performs at NATIVE 4k (similar to ps5's fidelity mode) with similar RT settings;



it gets 35-40 frames, just like PS5 does. look at his 2070 video, constantly having irregular frametimes, whereas 3060 is having a smooth ride all around.

as i said, it is indeed the 2070 being the problematic part here.

i have no idea how 8 gig rtx gpus will age, considering these events, I'd simply say stay away from them for your own good. wait for 12 GB 4070.
 
Last edited:

Tchu-Espresso

likes mayo on everthing and can't dance
the game is vram bottlenecked almost all the time if you enable RT with a 8 gig card (primarly because game limits itself to only use 6.4 GB of memory). then the game consntalty beings to use normal RAM as a substitute for VRAM

the test is all wrong, and he clearly failed to understand what made his 2070 perform like crap

not once in his video that he managed to make a comment regarding full vram utilization. the performance difference between 2070 and PS5 completely stems from running out of VRAM

in this specific shot, performance is dropped because of running out of VRAM, not because of "PS5 having better performance"

qvxpHqa.jpg



I actually proved it, in the exact same intro scene;


even high textures cause a performance drop. with low textures, I manage to get 56 frames. so 8 GB buffer cannot even fit high textures + ray tracing together at 4K without experiencing a performance drop

only at native 1440p, 8 GB+ very high textures are usable, but only in short burts of playtime. (not 4k+upscaling, 4k + upscaling uses 4k lods, therefore stresses vram further)

this is not to say NVIDIA and RTX cards are not to blame here. They are. Lack of VRAM causes these abnormalities, and opportunists like NX Gamer took a nice slice out of this problem for himself.

The entire reason game constantly hammers PCIE is not because how the engine is built, but because game artificially limits itself to only use a maximum of 6.4 GB VRAM and use normal RAM as a subsitute. It is not doing anything fancy: No game on PC should ever rely on transferring texture data constantly on the fly from RAM to VRAM. PCIE 4 wont solve that either, RAM to VRAM will always be slow, will always stall GPUs to worse performance and will always create more problems.

In this specific case, this happens becuase the said GPU does not have enough VRAM that game demands at 4K/RT+Very high textures. This config requires a solidly uninterrupted 10 GB BUFFER, which RX 6800 has, therefore does not run into similar performance issues.

If he actually had the courage to test and compare RX 6800 to PS5 with RT enabled in similar scenes, he would also see that PS5 is not "out performing" the other GPUs, as it would be decimated by the RX 6800. Notice how he cleverly avoided making a direct comparison between RX 6800 and PS5, it would undermine all of what he talked about in the very same video.

PS5 only "seemingly" destroys the 2070 because the buffer is at its limits.

Just my points and observations. You can do whatever you want with this info. I don't like the dude, but sadly 8 gigs of VRAM simply cannot enable features that PS5 can enable with its uncompromised 10 GB budget, which creates anomalies like this. And it too will continue to happen, especially if developers like Nixxes decides to artificially limit VRAM utilization to 6.4 GB, supposedly to leave "available VRAM" for background operations, even if you don't have nothing on the background

The RTX 3060 existing alone destroys the entire point of this video. If he tested 3060,

- It would match and exceed PS5 performance (unlike the 8 GB 2070)
- It would not cause PCIE to be hammered, since it has enough VRAM budget (unlike the 8 GB 2070)

Here's how 3060 performs at NATIVE 4k (similar to ps5's fidelity mode) with similar RT settings;



it gets 35-40 frames, just like PS5 does. look at his 2070 video, constantly having irregular frametimes, whereas 3060 is having a smooth ride all around.

as i said, it is indeed the 2070 being the problematic part here.

i have no idea how 8 gig rtx gpus will age, considering these events, I'd simply say stay away from them for your own good. wait for 12 GB 4070.

PS5 seems to be above 40fps most of the time in unlocked fidelity mode.
 

yamaci17

Member
PS5 seems to be above 40fps most of the time in unlocked fidelity mode.
it mostly performs great. compared to 2070. no need to comment further, game clearly uses a clear cut 9.1-9.2 GB on 3060 with exact same settings used on 2070 in his video. see the 2070 video again, and see how it is stranded to 6.4 GB. that deficit, a whopping 2.5-3 Gb worth of data is being compensated by RAM to VRAM transactions, which stalls 2070 to a much worse performance
 

Hoddi

Member
it mostly performs great. compared to 2070. no need to comment further, game clearly uses a clear cut 9.1-9.2 GB on 3060 with exact same settings used on 2070 in his video. see the 2070 video again, and see how it is stranded to 6.4 GB. that deficit, a whopping 2.5-3 Gb worth of data is being compensated by RAM to VRAM transactions, which stalls 2070 to a much worse performance
This is 100% the issue. It's blatantly obvious that his card is running out of VRAM because the PCIe bus would never be transferring at 8-16GB/s if it weren't. No surprise that it doesn't happen on higher VRAM GPUs.

I mean, this is absolute beginner's stuff.
 
Last edited:

Loxus

Member
the game is vram bottlenecked almost all the time if you enable RT with a 8 gig card (primarly because game limits itself to only use 6.4 GB of memory). then the game consntalty beings to use normal RAM as a substitute for VRAM

the test is all wrong, and he clearly failed to understand what made his 2070 perform like crap

not once in his video that he managed to make a comment regarding full vram utilization. the performance difference between 2070 and PS5 completely stems from running out of VRAM

in this specific shot, performance is dropped because of running out of VRAM, not because of "PS5 having better performance"

qvxpHqa.jpg



I actually proved it, in the exact same intro scene;

[/URL][/URL][/URL]

even high textures cause a performance drop. with low textures, I manage to get 56 frames. so 8 GB buffer cannot even fit high textures + ray tracing together at 4K without experiencing a performance drop

only at native 1440p, 8 GB+ very high textures are usable, but only in short burts of playtime. (not 4k+upscaling, 4k + upscaling uses 4k lods, therefore stresses vram further)

this is not to say NVIDIA and RTX cards are not to blame here. They are. Lack of VRAM causes these abnormalities, and opportunists like NX Gamer took a nice slice out of this problem for himself.

The entire reason game constantly hammers PCIE is not because how the engine is built, but because game artificially limits itself to only use a maximum of 6.4 GB VRAM and use normal RAM as a subsitute. It is not doing anything fancy: No game on PC should ever rely on transferring texture data constantly on the fly from RAM to VRAM. PCIE 4 wont solve that either, RAM to VRAM will always be slow, will always stall GPUs to worse performance and will always create more problems.

In this specific case, this happens becuase the said GPU does not have enough VRAM that game demands at 4K/RT+Very high textures. This config requires a solidly uninterrupted 10 GB BUFFER, which RX 6800 has, therefore does not run into similar performance issues.

If he actually had the courage to test and compare RX 6800 to PS5 with RT enabled in similar scenes, he would also see that PS5 is not "out performing" the other GPUs, as it would be decimated by the RX 6800. Notice how he cleverly avoided making a direct comparison between RX 6800 and PS5, it would undermine all of what he talked about in the very same video.

PS5 only "seemingly" destroys the 2070 because the buffer is at its limits.

Just my points and observations. You can do whatever you want with this info. I don't like the dude, but sadly 8 gigs of VRAM simply cannot enable features that PS5 can enable with its uncompromised 10 GB budget, which creates anomalies like this. And it too will continue to happen, especially if developers like Nixxes decides to artificially limit VRAM utilization to 6.4 GB, supposedly to leave "available VRAM" for background operations, even if you don't have nothing on the background

The RTX 3060 existing alone destroys the entire point of this video. If he tested 3060,

- It would match and exceed PS5 performance (unlike the 8 GB 2070)
- It would not cause PCIE to be hammered, since it has enough VRAM budget (unlike the 8 GB 2070)

Here's how 3060 performs at NATIVE 4k (similar to ps5's fidelity mode) with similar RT settings;



it gets 35-40 frames, just like PS5 does. look at his 2070 video, constantly having irregular frametimes, whereas 3060 is having a smooth ride all around.

as i said, it is indeed the 2070 being the problematic part here.

i have no idea how 8 gig rtx gpus will age, considering these events, I'd simply say stay away from them for your own good. wait for 12 GB 4070.

I'm not to sure what's going on as I haven't fully watched the video yet, but the 2070 and the 3060 basically has the same performance.

4rCzIUU.jpg


I always thought it was a given the PS5 performance was some were between a 2070 Super and a 2080, a 6700 Non XT if that was a card.
I guess it depends on the game.

zAtvcQX.jpg


I never thought I would hear the PS5 performing as well as a 3070 though, that's crazy if true.
 
Last edited:
the game is vram bottlenecked almost all the time if you enable RT with a 8 gig card (primarly because game limits itself to only use 6.4 GB of memory). then the game consntalty beings to use normal RAM as a substitute for VRAM

the test is all wrong, and he clearly failed to understand what made his 2070 perform like crap

not once in his video that he managed to make a comment regarding full vram utilization. the performance difference between 2070 and PS5 completely stems from running out of VRAM

in this specific shot, performance is dropped because of running out of VRAM, not because of "PS5 having better performance"

qvxpHqa.jpg



I actually proved it, in the exact same intro scene;

[/URL]

even high textures cause a performance drop. with low textures, I manage to get 56 frames. so 8 GB buffer cannot even fit high textures + ray tracing together at 4K without experiencing a performance drop

only at native 1440p, 8 GB+ very high textures are usable, but only in short burts of playtime. (not 4k+upscaling, 4k + upscaling uses 4k lods, therefore stresses vram further)

this is not to say NVIDIA and RTX cards are not to blame here. They are. Lack of VRAM causes these abnormalities, and opportunists like NX Gamer took a nice slice out of this problem for himself.

The entire reason game constantly hammers PCIE is not because how the engine is built, but because game artificially limits itself to only use a maximum of 6.4 GB VRAM and use normal RAM as a subsitute. It is not doing anything fancy: No game on PC should ever rely on transferring texture data constantly on the fly from RAM to VRAM. PCIE 4 wont solve that either, RAM to VRAM will always be slow, will always stall GPUs to worse performance and will always create more problems.

In this specific case, this happens becuase the said GPU does not have enough VRAM that game demands at 4K/RT+Very high textures. This config requires a solidly uninterrupted 10 GB BUFFER, which RX 6800 has, therefore does not run into similar performance issues.

If he actually had the courage to test and compare RX 6800 to PS5 with RT enabled in similar scenes, he would also see that PS5 is not "out performing" the other GPUs, as it would be decimated by the RX 6800. Notice how he cleverly avoided making a direct comparison between RX 6800 and PS5, it would undermine all of what he talked about in the very same video.

PS5 only "seemingly" destroys the 2070 because the buffer is at its limits.

Just my points and observations. You can do whatever you want with this info. I don't like the dude, but sadly 8 gigs of VRAM simply cannot enable features that PS5 can enable with its uncompromised 10 GB budget, which creates anomalies like this. And it too will continue to happen, especially if developers like Nixxes decides to artificially limit VRAM utilization to 6.4 GB, supposedly to leave "available VRAM" for background operations, even if you don't have nothing on the background

The RTX 3060 existing alone destroys the entire point of this video. If he tested 3060,

- It would match and exceed PS5 performance (unlike the 8 GB 2070)
- It would not cause PCIE to be hammered, since it has enough VRAM budget (unlike the 8 GB 2070)

Here's how 3060 performs at NATIVE 4k (similar to ps5's fidelity mode) with similar RT settings;



it gets 35-40 frames, just like PS5 does. look at his 2070 video, constantly having irregular frametimes, whereas 3060 is having a smooth ride all around.

as i said, it is indeed the 2070 being the problematic part here.

i have no idea how 8 gig rtx gpus will age, considering these events, I'd simply say stay away from them for your own good. wait for 12 GB 4070.

That dude in your vid is using a better than ps5 cpu instead of equivalent one. Let me see a 3060 tested with either a 2700x or 3700x
 
the game is vram bottlenecked almost all the time if you enable RT with a 8 gig card (primarly because game limits itself to only use 6.4 GB of memory). then the game consntalty beings to use normal RAM as a substitute for VRAM

the test is all wrong, and he clearly failed to understand what made his 2070 perform like crap

not once in his video that he managed to make a comment regarding full vram utilization. the performance difference between 2070 and PS5 completely stems from running out of VRAM

in this specific shot, performance is dropped because of running out of VRAM, not because of "PS5 having better performance"

qvxpHqa.jpg



I actually proved it, in the exact same intro scene;

[/URL]

even high textures cause a performance drop. with low textures, I manage to get 56 frames. so 8 GB buffer cannot even fit high textures + ray tracing together at 4K without experiencing a performance drop

only at native 1440p, 8 GB+ very high textures are usable, but only in short burts of playtime. (not 4k+upscaling, 4k + upscaling uses 4k lods, therefore stresses vram further)

this is not to say NVIDIA and RTX cards are not to blame here. They are. Lack of VRAM causes these abnormalities, and opportunists like NX Gamer took a nice slice out of this problem for himself.

The entire reason game constantly hammers PCIE is not because how the engine is built, but because game artificially limits itself to only use a maximum of 6.4 GB VRAM and use normal RAM as a subsitute. It is not doing anything fancy: No game on PC should ever rely on transferring texture data constantly on the fly from RAM to VRAM. PCIE 4 wont solve that either, RAM to VRAM will always be slow, will always stall GPUs to worse performance and will always create more problems.

In this specific case, this happens becuase the said GPU does not have enough VRAM that game demands at 4K/RT+Very high textures. This config requires a solidly uninterrupted 10 GB BUFFER, which RX 6800 has, therefore does not run into similar performance issues.

If he actually had the courage to test and compare RX 6800 to PS5 with RT enabled in similar scenes, he would also see that PS5 is not "out performing" the other GPUs, as it would be decimated by the RX 6800. Notice how he cleverly avoided making a direct comparison between RX 6800 and PS5, it would undermine all of what he talked about in the very same video.

PS5 only "seemingly" destroys the 2070 because the buffer is at its limits.

Just my points and observations. You can do whatever you want with this info. I don't like the dude, but sadly 8 gigs of VRAM simply cannot enable features that PS5 can enable with its uncompromised 10 GB budget, which creates anomalies like this. And it too will continue to happen, especially if developers like Nixxes decides to artificially limit VRAM utilization to 6.4 GB, supposedly to leave "available VRAM" for background operations, even if you don't have nothing on the background

The RTX 3060 existing alone destroys the entire point of this video. If he tested 3060,

- It would match and exceed PS5 performance (unlike the 8 GB 2070)
- It would not cause PCIE to be hammered, since it has enough VRAM budget (unlike the 8 GB 2070)

Here's how 3060 performs at NATIVE 4k (similar to ps5's fidelity mode) with similar RT settings;



it gets 35-40 frames, just like PS5 does. look at his 2070 video, constantly having irregular frametimes, whereas 3060 is having a smooth ride all around.

as i said, it is indeed the 2070 being the problematic part here.

i have no idea how 8 gig rtx gpus will age, considering these events, I'd simply say stay away from them for your own good. wait for 12 GB 4070.

also even in that vid your showing he’s mostly under 40 unlike the ps5 which is always above 40 and usually hovering in the 45-50 range
 

yamaci17

Member
also even in that vid your showing he’s mostly under 40 unlike the ps5 which is always above 40 and usually hovering in the 45-50 range
different locations / different regions. just a few seconds later 3060 too shoots above 40+

2080ti will easily destroy and wreck PS5 in this game with matched settings at 4K, there is simply no competition

don't twist my words, or videos i've linked, i've just sent that as a proof of concept. in nx gamer video, he claims ps5 is hovering around 25 whereas ps5 gets 45, and this naturally puts ps5 to something like 3070 according to him, whereas in reality 3060 is capable of pushing 35/40+ in exact same location, meaning 2070 would be capable too, if it had enough VRAM
 

yamaci17

Member
I'm not to sure what's going on as I haven't fully watched the video yet, but the 2070 and the 3060 basically has the same performance.

4rCzIUU.jpg


I always thought it was a given the PS5 performance was some were between a 2070 Super and a 2080, a 6700 Non XT if that was a card.
I guess it depends on the game.

zAtvcQX.jpg


I never thought I would hear the PS5 performing as well as a 3070 though, that's crazy if true.
3060 is not outperforming the 2070
ps5 is not outperforming the 2070

2070 is underperforming compared to all its equivalent-power hardware
even 3080 underperforms compared to 2080ti at 4K.


Practically, VRAM constrained cards cannot perform as they should. Clear as crystal.

vUDE31i.jpg


The reason 3080 also buckles at 4k is because game only uses a total of %80 available vram.
As a result:
8 GB cards have 6.4 GB budget,
10 GB cards have 8 GB budget (and 4K+ray tracing breaches the budget, hence it underperforms)
11 GB cards have 8.8 GB budget (still not enough budget, but better than 8)
12 GB cards have 9.6 GB budget (only proper budget that can truly match PS5's allocated memory)


VRAM constraint buckles 3080 to a point where it performs almost like a 2080ti. Whereas 3070 is constrainted too much that it drops a nearly %30 perf.

2070/2070s also drops a %30 perf. drop, which is the reason why it drops below 30s, and often hangs around 25 in his obnoxious video.

These problems %100 exists. Steam discussions are full of people who keep saying they keep having performance issues after 15 mins of playtime. The entire reason all reviewers seem to ignore this is because it would create a PR nightmare for NVIDIA. Even the 10 gig 3080 falter, to a point where it perfoms like a 2080ti.

All the issues community experiences stems from the exact same VRAM conundrum.
 
Last edited:
Top Bottom