• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[NxGamer] Spider-Man Remastered PC vs PS5 vs Steam Deck vs PS4 vs 750Ti - Technical Review & Comparison

i man ican provide 4k native high texture gameplay, then he would say I should use very high textures, while ignoring 2080ti's 65+ avg. performance since it has a 10900k, but if i can provide 60+ avg performance with a 2700x and a 3070 but with high textures, again it is not valid

i really wish 3070 had 10 or 12 gigs, it truly saddens me to trying to expalining all this. NVIDIA aimed this card at 1440p, which makes it a laughing stock at native 4k in niche situations like this, where NX gamer preys upon



i literally cannot get above 40 frames with VH textures. simple as that. this is simply a limitation of VRAM, and it is not by DESIGN. i,t is what happens to all games universally when you RUN OUT OF VRAM...

Spoiler: I tried 1080p resolution on 3080 and that low texture streaming bug? is still there. :messenger_medmask:
 

ChiefDada

Gold Member
i really wish 3070 had 10 or 12 gigs, it truly saddens me to trying to expalining all this. NVIDIA aimed this card at 1440p, which makes it a laughing stock at native 4k in niche situations like this, where NX gamer preys upon

The bolded is just about the only thing I don't understand/agree with your argument; The gpu and vram are intertwined and you are commenting as if it's not. I really wish console had dedicated RT cores, but that doesn't mean I'm against comparing 30 series with console for RT heavy workloads. All hardware design choices have sacrifices.
 

01011001

Banned
I really wish 3070 had 10 or 12 gigs, it truly saddens me to trying to expalining all this. NVIDIA aimed this card at 1440p, which makes it a laughing stock at native 4k in niche situations like this, where NX gamer preys upon

well targeting 1440p makes a lot of sense. 1080p and 1440p are by far the most used resolutions on PC after all. and for good reason, at the seize of the typical PC monitor (25" to 27") 1440p is more than enough and will also help you keep settings high without introducing unwanted res scaling.

the bigger issue might be more vram demanding games of the future. but maybe Nvidia was putting its cards on direct storage helping their cards out in the long run

edit: Steam HW Survey August 2022
screenshot_20220911_2zgit4.jpg
 
Last edited:
i dont care, i dont what dlss resolution is. it is unknown. it is also high preset, which sets weather particles to high, which causes a huge %25 GPU render budget drop, which does not happen on PS5, because it uses medium weather particles

i dont care about dynamic resolution, you can nnever knoe what resolution it drops to, maybe it does not at all. 2080ti is almost powerful enough to push native 4k 60 fps in this game without dropping resolution.

you downplay 60 to 57-58 and then completely disregard 35-40 FPS drops on PS5 and say it produly averages 45-50

Bias is too strong with you, if you do not dose it back, I will have to ignore and stop replying back to you


P playsaves3 it is set to medium for all PS5 modes

you can literally prove it by looking at title screen in idle. only high particles create extra clouds in the background, whereas all PS5 modes have the static, non clouded background, just like medium particles
That’s strange Nx gamer mentioned it was high on the fidelity mode specifically and the ps5 never drops to 35fps i literally don’t know if you saw something different but the absolute lowest it can hit is right at 40
 
what does this have to do with resolution now? you claim or inisuate that good performace is something to do with CPU. anyone who sents you a succesful 2080ti performance vid, you refute it with saying it has a strong CPU, implying that 2080ti has nothing to do with it but CPU makes it succesful

this video is proof that 2700x is able to almost to lock to a perfect 60 with RT enabled, regardless of GPU or resolution being used. i dont even remember the resolution if i had a 2080ti I would test it out. instead i have the 8 GB variant of it. naturally VRAM bogs down hard at 4K so i wont even bother sending you a video regarding that, because you're too biased to even give credence to what I claim
Im talking about the equal performance thing
 

yamaci17

Member
That’s strange Nx gamer mentioned it was high on the fidelity mode specifically and the ps5 never drops to 35fps i literally don’t know if you saw something different but the absolute lowest it can hit is right at 40
okay bro, it never drops to 35 and drops to 40, okay? nx gamer is free to prove that ps5 uses high weather particles by making cross comparisons with PC
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Is gaming or synthetic benchmarks?
Cinebench. They tried to do ingame benchmarks but the chip only supports PCIE 2.0 bandwidth which destroyed the performance of their 3090 systems so any gaming tests were inconclusive. And since AMD disabled the iGPU cores, they werent able to test any games on the actual chip itself. AMD does not recommend this CPU to be paired up with gaming anyway. Its for desktop applications use only.

9dP3VSgjGLBNiVGq8T3S2D-1200-80.png.webp


As you can see, the 2700x is roughly 4% faster than the PS5 CPU and the 2700 is roughly 5% slower than 2700x, so its a very good CPU to pair up with the 32 CU 10.6 tflops 6600xt or the 40 CU 13.1 tflops 6700xt which can be downclocked to get to 10.23 tflops. The 6700xt has a higher bandwidth which should bring it in line with the PS5 bandwidth so that might be just a bit more accurate, but i havent seen any 6600xt vs 6700xt benchmarks be hampered by its lower ram bandwidth.
 
Last edited:

s-bojan

Banned
I have 8700k, 3000 ram, 3070 which I run in x8 slot (as I’ve damaged x16 one).
In most games I get better performance than on my ps5. Sometimes it’s close, sometimes the difference is really large.

Far Cry 6 was the first game ps5 offered something that my pc could not - high textures.
I was not able to run spiderman with RT in locked 1080p/60, even with textures set to high. This was the first game where I’ve felt that playing on ps5 would be preferable.

My pc has many potential bottlenecks. Honestly I am not even sure why performance is like that. But I really hope that this is not a sign of things to come.
 
Last edited:
okay bro, it never drops to 35 and drops to 40, okay? nx gamer is free to prove that ps5 uses high weather particles by making cross comparisons with PC
Dude im not personally offended and I’ve appreciated having this discussion with you even with our disagreements. More people have the oppurtunity to do the tests on pc vs ps5 vrr modes but no one but Nx has bothered so he is left as our sole source
 
Cinebench. They tried to do ingame benchmarks but the chip only supports PCIE 2.0 bandwidth which destroyed the performance of their 3090 systems so any gaming tests were inconclusive. And since AMD disabled the iGPU cores, they werent able to test any games on the actual chip itself. AMD does not recommend this CPU to be paired up with gaming anyway. Its for desktop applications use only.

9dP3VSgjGLBNiVGq8T3S2D-1200-80.png.webp


As you can see, the 2700x is roughly 4% faster than the PS5 CPU and the 2700 is roughly 5% slower than 2700x, so its a very good CPU to pair up with the 32 CU 10.6 tflops 6600xt or the 40 CU 13.1 tflops 6700xt which can be downclocked to get to 10.23 tflops. The 6700xt has a higher bandwidth which should bring it in line with the PS5 bandwidth so that might be just a bit more accurate, but i havent seen any 6600xt vs 6700xt benchmarks be hampered by its lower ram bandwidth.
6650xt is the closer match to ps5 I’d say 6700xt is more series x level
 

yamaci17

Member
Dude im not personally offended and I’ve appreciated having this discussion with you even with our disagreements. More people have the oppurtunity to do the tests on pc vs ps5 vrr modes but no one but Nx has bothered so he is left as our sole source
he's free to compare rx 6800/ryzen 3600 to ps5 in the exact same intro scene. somehow he eluded from doing that. do you think it is convenient?

if rx 6800 ends up being much better than ps5 in matched scenes, then it clearly proves my point that PS5 is not overachieveing, but it is the 3070 that is underperforming. only way for PS5 to overachieve itself is to get equivalent RT performance to that of 2080ti's, but it is not happening either
 
Last edited:

yamaci17

Member
6650xt is the closer match to ps5 I’d say 6700xt is more series x level
you will probably hate me for saying it but at this point i dont give any respect towards 6600xt / 6650xt either. initially i also thought they'd match ps5, but not anymore. at 1080p/1300p, they can be, but at 4K, 6600xt/6650xt shits the bed with their 128 bit interface and weak memory bandwidth one being 280 gb/s (6650xt) and other being 256 gb/s (6600xt). 6600xt and 3060ti literally ties together at 1080p yet at 4K, 3060ti roflstomps it. there is something very wrong with those GPUs at higher resolutions that bogs down their compute capability

6600xt literally gets destroyed in spiderman at 4k, due to how badly its memory subsystem bottlenecks it

at this point only proper GPUs we can compare to PS5 is to a point, is 6700xt (you can calculate how better 6700xt is to PS5 and how much better it should theoritically be, and make your overachieving calculations from there). on NVIDIA, we have 3060 that undershoots, and 3080 that overshoots
 
Last edited:
he's free to compare rx 6800/ryzen 3600 to ps5 in the exact same intro scene. somehow he eluded from doing that. do you think it is convenient?

if rx 6800 ends up being much better than ps5 in matched scenes, then it clearly proves my point that PS5 is not overachieveing, but it is the 3070 that is underperforming. only way for PS5 to overachieve itself is to get equivalent RT performance to that of 2080ti's, but it is not happening either

as i said, high particles put %25-30 extra strain on GPU, which is hugely skewing the entire comparison towards PS5, also by extension, all of the videos you've linked so far becomes invalid due to that single setting
He mentioned in the video he wasnt testing every last configuration and then doing comparisons cause that would make this a well over an hour video you can look at rhe results though he mentioned it’s 10-20% faster on average and it holds up in the framerate hovering in the 50-55 región mostly
 

yamaci17

Member
He mentioned in the video he wasnt testing every last configuration and then doing comparisons cause that would make this a well over an hour video you can look at rhe results though he mentioned it’s 10-20% faster on average and it holds up in the framerate hovering in the 50-55 región mostly
btw i confirmed that ps5 fidelity mode uses high particles. I retract what I said about that

I now started to wonder if it is really native 4k or not with that VRR mode. it is interesting. supposedly fidelity mod has a dynamic range between 1584p and 2160p, with a cap of 30, to actually hold the 30 frames. Is there someone who can share a couple of lossless screnshots with VRR/Fidelity mode unlocked FPS mode from spiderman remastered? I can do the pixel counting if someone can provide shots. I tried to get some estimations from the elanalista and nxg videos but videos are too lossy to do it.
 
Last edited:

Corndog

Banned
Are you not realizing why it’s bad to allow a pc to not be bottlenecked on the cpu side but potentially allow the console to be or even when testing gpus against each other they have been caught using different cpu spreads?
I think you should always use the best cpu available so as not to hamstring the gpu. You are not really testing the gpu if it gets limited by the cpu.
 

Md Ray

Member
Cinebench. They tried to do ingame benchmarks but the chip only supports PCIE 2.0 bandwidth which destroyed the performance of their 3090 systems so any gaming tests were inconclusive. And since AMD disabled the iGPU cores, they werent able to test any games on the actual chip itself. AMD does not recommend this CPU to be paired up with gaming anyway. Its for desktop applications use only.

9dP3VSgjGLBNiVGq8T3S2D-1200-80.png.webp


As you can see, the 2700x is roughly 4% faster than the PS5 CPU and the 2700 is roughly 5% slower than 2700x, so its a very good CPU to pair up with the 32 CU 10.6 tflops 6600xt or the 40 CU 13.1 tflops 6700xt which can be downclocked to get to 10.23 tflops. The 6700xt has a higher bandwidth which should bring it in line with the PS5 bandwidth so that might be just a bit more accurate, but i havent seen any 6600xt vs 6700xt benchmarks be hampered by its lower ram bandwidth.
Cinebench is not representative of real-world gaming performance, don't judge gaming perf out of CB as it is pretty much confined to be dependant on core/thread count and to be very sensitive towards the core clocks. Things like system memory bandwidth, amount of cache and its latency, DRAM latency do not matter at all, whereas for gaming, it absolutely does! Gaming workload is often not like CB. Spider-Man for e.g. heavily thrashes system memory BW. Had the PS5 SoC with its GDDR6 NOT been limited by PCIe 2.0, it would have performed significantly faster than 2700X with DDR4 in this game due to vastly more system mem BW offered up by GDDR6 for the CPU. So I don't think you can say PS5 CPU performs roughly equivalent to 2700 just from looking at their Cinebench perf.
 

yamaci17

Member
Cinebench is not representative of real-world gaming performance, don't judge gaming perf out of CB as it is pretty much confined to be dependant on core/thread count and to be very sensitive towards the core clocks. Things like system memory bandwidth, amount of cache and its latency, DRAM latency do not matter at all, whereas for gaming, it absolutely does! Gaming workload is often not like CB. Spider-Man for e.g. heavily thrashes system memory BW. Had the PS5 SoC with its GDDR6 NOT been limited by PCIe 2.0, it would have performed significantly faster than 2700X with DDR4 in this game due to vastly more system mem BW offered up by GDDR6 for the CPU. So I don't think you can say PS5 CPU performs roughly equivalent to 2700 just from looking at their Cinebench perf.
sure as hell ps5 outperforms 2700x in this game

my 2700x tops out at 55-65 FPS cpu bound with RT enabled, whereas PS5 can hit framerates upwards of 75+ easily with its unlocked VRR mode at dynamic 1440p. maybe decompression thing hammers the CPU, that could be a reason too, but in the end it is what it is, I would say a 3700x is a better match in the case of Spiderman ray tracing. (im specifically talking about swinging in open world. in door scenes or sticking up to a wall is another story where u get better CPU bound performance)
 
Last edited:

lukilladog

Member
I have 8700k, 3000 ram, 3070 which I run in x8 slot (as I’ve damaged x16 one).
In most games I get better performance than on my ps5. Sometimes it’s close, sometimes the difference is really large.

Far Cry 6 was the first game ps5 offered something that my pc could not - high textures.
I was not able to run spiderman with RT in locked 1080p/60, even with textures set to high. This was the first game where I’ve felt that playing on ps5 would be preferable.

My pc has many potential bottlenecks. Honestly I am not even sure why performance is like that. But I really hope that this is not a sign of things to come.

Do you mean the HD textures in Far Cry 6?... some area in specific?. On my Pc it seems to jump between 7000 and 7500mb, it must be transferring textures all the time and your pcie port doesn´t cut it.
 
you will probably hate me for saying it but at this point i dont give any respect towards 6600xt / 6650xt either. initially i also thought they'd match ps5, but not anymore. at 1080p/1300p, they can be, but at 4K, 6600xt/6650xt shits the bed with their 128 bit interface and weak memory bandwidth one being 280 gb/s (6650xt) and other being 256 gb/s (6600xt)

6600xt literally gets destroyed in spiderman at 4k, due to how badly its memory subsystem bottlenecks it

at this point only proper GPUs we can compare to PS5 is to a point, is 6700xt (you can calculate how better 6700xt is to PS5 and how much better it should theoritically be, and make your overachieving calculations from there). on NVIDIA, we have 3060 that undershoots, and 3080 that overshoots
Is this actually true I thought amd gpus had the better bus
I think you should always use the best cpu available so as not to hamstring the gpu. You are not really testing the gpu if it gets limited by the cpu.
yeah cause pc is afforded that luxury but not console… totally equivalent though
 

SlimySnake

Flashless at the Golden Globes
I think you should always use the best cpu available so as not to hamstring the gpu. You are not really testing the gpu if it gets limited by the cpu.
When you are doing pc gpu to gpu comparisons yes, but this DF fool doesnt realize that the zen 2 cpus in the consoles are heavily cutdown versions of the pc versions that also have to share previous memory bandwidth with the GPUs.

Using a 12 core 24 thread cpu running at 5.2 ghz in a comparison to console GPUs is fucking retarded. He should instead of handicapping the PC GPUs with shitty CPUs like the 2700 to see just where the console GPUs stand relative to pc GPUs. The guy really badly wants to push the narrative that the console GPUs are equivalent to 2060 and any time they outperform it he goes into panic mode.
 
Last edited:
When you are doing pc gpu to gpu comparisons yes, but this DF fool doesnt realize that the zen 2 cpus in the consoles are heavily cutdown versions of the pc versions that also have to share previous memory bandwidth with the GPUs.

Using a 12 core 24 thread cpu running at 5.2 ghz in a comparison to console GPUs is fucking retarded. He should instead of handicapping the PC GPUs with shitty CPUs like the 2700 to see just where the console GPUs stand relative to pc GPUs. The guy really badly wants to push the narrative that the console GPUs are equivalent to 2060 and any time they outperform it he goes into panic mode.
I think we all sometimes take all DF videos as "benchmarking" PS5 power Vs PC, DFs goal was different. It was about what kind of GPU and CPU would one need to get a "PS5 like" experience. That's a very different goal to pure benchmarking, because Spiderman isn't suitable for pure benchmarking.

A 2060s (not 2060) can as usual provide ballpark PS5 level experience on the GPU side in this game but with a caveat: the CPU has to be stronger than usual since Spiderman suffers from PC specific CPU side optimisation issues (related to RT, decompression etc). They illustrated this by pairing the 2060s with an overpowered CPU (thus bruteforcing through the CPU optimisation issues) and showing that unconstrained, the 2060s is enough to deliver a PS5 level experience. (Again, this was NOT a benchmarking exercise, just a analysis of the type of GPU needed to provide a PS5 like experience). Once that's done they show how a moderate CPU like the 3600 underperformes in the game. I think this is a rational way to test and illustrate their goal.
 
I think we all sometimes take all DF videos as "benchmarking" PS5 power Vs PC, DFs goal was different. It was about what kind of GPU and CPU would one need to get a "PS5 like" experience. That's a very different goal to pure benchmarking, because Spiderman isn't suitable for pure benchmarking.

A 2060s (not 2060) can as usual provide ballpark PS5 level experience on the GPU side in this game but with a caveat: the CPU has to be stronger than usual since Spiderman suffers from PC specific CPU side optimisation issues (related to RT, decompression etc). They illustrated this by pairing the 2060s with an overpowered CPU (thus bruteforcing through the CPU optimisation issues) and showing that unconstrained, the 2060s is enough to deliver a PS5 level experience. (Again, this was NOT a benchmarking exercise, just a analysis of the type of GPU needed to provide a PS5 like experience). Once that's done they show how a moderate CPU like the 3600 underperformes in the game. I think this is a rational way to test and illustrate their goal.
Eh they didn’t show that you can get a ps5 like experience on a visual front but not a performance front
 

sn0man

Member
that's a long video to excuse a sub par PC port
Seth Meyers Lol GIF by Late Night with Seth Meyers


I kind of buy the console can achieve more because you can code a little closer to a single console SKU vs dozens of PC configurations. Which is kind of the opposite of what you’re saying.
 
Eh they didn’t show that you can get a ps5 like experience on a visual front but not a performance front
That's cause the settings don't match. If nixxes included an "original" setting like what's available in Death Stranding or HZD, it would have been far easier. So they gave a ball park PS5 setting. In some cases those settings give better quality on PC ( better AF and better looking RT) but on others they are worse (crowd density etc). But it's close enough. Ballpark. And a 2060S CAN get those ballpark PS5 level quality, when paired with a stronger CPU to overcome this port's CPU optimisation issues. It's that simple.
 

Dampf

Member
sure as hell ps5 outperforms 2700x in this game

my 2700x tops out at 55-65 FPS cpu bound with RT enabled, whereas PS5 can hit framerates upwards of 75+ easily with its unlocked VRR mode at dynamic 1440p. maybe decompression thing hammers the CPU, that could be a reason too, but in the end it is what it is, I would say a 3700x is a better match in the case of Spiderman ray tracing. (im specifically talking about swinging in open world. in door scenes or sticking up to a wall is another story where u get better CPU bound performance)
Keep in mind PS5 uses lower settings than PC at max, which yoiu are probably using.

tNMaW0v.png


From DF. Try use them.

That is the case with nearly all games there is BTW. People compare PC performance at max settings to PS5 and wonder why the PS5 is so much faster, when the PS5 is actually running a combination between low, medium and high settings.
 
Last edited:

DenchDeckard

Moderated wildly
Would be really funny if nixxes do some extra work and manage to patch performance. I wonder if nxgamer would then revisit....
 
Last edited:

Dampf

Member
Yes I agree with the others here that this is likely a VRAM constraint.

Setting the 2070 to 4K resolution will completely overflow it's VRAM, even with DRS on. Remember, setting it to 4K also means the textures are using the correct mip level bias for 4K res, regardless if DRS enabled or not.

The PS5 should have around 7-9 GB available as video memory, and the game is apparently set to use 6.7 GB at maximum on 8 GB PC GPUs, so the reason why the PS5 performs much better in NXGamer's video is quite obvious.

NXGamer NXGamer Were you using the PS5 equivalent settings I mentioned above? Could you try using them again, but this time use low texture quality and report back your findings? Thank you!
 
Last edited:

Loxus

Member
I think we all sometimes take all DF videos as "benchmarking" PS5 power Vs PC, DFs goal was different. It was about what kind of GPU and CPU would one need to get a "PS5 like" experience. That's a very different goal to pure benchmarking, because Spiderman isn't suitable for pure benchmarking.

A 2060s (not 2060) can as usual provide ballpark PS5 level experience on the GPU side in this game but with a caveat: the CPU has to be stronger than usual since Spiderman suffers from PC specific CPU side optimisation issues (related to RT, decompression etc). They illustrated this by pairing the 2060s with an overpowered CPU (thus bruteforcing through the CPU optimisation issues) and showing that unconstrained, the 2060s is enough to deliver a PS5 level experience. (Again, this was NOT a benchmarking exercise, just a analysis of the type of GPU needed to provide a PS5 like experience). Once that's done they show how a moderate CPU like the 3600 underperformes in the game. I think this is a rational way to test and illustrate their goal.
I decided to re-watch Digital Foundry's Control Performance reviews and realized something.

In this video, we see the PS5's RT uncapped performance.


W00Fw91.jpg

Most of their benchmarks show the PS5 sitting in the mid 40s, 45 fps on average.

In this article by Digital Foundry, we can see the PS5 console settings applied to the 6600XT. Which gives an average of 32 fps.
AMD Radeon RX 6600 XT review: ray-tracing performance

Control Benchmarks
uDDSjmL.jpg


If the Console settings were to be applied across the board. Will this give us a proper example of the PS5's GPU RT performance compared to other GPU's?

Console settings seems to give an 11fps improvement.
6600XT: 32fps
6700XT: 36fps
6800: 44fps
6800XT: 49fps
6900XT: 52fps

What's even more bizarre, this is the area used for the benchmarking by Digital Foundry.


And this is the same area on PS5.
6S60eq2.jpg


There is one problem though.
The RT on PS5 is checkerboarded, while on PC it's full resolution and cannot be lowered.
So to be fair, I added 10fps in an attempt to match PS5's checkerboarded RT.

Console settings with checkerboarded RT?
6600XT: 42fps
6700XT: 46fps
PS5: 49fps
3060: 50fps
6800: 54fps
6800XT: 59fps
6900XT: 62fps

All this above is just a thought experience.
In my opinion with good optimisation.
PS5 RT off = 2070 Super - 2080 Super range.
PS5 RT on = 2060 - 2070 range.

One thing to note, the PS5 performance is equal to the sum of all it's part working together to punch above it's weight. This applies to the Xbox Series Consoles as well.
 
Last edited:

yamaci17

Member
Keep in mind PS5 uses lower settings than PC at max, which yoiu are probably using.

tNMaW0v.png


From DF. Try use them.

That is the case with nearly all games there is BTW. People compare PC performance at max settings to PS5 and wonder why the PS5 is so much faster, when the PS5 is actually running a combination between low, medium and high settings.
no, i'm exactly using ps5 equivalent settings xd
 

yamaci17

Member
I decided to re-watch Digital Foundry's Control Performance reviews and realized something.

In this video, we see the PS5's RT uncapped performance.




Console settings with checkerboarded RT?
6600XT: 42fps
6700XT: 46fps
PS5: 49fps
3060: 50fps
6800: 54fps
6800XT: 59fps
6900XT: 62fps

All this above is just a thought experience.
In my opinion with good optimisation.
PS5 RT off = 2070 Super - 2080 Super range.
PS5 RT on = 2060 - 2070 range.

One thing to note, the PS5 performance is equal to the sum of all it's part working together to punch above it's weight. This applies to the Xbox Series Consoles as well.


this has always been the case, and still is the same case. ps5 performs just a tad bit above 3060 in spiderman with RT enabled. since spiderman's RT implementation is rather lighter than most other games, PS5's rasterization can shine through. you can see this with AMD GPUs versus NVIDIA GPUs in light RT games such as Far Cry 6. Even in this game, rx 6700xt and above is highly competitive in terms of ray tracing against NV cards

what happens here that people misuses this event to their own arguments however is that 2070 being 2 times slower than PS5 in an abnormal case of VRAM bottleneck and people taking that fact as claiming PS5 is 1.5-2 times faster than 2070 even with RT.

individually, it is correct. at native 4k with very high textures, PS5 does indeed beat 2070. but it still does not change the fact that PS5's performance level is still near a 3060, which was practically a 2070 if not for VRAM bottlenecks

let's imagine we have 3 persons:

Person A can prepare 10 dishes per hour (PS5)
Person B can prepare 10 dishes per hour (2070)
Person C can prepare 10 dishes per hour (3060)
Person D can prepare 15 dishes per hour (3070)
Person E can prepare 15 dishes per hour (2080ti)

Person B (2070) and Person C (3060) are twins, but one of them buckles under stress, the Person C.
Up until that stress point, people who support Person A are constantly being reminded how they can only prepare as much as dishes as Person B and Person C.

Suddenly, Person B gets buckled under stress and starts to prepare 5 dishes per hour. It is astounding, everyone knew him as the person who did 10 dishes per hour.

Person D also drops from 15 dishes to 10 dishes, which puts him in a comical situation.

But in the same stressful situation, Person B and Person E just do their usual dishes per hour. Person A also keeps doing their exact same 10 dishes per hour. Person A's and Person C and E's relations stay exactly the same.

In the end: Person A is not "overperforming", neither Person C and Person E.
If Person A was preparing 15 dishes per hour like the Person E all of a sudden, you could say Person A did more work to get a better result. However that is not the case.

The entire case is Person B and Person D being heavly buckled under stress (huge vram bottleneck)

As I said countless other times; there are people in this very thread who keeps asking reviewers to use 2700x to do a proper match, yet now all of a sudden "but but 3070 is a package!!1 it is not sony's fault!1" Whenever I told tham that a high end rig is also a package that comes with a faster Zen 3 CPU, they retaliate saying it is unfair. Now when you ask them to fairly compare PS5 with a GPU that has enough VRAM budget, they refuse to acknowledge it.
 

yamaci17

Member
P playsaves3

my final post to you, to make you understand real good this time. do note that i have a 2700x, the so called ps5 capable CPU

QRbva76.jpg


lets see. in this scene we supposedly see ps5 being %38 faster than 2070. therefore, NX gamer claims ps5 is almost performing like a 3070 here.

let's see. with low textures but matched ps5 settings (high particles, high RT, 10 rt object range and so on)

3070 renders a whopping 56 frames. this is the raw brute power 3070 has. it is %30 faster than ps5, which is what it should be.

a6glnra.jpg


if i use very high textures;
CxtnbuY.jpg



i lose %36 frames.
practically the entire advantage 3070 has over PS5.

now, what I'm trying to convey you is that if 3070 had 12 GB memory this would not happen. because, this does not happen to 2080ti, and it always performs %30 over PS5, in almost every case.

in that shot with 2070 and PS5, 2070 is handicapped by %30. if it did not have the huge VRAM bottleneck, it would perform almost similar to PS5, maybe only tailing %5 behind it, instead of tailing %38 behind it.

as i said, this is a VRAM memory bottleneck issue. it does not have to do with architectural differences. VRAM bound bottlenecks are known to cause severe framerate drops since 2005s. this is a wide knowledge that this is what happens when you run out of VRAM

this is clear as I can get.

both my 2700x and my 3070 is capable of pushing 56 framerate average there. if i had a 2080ti instead of a 3070, I would get 56-57 framerate average with VERY HIGH textures. but I cannot. I'm hugely constrained by VRAM. a very simple concept misused by NX gamer to make irrevelant foundations his comical deductions

When we compare PS5 to equivalent GPUs, we compare their raw strenght and power. but when that power is strangled by an other factor, ie VRAM, things change. 3070 is not performing like the usual 3070 in this case. if it had more VRAM, it would.

GPU has the grunt, it just cannot show it because the GPU stalls and waits for VRAM to do its job. its not a normal thing to happen, you never want this to happen. you either lower texture resolution or game resolution.

yes, in the end, 8 GB 2070 model cannot match PS5 experience at 4K. no one denies this. But it does not change the fact that PS5 is not overperforming. It is 8 GB GPUs underperforming instead, compared to PS5.
 

avin

Member
I think I get it. NXGamer NXGamer is looking at a PS5 game with a poorly optimized PC port, and further, is using hardware configurations on which it's likely to run especially poorly. But I end up learning a lot from these threads, so at least for me, there's a major up side.

avin
 
Last edited:

Ywap

Member
I'm playing on 12700k - 3080 - 32ddr4 and have to play at 1080p 50hz with their own upsampling method enabled (looks best to me) to eliminate framerate drops at max settings.

It doesn't matter what upsampling method i enable, the game still drops below 60 frequently at 60hz.

Still, i can't complain, the game looks and plays great at 50hz for a Bfi Oled addict like me :messenger_horns:
 
Last edited:
I decided to re-watch Digital Foundry's Control Performance reviews and realized something.

In this video, we see the PS5's RT uncapped performance.


W00Fw91.jpg

Most of their benchmarks show the PS5 sitting in the mid 40s, 45 fps on average.

In this article by Digital Foundry, we can see the PS5 console settings applied to the 6600XT. Which gives an average of 32 fps.
AMD Radeon RX 6600 XT review: ray-tracing performance

Control Benchmarks
uDDSjmL.jpg


If the Console settings were to be applied across the board. Will this give us a proper example of the PS5's GPU RT performance compared to other GPU's?

Console settings seems to give an 11fps improvement.
6600XT: 32fps
6700XT: 36fps
6800: 44fps
6800XT: 49fps
6900XT: 52fps

What's even more bizarre, this is the area used for the benchmarking by Digital Foundry.


And this is the same area on PS5.
6S60eq2.jpg


There is one problem though.
The RT on PS5 is checkerboarded, while on PC it's full resolution and cannot be lowered.
So to be fair, I added 10fps in an attempt to match PS5's checkerboarded RT.

Console settings with checkerboarded RT?
6600XT: 42fps
6700XT: 46fps
PS5: 49fps
3060: 50fps
6800: 54fps
6800XT: 59fps
6900XT: 62fps

All this above is just a thought experience.
In my opinion with good optimisation.
PS5 RT off = 2070 Super - 2080 Super range.
PS5 RT on = 2060 - 2070 range.

One thing to note, the PS5 performance is equal to the sum of all it's part working together to punch above it's weight. This applies to the Xbox Series Consoles as well.


Confused Joe Biden GIF by CBS News


Hmmm.. why are you posting PC Ultra RT benchmark vs PS5 RT knowing well that you can get well over 100 fps on PC at such low settings.

drz4hEW.png
 
P playsaves3

my final post to you, to make you understand real good this time. do note that i have a 2700x, the so called ps5 capable CPU

QRbva76.jpg


lets see. in this scene we supposedly see ps5 being %38 faster than 2070. therefore, NX gamer claims ps5 is almost performing like a 3070 here.

let's see. with low textures but matched ps5 settings (high particles, high RT, 10 rt object range and so on)

3070 renders a whopping 56 frames. this is the raw brute power 3070 has. it is %30 faster than ps5, which is what it should be.

a6glnra.jpg


if i use very high textures;
CxtnbuY.jpg



i lose %36 frames.
practically the entire advantage 3070 has over PS5.

now, what I'm trying to convey you is that if 3070 had 12 GB memory this would not happen. because, this does not happen to 2080ti, and it always performs %30 over PS5, in almost every case.

in that shot with 2070 and PS5, 2070 is handicapped by %30. if it did not have the huge VRAM bottleneck, it would perform almost similar to PS5, maybe only tailing %5 behind it, instead of tailing %38 behind it.

as i said, this is a VRAM memory bottleneck issue. it does not have to do with architectural differences. VRAM bound bottlenecks are known to cause severe framerate drops since 2005s. this is a wide knowledge that this is what happens when you run out of VRAM

this is clear as I can get.

both my 2700x and my 3070 is capable of pushing 56 framerate average there. if i had a 2080ti instead of a 3070, I would get 56-57 framerate average with VERY HIGH textures. but I cannot. I'm hugely constrained by VRAM. a very simple concept misused by NX gamer to make irrevelant foundations his comical deductions

When we compare PS5 to equivalent GPUs, we compare their raw strenght and power. but when that power is strangled by an other factor, ie VRAM, things change. 3070 is not performing like the usual 3070 in this case. if it had more VRAM, it would.

GPU has the grunt, it just cannot show it because the GPU stalls and waits for VRAM to do its job. its not a normal thing to happen, you never want this to happen. you either lower texture resolution or game resolution.

yes, in the end, 8 GB 2070 model cannot match PS5 experience at 4K. no one denies this. But it does not change the fact that PS5 is not overperforming. It is 8 GB GPUs underperforming instead, compared to PS5.
Bit off topic but it’s weird that Nx gamers framerate average is lower than elanalistabits for the ps5 vrr ID assume Nx gamers is more accurate though
 
That's cause the settings don't match. If nixxes included an "original" setting like what's available in Death Stranding or HZD, it would have been far easier. So they gave a ball park PS5 setting. In some cases those settings give better quality on PC ( better AF and better looking RT) but on others they are worse (crowd density etc). But it's close enough. Ballpark. And a 2060S CAN get those ballpark PS5 level quality, when paired with a stronger CPU to overcome this port's CPU optimisation issues. It's that simple.
Nx gamer actually tested a lighter setting in his benchmark than what the ps5 uses since it’s a custom in between to drive his point even further (for example one if the settings in perf rt is between med and high so he used the medium setting on the pc so there are no excuses for it) and he usually does this in his settings benchmarks to show a bare minimum difference between pc and console. So it’s showing you didn’t watch the video and take whatever DF says at face value
 
Nx gamer actually tested a lighter setting in his benchmark than what the ps5 uses since it’s a custom in between to drive his point even further (for example one if the settings in perf rt is between med and high so he used the medium setting on the pc so there are no excuses for it) and he usually does this in his settings benchmarks to show a bare minimum difference between pc and console. So it’s showing you didn’t watch the video and take whatever DF says at face value
Nxg testing methodologies are extremely suspect as many have pointed out already (with examples). I'm not going to repeat them. But hey, you are free to believe what you want to believe.
 
Nxg testing methodologies are extremely suspect as many have pointed out already (with examples). I'm not going to repeat them. But hey, you are free to believe what you want to believe.
No your original claim was that it was unfair to pc cause he was using a higher setting there than on console when it’s actually the opposite he alleviates any extra load for pc when there is discrepancies. In actuality if there was an exact setting for what the ps5 uses in that mode on pc the performance would be EVEN WORSE than what he showed on screen
 

Rubim

Member
No your original claim was that it was unfair to pc cause he was using a higher setting there than on console when it’s actually the opposite he alleviates any extra load for pc when there is discrepancies. In actuality if there was an exact setting for what the ps5 uses in that mode on pc the performance would be EVEN WORSE than what he showed on screen
The unfair side is more of it is relevant to his Graphics Card being severed limited by his CPU.

But that Mipmap issue happens evens on a 3080TI 24GB.
So is it a memory issue?
Is it a PCI Budget issue?

Whats actually going on?
 
The unfair side is more of it is relevant to his Graphics Card being severed limited by his CPU.

But that Mipmap issue happens evens on a 3080TI 24GB.
So is it a memory issue?
Is it a PCI Budget issue?

Whats actually going on?
I wasnt talking about mitmaps in my post I was sorely talking about graphical settings and a custom preset on console. About cpu limits that gpu is being limited by it just as much as the ps5 is by it’s cpu so it’s still equal in the end
 

Loxus

Member
Confused Joe Biden GIF by CBS News


Hmmm.. why are you posting PC Ultra RT benchmark vs PS5 RT knowing well that you can get well over 100 fps on PC at such low settings.

drz4hEW.png
Post like these are how you know the person didn't even bother to read what you posted.

If you did bother to read everything, you would have understood I used PS5 console settings from Digital Foundry with has the 6600XT @ 32fps and the PS5 @ 49fps.

It can be that hard, all of us went to school correct?
 
Last edited:
Top Bottom