• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[NxGamer] Spider-Man Remastered PC vs PS5 vs Steam Deck vs PS4 vs 750Ti - Technical Review & Comparison

yamaci17

Member
It really doesn't matter. 2070 is what it is. I don't want to lower the textures
you know what product you purchased. you can practically trade it with a 3060 and get your precious maxed textures with exact same compute power (maybe a bit more for 3060) here in second hand market I'm seeing 3060s being on par with 2070s in terms of price. maybe this is not an ideal solution either but it is what it is

I can for example trade my 3070 for a 6700xt and get money on top of it easily if I want my ultra textures in Spiderman. Its not like you don't have choices and you're trapped with a GPU. It is what it is, 2 GB budget gap is there.

In this case, gap is 3.6 GBs, because developer put an active %80 allocation cap for whatever reason. If it actually tried to use all available VRAM, It would of course perform better than what we see in VRAM constrained situations. But that's a choice they made.

As I said, you can claim PS4 is outperforming a 2 GB GTX 770 but no one says that. Instead we say PS4 is performing like a 4 GB GTX 1050.

Even NX gamer's video is laughable in that aspect. He specifically targeted the 2 GB 750ti to see how bad it stacks against PS4, whereas DF managed to find a 4 GB 750ti and their 750ti managed 900p/30 FPS with respectable texture quality. 750ti is a dead product anyway.

I've personally, even 1 year ago, suggested people from staying away from 8 GB models. Whenever I said this, I've been met criticisms by saying VRAM is not that important and 3060 cannot utilize it etc. etc. I'm the last person that you can talk to like this, since I've always known and openly talked about 8 GB is not being enough to match all bells and whistles a PS5 might have. Therefore, if you asked my advice back in 2021's beginning, I'd exactly told you this. If you want actual good 3070 performance at 4K in spiderman, you have to play with high textures. That's it. Quite as simple. 2070 or 3070ti does not matter.

I've did my part. I've never said 2070s-3070 would match PS5 all the time, they matched PS5 for not VRAM constrained games of the past, but we're not living the in the past either. And 3060 may not provide enough raster power to properly match a PS5 as well. So best bet is to wait 4000 series and see if they actually sell GPUs at sane prices. Get a 12 GB 4070 and then it can perform above PS5 until the end of the generation, just like how GTX 970 performed above PS4 in the entirety of the generation, whereas GTX 770 performed like dogshit
 

yamaci17

Member


at 1440p, this issue does not happen the way it does at 4K. 8 GB budget was never marketed or aimed at 4K. PS5 is capable of it, yes, and have enough budget. 3070 is simply a gimped 4K capable GPU that is constrained heavily by its VRAM.

if you play the game with the intended 1440p resolution, you will have the performance you're supposed to get, most of the time, even with very high textures.

if you play 4K, even with DLSS or FSR, you will run into VRAM bottleneck. simple as that. this is not an excuse, this is just the reality.

notice how he cleverly avoided proper GPU bound comparisons at 1440p between 2070 and PS5 since he is now heavily CPU bottlenecked at that resolution. good times. at 1440p, with no vraim constrainments, 2070 would match PS5 at 1440p, actually.

As I shown in the video, 3070, by "raw power" is capable of getting 4K native ray tracing with 60 or so frames per second (compared to PS5's average 40-45 FPS per second). this is just the expected gap, given the nature of RT being better on NV hardware.

with "very high" textures, GPU gets stalled. it is not its real performance, simple as that. gpu gets stalls to a point it loses around %50 of its performance, dropping to 40s, which is where "hurr durr ps5 performs like a 3070 now!".

let me "fix" your statements:

PS5 performs similar to 3060
PS5 performs similar to 3070 8 GB WHICH is heavily stalled and VRAM constrained at 4K and in a situation where it loses %50 of its performance
3070 performs %40 higher than PS5 when it is not constrained by VRAM
2070 performs same or similar to PS5 when it is not constrained by VRAM


PS5 is not "punching above its weight". if it did, it would outperform 3060 by a huge margin. or it would actually match 2080ti's performance. as I said, you and some others statements are simply disingenious. 2070/3070 underperforms and you know it. You can say it is hardware as a package, it does not matter. We're purely talking about matched hardware here.

You and all other people keep blaming DF for using higheend CPUs and claim they should use 2700/3700x instead. You want PC hardware to properly match PS5. Then you see no bias or wrongness in matching PS5 hardware with only 6.5 GB budget of VRAM (out of total 8 GB).

If you oh so want a matched CPU to compare against PS5, then I demand a matched VRAM BUDGET to properly compare against PS5. And 3060 fits the bill. And it almost matches PS5, or just stays a bit below it. Whereas VRAM constrained 2070 gets way below that, stalls to frames below 25. 2070 was an high end GPU whereas 3060 is a midrange GPU. By the end of the year, it will be sold for comical prices, and at this point, anyone who cares about maximmu texture quality at 4K should stay away from 8 GB or ditch it.

At native 1440p, the intended resolution for 8 GB budget cards, you do not run into this issue. I've had this discussion before with you, you being the overlord console master race saying 2060 is not a gaming GPU and should not be used, and it is below the console specs. I've told you that it is a GPU designed for 1080p in my mind, you retealiated saying 1080p is not a gaming resolution. You even go the extremes to say 1440p is a shit resolution. You simply do not respect people's choices so I don't expect you to understand there can be GPUs that have

- Lower VRAM budget than PS5 that targets lower resolutions
- Lower GPU power levels than PS5 that targets lower resolutions

The case of 3070 is intrigiuing. It is indeed capable of more than PS5, but yes, at 4K, it will constantly be challenged by its memory budget. NVIDIA probably planned it for it to be used at 1440p/monitor space, which a space you do not like, enjoy, or respect.

I've always said, even running a 4k monitor alone has its own VRAM tax. If you want a proper 4K capable machine, you either have to go with 3060 route, where you have to enormously lower settings to get playable framerates, or get a 3080 and above, which is very expensive. In that case yes, PS5 in terms of Spiderman is the best price to performance ratio 4K capable gaming device. That is clear as crystal as well. Future gen GPUs might change that if NVIDIA do not continue to be stingy with their VRAM budgets. From the looks of if they plan to push 8 GB 4060 and 10 GB 4070 which is comical, and at that point, I'd still say stay away from them seeing how Nixxes and some other devs artificially limit VRAM budget to %80, so you simply need 12 GB to make sure you have enough budget as much as PS5 has.
 
No… Battaglia making an unfair PC biased comparison? No way…

shocked philip j fry GIF
Wasn't Battaglia part of that secret Xbox discord with Major Nelson and others that was trying to spread FUD about PS5 pre-launch?
 

Skifi28

Member
Just a single $399 2020 box (case, cooling, motherboard, CPU+GPU, RAM, high performance SSD, one controller, OS, etc…)… yeah not doing anything noteworthy ;).
I think the most noteworthy part of all is performance with RT on. AMD's solution kind of sucks in RT as we've seen time and again, yet the framerate with RT on is equivalent to RTX cards in this case. That feels quite impressive, if you were to make an RT comparison in spiderman with an 6600XT which is around the same level of GPU power it'd be almost sad for the desktop GPU.

I've always found it odd how PC to console comparisons are always done with nvidia GPUs forgetting AMD exists even thought it'd be far more accurate in many cases.
 
Last edited:

yamaci17

Member
I think the most noteworthy part of all is performance with RT on. AMD's solution kind of sucks in RT as we've seen time and again, yet the framerate with RT on is equivalent to RTX cards in this case. That feels quite impressive, if you were to make an RT comparison in spiderman with an 6600XT which is around the same level of GPU power it'd be almost sad for the desktop GPU.

I've always found it odd how PC to console comparisons are always done with nvidia GPUs forgetting AMD exists even thought it'd be far more accurate in many cases.
yet the framerate with RT on is equivalent to VRAM CONSTRAINED 8 GB RTX with very high textures

with high textures, 2070 still gets ps5 equivalent performance, instead of being way behind it
 
Last edited:

Kenpachii

Member
3090 = 1440p/4k/8k card
3080 = 1440p/4k card ( v-ram limitations )
3070 = 1080p/1440p card
3060 = 1080p card
2060 = 1080p card
2070 = 1080p card
2080 = 1080p/1440p card
2080ti = 1440p/4k card

Testing 2070/2060/3060/3070 for 4k is fun and all but its not what its build for so its pointless.

Also why use quality DLSS at 4k to start with, drop it to performance or whatever is in the middle and u will see taxation on v-ram reduce anyway as resolution is just lower.

The testing that is done at 4k with a 2070 is like testing a PS5 at 8k, dumb as hell.

Also no PS5 doesn't have plenty of v-ram and enough, 10gb is a joke always was, 16gb is the bare minimum specially with RT.
 
Last edited:

yamaci17

Member
by the way "so called expertise" nx gamer says this happens because of not having enough VRAM.

his words

"as you can see here rtx 2070 vs ps5, not only the resolution is higher on the ps5, entire texture layers are missing and PC here never catches "up"

cgy70uV.jpg


Yet it also happens on a beefy 3090 with aan highend pcie 4 12900k rig


3:56

5rxTXiZ.jpg



this has nothing to do with VRAM, it is simply texture streaming lag. their engine is not properly optimized for PC architectures. you can argue all you want. DirectStorage and similar tech should adress potential problems like this.
 
Last edited:

Kenpachii

Member
by the way "so called expertise" nx gamer says this happens because of not having enough VRAM.

his words

"as you can see here rtx 2070 vs ps5, not only the resolution is higher on the ps5, entire texture layers are missing and PC here never catches "up"

cgy70uV.jpg


Yet it also happens on a beefy 3090 with aan highend pcie 4 12900k rig


3:56

5rxTXiZ.jpg



this has nothing to do with VRAM, it is simply texture streaming lag. their engine is not properly optimized for PC architectures. you can argue all you want. DirectStorage and similar tech should adress potential problems like this.


So basically a shit port then.
 
by the way "so called expertise" nx gamer says this happens because of not having enough VRAM.

his words

"as you can see here rtx 2070 vs ps5, not only the resolution is higher on the ps5, entire texture layers are missing and PC here never catches "up"

cgy70uV.jpg


Yet it also happens on a beefy 3090 with aan highend pcie 4 12900k rig


3:56

5rxTXiZ.jpg



this has nothing to do with VRAM, it is simply texture streaming lag. their engine is not properly optimized for PC architectures. you can argue all you want. DirectStorage and similar tech should adress potential problems like this.

It has to do with I/O as a whole. PS5 I/O for now is untouchable even against the most expensive CPU. This is what Epic boss was trying to tell everybody.
 

yamaci17

Member
It has to do with I/O as a whole. PS5 I/O for now is untouchable even against the most expensive CPU. This is what Epic boss was trying to tell everybody.
well then we have to wait for directstorage then. it is specifically aimed at stuff like this so it should be of help. i'm not saying it would be definetely, but surely it is an attempt towards it

nvidia already have their tensor cores that handle I/O related stuff, RTX IO is very much real and exists, and it will be used to accerelate directstorage most likely, I'm sure AMD will come up with something, considering how much excess compute units their PC GPUs now have.

problem is that Microsoft fails to deliver
 
well then we have to wait for directstorage then. it is specifically aimed at stuff like this so it should be of help. i'm not saying it would be definetely, but surely it is an attempt towards it

nvidia already have their tensor cores that handle I/O related stuff, RTX IO is very much real and exists, and it will be used to accerelate directstorage most likely, I'm sure AMD will come up with something, considering how much excess compute units their PC GPUs now have.

problem is that Microsoft fails to deliver
Directstorage withouth GPU decompression won't make a big difference as we can already see it in some benchmarks (you need 5-10 times faster I/O to get near PS5 speed, 20% faster is meaningless here).

GPU decompression is still theoretical stuff and probably won't be industry ready before next generation if we are lucky. But you think Cerny is not going to improve his design further for PS6 or even PS5 Pro?
 

ChiefDada

Gold Member
this has nothing to do with VRAM, it is simply texture streaming lag. their engine is not properly optimized for PC architectures. you can argue all you want. DirectStorage and similar tech should adress potential problems like this.

You and NX seem to be saying the same thing so I don't understand your beef with him. Your explanation for why the port isn't optimized for PC falls on not Nixxes not implementing a Direct Storage solution that doesn't exist. How can they optimize for tech that doesn't exist yet?
 
I think the main reason (as seen on page one) is because his comparisons make the PS5 look way stronger than it actually is.
we need to learn the difference between stronger (the ps5 isnt) and what he actually says that the ps5 can perform better in a lot of instances. pcs flat out have to do more work and take more on an equivalent task from the ps5 cause of the way its constructed so it takes more powerful hardware to equal the same performance
 
well then we have to wait for directstorage then. it is specifically aimed at stuff like this so it should be of help. i'm not saying it would be definetely, but surely it is an attempt towards it

nvidia already have their tensor cores that handle I/O related stuff, RTX IO is very much real and exists, and it will be used to accerelate directstorage most likely, I'm sure AMD will come up with something, considering how much excess compute units their PC GPUs now have.

problem is that Microsoft fails to deliver

This issue is present on only Sony game ports. I noticed this in steam deck vs ps4 comparison as well made by DF.

Hopefully more of the outlets begin to highlight it so Sony can begin working on fixing it.
 
im genuinely a little dissapointed in this forum i thought they were above attacking creators like resetera or other forums are. im also trying to figure out if anyone ever watched your video you say multiple times that its not that the ps5 has more grunt (you literally start off the video with needing more powerful hardware for equal performance) its specifically that with the construction differences between both versions the pc has to do more work than the ps5 on the exact same thing which means a ps5 punches above its weight. I really appreciated this analysis man as an autistic guy (though im not into identity politics) your form of analysis is genuinely addicting only made better by the accuracy of it. if i can add 1 thing i think it would be cool to add more configurations into your tests in the future i would have loved to see a 3090 or 6900xt at ps5 settings and see what level of frames they get for example (well that is if you have those gpus)
 

reinking

Gold Member
I have always felt the console comparison videos were not necessary in most cases because the differences are usually minimal and they only lead to console warring. PC to console is just stupid. I can understand comparing different PC setups to give an inication of performance on differnt systems. I can even understand maybe a mention to reference performance on a console vs a PC but there are so damn many variables in PC gaming I will never understand peoples fascination with these kind of comparisons.
 
I’m sure they have to build some middleware that bridges Sonys api(GNM?) to either directx or Vulcan. That in itself will introduce some overhead.
Exactly I would say if anything the 3070 is overutilized which is why rhe performance is the way it is
 

ChiefDada

Gold Member
we need to learn the difference between stronger (the ps5 isnt) and what he actually says that the ps5 can perform better in a lot of instances. pcs flat out have to do more work and take more on an equivalent task from the ps5 cause of the way its constructed so it takes more powerful hardware to equal the same performance

For the life of me I can't understand why this concept is so difficult for certain hard heads to understand.
 
For the life of me I can't understand why this concept is so difficult for certain hard heads to understand.
With how defensive this thread and especially the dumpster fire that is beyond 3d forums i think people want their outrage so they insert words into your mouth. I think the 3070 is actually performing quite impressively when you consider the load it has to deal with.
 
er...did you even watch the video, I literally call this out and explain it, with even a chapter entitled memory within the video. I show and discuss the Vram issues, low textures, lower mips, how it is worse than PS5 and the bigger 16GB of the RX6800. Your post is nothing but confirming what my video covers.

I then discuss the VRAM to Sysram issue, bandwidth a data bound and state in the 750Ti what I said at last gen, 2GB is and was not going to cut it and 8GB will not now.

I get frustrated when people attack facts with no logic, your argument is, "Well if the GPU had more Vram than it does it would be performing better!"
Well yeah, of course, This argument (which it is clearly not) is if my Fiesta had a Ferrari engine, it would be able to beat a Porshe. I see this very pigeonhole logic a great deal in comments, and it misses the point of these tests and how tests should be. The fact is you cannot buy a 2070 or 3070 with anything other than 8GB, so this game, in this mode, in this card, it performs as shown. All stated clearly in the video, you are arguing the same old, it is not a fair test, this is not about that or should it ever be, this is about what and where is the PS5 performing. I do not see you and others here arguing that DF using a 12900K with a RX580 is not madness and completely off what would be a real system do you? My rig here is a real example of what will exists and is around the same level as the consoles target.

Even that aside, you are skipping the other modes with the flat Performance Mode NO RT having clear GPU bound points that still show a deficit to the PS5, when not CPU constrained.


I mean just read this comment, the GPU that is performing worse here has nothing to do with the PS5 performing better???!!!!????

The GPU can and is Memory bound often, but NOT 100% of the time and is not the only reason, as noted above the 3060 with more Vram is not suddenly leaping ahead in performance.
I do appreciate you calling the 12900k thing especially in what’s supposed to be an equivalent gpu benchmark
 
There is nothing controversial for ps5 to be faster than 2070.... and it's using 200W at that at 400$.
Above that pc's can scale better. I don't think anyone or nxg are arguing that.
Vram or not, ps5 is faster than 2070 in this case by a good margin.

3060 doesn't destroy anything. It's 500$ gpu alone, so of course you would expect it to match ps5 or be quite a bit faster ideally.
And even with best spec PC, there are still some texture streaming issues and few other tiny drawbacks. Nitpicking but still.

You make it sound, like NXg dropped some controversy here...
Also as we see the 3060 is regularly under 40fps while the ps5 never goes below 40 in any mode (and usually hovers in the 45-50 range) so im not seeing how it’s equaling the ps5 let alone destroying it
 

yamaci17

Member
With how defensive this thread and especially the dumpster fire that is beyond 3d forums i think people want their outrage so they insert words into your mouth. I think the 3070 is actually performing quite impressively when you consider the load it has to deal with.
no, you still don't understand. 3070 UNDERPERFORMS compared to all non-VRAM constrained cards. 3070 literally performs almost like a 3060 at native 4k with ray tracing. it is not about 3070 versus ps5 or ps5 versus 3070 or 3060. it is about 3070 itself, underperforming compared to all GPUs that have higher VRAM budget than IT.

how many more times I have to explain this to you? why do you resist to understand it?


Being vram bottleneck usually cause a card to drop around %50 to %100 of its performance. even more than %100 in some cases. in this case, all 8 GB RTX GPUs are handicapped by %50 at 4K. Clear as that.



see how 4 gb version triumps 1.5x-2x performance over the 2 gb variant. card is hugely bottlenecked its by VRAM, to a point it loses half of its performance.

should be a basic concept for you to understand. you cannot. you still say this,

" I think the 3070 is actually performing quite impressively when you consider the load it has to deal with."

this is LARGELY false. if, the 3070 had a theoritical 12 GB VRAM, it would perform way above PS5 in this title. that theoritical 3070 happens to be the 2080Ti. BECAUSE it literally is. 3070 is literally a 2080ti crammed back into a 8 GB buffer. you can check their benchmarks, %99 of the time they're equal cards. it is not a false marketing or statement. 2080ti was targeted for 4k, whereas nvidia prepped the 8 gb variant for 1440p. and at 4K, clearly 8 GB is not cutting it. no one plays that way with a 3070, you would do yourself better if you play at 1440p. you quite literally "bleed" %50 of your potential performance at 4K for VRAM bottleneck. that is not an efficient way to handle this issue. you either suck it up and reduce the resolution or reduce the texture.

Quite literally, Imsomniac themselves suggest using high textures for 8 GB cards. Enough said.

"so im not seeing how it’s equaling the ps5 let alone destroying it"

i never said it was destroying it. but it is close enough. 3060 has way behind of PS5 in terms of rasterization. this game's ray tracing is not "heavy" unlike some would believe. naturally ps5 with its extra raster power over 3060 manages to make it a tad bit above 3060, despite 3060's RT advantage. them being very close however is proof enough that 3070 underperforms, instead of PS5 is overperforming.

you kept saying PS5 is overachieveing itself. I however never saw you saying 3060 overachieves compared to 3070/2070. Because that would not help your narrative, nor you would like it to be that way. This alone proves your bias towards this discussion.

just go fetch the hardware, do a 3060 vs 3070 vs 2080ti 3-way comparison at native 4k. see what is what. hardware unboxed won't do that, or it would hurt NVIDIA'S PR ego. I can do that, because I do not care about 8 GB's pride or NVIDIA's PR. You clearly meddle with things you don't understand. I present you a clearly open benchmark where 2080ti overtakes 3070 by %36 and 3070 almost being dangerously close to 3060 yet you ignore it.
 
Last edited:

lukilladog

Member
Meh, bad pc ports have always been a reality, they are using a half-assed streaming solution on PC, there is no much to it. Don´t give me this "king of ports" argument, every game with a custom renderer represents different challenges.
 

yamaci17

Member
Can anyone with a 3060 please swoop in to the topic? Make a video in the intro scene at native 4k with rt set to high. let's see how it fares against the 2070. Let's see if it drops to 20s like 2070 does.
 
no, you still don't understand. 3070 UNDERPERFORMS compared to all non-VRAM constrained cards. 3070 literally performs almost like a 3060 at native 4k with ray tracing. it is not about 3070 versus ps5 or ps5 versus 3070 or 3060. it is about 3070 itself, underperforming compared to all GPUs that have higher VRAM budget than IT.

how many more times I have to explain this to you? why do you resist to understand it?


Being vram bottleneck usually cause a card to drop around %50 to %100 of its performance. even more than %100 in some cases. in this case, all 8 GB RTX GPUs are handicapped by %50 at 4K. Clear as that.



see how 4 gb version triumps 1.5x-2x performance over the 2 gb variant. card is hugely bottlenecked its by VRAM, to a point it loses half of its performance.

should be a basic concept for you to understand. you cannot. you still say this,

" I think the 3070 is actually performing quite impressively when you consider the load it has to deal with."

this is LARGELY false. if, the 3070 had a theoritical 12 GB VRAM, it would perform way above PS5 in this title. that theoritical 3070 happens to be the 2080Ti. BECAUSE it literally is. 3070 is literally a 2080ti crammed back into a 8 GB buffer. you can check their benchmarks, %99 of the time they're equal cards. it is not a false marketing or statement. 2080ti was targeted for 4k, whereas nvidia prepped the 8 gb variant for 1440p. and at 4K, clearly 8 GB is not cutting it. no one plays that way with a 3070, you would do yourself better if you play at 1440p. you quite literally "bleed" %50 of your potential performance at 4K for VRAM bottleneck. that is not an efficient way to handle this issue. you either suck it up and reduce the resolution or reduce the texture.

Quite literally, Imsomniac themselves suggest using high textures for 8 GB cards. Enough said.

"so im not seeing how it’s equaling the ps5 let alone destroying it"

i never said it was destroying it. but it is close enough. 3060 has way behind of PS5 in terms of rasterization. this game's ray tracing is not "heavy" unlike some would believe. naturally ps5 with its extra raster power over 3060 manages to make it a tad bit above 3060, despite 3060's RT advantage. them being very close however is proof enough that 3070 underperforms, instead of PS5 is overperforming.

you kept saying PS5 is overachieveing itself. I however never saw you saying 3060 overachieves compared to 3070/2070. Because that would not help your narrative, nor you would like it to be that way. This alone proves your bias towards this discussion.

just go fetch the hardware, do a 3060 vs 3070 vs 2080ti 3-way comparison at native 4k. see what is what. hardware unboxed won't do that, or it would hurt NVIDIA'S PR ego. I can do that, because I do not care about 8 GB's pride or NVIDIA's PR. You clearly meddle with things you don't understand

I just looked up a 2080 ti benchmark and that was one paired with a 10900k (which mind you is a more than 2x better cpu than the ps5 one) on 4k dlss with dynamic res (all compromises compared to ps5) and it’s still falling below 60 at the same settings (usually hovering in the 55-60 range) that is not “much better” it seems to be performing around 15% better in this instance
 

yamaci17

Member
I just looked up a 2080 ti benchmark and that was one paired with a 10900k (which mind you is a more than 2x better cpu than the ps5 one) on 4k dlss with dynamic res (all compromises compared to ps5) and it’s still falling below 60 at the same settings (usually hovering in the 55-60 range) that is not “much better” it seems to be performing around 15% better in this instance
it is much better. you're just skewing your own perceptions. PS5 still applies certain kind of dynamic res even with VRR mode, if not, it would not drop to 1512p in some cases. we never see it drop below 30 in any cases, yet we know it drops to 1512p.


you're now downplaying an almost locked 60 wih a framerate profile between 35-45

also send me that video and let me see if it actually matches ps5 settings.
 
NXG does the exact same shit only more often, he royally screws up and does a twitter poll to say people can't see the difference so his mistakes don't matter............... yeh that's a better way to do things sure.
Nx gamer attacks creators and calls them virgins when they disagree with him? ID love to hear about an Alex who shows how he came to find his findings instead of saying it cause that was 1 thing hat always stood out about Nx gamers analysis. Hell Nx gamer literally bought a vrr capture card something digital foundry didn’t even do so he can get as accurate as possible benchmarks while in this mode
 
First of all, in your video you grossly misrepresented the number of PCs with GPUs more powerful than a 2070(super).. You used percentages based off of concurrent daily users instead of total monthly users.. of which Steam has more than 135M as of last year. Same goes for CPUs which have 8 cores or more, which is around 20%..

15% of 135M = ~20M PCs with GPUs better than PS5
20% of 135M = ~27M PCs with 8core+ CPUs
65% of 135M = ~87M PCs with 16GB+ RAM

So it's fair to say that there's around ~20M PCs that are of equivalent or greater specs than PS5 used on a monthly basis on Steam.

About your point of people not arguing about Digital Foundry using a 12900K with a RX580... is because they are removing any and all CPU bottlenecks out of the equation to test pure GPU performance... which is what people who test specific pieces of hardware do.... They're very clear about what they are testing. Your method of testing a "system" and not a particular component by isolating it... is what leads everyone to constantly mention how you're bottlenecking your system. I mean, even here, you KNOW that your configurations are bottlenecked in SOME way, whether that's CPU, or VRAM, or a combination. Your justification to continue use them because they're "around the same level as consoles" doesn't change the fact that you're bottlenecking performance and that you know you are.

Finally, call your damn 2070 what it is... it's a 2070. The fact that you go out of your way to OC it (which still doesn't match 2070s) and continually present it as "2070super-like" is because you want to present the PS5 to be just thaaaaaat much more impressive. Cmon now.. it's things like that which cause people to say the things they do.

/edit\ Also, the DRS in this game is fucky and can often cause performance fluctuation performance issues where there are none without it. Most of your testing is done with DRS active... and it doesn't work the same as it does on consoles, so the testing is "eh" to begin with.
Are you not realizing why it’s bad to allow a pc to not be bottlenecked on the cpu side but potentially allow the console to be or even when testing gpus against each other they have been caught using different cpu spreads?
 

yamaci17

Member
I just looked up a 2080 ti benchmark and that was one paired with a 10900k (which mind you is a more than 2x better cpu than the ps5 one) on 4k dlss with dynamic res (all compromises compared to ps5) and it’s still falling below 60 at the same settings (usually hovering in the 55-60 range) that is not “much better” it seems to be performing around 15% better in this instance
you're still being petty. if a person has 2080ti, they won't have a 3700x to pair with it. either understand this fact, or just leave these discussions already.

also, 10900k is not "2x" better than PS5. core for core, it is at best, %40 faster. and having 10 cores is not meaningful when a game does not scale beyond 8 cores
 
Last edited:
they removing bottlenecks for PC while comparing to PS5 :messenger_grinning_smiling: in which you can't remove bottlenecks and of course CPU which costs like a PS5 or more beats PS5 that's what DF wan't to see, so the most fair comparisson should be PS5 vs PC(closest spec to PS5) without any bottleneck free parts to see what PC is needed or not to match PS5, in other case remove consoles from comparisson or be proud as DF and act how 1000$ PC is beating 500$ console.
Hence why I said this is not an equivalent benchmark cause as far as I am aware the ps5 doesn’t have a 12900k in it
 

yamaci17

Member
Hence why I said this is not an equivalent benchmark cause as far as I am aware the ps5 doesn’t have a 12900k in it
3070 doesn't have 10 gb vram budget, innit

yet you see no wrongness in comparing them

see your hypocrisy

i literally sent you a video back in the release day, proof of concept that 2700x never drops below 55 with ray tracing enabled



i will do so again, so that you can stop the 10900k, CPU talk
 
it is much better. you're just skewing your own perceptions. PS5 still applies certain kind of dynamic res even with VRR mode, if not, it would not drop to 1512p in some cases. we never see it drop below 30 in any cases, yet we know it drops to 1512p.


you're now downplaying an almost locked 60 wih a framerate profile between 35-45

also send me that video and let me see if it actually matches ps5 settings
Literally where are you seeing a drop to 35 on ps5 literally where the average we see is 45-50 and also it’s not a locked 60 I said it’s regularly below 60 and again that’s with doing things like dynamic res and reconstruction which neither the ps5 use and even then it still can’t hold 60 and that’s even with a cpu MUCH better than the ps5 one. What’s sad is i would have believed you and stayed ignorant on it if you didn’t keep pointing to it as an example
 
3070 doesn't have 10 gb vram budget, innit

yet you see no wrongness in comparing them

see your hypocrisy

i literally sent you a video back in the release day, proof of concept that 2700x never drops below 55 with ray tracing enabled



i will do so again, so that you can stop the 10900k, CPU talk

Thats something with the gpu not an e tire other part of the pc/configuration so it’s not comparable a better comparison would be pairing the 3070 with an hdd while the ps5 has an ssd
 

yamaci17

Member
Literally where are you seeing a drop to 35 on ps5 literally where the average we see is 45-50 and also it’s not a locked 60 I said it’s regularly below 60 and again that’s with doing things like dynamic res and reconstruction which neither the ps5 use and even then it still can’t hold 60 and that’s even with a cpu MUCH better than the ps5 one. What’s sad is i would have believed you and stayed ignorant on it if you didn’t keep pointing to it as an example
60 to 45 is %33 difference

literal difference between a 3070/2080ti and PS5

you still havent sent me the video
 
Last edited:
3070 doesn't have 10 gb vram budget, innit

yet you see no wrongness in comparing them

see your hypocrisy

i literally sent you a video back in the release day, proof of concept that 2700x never drops below 55 with ray tracing enabled



i will do so again, so that you can stop the 10900k, CPU talk

Neogaf had this weird thing where it won’t allow me to respond after several minutes
 
3070 doesn't have 10 gb vram budget, innit

yet you see no wrongness in comparing them

see your hypocrisy

i literally sent you a video back in the release day, proof of concept that 2700x never drops below 55 with ray tracing enabled



i will do so again, so that you can stop the 10900k, CPU talk

Whats the resolution in that vid?
 

yamaci17

Member
Neogaf had this weird thing where it won’t allow me to respond after several minutes

where are settings shown? dont waste my time. also, if they enabled higher weather particles, that alone costs %25 GPU render budget with "high" preset, and that is set to Medium with PS5.
 
where are settings shown? dont waste my time. also, if they enabled higher weather particles, that alone costs %25 GPU render budget with "high" preset, and that is set to Medium with PS5.
Isn’t it set to medium on the performance modes on ps5 not the fidelity mode?
 

yamaci17

Member
The average isnt 60 it’s more like 57-58 although it can hit 60 also again that’s with both dlss and dynamic res on top of a better cpu
i dont care, i dont what dlss resolution is. it is unknown. it is also high preset, which sets weather particles to high, which causes a huge %25 GPU render budget drop, which does not happen on PS5, because it uses medium weather particles

i dont care about dynamic resolution, you can nnever knoe what resolution it drops to, maybe it does not at all. 2080ti is almost powerful enough to push native 4k 60 fps in this game without dropping resolution.

you downplay 60 to 57-58 and then completely disregard 35-40 FPS drops on PS5 and say it produly averages 45-50

Bias is too strong with you, if you do not dose it back, I will have to ignore and stop replying back to you


P playsaves3 it is set to medium for all PS5 modes

you can literally prove it by looking at title screen in idle. only high particles create extra clouds in the background, whereas all PS5 modes have the static, non clouded background, just like medium particles
 

yamaci17

Member
Whats the resolution in that vid?
what does this have to do with resolution now? you claim or inisuate that good performace is something to do with CPU. anyone who sents you a succesful 2080ti performance vid, you refute it with saying it has a strong CPU, implying that 2080ti has nothing to do with it but CPU makes it succesful

this video is proof that 2700x is able to almost to lock to a perfect 60 with RT enabled, regardless of GPU or resolution being used. i dont even remember the resolution if i had a 2080ti I would test it out. instead i have the 8 GB variant of it. naturally VRAM bogs down hard at 4K so i wont even bother sending you a video regarding that, because you're too biased to even give credence to what I claim
 

yamaci17

Member
Poor yamaci17 getting bombarded by some guys alt :messenger_grinning_sweat: Just leave him in his fanatasy land.
i man ican provide 4k native high texture gameplay, then he would say I should use very high textures, while ignoring 2080ti's 65+ avg. performance since it has a 10900k, but if i can provide 60+ avg performance with a 2700x and a 3070 but with high textures, again it is not valid

i really wish 3070 had 10 or 12 gigs, it truly saddens me to trying to expalining all this. NVIDIA aimed this card at 1440p, which makes it a laughing stock at native 4k in niche situations like this, where NX gamer preys upon



i literally cannot get above 40 frames with VH textures. simple as that. this is simply a limitation of VRAM, and it is not by DESIGN. i,t is what happens to all games universally when you RUN OUT OF VRAM...
 
Last edited:
Top Bottom