• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Control PS5 Vs Xbox Series X Raytracing Benchmark

16% a fairly big difference tbh and when you look at where the SX hits frame rate wise as they go round it’s big. If you work out ps5 hitting 45FPS and add 16% on top of that it’s big enough

A massive difference? Nah I don't think it's that big to be honest. If one system was running it at a locked 60 and the other an unstable 30 I would call that a massive difference. But this comparison shows that they are really close to each other. Just my opinion BTW but I wouldn't celebrate a 16% advantage as being massive. Now the XSX and the XSS that's a huge difference.

Also keep in mind we are talking about fotomode which isn't representative of the actual gameplay.
 
Last edited:

J_Gamer.exe

Member
Daft is looking at the video and seeing what you only want to see, which you have done

If its an area of cpu strain, which it doesn't seem likely anyway, then its affecting xbox more, due to its framerate falling to be in line instead of ahead of PS5. That implies a bottleneck on said platform.

Just like in a heavy gpu areas you expect to see similar drop levels on both.
 

Fredrik

Member
no, but really, people measuring performance in photo mode? how much you have to be brain dead to do that VS real gameplay where you actually playing?
Did you watch the video? As DF says this is purely academical and more like a benchmark for the GPU and raytracing capabilities than anything else.

There is already a gameplay comparison thread it you rather want to talk about gameplay performance.

For me this is interesting since we finally have something similar to 3DMarks on consoles, which to my knowledge is a first. This is the first example where the GPU power advantage for XSX is actually seen with resolution and gfx settings the same.

There is still the possibility that one version is coded better of course but it’s still an interesting showcase. All we’ve had prior to this is PR.

But as for the actual game I wouldn’t play it on XSX, or PS5. This is a PC game.
 

Concern

Member
Anthony Anderson Reaction GIF


People really rather cry about photo mode rather than play games huh?

If its really "just photo mode" and means absolutely nothing. Why is this thread looking like the Hitman thread? Lol

We don't love our favorite console, we just hate yours 🤣🤣
 

JonkyDonk

Member
Anthony Anderson Reaction GIF


People really rather cry about photo mode rather than play games huh?

If its really "just photo mode" and means absolutely nothing. Why is this thread looking like the Hitman thread? Lol

We don't love our favorite console, we just hate yours 🤣🤣
There are as many posts celebrating this as others disregarding it, that's why this thread is the way it is.
 

DJ12

Member
Oh its "They" again. Yeah all PlayStation fans across the globe clubbed together to make this video.

Let's ignore the fact kingthrash has been calling them out for years.

How about address the differences spotted and come up with your explanation.


Damn, you'd have to be a new level of daft to not see why it would refer to xbox given it lead in prior scenes, what sort of logic do you have to say ps5 was behind, is now level so maybe ps5 is bottlenecked in this level scene when its the SX thats now lower than before in % lead LOL. If it was aimed at both you'd expect ps5 to drop by a similar % as in the other scenes if CPU's were equal.

tenor.gif
Let's not forget proof that they are shilling provided in the next gen thread.
 

phil_t98

#SonyToo
A massive difference? Nah I don't think it's that big to be honest. If one system was running it at a locked 60 and the other an unstable 30 I would call that a massive difference. But this comparison shows that they are really close to each other. Just my opinion BTW but I wouldn't celebrate a 16% advantage as being massive. Now the XSX and the XSS that's a huge difference.

Also keep in mind we are talking about fotomode which isn't representative of the actual gameplay.

we are talking about photo mode which they have used as a bench mark
 

Riky

$MSFT
But certainly not enough for it to be a big difference. I understand why the developers went with the same settings on both. Pretty sure many other games will do the same except for the ones that favor the XSX architecture more. That's what I've learned from this.

It's not enough to get a steady 60fps, although they did note that a lot of samples were useless because the Xbox was at 60fps so it could possibly go above.
Hopefully when Devs start leveraging tools like Tier 2 VRS , Mesh Shaders and Sampler Feedback Streaming it would allow games like this to hit a constant 60fps.
 
Last edited:
Did you watch the video? As DF says this is purely academical and more like a benchmark for the GPU and raytracing capabilities than anything else.

I also learned another thing from this comparison. It shows that the developers have access to the extra GPU grunt the XSX has. So when it seems to underperform in games it's not because the developers can't access the extra GPU power. Either the developers are making decision to obtain parity or there's something else that's affecting performance.

I know that actual gameplay and the fotomode mode don't work the same. It could be possible that whatever is limiting performance isn't there in fotomode.
 

phil_t98

#SonyToo
If its an area of cpu strain, which it doesn't seem likely anyway, then its affecting xbox more, due to its framerate falling to be in line instead of ahead of PS5. That implies a bottleneck on said platform.

Just like in a heavy gpu areas you expect to see similar drop levels on both.

so what your saying is the Xbox series X has more overhead except in that area where it drops to PS5 levels of performance? is that right?
 

Elog

Member
Nobody? Really? I remember one guy who constantly claimed that the Xbox GPU was in fact a badly designed server blade. I can pull up all the quotes if you like, but we both know I don't need to.
You are not a great listener Riky but at some point the penny needs to drop.

Let's be nice and state that the XSX and the PS5 has so far on average performed equal across a fairly ok sample of cross-generation titles (as stated it has been very close between the two but actually a small edge - so far - to the PS5).

On paper the XSX has a more powerful GPU. Where did that power go? Are Microsoft that bad at hardware design? Of course not. The answer is actually exactly what you are eluding to above. The XSX has dual speed memory for only one reason - that MS wanted the same board to be used for the server blade as for the XSX which resulted in the same memory controller silicon but with slightly less memory on the XSX than the server blade). Same goes for the largest shader arrays of any GPU ever - if MS only needed to design the XSX they would not have been that large since it will result in intermittent bottle-necking of the array.

MS made these choices to drive down the manufacturing cost of both the XSX and the server board - at the same time as ensuring hardware based back-ward compatibility.

That is one of the main reasons why you more or less see parity between these two systems.
 
It's not enough to get a steady 60fps, although they did note that a lot of samples were useless because the Xbox was at 60fps so it could possible go above.
Hopefully when Devs start leveraging tools like Tier 2 VRS , Mesh Shaders and Sampler Feedback Streaming it would allow games like this to hit a constant 60fps.

Seems like it's the same case with the PS5 as developers get used to the hardware and it's customizations. I honestly don't see either system just stagnating.
 
I can't imagine this game is CPU bound on console unless their CPUs are much weaker than expected compared to desktop Ryzen. I've been messing about this morning on PC and it is always GPU bound for me no matter what is happening on screen.

I can get as low as 33 FPS in the corridor of doom with everything maxed out and no DLSS. This almost doubles to 59FPS with DLSS quality which suggests this scene is crushing GPUs for some reason. If I put it to console settings and lower the render resolution I can max out at 144fps (and probably higher if I took v-sync off) which surely wouldn't be the case if this scene was CPU bound.
 

Panajev2001a

GAF's Pleasant Genius
this post will age very very very bad..saved for future crow eating
Sure, I am not sure how you will cherry pick and stretch things to make sure saying “expected consoles to be pretty close and XSX GPU pulling ahead in several scenarios” would sound the most clickbaity crow serving statement ever... good thing only people like you and Riky are not focused on petty warring.
 
Last edited:

JonkyDonk

Member
It only took 3 posts for fanboys to start filling this thread with tears. Thats why it is the way it is.

If its so insignificant, why is there so many emotional meltdowns in here over this?

I will say it provides quite the entertainment tho regardless.
You are selectively seeing what you want to see because of your own biases. There are plenty of asinine Xbox fanboys exagerrating the significance of this to keep the stupid arugment going. And there are also a fair number of posts like yours, mocking the length of the thread, which also keeps the thread going. And I'm doing it too by responding to you.
 

phil_t98

#SonyToo
A massive difference? Nah I don't think it's that big to be honest. If one system was running it at a locked 60 and the other an unstable 30 I would call that a massive difference. But this comparison shows that they are really close to each other. Just my opinion BTW but I wouldn't celebrate a 16% advantage as being massive. Now the XSX and the XSS that's a huge difference.

Also keep in mind we are talking about fotomode which isn't representative of the actual gameplay.

if you watched the video he says at the beginning at he had to not use a lot of scenes because xbox was constantly hitting 60fps so there was a big difference and he averaged it at 16% where there was a lot it was hitting higher
 

Concern

Member
You are selectively seeing what you want to see because of your own biases. There are plenty of asinine Xbox fanboys exagerrating the significance of this to keep the stupid arugment going. And there are also a fair number of posts like yours, mocking the length of the thread, which also keeps the thread going. And I'm doing it too by responding to you.


It goes both ways, i know that. Still nothing compared to the dumb shit on other threads. When it goes the other way, roles are reversed.

But if the argument is "lolz issa photo mode" than why are they so triggered about it?

Lets not forget it is ps fanboys who made a fake Hitman 3 framerate dropping video just because of the df faceoff.
 
if you watched the video he says at the beginning at he had to not use a lot of scenes because xbox was constantly hitting 60fps so there was a big difference and he averaged it at 16% where there was a lot it was hitting higher

So why do both versions have identical settings then if the difference was so massive between the two?
 

phil_t98

#SonyToo
It goes both ways, i know that. Still nothing compared to the dumb shit on other threads. When it goes the other way, roles are reversed.

But if the argument is "lolz issa photo mode" than why are they so triggered about it?

Lets not forget it is ps fanboys who made a fake Hitman 3 framerate dropping video just because of the df faceoff.

the funny thing was if that video showed 0fps when he was stood still it might of been believable but of coarse it went to far lol
 
Last edited:

Riky

$MSFT
You are not a great listener Riky but at some point the penny needs to drop.

Let's be nice and state that the XSX and the PS5 has so far on average performed equal across a fairly ok sample of cross-generation titles (as stated it has been very close between the two but actually a small edge - so far - to the PS5).

On paper the XSX has a more powerful GPU. Where did that power go? Are Microsoft that bad at hardware design? Of course not. The answer is actually exactly what you are eluding to above. The XSX has dual speed memory for only one reason - that MS wanted the same board to be used for the server blade as for the XSX which resulted in the same memory controller silicon but with slightly less memory on the XSX than the server blade). Same goes for the largest shader arrays of any GPU ever - if MS only needed to design the XSX they would not have been that large since it will result in intermittent bottle-necking of the array.

MS made these choices to drive down the manufacturing cost of both the XSX and the server board - at the same time as ensuring hardware based back-ward compatibility.

That is one of the main reasons why you more or less see parity between these two systems.

That's one interpretation of their choices, but it's not the only one and not what the people who designed the machine say.
They said the "split memory" which isn't actually split in the traditional sense of the word is down to developers asking for faster optimized Vram, since say Valhalla on PC only uses about 6.5gb and that's the most I've heard of so far then I seriously doubt it's a problem, maybe it takes more work but that doesn't make it a bottleneck. I've got a PC with 8gb of Vram, I've never seen it get anywhere near that.

As for why Xbox hasn't shown an advantage in every game well traditionally that hasn't happened in previous gen, there were occasions PS2 outperformed Xbox, Xbox One on PS4 etc, it's not unheard of, that didn't make the PS4 or original Xbox badly designed.

My take as I've explained before is that the games so far have all started as last gen games, to get ready for launch and to mitigate the problems of working from home Devs have taken Pro and X1X versions and upgraded them for next gen, just because a game released after the new consoles doesn't make it next gen.
Xbox has had a bigger change in developer environment and if you look at the specs of going from Pro to PS5 in compute units etc it would be a lot easier to port and unlock extra performance where on Xbox you have a different setup of memory and a wider GPU.
Once we had a game that's wasn't rushed for launch in Hitman 3 then Xbox had a big advantage at 44% resolution advantage and higher shadow settings, a bigger metric than the paper metric .
 
Last edited:

Elog

Member
Ah yes, gameplay where the frame rate is capped and poor optimization causes issues.
You need to be more intellectually honest here. In the 30 FPS RT mode for control you are right that the FPS is capped. Interestingly, you have slightly more frame rate dips on the XSX in this mode than the PS5.

What is the logical conclusion from this? More dips with capped 30 FPS but higher average in photo mode without cap? That most likely means that the XSX is bottle-necking more often than the PS5 which is logical given the design.
 
you tell me, did you see that bit of the video though?

Because it's fotomode and not actual gameplay?

I mean if there was a massive difference between the two then the developers could easily push higher settings on the XSX. Yet they are virtually identical.

You need to be more intellectually honest here. In the 30 FPS RT mode for control you are right that the FPS is capped. Interestingly, you have slightly more frame rate dips on the XSX in this mode than the PS5.

What is the logical conclusion from this? More dips with capped 30 FPS but higher average in photo mode without cap? That most likely means that the XSX is bottle-necking more often than the PS5 which is logical given the design.

Fotomode is definitely missing some things when compared to the actual gameplay. Hence why I believe they didn't boost settings on the XSX version. If they had that much headroom you could definitely see the XSX run the game at higher settings. Like a 1440P vs Native 1800P kind of situation with some settings set a little bit higher.
 
Last edited:

phil_t98

#SonyToo
Because it's fotomode and not actual gameplay?

I mean if there was a massive difference between the two then the developers could easily push higher settings on the XSX. Yet they are virtually identical.

did you see the bit where he said in this benchmark where he was using photo to benchmark and that that he couldn't compare certain bits as the xbox was constantly hitting 60fps where the PS5 was not? it was a simple question?
 

JonkyDonk

Member
It goes both ways, i know that. Still nothing compared to the dumb shit on other threads. When it goes the other way, roles are reversed.

But if the argument is "lolz issa photo mode" than why are they so triggered about it?

Lets not forget it is ps fanboys who made a fake Hitman 3 framerate dropping video just because of the df faceoff.
People are 'triggered' because the conclusion that a lot of people have walked away from this DF video is that XSX is outperforming PS5 in this game. But that's obviously not true, as the initial DF analysis showed. The narrative of how this game actually performs on these consoles has been distorted by this abstracted photo mode benchmark. And now there is a new narrative that Remedy held back the XSX version because of PS5 which is also not true for several reasons that have already been discussed.

I don't know what Hitman has to do this, or why you think that thread was any different than other frenzied DF threads we've had. But many fake benchmarks have been going around since this gen started from random youtube channels. They are made for clickbait, there is no fanboy conspiracy behind it.
 

Fredrik

Member
Sooooo......why isn't it on DF tho....they are the only ones echoing the devs. Lol....I mean even so it still doesn't explain the missing textures and lower reflective effects throughout. The dev said identical just like DF said.....both of them lied.
Have you played the game?

You’re analysing an analysis video of a highly dynamic game on Youtube. Claiming the devs are lying after that is not very serious imho. Like I said, at least start the game and have a look around, try replicate one of the scenes DF showed and you’ll understand what we’re dealing with here.

The texture was missing, a bug on one side on the pillar, not sure about the puddle and floor rail thing, some Xbox gamer can probably check that. I didn’t see anything else in your video that I couldn’t explain just from having played the game and knowing the crazy dynamic shit this game does. Besides the physics and gfx filters and all that you have smoke and ambient occlusion and particles and all kinds of dynamic effects all over the place making things look different pretty much every second unless you’re standing on the exact position and goes into photo mode on the exact same second.

I tried matching the pillar scene on PC and getting everything the same, the light, shadows and reflections etc. It was... not easy, I gave up.

Someone explained the missing objects in the mirror on PS5, which I think you forgot to mention, by talking about a randomizer which is placing out the crap you blow around when launching something. Haven’t double checked that but it’s probably right.

All in all, just chill. It’s just a weird little GPU benchmark.
 
did you see the bit where he said in this benchmark where he was using photo to benchmark and that that he couldn't compare certain bits as the xbox was constantly hitting 60fps where the PS5 was not? it was a simple question?

But why didn't that translate to the gameplay?

That's my question.

Your suggesting that developers could have pushed the settings higher but they chose to go with parity. I don't believe that's the case.

Is that simple enough for you?
 
Last edited:

Elog

Member
That's one interpretation of their choices, but it's not the only one and not what the people who designed the machine say.
They said the "split memory" which isn't actually split in the traditional sense of the word is down to developers asking for faster optimized Vram, since say Valhalla on PC only uses about 6.5gb and that's the most I've heard of so far then I seriously doubt it's a problem, maybe it takes more work but that doesn't make it a bottleneck. I've got a PC with 8gb of Vram, I've never seen it get anywhere near that.

As for why Xbox hasn't shown an advantage in every game well traditionally that hasn't happened in previous gen, there were occasions PS2 outperformed Xbox, Xbox One on PS4 etc, it's not unheard of, that didn't make the PS4 or original Xbox badly designed.

My take as I've explained before is that the games so far have all started as last gen games, to get ready for launch and to mitigate the problems of working from home Devs have taken Pro and X1X versions and upgraded them for next gen, just because a game released after the new consoles doesn't make it next gen.
Xbox has had a bigger change in developer environment and if you look at the specs of going from Pro to PS5 in compute units etc it would be a lot easier to port and unlock extra performance where on Xbox you have a different setup of memory and a wider GPU.
Once we had a game that's wasn't rushed for launch in Hitman 3 then Xbox had a big advantage at 44% resolution advantage and higher shadow settings, a bigger metric than the paper metric .
I have actually been surprised by the PS5 performance this early on. Under the hood the XSX is made to play current games and engines better than the PS5 given the hardware compatibility and that there are no changes to the rendering pipeline in the hardware apart from silicon upgrades that are the latest off-the-shelf solutions from AMD. The PS5 on the other hand has significant changes in the rendering pipeline if you want to use the GPU efficiently since it is centred around the new Geometry Engine - it can emulate the old way but then you leave a lot of silicon behind.

To what extent this is due to that the PS5 is brilliantly designed, the MS GDK is underdeveloped or the XSX bottle-necks even more than expected is something the future will show.
 

Elog

Member
Fotomode is definitely missing some things when compared to the actual gameplay. Hence why I believe they didn't boost settings on the XSX version. If they had that much headroom you could definitely see the XSX run the game at higher settings. Like a 1440P vs Native 1800P kind of situation with some settings set a little bit higher.
Question is really how much the frame rate jumps up and down on the XSX - the variability. Almost stable 30 FPS is better from an experience point of view than an FPS jumping between 25 and 45 even if the average would be higher (as an example).
 

Concern

Member
People are 'triggered' because the conclusion that a lot of people have walked away from this DF video is that XSX is outperforming PS5 in this game. But that's obviously not true, as the initial DF analysis showed. The narrative of how this game actually performs on these consoles has been distorted by this abstracted photo mode benchmark. And now there is a new narrative that Remedy held back the XSX version because of PS5 which is also not true for several reasons that have already been discussed.

I don't know what Hitman has to do this, or why you think that thread was any different than other frenzied DF threads we've had. But many fake benchmarks have been going around since this gen started from random youtube channels. They are made for clickbait, there is no fanboy conspiracy behind it.


And ? Its their own fault for being triggered. Hitman is a perfect example of how far fanboys will go for a "win" in these faceoffs. Making a fake framerate video is embarrassing.

Its a benchmark separate from the game's faceoff. Yet people are still bickering and bitching about it no matter how insignificant they claim it is.
 
Question is really how much the frame rate jumps up and down on the XSX - the variability. Almost stable 30 FPS is better from an experience point of view than an FPS jumping between 25 and 45 even if the average would be higher (as an example).

It's why I don't see how there's a massive difference between the two. If there really was that much headroom for the XSX then developers would increase the settings. I remember they did that with Hitman 3 even though it had a marketing deal with Sony. I don't see what's stopping Remedy from doing the same unless increasing settings ruins their performance target.

Edit: They can always patch in higher settings later in like the Dirt developers but at the moment this is what we have.
 
Last edited:

JonkyDonk

Member
And ? Its their own fault for being triggered. Hitman is a perfect example of how far fanboys will go for a "win" in these faceoffs. Making a fake framerate video is embarrassing.

Its a benchmark separate from the game's faceoff. Yet people are still bickering and bitching about it no matter how insignificant they claim it is.
A lot of the activity spikes on this forum happen around new DF threads, especially one that is controversial in some way. And there is not much else to talk about until the next DF thread. That's just the nature of this community.
 

ethomaz

Banned
And you can rarely determine performance from capped framerates.
So why even capped Series X has framerate drops even without the UI bug they talked?

Imagine when the tools can only be used in Photomode...

The reality is something else is a bottleneck in Series that doesn’t allow the use GPU of the GPU like dozen or games already evidencies that and this “benchmark” finally showed what we speculated since launch of the consoles.
 
Last edited:

Fredrik

Member
I also learned another thing from this comparison. It shows that the developers have access to the extra GPU grunt the XSX has. So when it seems to underperform in games it's not because the developers can't access the extra GPU power. Either the developers are making decision to obtain parity or there's something else that's affecting performance.

I know that actual gameplay and the fotomode mode don't work the same. It could be possible that whatever is limiting performance isn't there in fotomode.
As I said earlier I think it’s simple. This is just my ideas so don’t take them for anything more.

PS5 have a higher clocked GPU. That’s it.

In certain scenarios in certain games that will be more beneficial for performance. It’s the same when you overclock on PC.

But once real heavy liftning is needed you still need muscles and when devs go there we’ll likely see XSX push ahead. Doesn’t mean XSX will be ahead all the time or even half the time going forward, because while Hitman 3 and this Control benchmark shows that there is power under the hood on XSX waiting to be unleashed, it also shows that in certain situations it would be very helpful with a higher clocked GPU.

Which leads me to this...
Why did MS stay at 1.8GHz when the same GPU architecture on PC can do 2GHz on base clock? The cooling seems great, power should be enough too, bandwidth is good. Why the low clock?
 
Do I need credibility when the facts are straight in your face?
I mean... Yes, you do. Credibility is everything. One fanboy comparison is much like any other - worthless. Mere fuel for the console warriors.

I’m sure DF, who you’re accusing of being ‘dishonest’ would point out that their ‘facts were straight in your face’, had they not been run out of town here by manic fanboys.

I’ve never watched one of your videos by the way, good luck with all that. If you ever become unbiased I’ll maybe check one out then.
 
You are selectively seeing what you want to see because of your own biases. There are plenty of asinine Xbox fanboys exagerrating the significance of this to keep the stupid arugment going. And there are also a fair number of posts like yours, mocking the length of the thread, which also keeps the thread going. And I'm doing it too by responding to you.

Man, you fucking smashed it.
 
As I said earlier I think it’s simple. This is just my ideas so don’t take them for anything more.

PS5 have a higher clocked GPU. That’s it.

In certain scenarios in certain games that will be more beneficial for performance. It’s the same when you overclock on PC.

But once real heavy liftning is needed you still need muscles and when devs go there we’ll likely see XSX push ahead. Doesn’t mean XSX will be ahead all the time or even half the time going forward, because while Hitman 3 and this Control benchmark shows that there is power under the hood on XSX waiting to be unleashed, it also shows that in certain situations it would be very helpful with a higher clocked GPU.

Which leads me to this...
Why did MS stay at 1.8GHz when the same GPU architecture on PC can do 2GHz on base clock? The cooling seems great, power should be enough too, bandwidth is good. Why the low clock?

Just weird that the developers have access to all this untapped power in the XSXs GPU which fotomode proves that it has. But when it comes to regular gameplay the results are different. I know that fotomode isn't doing everything that normal gameplay does. There's calculations being made in regular gameplay that are not needed during fotomode.

My guess is that the game wasn't giving them the performance that they wanted even though they had access to that extra GPU grunt. So they made the decision to make the settings what they are to reach that performance target.

It's the only thing that could explain why they couldn't use that headroom that was seen in fotomode.

As for the low clocks I always believed it was due to the limits of their PSU and their cooling solution. Increasing clock speed isn't really feasible otherwise they would have done it already.
 
Last edited:

FALCON_KICK

Member
Is this really surprising?

Xbox Series X has 52 CUs means for a non-updating frozen 3D scene, the GPU is able to fill the CUs completely thereby able to reach 12 TFs.

Now if you increase the frames to 120 FPS for dynamic 3D scene in motion which needs to constantly get updated within ~16ms, then Xbox series X will not be able to fill its CU effectively within that time frame while the PS5 due to higher frequency but less CUs will be able to beat it.

Now if games were made for less than 30Hz, like say 15-20Hz, then Xbox series X would be edging out PS5 in almost all games.
 
Last edited:

MonarchJT

Banned
It is a specific scenario. Stop being delusional.

directly from the mouth ofLance Mcdonald about the DF test

''This is fascinating. It's important to remember that when photo mode is engaged, CPU load is absolutely miniscule compared to gameplay (something I learned while debugging the game myself), so this is basically just a fantastic measurement of the differential in pure graphics hardware at play here. Really cool test.''

For your information he is the guy that made 60 fps Bloodborne possible


But I'm sure we should trust more an absolute nobody who writes on a forum in the throes of a fanboy crisis because he does not accept that his favorite box dosnt performs as he would like instead of Lance and all of Digital Foundry
 
Last edited:

Riky

$MSFT
As I said earlier I think it’s simple. This is just my ideas so don’t take them for anything more.

PS5 have a higher clocked GPU. That’s it.

In certain scenarios in certain games that will be more beneficial for performance. It’s the same when you overclock on PC.

But once real heavy liftning is needed you still need muscles and when devs go there we’ll likely see XSX push ahead. Doesn’t mean XSX will be ahead all the time or even half the time going forward, because while Hitman 3 and this Control benchmark shows that there is power under the hood on XSX waiting to be unleashed, it also shows that in certain situations it would be very helpful with a higher clocked GPU.

Which leads me to this...
Why did MS stay at 1.8GHz when the same GPU architecture on PC can do 2GHz on base clock? The cooling seems great, power should be enough too, bandwidth is good. Why the low clock?

It isn't low really, just lower. The Game Clock of the RX 6800 is a little lower than Series X, maybe they just wanted double the power of X1X and that's why they settled on it as it gives them exactly that.
 

pixelbox

Member
Isolating the gpu from the cpu in a smartshift setup makes this an invalid analysis. The system isn't a traditional one where it receives constant power. Although it can, its designed to pass off power to the chip that needs it, if one is being stressed. Normal usage is happening in this comparison which wouldn't trigger the need to draw power from the CPU. This could be a case of the GPU working at a stock/non boosted clock. Also note how consistent the frame graph is on the ps5 vs SX.

60A4zmE.jpg
60A4zmE.jpg

 
Last edited:
Top Bottom