• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 and Xbox Series X GPU TFLOPs Difference “Probably” Won’t Matter in the Long Run – Hellpoint Developer

While 2TF or 18% doesn't seem like big numbers, let's all remember what could be accomplished in 1.8 TF. I feel like there is a certain level of fidelity that will be missing from PS5 games. On the flip side with the PS5 being allowed to load very many assets all at once, I feel like LOD balancing will be less of an issue than on Xbox, however, that's not to say it will be a huge issue regardless as it's all in how the game is designed.

I think the biggest difference will be PS5 will have lots of objects on screen, with less detail, whereas Xbox will have less objects on screen with more detail.
Actually, the xbox will have more things on screen AND will have more detail.
 

silent head

Member
10.3 is boost mode (overclocked) and not consistent.
:messenger_mr_smith_who_are_you_going_to_call:
Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core. It makes perfect sense as most game engines right now are architected with the low performance Jaguar in mind - even a doubling of throughput (ie 60fps vs 30fps) would hardly tax PS5's Zen 2 cores.
 

S0ULZB0URNE

Member
sure, xbox one ran at 900p vs 1080p on ps4, with less than 1TF diference, not a big diference...
anywho, excited for both upcoming consoles
It was a big difference PS4 vs Xbo as was Xbox One X vs PS4 Pro.

I see 3rd party games having a small advantage on XSX and PS5 having the best looking exclusives in gaming.
 
:messenger_mr_smith_who_are_you_going_to_call:
Several developers speaking to Digital Foundry have stated that their current PS5 work sees them throttling back the CPU in order to ensure a sustained 2.23GHz clock on the graphics core. It makes perfect sense as most game engines right now are architected with the low performance Jaguar in mind - even a doubling of throughput (ie 60fps vs 30fps) would hardly tax PS5's Zen 2 cores.

Kinda sucks that the CPU has to be throttled back though
 

Trogdor1123

Member
Probably true, the difference between Xbox and PS4 wasn't too much in most cases. This will probably be even less given deminishing returns.
 

DForce

NaughtyDog Defense Force
900p vs 1080p and 1440p vs 2160p are big differences.
If someone says no either...
They have a shitty display or they need to get a eye exam.
Xbox One X almost doubled the PS4 Pro in most cases. PS4 Pro also had lower memory and couldn't run 4K textures like the X, which made the image quality worse.

900p vs 1080p is smaller and not nearly the same gap size.
720p vs 1080p is what we saw early on between both consoles.
 
It is not factual despite how much you wish it to be. Getting angry does not prove your point with any more weight.
The PS5 as cheap reactionary half hearted step narrative has not stuck, stop throwing it at the wall (... and disingenuously pretending you are doing anything but that).
I'm just curious. Is the 18% power delta the PS5 running at maximum performance? Is it possible for the PS5 to perform any less than its maximum performance? Microsoft made a big deal about a fixed clock rate of the CPU and GPU. The PS5 has variable clocks. Unless the 18% represents the PS5 running at its lowest frequency, why is it not a possibility that the delta could be bigger than 18%?
 

oldergamer

Member
Lets just sum this thread up. The difference in tflops wont matter to his game as he isnt really pushing either console to the limits.

Thats basically what it comes down to... And is a valid point. Not every game is going to push what can be done, and people here need to realize that.
 

Mista

Banned
Lets just sum this thread up. The difference in tflops wont matter to his game as he isnt really pushing either console to the limits.

Thats basically what it comes down to... And is a valid point. Not every game is going to push what can be done, and people here need to realize that.
Well said sir
 

Vawn

Banned
Lets just sum this thread up. The difference in tflops wont matter to his game as he isnt really pushing either console to the limits.

Thats basically what it comes down to... And is a valid point. Not every game is going to push what can be done, and people here need to realize that.

Yup. Most games don't even push the less than 2 TF of the base PS4 because it's too expensive to make games at that level, regardless of the hardware.

The games that will use the difference between 10.3 and 12 TF will be basically nil. Things like a 2x faster SSD will make a more noticeable difference for most games.
 

DForce

NaughtyDog Defense Force
I'm just curious. Is the 18% power delta the PS5 running at maximum performance? Is it possible for the PS5 to perform any less than its maximum performance? Microsoft made a big deal about a fixed clock rate of the CPU and GPU. The PS5 has variable clocks. Unless the 18% represents the PS5 running at its lowest frequency, why is it not a possibility that the delta could be bigger than 18%?
If worst case scenario drops the percentages in frequency at about 2%. That's a very small drop. That would still be about a 18-21% gap based on where it's dropping. It's also likely that devs will maximize GPU performance and having games use the full 10.28TF.


There's 24% difference between an RTX 2070 Super vs RTX 2080 Super. 2070 is 9TF and the other is 11.1TF.

If you look at the reviews on these cards, then you would realize that the gap in TF is not that big when it comes to performance.
 
If worst case scenario drops the percentages in frequency at about 2%. That's a very small drop. That would still be about a 18-21% gap based on where it's dropping. It's also likely that devs will maximize GPU performance and having games use the full 10.28TF.


There's 24% difference between an RTX 2070 Super vs RTX 2080 Super. 2070 is 9TF and the other is 11.1TF.

If you look at the reviews on these cards, then you would realize that the gap in TF is not that big when it comes to performance.
This is yet to be seen, what happens when something heavily needs to prioritize the CPU? What happens when the GPU and CPU are both being hammered? It seems like a really dumb way to go about things instead of a fixed function system where you can draw everything all at once without question, without issue, without complication, without sacrifice.
 
Last edited:

TGO

Hype Train conductor. Works harder than it steams.
Hellpoint Dev be all like
tenor.gif
 

PocoJoe

Banned
Yes there is other factors this time too, still the diference is even bigger now... lol
I don't get what you are saying,... You are saying there's not a diference or what?

You cant be serious?

Or are you one of those people that dont understand percentacles and ratios at all?

1 vs 2

10 vs 12

110 vs 120

1100 vs 1200

Absolute difference is "bigger" but proportional gets smaller and smaller
 
Yup. Most games don't even push the less than 2 TF of the base PS4 because it's too expensive to make games at that level, regardless of the hardware.

The games that will use the difference between 10.3 and 12 TF will be basically nil. Things like a 2x faster SSD will make a more noticeable difference for most games.
“Advantage of Console A doesn’t matter cause neither console will be pushed to their limits, so you won’t see a difference. But Console Bs advantage makes all the difference and will be used at max 100% of the time”
Am I doing this right?
 
You cant be serious?

Or are you one of those people that dont understand percentacles and ratios at all?

1 vs 2

10 vs 12

110 vs 120

1100 vs 1200

Absolute difference is "bigger" but proportional gets smaller and smaller
This is fine logic until you throw in the reality of how that difference contextually relates.

The Series X GPU is at an advantage of anywhere from 1 1/2 to 2 PlayStation 4 GPU's more powerful than the PlayStation 5.

Percentages need context, and with that in mind it doesn't seem so small.
 
Can’t wait for the first games to be shown and everyone having to break out microscopes to try find the differences.

but... the SSD in PS5 is supposed to make games have less than 0 load times and allow for experiences and game worlds not possible on XSX. Why would anyone need a microscope to find any difference?
 

MCplayer

Member
You cant be serious?

Or are you one of those people that dont understand percentacles and ratios at all?

1 vs 2

10 vs 12

110 vs 120

1100 vs 1200

Absolute difference is "bigger" but proportional gets smaller and smaller
dude I do understand, just don't know where he stands, you think the xbox one diference to ps4 ( the OG consoles) is bigger than the ps5 to xbox series X?
 
Last edited:
I find it humorous that now that we know Lockhart is about to be unveiled soon, suddenly TFs don’t matter, Lockhart isn’t going to hold back development, and the SSD and GPU are the real stars of next gen. Now other devs are starting to echo the same sentiments. Interesting. Veeeeeeery interesting.
 

pixelation

Member
Sure won't matter to me, PS2 was less powerful than the XB? I went where the games were @. PS3 was worse for 3rd party games? who gives a fuck, still went were the games were @. PS4 was more powerful than base XBO?, still went were the games were @. PS5 less powerful than the XBSX?, who the fuck cares, I will go with PS5 because I know they always bring it when it comes to the games, sure MS has the power but so what?, they're not gonna get me with a carrot that promises awesome new games (that are still way off in the future) they have been dangling that carrot in front of us for years now.
 

MCplayer

Member
Sure won't matter to me, PS2 was less powerful than the XB? I went where the games were @. PS3 was worse for 3rd party games? who gives a fuck, still went were the games were @. PS4 was more powerful than base XBO?, still went were the games were @. PS5 less powerful than the XBSX?, who the fuck cares, I will go with PS5 because I know they always bring it when it comes to the games, sure MS has the power but so what?, they're not gonna get me with a carrot that promises awesome new games (that are still way off in the future) they have been dangling that carrot in front of us for years now.
yes, there would not be consoles without games, I will get both :messenger_ok:
 

Hobbygaming

has been asked to post in 'Grounded' mode.
Why is EVERONYE in the whole INDUSTRY - every dev, every journalist - trying HARD AS FUCK to downplay the huge power difference and relevance of power? Like, suddenly power doesn't matter?! wtf is going on? Right before the PS5 GDC talk the narrative was completely different. What is going on? This is crazy.

Wasn't even remotely like this during XBOX ONE reveal. lol
More like they know more than arm chair devs and the power difference is tiny and won't matter
 

93xfan

Banned
Hello, "brand new member" to these forums.

That was, what, a 40% difference? What is it this time? 10.3 to 12.

10.28 variable (though mostly steady if the CPU isn’t stressed) to 12.155 if you want to be accurate. Probably won’t be a major difference alone.

More CUs for ray tracing may be something that makes difference though.

Also, variable rate shading is import as well and Sony and Cerny have been quiet on whether they have support for that.

My guess is we will see better resolutions and some better effects in GPU constrained games and that may not make much of a difference. If the game is heavy on the CPU side, that could be a different story.

Supposed ray tracing advantages may be the big winner for MS.
 

Romulus

Member
sure, xbox one ran at 900p vs 1080p on ps4, with less than 1TF diference, not a big diference...
anywho, excited for both upcoming consoles

A better comparison for this gen would be PS4 having 1.2TF and Xbox One having 1TF, if you want to compare it to XSX and PS5.

The real TF difference between PS4 and Xbox one was nearly half the Xbox one's total power. Think about that. Now that 1.8 TF is almost nothing when you look at the overall picture between 10.2-12.
But the even bigger take away back during ps4 vs Xb1 was those consoles were running native resolutions up until more recently, so that difference in resolution, while small, was more noticeable than it will be now. Now that will be even less now because of modern dynamic and checkboard tactics with resolution. So you have less percentage of power difference AND far better tactics for lessening image quality loss. I wouldn't be surprised if 90% of games are indistinguishable to the trained eye, if not more.
 
Last edited:
It's 18% at minimum and that percentage will scale upwardly depending on how far the PlayStation 5's GPU scales frequency downwardly.

you cannot correct people, you have a wrong idea of how it works

the whole purpose of having a variable frequency is to only use what is needed to run the game, that saves power and heat generation, they do it exploiting the simple fact that games vary its complexity frame by frame and scene by scene

if PS5's GPU downclocks is because the scene doesnt require all GPU for rendering, if the scene is complex it uses full clock as simple as that and that scales automatically no need special options from developers, if you run an indie game it will be way more cool as it can run using very little power and run the game full speed and if you run lets say "FF7R chapter 2" it can lower its clock when in simple scenes like menus and passages and run full clock when playing a complex scene that requires it like a battle, the game will always run consistently using as much as it needs
 
That's... not how percentages work.
PS3 had about 15 PS2s more compute than XB360 - that didn't make that difference (50%) larger than the jump from PS2->XB (~300%).
Of course that's exactly how it works because the way it's been framed lacks any relevant context.

The reality is that 18% - XX% is the equivalent of one and half to two PlayStation 4 GPU's stacked onto the PlayStation 5's GPU.

You can frame it as only being 18% but for modern rendering implication the context of that association to actual hardware in the market today is quite substantive.

you cannot correct people, you have a wrong idea of how it works

the whole purpose of having a variable frequency is to only use what is needed to run the game, that saves power and heat generation, they do it exploiting the simple fact that games vary its complexity frame by frame and scene by scene

if PS5's GPU downclocks is because the scene doesnt require all GPU for rendering, if the scene is complex it uses full clock as simple as that and that scales automatically no need special options from developers, if you run an indie game it will be way more cool as it can run using very little power and run the game full speed and if you run lets say "FF7R chapter 2" it can lower its clock when in simple scenes like menus and passages and run full clock when playing a complex scene that requires it like a battle, the game will always run consistently using as much as it needs
This isn't accurate whatsoever, because if both the GPU and CPU get hammered priority will have to be taken from one or the other and a component will suffer a loss in performance capability.
 
Last edited:
Top Bottom