• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 and Xbox Series X GPU TFLOPs Difference “Probably” Won’t Matter in the Long Run – Hellpoint Developer

MCplayer

Member
A better comparison for this gen would be PS4 having 1.2TF and Xbox One having 1TF, if you want to compare it to XSX and PS5.

The real TF difference between PS4 and Xbox one was nearly half the Xbox one's total power. Think about that. Now that 1.8 TF is almost nothing when you look at the overall picture between 10.2-12.
But the even bigger take away back during ps4 vs Xb1 was those consoles were running native resolutions up until more recently, so that difference in resolution, while small, was more noticeable than it will be now. Now that will be even less now because of modern dynamic and checkboard tactics with resolution. So you have less percentage of power difference AND far better tactics for lessening image quality loss. I wouldn't be surprised if 90% of games are indistinguishable to the trained eye, if not more.


okok and what about having more CUs compared to a higher frequency? which is more beneficial?
agreed on the tecniques of rendering, would be amazing if AMD could develop a DLSS equivelent feature to next gen consoles.
 

Romulus

Member
The reality is that 18% - XX% is the equivalent of one and half to two PlayStation 4 GPU's stacked onto the PlayStation 5's GPU.

You can frame it as only being 18% but for modern rendering implication the context of that association to actual hardware in the market today is quite substantive.


Why is PS4 1TF now?

Anyway, talking about framing, the 18% number is probably like 50 Ps2's in terms of horsepower, but that doesn't really add to the perspective of power for current gen. PS2 and PS4 were machines from a different time with different targets. Those little past gen TF numbers get chewed up quickly the higher you go with more demanding resolution, lighting, etc.
 
Why is PS4 1TF now?

Anyway, talking about framing, the 18% number is probably like 50 Ps2's in terms of horsepower, but that doesn't really add to the perspective of power for current gen. PS2 and PS4 were machines from a different time with different targets. Those little past gen TF numbers get chewed up quickly the higher you go with more demanding resolution, lighting, etc.
The system has at minimum a 1.87 teraflop advantage on RDNA 2.0 architecture, RDNA 1.0 alone has a 1.25x uplift in computational performance over GCN which equates to 2.33 GCN teraflops. Account for the wavering teraflop figure of the PS5, account for the architectural uplifts in performance and gap is anywhere from 1.5 to 2 PlayStation 4 GPU's in excess of the PlayStation 5's.
 
Last edited:

Romulus

Member
okok and what about having more CUs compared to a higher frequency? which is more beneficial?
agreed on the tecniques of rendering, would be amazing if AMD could develop a DLSS equivelent feature to next gen consoles.

I think the PS5 has less or close to the PS4 Pro and X1X, will it underperform those? Probably not.
 
Last edited:

MCplayer

Member
ok, so guys, help a dumb idiot here start all over again on this subject, trying to figure all of this out after reading all of the posts in this thread. ( I might have learned about RDNA GPUs wrong)

Please explain me how a RDNA GPU diferenciates from a GCN GPU, I know RDNA can't be directly camparable to GCN since both having the "Same TF" RDNA has segnificantly more performance.

How does this next gen GPUs diferenciate from each other XsX and PS5.
 

nikolino840

Member
Can’t wait for the first games to be shown and everyone having to break out microscopes to try find the differences.
Like Always i think..but i Remember in the First years of ps4/One speaking with people that Say "Xbox sucks,ps4 Is more powerful and the graphic Is Better"
 
ok, so guys, help a dumb idiot here start all over again on this subject, trying to figure all of this out after reading all of the posts in this thread. ( I might have learned about RDNA GPUs wrong)

Please explain me how a RDNA GPU diferenciates from a GCN GPU, I know RDNA can't be directly camparable to GCN since both having the "Same TF" RDNA has segnificantly more performance.

How does this next gen GPUs diferenciate from each other XsX and PS5.
It's a significantly better and more efficient architecture, you get more out of less.

As an example if the Xbox Series X with RDNA 1.0 architecture were its stated 12.155 teraflops, that would essentially equate to 15.2 teraflops of performance on GCN.
 

MCplayer

Member
It's a significantly better and more efficient architecture, you get more out of less.

As an example if the Xbox Series X with RDNA 1.0 architecture were its stated 12.155 teraflops, that would essentially equate to 15.2 teraflops of performance on GCN.
ok, and what about 10.07 - 10.275 TF compared to 12.115 TF RDNA 2.0 (1.85) it isn't that big of a jump if we say xbox one to ps4 or is it
 
Of course that's exactly how it works because the way it's been framed lacks any relevant context.

The reality is that 18% - XX% is the equivalent of one and half to two PlayStation 4 GPU's stacked onto the PlayStation 5's GPU.

You can frame it as only being 18% but for modern rendering implication the context of that association to actual hardware in the market today is quite substantive.

:messenger_grinning_smiling:
stop the mental gymnastics

you can use gameboys if you want , the difference is the same



This isn't accurate whatsoever, because if both the GPU and CPU get hammered priority will have to be taken from one or the other and a component will suffer a loss in performance capability.

there is enough power budget for both run at full clock if needed

CPU are less a problem in consoles, there is no interference form background OS programs, drawcalls are a non issue and in this generation the CPUs actually has less work as CPU workload like physics was moved to GPU and you can make drawcalls from different cores(remember how DX12 reduced CPU power consuption in PC because of this)
 
ok, and what about 10.07 - 10.275 TF compared to 12.115 TF RDNA 2.0 (1.85) it isn't that big of a jump if we say xbox one to ps4 or is it
The percentile jump is not as large, but don't let anyone downplay the contextual implications of the capability relative to that advantage.

It's still a sizable increase in compute, it can allow for higher resolution targets, higher graphical ceilings, lesser impact to RT performance. It's not small.
 
ok, and what about 10.07 - 10.275 TF compared to 12.115 TF RDNA 2.0 (1.85) it isn't that big of a jump if we say xbox one to ps4 or is it
People will say "Only 18% difference" but that's not accurate, 2 teraflops is a really sizable difference, it's more computing power than the Xbox One and PS4 combined
:messenger_grinning_smiling:
stop the mental gymnastics

you can use gameboys if you want , the difference is the same





there is enough power budget for both run at full clock if needed

CPU are less a problem in consoles, there is no interference form background OS programs, drawcalls are a non issue and in this generation the CPUs actually has less work as CPU workload like physics was moved to GPU and you can make drawcalls from different cores(remember how DX12 reduced CPU power consuption in PC because of this)
If there is enough power budget for both to run at full clock, then why is not the case? Why bother with a variable clock instead of a fixed one?
 
Last edited:

Romulus

Member
People will say "Only 18% difference" but that's not accurate, 2 teraflops is a really sizable difference, it's more computing power than the Xbox One and PS4 combined


The reason people say the target is only 18% is that the targets are different now. It's much more taxing. Lol You can tape 100 GameCubes to a PS5 and it won't make much difference because in this era the power from that time is near irrelevant. So when you say things like "XBox and PS4 combined" it really doesn't mean much at all either.
 
Last edited:
The reason people say the target is only 18% is that the targets are different now. It's much more taxing. Lol You can tape 100 GameCubes to a PS5 and it won't make much difference because in this era the power from that time is near irrelevant. So when you say things like "XBox and PS4 combined" it really doesn't mean much at all either.
Every single teraflop matters when Raytracing is supposed to be a thing on these consoles
 
The reason people say the target is only 18% is that the targets are different now. It's much more taxing. Lol You can tape 100 GameCubes to a PS5 and it won't make much difference because in this era the power from that time is near irrelevant. So when you say things like "XBox and PS4 combined" it really doesn't mean much at all either.
Of course it does because those are still the forefront of console gaming RIGHT NOW, so their computational impact is well understood. Having a system which has the combined GPU surplus of a PS4 and Xbox One lorded over another system is a big deal.

There's a lot of different implications there, that's no small amount of compute especially when you consider what each of those systems GPU's even abstract of each other are capable of rendering.
 

Romulus

Member
Of course it does because those are still the forefront of console gaming RIGHT NOW, so their computational impact is well understood. Having a system which has the combined GPU surplus of a PS4 and Xbox One lorded over another system is a big deal.

There's a lot of different implications there, that's no small amount of compute especially when you consider what each of those systems GPU's even abstract of each other are capable of rendering.

Those machines are only on the forefront by default. Long generation. And they were underpowered on launch day, so that says even less.
 
So let me be clear. So what I'm hearing from people actually working on these things is that the Xbox is not significantly more powerful than the PlayStation, despite this teraflops number, and that the teraflops -- it might be a useful measure of comparison in some ways, but ultimately it's a theoretical max speed, and there are so many things that could come between where you are trying to get and what you are actually able to do, to the point where the GPU could have X number of flops that it can actually perform, but if the developer isn't able to actually access all of it for whatever reason, then it doesn't even matter, and there are so many other variables here that go into it.

 
People will say "Only 18% difference" but that's not accurate, 2 teraflops is a really sizable difference, it's more computing power than the Xbox One and PS4 combined

If there is enough power budget for both to run at full clock, then why is not the case? Why bother with a variable clock instead of a fixed one?

Mark Cerny comments how they use it to save power(wich also saves heat generation) the frequency is based in workload not based in heat generation or power consumption, the PS5 will perform the same no matter room temperature the variations will be the same you can save power here and there and you can generate less heath no game work using 100% of CPU and 100% of GPU all the time and even then is just busy not using every bit of silicon(which is impossible in a CPU as you dont run all instructions set per clock XD ), they work in a pipeline generating frames intended for a display with a maximum Hz and different scenes and even frames dont have the same cost

Cerny also mentioned AMDs smartshift in the eurogamer interview so the tech or at least part of it comes from AMD wich also makes the CPU and GPU of the console
 
Those machines are only on the forefront by default. Long generation. And they were underpowered on launch day, so that says even less.
Downplay it all you want, the Series X has a GPU which could effectively render both this....

30376161547_4503e1ea28_o.png


and this

GodOfWar_01_Resolution.png


AND render exactly what the PlayStation 5 is operating at the same time with additional RT, and you really want to say that's nothing? Having an Xbox One and PS4 combined of GPU surplus plus additional RT hardware is a big deal no matter what you say...
 
Last edited:
Mark Cerny comments how they use it to save power(wich also saves heat generation) the frequency is based in workload not based in heat generation or power consumption, the PS5 will perform the same no matter room temperature the variations will be the same you can save power here and there and you can generate less heath no game work using 100% of CPU and 100% of GPU all the time and even then is just busy not using every bit of silicon(which is impossible in a CPU as you dont run all instructions set per clock XD ), they work in a pipeline generating frames intended for a display with a maximum Hz and different scenes and even frames dont have the same cost

Cerny also mentioned AMDs smartshift in the eurogamer interview so the tech or at least part of it comes from AMD wich also makes the CPU and GPU of the console
My god
 
F

Foamy

Unconfirmed Member
Knock off juvenile “Sony Pony” or “Xbox” console war theatrics. It serves nothing but goofy arguments.
Wow. This is just sad now. A nobody from a nobody kickstarter developer with zero history props up the PS5 and Ponys cling to it in desperation and take it for the gospel.
And Hellpoint... as far as graphics go.... looks like ass .
 
Last edited by a moderator:

Deto

Banned
People will say "Only 18% difference" but that's not accurate, 2 teraflops is a really sizable difference, it's more computing power than the Xbox One and PS4 combined

If there is enough power budget for both to run at full clock, then why is not the case? Why bother with a variable clock instead of a fixed one?


The comparison is using percentage.
 

Deto

Banned
Of course that's exactly how it works because the way it's been framed lacks any relevant context.

The reality is that 18% - XX% is the equivalent of one and half to two PlayStation 4 GPU's stacked onto the PlayStation 5's GPU.

You can frame it as only being 18% but for modern rendering implication the context of that association to actual hardware in the market today is quite substantive.

This isn't accurate whatsoever, because if both the GPU and CPU get hammered priority will have to be taken from one or the other and a component will suffer a loss in performance capability.

Are you ignorant or troll?


With all this nonsense, do you think it will convince anyone? looks like a clown playing for the fanatic audience of xbox
 
Last edited:

Romulus

Member
Downplay it all you want, the Series X has a GPU which could effectively render both this....

30376161547_4503e1ea28_o.png


and this

GodOfWar_01_Resolution.png


AND render exactly what the PlayStation 5 is operating at the same time with additional RT, and you really want to say that's nothing? Having an Xbox One and PS4 combined of GPU surplus plus additional RT hardware is a big deal no matter what you say...

That's so ridiculous. I could post incredible pictures of Xbox Chronicles of Riddick or Rogue Squardron on gamecube. "Imagine, the XSX can render this hundreds of times!" Which is accurate, but so what.
 

bitbydeath

Member
but... the SSD in PS5 is supposed to make games have less than 0 load times and allow for experiences and game worlds not possible on XSX. Why would anyone need a microscope to find any difference?

3rd party comparisons, load times will obviously be different.
 
Last edited:

Deto

Banned
That's so ridiculous. I could post incredible pictures of Xbox Chronicles of Riddick or Rogue Squardron on gamecube. "Imagine, the XSX can render this hundreds of times!" Which is accurate, but so what.


I'm trying to imagine what his purpose is here. Speaking nonsense, with everyone knowing it's bullshit.
What does he think? that if he writes this here, will it become reality?


I was really curious.

What you write is wrong and irrelevant. So what is the joy of writing nonsense?

You won't even be able to have a name to watch over, because when the games come out you'll have to disappear.
 
Last edited:
what's the argument?
the argument that opposes math (use%)?

What do you want? talking nonsense and convincing a person in this topic?

What you write is irrelevant and wrong, and it's not going to make any difference, except for smart people laughing at the nonsense you write.
You can't make an argument, our interaction ends here.

That's so ridiculous. I could post incredible pictures of Xbox Chronicles of Riddick or Rogue Squardron on gamecube. "Imagine, the XSX can render this hundreds of times!" Which is accurate, but so what.
If you extrapolate the rendering load being put on the hardware? Yes, it feasibly could.

You're absolutely missing the point.
 
Last edited:

Deto

Banned
You can't make an argument, our interaction ends here.


If he was a store salesman, being an xbox fanboy, could he cheat some poor guy with that nonsense and make an extra sale of xbox sx

is here? what do you expect doing comparative that ignores basic mathematics?


I was really curious.
What you write is wrong and irrelevant. So what is the joy of writing nonsense?
You won't even be able to have a name to watch over, because when the games come out you'll have to disappear.

"If I write on the internet, it becomes reality". This is what you want? delude yourself?


Do you just want to spoil the discussions of anyone interested in PS5?
"since I am unhappy, I will ruin the happiness of others on the internet"
Guess what? it will not happen, whoever is in the PS5 Hype will not stop being in the hype with these nonsense that you write.


You don't care about reality, just rhetorical and narrative. So there is no argument with you, It’s like arguing with an anti-vax
 
Last edited:

Vroadstar

Member
Are you ignorant or troll?


With all this nonsense, do you think it will convince anyone? looks like a clown playing for the fanatic audience of xbox

The guy just came back fron being banned for creating fud and spewing BS, not surprised he is at it again, its his MO. Learn to ignore him and G Goliathy both just came back from spreading FUD x fans both, embarrassing
 
Last edited:
The guy just came back fron being banned for creating fud and spewing BS, not surprised he is at it again, its his MO. Learn to ignore him and G Goliathy both just came back from spreading FUD x fans both, embarrassing
This is a lie, I was banned because I became exceedingly combative, as if any of this is relevant to the point of the previous conversation. Please leave the ad hominems at the door, they're not conducive to any of this discussion.
 

DForce

NaughtyDog Defense Force
People will say "Only 18% difference" but that's not accurate, 2 teraflops is a really sizable difference, it's more computing power than the Xbox One and PS4 combined

If there is enough power budget for both to run at full clock, then why is not the case? Why bother with a variable clock instead of a fixed one?

ok, and what about 10.07 - 10.275 TF compared to 12.115 TF RDNA 2.0 (1.85) it isn't that big of a jump if we say xbox one to ps4 or is it
RTX 2070 Super - 9.062 TF
RTX 2080 Super - 11.15 TF

Difference is 2.088‬



It's not a sizable difference.

when the gap is 2TF or even 1.8TF.

You can compare a RX 5700 vs RX 5700 XT which is a 1.8 difference. Still, there's no sizable difference.




This is just about TF performance alone and I just don't see why people think a 2TF difference between these two cards is a huge difference. It's only 18%, and like others have said, the PS4\XB1 had around a 40% difference.

You're getting 10-12FPS different compared to the higher card. PS5 will have to lower the resolution *slightly* if it can't hit the 60fps or 30fps target while rendering at the highest resolution.
 
the reality here is only one was vanished for spreading FUD :messenger_winking:
You should try lying less, in this day and age I'm amazed you didn't think that someone would pull up internet cache.

jmz21hJ.png



As I said, I got exceedingly combative and paid the price for it. Any other things you want to lie about?
 

MCplayer

Member
This is a lie, I was banned because I became exceedingly combative, as if any of this is relevant to the point of the previous conversation. Please leave the ad hominems at the door, they're not conducive to any of this discussion.

I don't know your history but you seem polite compared to some others. (y)

RTX 2070 Super - 9.062 TF
RTX 2080 Super - 11.15 TF

Difference is 2.088‬



It's not a sizable difference.

when the gap is 2TF or even 1.8TF.

You can compare a RX 5700 vs RX 5700 XT which is a 1.8 difference. Still, there's no sizable difference.




This is just about TF performance alone and I just don't see why people think a 2TF difference between these two cards is a huge difference. It's only 18%, and like others have said, the PS4\XB1 had around a 40% difference.

You're getting 10-12FPS different compared to the higher card. PS5 will have to lower the resolution *slightly* if it can't hit the 60fps or 30fps target while rendering at the highest resolution.


ok good, now lets see about each console exclusive features like DX112 Ultimate and sony API etc, but if the gap isnt that big, good.
 
Last edited:
RTX 2070 Super - 9.062 TF
RTX 2080 Super - 11.15 TF

Difference is 2.088‬



It's not a sizable difference.

when the gap is 2TF or even 1.8TF.

You can compare a RX 5700 vs RX 5700 XT which is a 1.8 difference. Still, there's no sizable difference.




This is just about TF performance alone and I just don't see why people think a 2TF difference between these two cards is a huge difference. It's only 18%, and like others have said, the PS4\XB1 had around a 40% difference.

You're getting 10-12FPS different compared to the higher card. PS5 will have to lower the resolution *slightly* if it can't hit the 60fps or 30fps target while rendering at the highest resolution.
The gap between the 2070 Super to the 2080 Super goes anywhere from 5-30+ FPS. It can be a small difference, it can also be a quite massive difference.
 

MCplayer

Member
The gap between the 2070 Super to the 2080 Super goes anywhere from 5-30+ FPS. It can be a small difference, it can also be a quite massive difference.
I did notice that, if we check both 5700 vs 5700xt and 2070 SUPER VS 2080 SUPER which is the closer to next gen consoles taking in account CU count and and frequency, even tho next gen consoles are RDNA 2.0 GPUS.
 
Last edited:
Top Bottom