• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Developers Begin to Weigh in on the Power Gap Between the Xbox Series X and PlayStation 5. Series X taking the power crown "from all accounts".

S0ULZB0URNE

Member
Apparently AMD has this backward and the base clock is actually not the base clock, the boost clock is...

qggRn3B.png



It's almost as if AMD has already defined these things or something, strange...

hest.png
GPU: 10.28 TFLOPs, 36 CUs at 2.23GHz (variable frequency)
Boost clock is a different thing that neither Sony or MS have mentioned in neither specs.
Both consoles use CUSTOM gpu's and cannot be directly compared to AMD's PC cards.
 

Hendrick's

If only my penis was as big as my GamerScore!
POWER draw can be 10%
2230 is the base clock which can drop a couple percent in PERFORMANCE.
2230-2%=2185.4
We can see who actually listened to what Cerny said and who hasn't.
How much does the CPU clock down when the GPU is boosted?
 
Last edited:

Stuart360

Member
How much does the CPU clock down when the GPU is boosted?
I'm also interested to know what the base cpu clock speed is on PS5, all we know is the 3.5 boost mode. If base clock is like 3.2ghz, well a 400+mhz uptake on the XSX cpu is nothing to sniff at, i have a 300mhz overclock on my 6700k and the difference is very noticeable.
 

Deto

Banned
Talking shit? No, but I'm definitely wading through it.

Yes, I am outwardly calling Mark Cerny a liar


The PlayStation 5 was without any shadow of a doubt a 3.5Ghz and 9.X teraflop fixed frequency system. Its voltage was mated properly and thermals were regulated to handle those figures. At some point far in the development of the system Sony clearly got wind of Microsoft's system capability and being too far along in their design to rework it, they had to manipulate it. They had to implement broader cooling, higher voltages, implement a set of functions which allowed the GPU to push harder at the cost of CPU cycles, allowed the CPU to push harder at the cost of GPU cycles. Both cannot be true at the same time, one has to give for the other to excel.


Again, no one would intentionally design a system in this capacity because there's no advantage to it. It's nonsensical design, there's no leg up over fixed operation. If the system was locked at 2.23Ghz and 3.5Ghz it would be better. However given the above they cannot do that, because the system was never designed to operate at those heights. This is a through and through reworking to try and close a very large divide with their competitor, not intelligent or originally planned system operation.


were you sexually abused by the playstation?
 
Last edited:

Hendrick's

If only my penis was as big as my GamerScore!
I don't think Cerny is a liar, but he certainly is being told what to say. But ultimately, the specs Sony released for the APU are max boosted clocks.

Since the CPU and GPU cannot both be fully boosted at the same time, and because you cannot sustain max boost indefinitely, those specs are purposefully misleading.
 

Deto

Banned
I don't think Cerny is a liar, but he certainly is being told what to say. But ultimately, the specs Sony released for the APU are max boosted clocks.

Since the CPU and GPU cannot both be fully boosted at the same time, and because you cannot sustain max boost indefinitely, those specs are purposefully misleading.


Cerny said that both are running at full speed simultaneously.


Did you just say he is lying
 
But yet didn't have everything correct?

wow
Context was missing. But the chips were the chips. Like the leaks showed XSX having 56CUs. But was that for the dev kit which was using the full 56CUs, or was that a retail GPU? Now in hindsite it was the full chip. The PS5 chip was tested in it's back compat clocks and native, so it didn't show RT. We don't know what tests were being done.
The leak was correct and real however.
It showed the PS5 to be 36CUs. Sony people here said it was BS that Sony would have a 36CU chip clocked so high. But yet here we are.
 
I thought the PS5 APU clocks was known?
Both the CPU and GPU cannot run at their peak frequency at the same time. If the CPU is running at 3.5ghz, then the GPU can t run at 2.23ghz.
So devs have a power budget they have to work to and they have to make sure the GPU and CPU stick to the ratio of speeds between them.
As an example,
CPU running at 3.5ghz, so GPU can only go up to 2.0ghz. CPU running at 3.4ghz, GPU can run up to 2.1ghz. CPU running at 3.3ghz, GPU can run up to 2.2ghz.
 
Last edited:
I thought the PS5 APU clocks was known?
Both the CPU and GPU cannot run at their peak frequency at the same time. If the CPU is running at 3.5ghz, then the GPU can t run at 2.23ghz.
So devs have a power budget they have to work to and they have to make sure the GPU and CPU stick to the ratio of speeds between them.
As an example,
CPU running at 3.5ghz, so GPU can only go up to 2.0ghz. CPU running at 3.4ghz, GPU can run up to 2.1ghz. CPU running at 3.3ghz, GPU can run up to 2.2ghz.
This seems like it's going to cause an absurd amount of complications for development given the volatility of game code.
 

thelastword

Banned
I really don't understand this.....XBOX fans have 12.1TF and they are still not satisfied, they don't want Sony fans talking about the SSD, the sound chip and all that custom hardware that mitigates the 17% TF divide, they don't want us to talk about the higher clocks that will benefit everything in the rendering pipeline......They just want us to drone out 12.1 > 10.3.....Hell, some even insist that we should say 12.1 > 9.2TF......Even after the official reveal, even with the official specs out, they resort to FUD........
 
I really don't understand this.....XBOX fans have 12.1TF and they are still not satisfied, they don't want Sony fans talking about the SSD, the sound chip and all that custom hardware that mitigates the 17% TF divide, they don't want us to talk about the higher clocks that will benefit everything in the rendering pipeline......They just want us to drone out 12.1 > 10.3.....Hell, some even insist that we should say 12.1 > 9.2TF......Even after the official reveal, even with the official specs out, they resort to FUD........
This sounds like a personal problem, my professional suggestion would be to take a few weeks off and research.
 

MurfHey

Member
I really don't understand this.....XBOX fans have 12.1TF and they are still not satisfied, they don't want Sony fans talking about the SSD, the sound chip and all that custom hardware that mitigates the 17% TF divide, they don't want us to talk about the higher clocks that will benefit everything in the rendering pipeline......They just want us to drone out 12.1 > 10.3.....Hell, some even insist that we should say 12.1 > 9.2TF......Even after the official reveal, even with the official specs out, they resort to FUD........
This is the problem with fanboys. Same can be said about other products..cars,tvs,phones.

They will defend it as much as they can because brand loyalty is more important that the actual device. They get off on what's bigger and faster. Its toxic as all get out. They rely on comparison videos to justify spending x dollars on the new box instead of just enjoying it. Who cares if it loads faster, who cares if it has a higher resolution. I still play my n64 and ps3 and enjoy them very much... people need to just enjoy their games and be thankful they are healthy!

I
 
Good question
...below
Since the CPU and GPU cannot both be fully boosted at the same time, and because you cannot sustain max boost indefinitely, those specs are purposefully misleading.
... below
Both the CPU and GPU cannot run at their peak frequency at the same time. If the CPU is running at 3.5ghz, then the GPU can t run at 2.23ghz.
Cerny seemed to imply that both cpu and gpu would be running at max speed most of the time. Suggesting only use of certain instructions that are power hungry might cause slight drops in frequency in a predictable manner.
 
I really don't understand this.....XBOX fans have 12.1TF and they are still not satisfied, they don't want Sony fans talking about the SSD, the sound chip and all that custom hardware that mitigates the 17% TF divide, they don't want us to talk about the higher clocks that will benefit everything in the rendering pipeline......They just want us to drone out 12.1 > 10.3.....Hell, some even insist that we should say 12.1 > 9.2TF......Even after the official reveal, even with the official specs out, they resort to FUD........
The sound chip and SSD are not going to give you any ability to mitigate the power difference. It just isn't.
Sure the higher clocks can do things like improve pixel fill rates, but it doesn't match the fill rate of the XSX, it's still less. It does close the gap some.
The gap between the two systems is only 18%, smaller than the PS4 and Xbone, and smaller than the Pro and One X.
Sony peeps shouldn't be trying to argue that the SSD is going to give the PS5 any graphics advantage over the XSX, because that is what you have called FUD.
What you should argue, and I would agree with, is that you won't really be able to tell the difference between the two consoles. Most PS5 titles will run 4k60, and any additional improvements to the XSX version won't really be noticeable unless DF are zooming in and comparing.
 
...below

... below

Cerny seemed to imply that both cpu and gpu would be running at max speed most of the time. Suggesting only use of certain instructions that are power hungry might cause slight drops in frequency in a predictable manner.
He also showed AMD smartshift. He said that any unused CPU resources could be shifted to the GPU to allow it to get some extra performance. And considering the max GPU speed is 2.23 ghz, that implies that it won't run that high if there is no left over CPU headroom.
 

thelastword

Banned
The sound chip and SSD are not going to give you any ability to mitigate the power difference. It just isn't.
Sure the higher clocks can do things like improve pixel fill rates, but it doesn't match the fill rate of the XSX, it's still less. It does close the gap some.
The gap between the two systems is only 18%, smaller than the PS4 and Xbone, and smaller than the Pro and One X.
Sony peeps shouldn't be trying to argue that the SSD is going to give the PS5 any graphics advantage over the XSX, because that is what you have called FUD.
What you should argue, and I would agree with, is that you won't really be able to tell the difference between the two consoles. Most PS5 titles will run 4k60, and any additional improvements to the XSX version won't really be noticeable unless DF are zooming in and comparing.
Even now there is some truth to that, where I've seen higher resolutions are blurrier, like in RE2, DMC5, RE3 are all burrier on XBONEX because of the AA method......Remember Quincuxx on the PS3, we never heard the end of it......So in a number of cases the final image is really a toss based on AA and PP, so the winner most times is framerate because I'll tell you something.....I have a 4K monitor, I can go from 4k to 1080p on games......You can adjust to resolution much quicker than you can adjust to slower frames.....

Try playing a game at 4k 30fps and then play it at 1440p or 1080p 60fps it's immediately noticeable because after the first few minutes of looking at the image at max, when you are playing for hours on end or even the first hour....How it responds is pinnacle...

Ideally, high rez+high frames+ best effects and graphics quality is what I prefer......Looking back from 2013, the console which did this most consistently was base PS4.....PRO doesn't have enough bandwidth most times to chase the high resolutions as XBONEX, but within it's confines it maintains a more consistent framerate so it's doing it's job quite fine too....

I think next gen, we can all be sure that Cerny has ensured that PS5 will be a 4K native machine, he's put evertything in that console to ensure that...…I don't think we will see 4k on Series X and 1800p on PS5 as people are saying, both have enough bandwidth to do 4K and each will shine relative to where they are strong.....PS5 may win lots of framerate, textures, loadtimes and sound faceoffs, based on how fast it's pipelines are and how it's designed......and for first parties, gameplay/level design may also favor PS5...….Series X will win it's share too based on it's extra CU's, maybe some elements of raytracing may be better there....Even then, the RT thing we are not sure yet based on how the workload is lifted more CU's vs higher clocks? Most developers seem to be having no problem on PS5, so I would assume PS5 will be very competent in RT, seeing what Cerny said of certain games in his GDC presentation and what the Quantum Error and Godfall guys have been saying......And let's not forget, Poliphony Digital will school everyone with Raytracing......

At this point I weep for what a potential Driveclub 2 would look like on PS5......I was always impressed with how fast I could redo a race in DC, how fast the menus were or how expansive and detailed the tracks were on DC PS4......I look at PS5 and this thing looks like it was built to make Driveclub 2......Sheds a tear....
 

semicool

Banned
Even now there is some truth to that, where I've seen higher resolutions are blurrier, like in RE2, DMC5, RE3 are all burrier on XBONEX because of the AA method......Remember Quincuxx on the PS3, we never heard the end of it......So in a number of cases the final image is really a toss based on AA and PP, so the winner most times is framerate because I'll tell you something.....I have a 4K monitor, I can go from 4k to 1080p on games......You can adjust to resolution much quicker than you can adjust to slower frames.....

Try playing a game at 4k 30fps and then play it at 1440p or 1080p 60fps it's immediately noticeable because after the first few minutes of looking at the image at max, when you are playing for hours on end or even the first hour....How it responds is pinnacle...

Ideally, high rez+high frames+ best effects and graphics quality is what I prefer......Looking back from 2013, the console which did this most consistently was base PS4.....PRO doesn't have enough bandwidth most times to chase the high resolutions as XBONEX, but within it's confines it maintains a more consistent framerate so it's doing it's job quite fine too....

I think next gen, we can all be sure that Cerny has ensured that PS5 will be a 4K native machine, he's put evertything in that console to ensure that...…I don't think we will see 4k on Series X and 1800p on PS5 as people are saying, both have enough bandwidth to do 4K and each will shine relative to where they are strong.....PS5 may win lots of framerate, textures, loadtimes and sound faceoffs, based on how fast it's pipelines are and how it's designed......and for first parties, gameplay/level design may also favor PS5...….Series X will win it's share too based on it's extra CU's, maybe some elements of raytracing may be better there....Even then, the RT thing we are not sure yet based on how the workload is lifted more CU's vs higher clocks? Most developers seem to be having no problem on PS5, so I would assume PS5 will be very competent in RT, seeing what Cerny said of certain games in his GDC presentation and what the Quantum Error and Godfall guys have been saying......And let's not forget, Poliphony Digital will school everyone with Raytracing......

At this point I weep for what a potential Driveclub 2 would look like on PS5......I was always impressed with how fast I could redo a race in DC, how fast the menus were or how expansive and detailed the tracks were on DC PS4......I look at PS5 and this thing looks like it was built to make Driveclub 2......Sheds a tear....
Wat?
 
Even now there is some truth to that, where I've seen higher resolutions are blurrier, like in RE2, DMC5, RE3 are all burrier on XBONEX because of the AA method......Remember Quincuxx on the PS3, we never heard the end of it......So in a number of cases the final image is really a toss based on AA and PP, so the winner most times is framerate because I'll tell you something.....I have a 4K monitor, I can go from 4k to 1080p on games......You can adjust to resolution much quicker than you can adjust to slower frames.....

Try playing a game at 4k 30fps and then play it at 1440p or 1080p 60fps it's immediately noticeable because after the first few minutes of looking at the image at max, when you are playing for hours on end or even the first hour....How it responds is pinnacle...

Ideally, high rez+high frames+ best effects and graphics quality is what I prefer......Looking back from 2013, the console which did this most consistently was base PS4.....PRO doesn't have enough bandwidth most times to chase the high resolutions as XBONEX, but within it's confines it maintains a more consistent framerate so it's doing it's job quite fine too....

I think next gen, we can all be sure that Cerny has ensured that PS5 will be a 4K native machine, he's put evertything in that console to ensure that...…I don't think we will see 4k on Series X and 1800p on PS5 as people are saying, both have enough bandwidth to do 4K and each will shine relative to where they are strong.....PS5 may win lots of framerate, textures, loadtimes and sound faceoffs, based on how fast it's pipelines are and how it's designed......and for first parties, gameplay/level design may also favor PS5...….Series X will win it's share too based on it's extra CU's, maybe some elements of raytracing may be better there....Even then, the RT thing we are not sure yet based on how the workload is lifted more CU's vs higher clocks? Most developers seem to be having no problem on PS5, so I would assume PS5 will be very competent in RT, seeing what Cerny said of certain games in his GDC presentation and what the Quantum Error and Godfall guys have been saying......And let's not forget, Poliphony Digital will school everyone with Raytracing......

At this point I weep for what a potential Driveclub 2 would look like on PS5......I was always impressed with how fast I could redo a race in DC, how fast the menus were or how expansive and detailed the tracks were on DC PS4......I look at PS5 and this thing looks like it was built to make Driveclub 2......Sheds a tear....
I saw RDR2 running side by side on Pro and X, and if I'm honest I couldn't see a difference.
I think without a doubt the PS5 will do 4k, as the X did in alot of instances and the PSS5 will be the equivalent of 15-16 tflops of the X architecture, while the CPU can do 60fps easily compared to the Jaguar CPU.
 

SonGoku

Member
How much does the CPU clock down when the GPU is boosted?
Depends of the load actually, both GPU & CPU can remain at peak frequency simultaneously in fact they'll sustain peak frequency or close to most of the time.
Frequency changes are triggered by loads: type and intensity.

Also don't forget that since they are capped they have the power reserves to go higher meaning theres a buffer of power available*
*
Just in-case you aren't aware, frequency drops when the power allocated to the GPU/CPU isn't enough to sustain the load
 
Last edited:

Kagey K

Banned
To go back to what I was saying earlier, this is a STAGGERING difference lol



The PSV and XseX will likely be negligible at best and a slight lead for the XseX at worst ;)

That Xbox video is exactly when I dropped PC and went to console. I got tired of upgrading cards and fucking around with drivers.

I was at a party one night and a couple guys had just got an Xbox and were beating each other at DOA after playing a few rounds I was convinced. My PC was trash and I was wasting more money investing more into it.

The console was barely more expensive then the voodoo FX card at the time, and it gave way more back in the long run.
 
Last edited:

Kagey K

Banned
Kagey K Kagey K Do you still feel the same way about PC gaming? Now that its much more user friendly (a HD 7970/2600K combo was solid most of the gen)
Haven’t gone back. I work on a PC all day and don’t want to sit in front of one at home as well.

Ive dabbled at bit since I had decent laptops, but I can’t see myself getting back into the building scene and chasing those high end specs again.

It’s not like it was back then where I was playing Hexen/Heretic/Decent vs 2D platformers on console.

Maybe if PC took a giant leap over consoles again and were delivering something other than better graphics on the same games, I’d come back.
 
Last edited:

Neo_game

Member
I don't think Cerny is a liar, but he certainly is being told what to say. But ultimately, the specs Sony released for the APU are max boosted clocks.

Since the CPU and GPU cannot both be fully boosted at the same time, and because you cannot sustain max boost indefinitely, those specs are purposefully misleading.

He is the system architect. So nobody needs to tell him what to say about PS5 specs. Moreover this was reordered for the GDC and most gamers do not want to necessarily want to listen to a 50min lecture. This was not necessarily marketed towards them. Also he clearly said the drop in GPU % is going to be couple of % only. There is no such game that uses 100% power all the time of either gpu or cpu let alone using both at the same time.
 
Last edited:

-kb-

Member
I don't think Cerny is a liar, but he certainly is being told what to say. But ultimately, the specs Sony released for the APU are max boosted clocks.

Since the CPU and GPU cannot both be fully boosted at the same time, and because you cannot sustain max boost indefinitely, those specs are purposefully misleading.

Im failing to see where any of the information of your post has appeared from, other than from thin air. We have no idea about the relation between the CPU and GPU clocks or what the power draw levels even are. Anyone who says they know is lying at this point in time or has internal leaks.

We will have to wait and see for how they relate.
 

Kagey K

Banned
Im failing to see where any of the information of your post has appeared from, other than from thin air. We have no idea about the relation between the CPU and GPU clocks or what the power draw levels even are. Anyone who says they know is lying at this point in time or has internal leaks.

We will have to wait and see for how they relate.
So let’s say he’s right? What are the repercussions?

What if he’s wrong? What are the repercussions?

Can you even explain the difference?
 
Last edited:
That Xbox video is exactly when I dropped PC and went to console. I got tired of upgrading cards and fucking around with drivers.

I was at a party one night and a couple guys had just got an Xbox and were beating each other at DOA after playing a few rounds I was convinced. My PC was trash and I was wasting more money investing more into it.

The console was barely more expensive then the voodoo FX card at the time, and it gave way more back in the long run.
Did they still even make Voodoo FX at that time? I think I had GeForce2 GTS and upgraded to 9700 pro when OG Xbox came.
 

Numenorean

Member
I'm a playstation guy but im intrigued about the differences in both architectures. At the end it comes down to price. All that supposed xbox power has to cost more unless microsoft wants to lose big money the first 3 years at least
 
Top Bottom