These numbers are not true.
Power to clock frequency do not scale linearly, and in particular, the GPU and CPU scale very differently - the implied ratios suggest the same scaling, which doesn't make sense.
Thank you for emphasizing how ridiculous the claim is.
Yeap, this is my point, compared to what we have been told by cerny, these numbers are completely ridiculous and basically paint the picture of a completely different machine.
About your point in non-linearity, of course you are correct, the relation between the two is cubic. you might want to watch the video segment I attach below.
No it wouldn't mean these things.
If the profiles are real.. then those are modes you can test it... that's all that it means.
1. my math is good, and these are the percentages
2. if by "test" you mean "develop", then I agree
There is no "if this is true". Its patently false.
Saying shit like "If this is true [blah blah blah] disappointing" lends more credence to this abject horseshit than it deserves.
He specifically stated that modulating frequency by a few percent is enough to keep it within the targeted power envelope.
Which, if anyone has actually seen a voltage frequency curve, will know is fact.
Frequency increases near or over the top of h/w performance limit though really are useless for performance and only cause heat and consumption.
And we know that both cpu and gpu are clocked to their thermal limits, per cerny.
So, for all of you,
here is a part of the second playstation 5 article DF published, about 3 weeks after cerny presentation and after having talked to developers.
You will see that DF has been told by developers that they are throttling back the CPU in order to keep the sustained GPU clockspeed. And these are crossgen games we are talking about. In the written article that accompanies the video, it was noted that there are some questions on about how this will pan out as we enter development of nextgen games, where the reliance to worthless jaguar CPUs will be cut.
Furthermore, even if DF is
silky smooth in what they are saying here, I have included in this video the part where they compare the performance difference of a 36 CU GPU @2.1 with another GPU that has only 4 CUs more, and runs at 200+Mhz less. Just 4 CUs were enough to overcompensate the big difference in clockspeed.
Also worth of notice is that Leadbetter specifically asked Cerny if there is limit of the lowest possible frequencies, and cerny did not answer.