• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox Series X’s BCPack Texture Compression Technique 'might be' better than the PS5’s Kraken

Variable Frequency for CPU and GPU 🤭

Most of the time. Since you probably didn't listen what Cerny said, maybe even deliberately. Cerny stated clearly they expect both CPU and GPU to spend most of their time at their top frequencies. Both CPU and GPU can work at their top frequency together as long as the total workload does not exceed the power cap. It that happens, downclock will be minor. Based NXG from his analysis, it's about 50 Mhz. It's sufficient to lower clocks by 2% to lower power consumption by 10%. Dropping power consumption by 10% IS NOT 10% drop in performance.

Btw. welcome to GAF.
 
Last edited:

oldergamer

Member
While i think MS does have better compression then sony, and I'm guessing they might be able to avoid the copy of the decompressed texture from SSD to Video memory ( if that is even possible ), I don't know if i see this as a dark horse. If it saves a ton of bandwidth or has some other benefit i could be wrong, but we need more info
 

JägerSeNNA

Banned
Most of the time. Since you probably didn't listen what Cerny said, maybe even deliberately. Cerny stated clearly they expect both CPU and GPU to spend most of their time at their top frequencies. Both CPU and GPU can work at their top frequency together as long as the total workload does not exceed the power cap. It that happens, downclock will be minor. Based NXG from his analysis, it's about 50 Mhz. It's sufficient to lower clocks by 2% to lower power consumption by 10%. Dropping power consumption by 10% IS NOT 10% drop in performance.
According to MS,Cloud Computing allegedly would be a game changer and eSRAM was a revolution backi in 2013. All those became a big nothing. Don‘t come to me „according to Cerny“ plus NXGamer,please no. You have to tell me first of all; How they‘re going to keep this thing cool at that speeds when both CPU and GPU maxed out. Waiting for a logical answer. Don‘t come to me please with „but someone said that“
 

RaySoft

Member
No matter how they spin it, the only thing that matters is this; How long does it take to load the same lossless compressed texture from the SSD before it's readily available in memory uncompressed.
That's the true benchmark of the whole system. (ssd bandwith, compression ratios, I/O controllers, RAM bandwith and h/w decompressors)

I get the notion that Cerny are delivering more real-world numbers in his presentation wheras MS usually provide max theoretical numbers. (my opinion)
Devs have come out and said they've gotten better compression ratios than Mark stated, around 10-11 GB/s.
But I guess Cerny's numbers are better to use, since that will probably be a more accurate representation of what to expect most of the time. (hence why they were used in the presentation)

Either way.. even IF BCPack do have a better compression ratio than Kraken, that would only be one ring in the chain, and as we all know, a chain is only as strong as the sum of it's parts.
 

Deto

Banned
According to MS,Cloud Computing allegedly would be a game changer and eSRAM was a revolution backi in 2013. All those became a big nothing. Don‘t come to me „according to Cerny“ plus NXGamer,please no. You have to tell me first of all; How they‘re going to keep this thing cool at that speeds when both CPU and GPU maxed out. Waiting for a logical answer. Don‘t come to me please with „but someone said that“


Microsoft never talked about ESRAM being revolutionary. I know, want to compare SRAM with SSD?
See also microsoft mind, then sony.
Sony doesn't even release AAA SP, because MS doesn't either.
 
Last edited:
According to MS,Cloud Computing allegedly would be a game changer and eSRAM was a revolution backi in 2013. All those became a big nothing. Don‘t come to me „according to Cerny“ plus NXGamer,please no. You have to tell me first of all; How they‘re going to keep this thing cool at that speeds when both CPU and GPU maxed out. Waiting for a logical answer. Don‘t come to me please with „but someone said that“

Cerny is a designer and system architect and beyond that. How they're going to keep this thing cool? Keeping CPU and GPU frequency high has most of the time has nothing to do with thermals. Cerny explained that during GDC presentation. Yeah, looks like you didn't listen what Cerny said about cooling solution. We are ending discussion here right now.
 
Last edited:

ZywyPL

Banned
Both CPU and GPU can work at their top frequency together as long as the total workload does not exceed the power cap

That's the thing - most 3rd party games are fucking horrible when it comes to optimization, not to mention various bugs and crashes, so there is a huge risk/possibility those games will slaughter the workload, effectively making the PS5 drop its clocks on both the CPU and GPU, not vice versa, while not providing anything for the graphical fidelity in return. Take Doom vs Fallout for example, same hardware, vastly different results.
 

JägerSeNNA

Banned
Cerny is a designer and system architect and beyond that. How they're going to keep this thing cool? Keeping CPU and GPU frequency high has most of the time has nothing to do with thermals. Yeah, looks like you didn't listen what Cerny said about cooling solution. We are ending discussion here right now.
The people who claimed that the cloud Computing will be a Game changer was also Xbox Engineers. 😂Yes I watched his tech talk. If it was only %2 reduction in terms of core speeds,he wouldn’t even mention about it. Believe what you want to believe.
 

StreetsofBeige

Gold Member
Variable Frequency for CPU and GPU 🤭
Most of the time. Since you probably didn't listen what Cerny said, maybe even deliberately. Cerny stated clearly they expect both CPU and GPU to spend most of their time at their top frequencies. Both CPU and GPU can work at their top frequency together as long as the total workload does not exceed the power cap. It that happens, downclock will be minor. Based NXG from his analysis, it's about 50 Mhz. It's sufficient to lower clocks by 2% to lower power consumption by 10%. Dropping power consumption by 10% IS NOT 10% drop in performance.

Btw. welcome to GAF.
fetch
 
The people who claimed that the cloud Computing will be a Game changer was also Xbox Engineers. 😂Yes I watched his tech talk. If it was only %2 reduction in terms of core speeds,he wouldn’t even mention about it. Believe what you want to believe.
Major Nelson surely isn't a engineer yet he talked about Power Of The Cloud :


Btw. are these engineers???

XboxOneCloud_04.jpg


Percentages matters and he said few of them, that's why also he mentioned that clocks are variable. Other way, it would be misleading.

Oh, you again with that crap. I've explained that to you yesterday, yet you providing the same crap over and over, Xbox fan!
 
Last edited:

StreetsofBeige

Gold Member
Most of the time. Since you probably didn't listen what Cerny said, maybe even deliberately. Cerny stated clearly they expect both CPU and GPU to spend most of their time at their top frequencies. Both CPU and GPU can work at their top frequency together as long as the total workload does not exceed the power cap. It that happens, downclock will be minor. Based NXG from his analysis, it's about 50 Mhz. It's sufficient to lower clocks by 2% to lower power consumption by 10%. Dropping power consumption by 10% IS NOT 10% drop in performance.

Btw. welcome to GAF.
Oh, you again with that crap. I've explained that to you yesterday, yet you providing the same crap over and over, Xbox fan!
How is it crap?

As Cerny and even you say, PS5 CPU/GPU may have to adjust when things get too tough, as there's a power cap.

How can you downplay the GIF when you're even going off about mhz and clock cycle reductions? If the system is so great, just have both CPU/GPU run at max speed and not worry about thermal temp like every other console does.

Also, you still haven't given a definition of what "most of the time" means. And Cerny beat around the bush too. A vague claim which can mean anything.
 
Last edited:

semicool

Banned
The best case scenario Cerny talks is 22GB/s.

8-9GB/s is the average/typical.
What about the memory bus feeding the XSX GPU at a MUCH HIGHER bandwidth? That's 560 GB/s vs 448GB/s , a 112 GB/s advantage for the X feeding the memory pipeline vs a 22GB/s best case really vs 6GB/s, really 8-9 like you stated ,for the ps5. That's a Delta of 112GB/s for the X versus a likely Delta of 3GB/s( 9 - 6) for the ps5....much more the advantage for the series X. Not all apples to apples but still talking about the memory system setup and bandwidth with the same destinations. I don't think the xsx ssd will be as much as a bottleneck as the ps5's RAM bandwidth will be for it IMO.
 
Last edited:

Goliathy

Banned
How is it crap?

As Cerny and even you say, PS5 CPU/GPU may have to adjust when things get too tough, as there's a power cap.

How can you downplay the GIF when you're even going off about mhz and clock cycle reductions? If the system is so great, just have both CPU/GPU run at max speed and not worry about thermal temp like every other console does.

Also, you still haven't given a definition of what "most of the time" means. And Cerny beat around the bush too. A vague claim which can mean anything.

I think what he meant by „this crap“ is that SSD is even included there. Because SSD Is not even close as important as CPU and GPU.

Think about it, take the base XBOX ONE, Don’t change cpu and gpu, just change SSD. Will there be a significant change other than loading time and maybe less popups? No

Now change CPU and GPU, will there be a significant change? HEALL YEAH. HUGE difference.
Even just changing the CPU will change games DRASTICALLY.

SSD is nice for loading times and snappier OS and maybe for open world games. That’s it. CPU and GPU. That’s where the important stuff is.
 
Last edited:

StreetsofBeige

Gold Member
I think what he meant by „this crap“ is that SSD is even included there. Because SSD Is not even close as important as CPU and GPU.

Think about it, take the base XBOX ONE, Don’t change cpu and gpu, just change SSD. Will there be a significant change other than loading time and maybe less popups? No

Now change CPU and GPU, will there be a significant change? HEALL YEAH. HUGE difference.
Even just changing the CPU will change games DRASTICALLY.

SSD is nice for loading times and snappier is, and maybe for open world games. That’s it. CPU and GPU. That’s where the important stuff is.
I agree.

And most games aren't even open world with giant landscapes and tons of NPCs/monsters lurking every 50 ft.

Also, it's not like SeX doesn't have an SSD. It has one too with it's own decompression algorithms. It's just that the raw speeds are less. But let's not try to make it like PS5 has a 5.5 gb/s SSD and SeX is running a 5400 rpm drive.

Who knew a 3.1 gb/s gap of SSD speed would be the holy grail of next gen gaming.
 
Last edited:

Dory16

Banned
So on one hand the PS5 is stronger than the XSX, because the XSX is merely bruteforcing and the PS5 is much more elegant..... buuuuuut on the other hand, the XSX is faster than the PS5, because PS5 is merely bruteforcing and the XSX is much more elegant.

Or maybe.. just maybe.. both machines are incredibly well optimized with different strenghts, where Sony has opted for a bit slower APU with an extreme SSD, while MS has chosen a bit faster APU with a slightly less extreme SSD.
It’s certainly the first time in console history that I see persistent storage even being considered as a power ingredient. On PCs
Most of the time. Since you probably didn't listen what Cerny said, maybe even deliberately. Cerny stated clearly they expect both CPU and GPU to spend most of their time at their top frequencies. Both CPU and GPU can work at their top frequency together as long as the total workload does not exceed the power cap. It that happens, downclock will be minor. Based NXG from his analysis, it's about 50 Mhz. It's sufficient to lower clocks by 2% to lower power consumption by 10%. Dropping power consumption by 10% IS NOT 10% drop in performance.

Btw. welcome to GAF.
First of all, he's making the PS5 consume the same amount of watts every time it's on so the cooling can be calibrated. that's already bad for your electricity bill. You should want him to come up with a dynamic and reliable cooling solution instead. You're paying for it.

Secondly, I expect a manufacturer to KNOW where the CPU and GPU of the product that they're selling me for 4-500$ will spend their entire time or at least most of their time. It's their job to know. It's not enough to tell me that you EXPECT the car that you're selling me to reach 140mph when I press on the gas. Especially since the technology isn't new. They have adapted AMD Smartshift which is in high end laptops right now. Simulate the load and tell me what happens. If it's vague, then there's the plague.
 
How is it crap?

As Cerny and even you say, PS5 CPU/GPU may have to adjust when things get too tough, as there's a power cap.

How can you downplay the GIF when you're even going off about mhz and clock cycle reductions? If the system is so great, just have both CPU/GPU run at max speed and not worry about thermal temp like every other console does.

Also, you still haven't given a definition of what "most of the time" means. And Cerny beat around the bush too. A vague claim which can mean anything.

It's crap since you, Xbone fans, trying to push narrative that PS5 is 9.2 TF console, which is a FUD in every single way.

I'll just copy the same crap which i've told you few days ago :

Like Cerny stated, both CPU and GPU can work at their top frequency together as long as the total workload does not exceed the power cap. It that happens, downclock will be minor. It's sufficient to lower clocks by 2% to lower power consumption by 10%. Dropping power consumption by 10% IS NOT 10% drop in performance.
If game surely sustain resolution at max. most of the time ( like FC 5 on X1X is at 4k most of the time, even John Linneman couldn't noticed drops in resolution, yet VGTech did ), than surely console will be at power peak most of the time. Like in bunch DF and NXG games comparison where it was mentioned so often - resolution drops are rarity ( depends on platform )
 
What about the memory bus feeding the XSX GPU at a MUCH HIGHER bandwidth? That's 560 GB/s vs 448GB/s , a 112 GB/s advantage for the X feeding the memory pipeline vs a 22GB/s best case really vs 6GB/s, really 8-9 like you stated ,for the ps5. That's a Delta of 112GB/s for the X versus a likely Delta of 3GB/s( 9 - 6) for the ps5....much more the advantage for the series X. Not all apples to apples but still talking about the memory system setup and bandwidth with the same destinations. I don't think the xsx ssd will be as much as a bottleneck as the ps5's RAM bandwidth will be for it IMO.

Hello, hello, i'm an XSX and i have another 3.5 GB for games and it's running at 336 GB/s
 
it_wasn't-me is having a very hard time coping with reality it seems. Its ok, dude. PS5 is still a solid machine that will produce good games.
 

KINGMOKU

Member
So on one hand the PS5 is stronger than the XSX, because the XSX is merely bruteforcing and the PS5 is much more elegant..... buuuuuut on the other hand, the XSX is faster than the PS5, because PS5 is merely bruteforcing and the XSX is much more elegant.

Or maybe.. just maybe.. both machines are incredibly well optimized with different strenghts, where Sony has opted for a bit slower APU with an extreme SSD, while MS has chosen a bit faster APU with a slightly less extreme SSD.
What the hell is wrong with you? Are you trying to destroy the insane discussion happening, and console wars with your rational and reasonable take?

Your a monster.
 

StreetsofBeige

Gold Member
It's crap since you, Xbone fans, trying to push narrative that PS5 is 9.2 TF console, which is a FUD in every single way.

I'll just copy the same crap which i've told you few days ago :

Like Cerny stated, both CPU and GPU can work at their top frequency together as long as the total workload does not exceed the power cap. It that happens, downclock will be minor. It's sufficient to lower clocks by 2% to lower power consumption by 10%. Dropping power consumption by 10% IS NOT 10% drop in performance.
If game surely sustain resolution at max. most of the time ( like FC 5 on X1X is at 4k most of the time, even John Linneman couldn't noticed drops in resolution, yet VGTech did ), than surely console will be at power peak most of the time. Like in bunch DF and NXG games comparison where it was mentioned so often - resolution drops are rarity ( depends on platform )
Who's saying it's a 9.2tf lately? Since they revealed it maxing out as 10.3tf with throttling, 9.2tf is seems way understated.

But there's no denying PS5 has issues with maxing out cores and workload, that's why you and Cerny has been pushing variable speeds and 2% downclocks or whatever, where there's this theoretically vague "most of the time" statement.

Just accept it. Everyone has. PS5 can't max out ghz at full workload like every other console can do.

If Sony designed this system better with better specs to begin with, you wouldn't even need a console gpu running at 2.23 ghz to begin with. No other console (even SeX) has a gpu even touching 2 ghz.
 
Who's saying it's a 9.2tf lately? Since they revealed it maxing out as 10.3tf with throttling, 9.2tf is seems way understated.

But there's no denying PS5 has issues with maxing out cores and workload, that's why you and Cerny has been pushing variable speeds and 2% downclocks or whatever, where there's this theoretically vague "most of the time" statement.

Just accept it. Everyone has. PS5 can't max out ghz at full workload like every other console can do.

If Sony designed this system better with better specs to begin with, you wouldn't even need a console gpu running at 2.23 ghz to begin with. No other console (even SeX) has a gpu even touching 2 ghz.

Oh, i've accepted it. EDIT : GPU and CPU will be at peak frequency most of the time, it adjusts clocks based on the activity of the chip, not temperature
But Xbox fans should stop spreading FUD about it.
Mind you that for all console with "locked" specs, numbers are theoretical. I think NXG mentioned that in his recently analysis.
 
Last edited:

-Arcadia-

Banned
From a brief skim, is this where we’re really at? Arguing about texture compression methods for 5 pages? : P
 
Last edited:
  • LOL
Reactions: GHG

StreetsofBeige

Gold Member
Mind you that for all console with "locked" specs, numbers are theoretical. I think NXG mentioned that in his recently analysis.
I'm sure they are, even moreso for specs which need to downclock so the system doesn't melt down.

You can keep going all day defending this odd variable cpu/gpu throttling, but there's no denying if SOny just made a more powerful system right off the bat like MS did, you wouldn't even need a gpu busting out at 2.23 ghz to begin with.

SeX gets by perfectly fine with a gpu at only 1.825 ghz.
 

Gargus

Banned
Ok great so they will have great compression on their lackluster games. I mean how good is a mediocre game if its compressed in a better way? Will it somehow be a good game?
 
I'm sure they are, even moreso for specs which need to downclock so the system doesn't melt down.

You can keep going all day defending this odd variable cpu/gpu throttling, but there's no denying if SOny just made a more powerful system right off the bat like MS did, you wouldn't even need a gpu busting out at 2.23 ghz to begin with.

SeX gets by perfectly fine with a gpu at only 1.825 ghz.

Defending. LOL! Why Xbox fans keep spreading FUD then I presume it's not ok to defend it, but spreading FUD and crap is OK.. GPU and CPU will be at peak frequency most of the time, it adjusts clocks based on the activity of the chip, not temperature.
 
Last edited:
Ok great so they will have great compression on their lackluster games. I mean how good is a mediocre game if its compressed in a better way? Will it somehow be a good game?
Are all 3rd party games mediocre too?

So you saw Halo 6, Hellblade 2, Playgrounds new RPG, Forza 9, Obsidians new big game, The Initiative new game, and all other 2nd party games MS have deals with? Damn son, you're the next Miss Cleo!

That Ori and the Will of the Wisp sure is a mediocre game with that 90 metacritic score.
 

icebomb

Banned
Are all 3rd party games mediocre too?

So you saw Halo 6, Hellblade 2, Playgrounds new RPG, Forza 9, Obsidians new big game, The Initiative new game, and all other 2nd party games MS have deals with? Damn son, you're the next Miss Cleo!

That Ori and the Will of the Wisp sure is a mediocre game with that 90 metacritic score.
Not enough Netflix 3rd person action adventures and every gamer knows only those are good games.
 

quest

Not Banned from OT
Defending. LOL! Why Xbox fans keep spreading FUD then I presume it's not ok to defend it, but spreading FUD and crap is OK.. GPU and CPU will be at peak frequency most of the time, it adjusts clocks based on the activity of the chip, not temperature.
What is most of the time 51%? Sony won't give numbers like they will the SSD. Cerny never said 2%, he said a couple he was very clever to avoid specific numbers on the down clock. Minor, a couple and most of the time were what he used. Show us the numbers just like the SSD. What is the max down clock what work loads cause issues ect quit hiding the facts like Microsoft in 2013.
 
What is most of the time 51%? Sony won't give numbers like they will the SSD. Cerny never said 2%, he said a couple he was very clever to avoid specific numbers on the down clock. Minor, a couple and most of the time were what he used. Show us the numbers just like the SSD. What is the max down clock what work loads cause issues ect quit hiding the facts like Microsoft in 2013.

He said few/couple percentages. And few/couple percentages ISN'T 10%, it's from 1 to 3 or 4 like bunch of tech breakdowns are assuming a couple is 2 or 3 . Based NXG, he thinks it's around 50 MHz.
 
Last edited:

Imtjnotu

Member
So on one hand the PS5 is stronger than the XSX, because the XSX is merely bruteforcing and the PS5 is much more elegant..... buuuuuut on the other hand, the XSX is faster than the PS5, because PS5 is merely bruteforcing and the XSX is much more elegant.

Or maybe.. just maybe.. both machines are incredibly well optimized with different strenghts, where Sony has opted for a bit slower APU with an extreme SSD, while MS has chosen a bit faster APU with a slightly less extreme SSD.
Imma need you to tone down that sound logic of yours.
 
That’s why “he thinks” cause there is nothing more than estimations before we see some actual numbers.

Btw. Michael ( NXG ) has been working as a software & hardware engineer for a long time. Surely he knows something. Better than rest of us here. Based on what Cerny himself said about only needing couple of percentage frequency clock drop when the system hits its set power limit, a tiny drop of which would claw back a huge amount of power (10%). Based on that, NXGamer has calculated around 50 MHz.
 
Last edited:
What about the memory bus feeding the XSX GPU at a MUCH HIGHER bandwidth? That's 560 GB/s vs 448GB/s , a 112 GB/s advantage for the X feeding the memory pipeline vs a 22GB/s best case really vs 6GB/s, really 8-9 like you stated ,for the ps5. That's a Delta of 112GB/s for the X versus a likely Delta of 3GB/s( 9 - 6) for the ps5....much more the advantage for the series X. Not all apples to apples but still talking about the memory system setup and bandwidth with the same destinations. I don't think the xsx ssd will be as much as a bottleneck as the ps5's RAM bandwidth will be for it IMO.

Thank you for pointing out that it's about the overall pipeline throughput! Pumping up the storage is not the only part that matters. Microsoft pumped up their memory bus.
 

Goliathy

Banned
It's sufficient to lower clocks by 2% to lower power consumption by 10%.


What if you need to drop the power consumption by more than 10%? What if the customer wants to have a consistent experience for the gamer? Wouldn’t he design the game based on the lowest possible clock under all circumstances?
 
I was once in the group of saying my 720p games looked as fine as 1080p games on PS4. We were wrong, but our console was the weaker that time. I understand why Sony fans are doing this right now. I totally understand.

It's justifiable panic for Playstation fans. The last 2 AAA SP games launched flopped(Days Gone, Death Stranding) and now PS5 is doomed to be the weaker console for the next 7 years. That doesn't even get into the idea that MS first party will start pumping out 90+ Metacritic titles on a regular basis by 2021.
 

ethomaz

Banned
What about the memory bus feeding the XSX GPU at a MUCH HIGHER bandwidth? That's 560 GB/s vs 448GB/s , a 112 GB/s advantage for the X feeding the memory pipeline vs a 22GB/s best case really vs 6GB/s, really 8-9 like you stated ,for the ps5. That's a Delta of 112GB/s for the X versus a likely Delta of 3GB/s( 9 - 6) for the ps5....much more the advantage for the series X. Not all apples to apples but still talking about the memory system setup and bandwidth with the same destinations. I don't think the xsx ssd will be as much as a bottleneck as the ps5's RAM bandwidth will be for it IMO.
Memory bandwidth doesn’t make different for streaming data from SSD to RAM... the SSD data will use a very small part of the bandwidth.

Like on PS5 if you are constant streaming data from SSD to RAM at 8-9GB/s it will left 440GB/s to be shared by CPU and GPU.
 

darkinstinct

...lacks reading comprehension.
Who knew a 3.1 gb/s gap of SSD speed would be the holy grail of next gen gaming.

When it's all you got, you cling to it as if you life depends on it. PS5's SSD speed is Xbox One's CPU clock speed + ESRAM. Everybody knows it won't do anything, but half a year of make-believe and seven years of suffering are better than seven and a half years of suffering. The roles truly completely turned around from 2013, when PS5 ends up being 499 against a 399 XSX because of an expensive and useless SSD, eyeballs will melt. PS5 SSD is basically this gen's Kinect, a great idea that will get barely any use and just make the console more expensive.

And funnily enough: Microsoft implemented ESRAM in Xbox One to make up for the lack of memory bandwidth and now Sony does the same with PS5. Just another piece of the puzzle that will make PS5 more expensive while being less powerful.
 
What if you need to drop the power consumption by more than 10%? What if the customer wants to have a consistent experience for the gamer? Wouldn’t he design the game based on the lowest possible clock under all circumstances?

What if that won't happen, which, based what Cerny said and how he designed it, it won't.
 
When it's all you got, you cling to it as if you life depends on it. PS5's SSD speed is Xbox One's CPU clock speed + ESRAM. Everybody knows it won't do anything, but half a year of make-believe and seven years of suffering are better than seven and a half years of suffering. The roles truly completely turned around from 2013, when PS5 ends up being 499 against a 399 XSX because of an expensive and useless SSD, eyeballs will melt. PS5 SSD is basically this gen's Kinect, a great idea that will get barely any use and just make the console more expensive.

And funnily enough: Microsoft implemented ESRAM in Xbox One to make up for the lack of memory bandwidth and now Sony does the same with PS5. Just another piece of the puzzle that will make PS5 more expensive while being less powerful.

I see, that's how Xbox fan thinks. Appropriate
 

darkinstinct

...lacks reading comprehension.
Btw. Michael ( NXG ) has been working as a software & hardware engineer for a long time. Surely he knows something. Better than rest of us here. Based on what Cerny himself said about only needing couple of percentage frequency clock drop when the system hits its set power limit, a tiny drop of which would claw back a huge amount of power (10%). Based on that, NXGamer has calculated around 50 MHz.

If that were the case, they would just go with fixed clocks. Makes no sense to have a variable rate with such minor changes. Which means he is wrong.
 
Top Bottom