• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Matrix Awakens Power Consumption Comparison. XSS vs XSX vs PS5.

tommib

Member
Series S beast level:

Cat Kitten GIF by Demic
Adorable. Thinking of getting one of those but my new leather couch! đź’€
 

mejin

Member
Ah power consumption, I remember trolls being very concerned saying ps5 won't be able to keep up clocks or power and will have to down clock. Even though Cerny indicated it rarely ever happened and it was only at a fraction. Good concern trolling times.
I hope this and the games have finally shut that up.

PS5 has the most optimized customizations thanks to Cerny.

No first party help needed, just plug and play. Beast tech.
 

Tripolygon

Banned
1. I would have thought there would be at least a 20W to 30W difference between the PS5 and XSX but it’s closer to 10W to 15W.

2. Dynamic frequency scaling allows you to get more power out of a processor. XSS would have benefited a lot from it if they clocked higher and used DFS. They would have been able to push it to Xbox One X performance level in terms of GPU while maintaining around 100W power draw

3. Confirming again that SoC has enough power to maintain both CPU and GPU at higher clock if it needs to even when the engine is compute heavy.

Overall very excellent design decision on Sony’s part to push the SoC and gain more performance out of a smaller chip.
 

SSfox

Member
Is 220W considered high, low or medium for a console gaming? I'm not but i think PS4 was around 80W, so PS5 is above obviously but how is it perceived in general?

EDIT: Also it seems that he is using PS5 launch model (not the revision model)
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Is 220W considered high, low or medium for a console gaming? I'm not but i think PS4 was around 80W, so PS5 is above obviously but how is it perceived in general?

EDIT: Also it seems that he is using PS5 launch model (not the revision model)
PS4 Slim might be 80w but the main SKU was 150 watts. PS4 Pro was also 150 watts. X1X topped out at 170. While the PS3 was around 200 watts. Both consoles are pushing the limits of a console TDP and thats a good thing.

PS5 launch models and all other models until the slim launches should consume the same amount of power. All they changed was the cooling solution, but the games and the GPU should continue to consume the same amount of power.
 

SSfox

Member
PS4 Slim might be 80w but the main SKU was 150 watts. PS4 Pro was also 150 watts. X1X topped out at 170. While the PS3 was around 200 watts. Both consoles are pushing the limits of a console TDP and thats a good thing.

PS5 launch models and all other models until the slim launches should consume the same amount of power. All they changed was the cooling solution, but the games and the GPU should continue to consume the same amount of power.
PS3 was 200W?? WTF that's above PS4's and way above what i though, interesting, thanks for the info.
 

SlimySnake

Flashless at the Golden Globes
1. I would have thought there would be at least a 20W to 30W difference between the PS5 and XSX but it’s closer to 10W to 15W.
Some PS5 games go up to 230 watts! Whats probably happening here is that the demo is capped at 30 fps so the xsx and PS5 GPUs are not being maxed out. It would be interesting to see what the power consumption would be like when they are flying through the environment which is when the framerate consistently drops.
 

onesvenus

Member
Ah power consumption, I remember trolls being very concerned saying ps5 won't be able to keep up clocks or power and will have to down clock. Even though Cerny indicated it rarely ever happened and it was only at a fraction. Good concern trolling times.
I hope this and the games have finally shut that up.
I also remember people saying that smartshift would enable using less power for the same results as the XSX. It seems that's not true either.
 

Jaysen

Banned
Crazy how well designed the xbox is on all fronts. Size, power, heat, power draw, noise levels.

Ps5 is well designed too but deffo pushes the hardware more.
Yep. The demo runs silently on my Series X. On my PS5 the fan immediately is noticeable, which is fine I guess, but then the coil whine starts ticking like crazy, which indicates to me I should never get a UE5 game for the PS5.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
PS3 was 200W?? WTF that's above PS4's and way above what i though, interesting, thanks for the info.
More detail here. The PS4 was a cheaper console designed for an uncertain period for gaming post recession. Gaming is in a much better state today.

Despite PlayStation 3 has a 380 W power supply it only consumes 170-200 W during game mode on 90 nm Cell CPU. The newer 40GB PlayStation 3 model that comes with 65 nm Cell CPU/90 nm RSX consumes from 120-140 W during normal use. PlayStation 3 Slim with 45 nm Cell CPU/40 nm RSX consumes from 65-84 W during normal use.


In this table you can see exactly how much power does each PS3 model consume in different mode:


Model:PS3PS3 Slim
Standby1.22 W0.36 W
Idle operation171.35 W75.19 W
Blu-ray172.79 W80.90 W
YouTube181.28 W85.08 W
Play (idle)200.84 W95.35 W
Play (full load)205.90 W100.04 W



I wrote a similar article for PS4 power consumption, so if you are interested, you can read it: How Much Electricity Does a PS4 Use. If we compare PS3 power consumption with other popular consoles, we can see that Xbox 360 uses 187 watts in the gaming mode, while the Nintendo Wii uses only 40 watts in gaming mode. Let’s go a little deeper into the topic.

 

SlimySnake

Flashless at the Golden Globes
People caring about console power draw is the weirdest thing to me.
Why? For techies, stuff like this is fascinating. We have been talking about this for over two years now on this very board.

This thread is from 2019 when the PS5 benchmarks first leaked.


powerscalinggpuonlyuljwr.png

resultsshjg4.png


These tests were done on the only RDNA card available back then. It's obvious that RDNA 2.0 has brought a massive reduction in perf/watts which has allowed Sony to go for massive clocks in a console box.

I am too lazy to go back and pull up my posts, but I dismissed the gonzalo and oberon rumors by listing these benchmarks because there was no way an RDNA 1.0 card like oberon and gonzalo would ever fit in a console with an CPU, RAM and SSD all consuming power. That wouldve made it a near 300w console.

This is fascinating stuff. Not just for techies but also what it means for Pro or midgen console refreshes. The XSX engineers might want to take a look at the PS5 variable clocks and higher clocks in general and see if they are indeed leaving performance on the table. That would mean better gaming performance in the future.
 
Last edited:

LordOfChaos

Member
Are people still unaware nearly a year later? Microsoft went with a larger chip at lower clocks, Sony went with a smaller chip with a higher clocked GPU, the natural result is that the PS5 uses a bit more power to get where they are. 52CUs vs 36, 1.825GHz vs up to 2.23. Higher clocks increase power use more than a wider design, and the PS5 uses an instruction mix based instead of a thermal based variable clock system where the power controller looks at the instruction mix and adjusts based on the expected power use based on that, and that can just be for milliseconds within a frame and generally can always run at peak. The design is meant to avoid having to design the system for the worst case instruction mix, instead of designing the cooling and all for that, they can just drop the clocks for milliseconds when those rare worst case power situations come up.

"Using it better" and "Stressing it more" are both bad takes, this would have been exactly what we expected since we knew the specs.

Slightly different designs, but so far as I see the results are trading blows and largely very similar.
 
Last edited:

DarkestHour

Banned
I'm a "techie", hell, I do it for a living. I still don't care about one console with different components drawing 180w vs another one with different components drawing 200w. What is interesting to me is when die shrinks occur or other optimizations of the same console. SlimySnake's comparison of PS3 to PS3 slim is cool, but the power draw of Series S to X to PS5 has zero influence of my decision to get any of those consoles.
 

oldergamer

Member
I'm a "techie", hell, I do it for a living. I still don't care about one console with different components drawing 180w vs another one with different components drawing 200w. What is interesting to me is when die shrinks occur or other optimizations of the same console. SlimySnake's comparison of PS3 to PS3 slim is cool, but the power draw of Series S to X to PS5 has zero influence of my decision to get any of those consoles.
This! It's fucking stupid for people to be making list wars out of power consumption.
 

hlm666

Member
Engines need rewrites for more compute . Best thing MS do is , write their own multi platform engine called Unrealer Engine, (also for PC) include Mesh shading , VRS, SFS, Direct Storage , basically all the RDNA 2 features (or 4) and release this engine with the GDK and the next console launch in beta , so that devs can have the engine 1,5 years before releasing. They can easily take on EPIC, if they want .

At least you than know all features are used. Let the Coalition build the engine.
That matrix demo is using hardware rt and mesh/primitive shaders already, and microsoft can add to unreal engine code base like the coalition already did with this and like nvidia did with RT and dlss. Epic are pretty good with regards to letting companies add to it.
 

Godfavor

Member
Are people still unaware nearly a year later? Microsoft went with a larger chip at lower clocks, Sony went with a smaller chip with a higher clocked GPU, the natural result is that the PS5 uses a bit more power to get where they are. 52CUs vs 36, 1.825GHz vs up to 2.23. Higher clocks increase power use more than a wider design, and the PS5 uses an instruction mix based instead of a thermal based variable clock system where the power controller looks at the instruction mix and adjusts based on the expected power use based on that, and that can just be for milliseconds within a frame and generally can always run at peak. The design is meant to avoid having to design the system for the worst case instruction mix, instead of designing the cooling and all for that, they can just drop the clocks for milliseconds when those rare worst case power situations come up.

"Using it better" and "Stressing it more" are both bad takes, this would have been exactly what we expected since we knew the specs.

Slightly different designs, but so far as I see the results are trading blows and largely very similar.
Microsoft should also have gone with variable clocks depending on power consumption and stayed in that 200-220w range more consistently. They could have boosted the clocks without changing the thermal design
 

Tripolygon

Banned
Some PS5 games go up to 230 watts! Whats probably happening here is that the demo is capped at 30 fps so the xsx and PS5 GPUs are not being maxed out. It would be interesting to see what the power consumption would be like when they are flying through the environment which is when the framerate consistently drops.
Yea Metro seems to be the only game I've seen so far draw closer to 230W. There was a brief spike in the matrix demo that hit 220W. I don't foresee any of the consoles drawing more than ~230W during normal gameplay 60fps or not. One thing Mark Cerny was pushing was for developers to optimize for power draw which is rather interesting.

With PSVR 2 we will see the system power draw increase because of the HMD. That 350W PSU makes sense for PS5.
 
Last edited:

Deerock71

Member
You are in a Matrix thread, you should know the answer, coppertop.

matrix-morpheus.gif
FTFY

EDIT- This is another one of those weird aspects on these side-by-side comparisons that actually impesses me more about the XSS than anything else. That little thing gives you 90% of what the other two are, but is drawing half the power. Kudos to those wizards at Microsoft!
 
Last edited:

LordOfChaos

Member
Microsoft should also have gone with variable clocks depending on power consumption and stayed in that 200-220w range more consistently. They could have boosted the clocks without changing the thermal design

I think variable clocks will become the norm in consoles going forward, sure. It's already a pretty similar AMD technology, though with the spin on it that it has to be run to run consistent on consoles. For any given design, you'll just be able to eke out a bit more when variable.
 

Neo_game

Member
I also remember people saying that smartshift would enable using less power for the same results as the XSX. It seems that's not true either.

I will not be surprised if PS5 is using pretty much same power in most games. PS5 chip is approx 20% smaller but more efficient.

Crazy how well designed the xbox is on all fronts. Size, power, heat, power draw, noise levels.

Ps5 is well designed too but deffo pushes the hardware more.

In games I do not think there will be much may be SX is will have slightly higher resolution in some cases but since SX has bigger GPU it should do better in a tech demo.
 

Panajev2001a

GAF's Pleasant Genius
Are people still unaware nearly a year later? Microsoft went with a larger chip at lower clocks, Sony went with a smaller chip with a higher clocked GPU, the natural result is that the PS5 uses a bit more power to get where they are. 52CUs vs 36, 1.825GHz vs up to 2.23. Higher clocks increase power use more than a wider design, and the PS5 uses an instruction mix based instead of a thermal based variable clock system where the power controller looks at the instruction mix and adjusts based on the expected power use based on that, and that can just be for milliseconds within a frame and generally can always run at peak. The design is meant to avoid having to design the system for the worst case instruction mix, instead of designing the cooling and all for that, they can just drop the clocks for milliseconds when those rare worst case power situations come up.

"Using it better" and "Stressing it more" are both bad takes, this would have been exactly what we expected since we knew the specs.

Slightly different designs, but so far as I see the results are trading blows and largely very similar.

What is more interesting is the sweet spot in terms of frequency and voltage for the chip. You have way more CU’s on the XSX compared to PS5 than you have MHz on PS5 compared to XSX. Would it mean that to allow that clockspeed they increased voltage a tad (or MS was able to lower voltage). That would explain the power increase.
 
Now that I have both Series X and PS5 I can finally say firsthand that both consoles are indeed super quiet with all this power going on. I look forward to more power in the future. I look even more forward to actual games that USE all this power in the future. My old PC built years ago still chugs along using 3 times the wattage and 20 times the Db. :) My next PC build will be water cooled I think. Leaks in my future.

.
 
Troll take is that there is 15 watts of performance left on the table on X, which lazy and dim developers fail to take advantage off. In all seriousness. I think both consoles seem fine hardware wise, but supposedly better yields probably help PS5 push more units.

Not necessarily a troll take; while a different type of system you can easily find many gaming laptops that have throttled performance due to their chips being capped at a certain TDP profile. In other laptops with the same CPU and GPU, the one with higher-TDP CPU and/or GPU tend to get better, more stable performance. That also usually comes with faster clocks.
 
Last edited:

Sosokrates

Report me if I continue to console war
to say its un-optimized is kind of nonsense when a MS owned team helped work on it..you seriously think they stepped in to help optimize to not really optimize? lol come on man work with me here

Even the best devs have to optimise.
 

yurinka

Member
Interesting. It's also crazy to see consoles deliver this stuff at this performance considering they consume way less power than a gaming pc. They are super well optimized and makes me wonder what is going on with PC manufacturers.

I mean, if GPU, CPU or memory is pretty similar in terms of technology and performance between these consoles and PC, why do similar PCs consume way more?
 
Last edited:

Tripolygon

Banned
Interesting. It's also crazy to see consoles deliver this stuff at this performance considering they consume way less power than a gaming pc. They are super well optimized and makes me wonder what is going on with PC manufacturers.

I mean, if GPU, CPU or memory is pretty similar in terms of technology and performance between these consoles and PC, why do similar PCs consume way more?
Yup, a single PC GPU can draw more power than the entire console system.
 

yurinka

Member
Yup, a single PC GPU can draw more power than the entire console system.
Yep, but why? I mean, let's say the PC one it's a similar Zen 2 with a similar clock. Why it draws way more power than the console one? Maybe to embed it into an APU reduces the power needed? I have no idea.

I mean, in terms of cooling, PCs have monster and noisy coolers or liquid stuff that I assume do a better job than the conole one. Or isn't the case?
 

Tripolygon

Banned
Yep, but why? I mean, let's say the PC one it's a similar Zen 2 with a similar clock. Why it draws way more power than the console one? Maybe to embed it into an APU reduces the power needed? I have no idea.

I mean, in terms of cooling, PCs have monster and noisy coolers or liquid stuff that I assume do a better job than the conole one. Or isn't the case?
The architecture is optimized differently, think of consoles like laptops, they are optimized differently because of power constraints and thermal constraints. In the case of consoles, you have a CPU and GPU combined in 1 package so they have to make sure the power draw is low.
 

Riky

$MSFT
It's almost as MS had time to improve the efficiency of an existed GPU and Sony would had to design a new one from the beginning:messenger_beaming:

You've said this before and it's stupid as usual.

Where is the "existing" GPU for Series X? No RDNA2 released GPU has the same number of compute units or clock speed of the Series X, also the whole setup of memory to the GPU is totally different from any PC part.
On top of that Microsoft adjusted the cores for ML and added filters for the SFS implementation on top of the RDNA2 implementation.
You just say things with no evidence again, I'm sure the terms of this site say you have to back these sort of claims up with evidence.
 
Last edited:

Riky

$MSFT
i also agree that I would have liked to see the series x gpu hit 2ghz or 2.1 but there must be a reason ms went with the 1.8 that we are unaware of.

Jason Ronald was asked this, basically it was to give exactly twice the GPU performance of Xbox One X, that's why the clock is set exactly how it is. They even talked about a console with more compute units and an even lower clock at one point but I think the die size was increased too much.
 

DeepEnigma

Gold Member
More detail here. The PS4 was a cheaper console designed for an uncertain period for gaming post recession. Gaming is in a much better state today.

Despite PlayStation 3 has a 380 W power supply it only consumes 170-200 W during game mode on 90 nm Cell CPU. The newer 40GB PlayStation 3 model that comes with 65 nm Cell CPU/90 nm RSX consumes from 120-140 W during normal use. PlayStation 3 Slim with 45 nm Cell CPU/40 nm RSX consumes from 65-84 W during normal use.


In this table you can see exactly how much power does each PS3 model consume in different mode:


Model:PS3PS3 Slim
Standby1.22 W0.36 W
Idle operation171.35 W75.19 W
Blu-ray172.79 W80.90 W
YouTube181.28 W85.08 W
Play (idle)200.84 W95.35 W
Play (full load)205.90 W100.04 W



I wrote a similar article for PS4 power consumption, so if you are interested, you can read it: How Much Electricity Does a PS4 Use. If we compare PS3 power consumption with other popular consoles, we can see that Xbox 360 uses 187 watts in the gaming mode, while the Nintendo Wii uses only 40 watts in gaming mode. Let’s go a little deeper into the topic.

Thanks for this. The PS3 SS I have is super quiet and does not give off much heat either. Almost like it's not even on when playing digital games.
 

Imtjnotu

Member
Interesting, xsx versions seems unoptimised, it should not be getting such drastic dips.

We can see here Gears5 is a lot more stable and using more power, in gameplay it doesn't go below 190w and during battle its around the 200-210w.


Unreal 4 vs 5. Not the same thing my man
 
Top Bottom