• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Power Analysis] What happens if you spec a PC like a PS5?

ZywyPL

Banned
Amazing effort. But why would anyone build a PC specced as PS5 and pay more? While your burning more money, better shoot for 6800XT or 6900XT or the nVidia cards.

Some people just don't accept the comfy couch experience and cannot imagine gaming on something else than M+K. Also access to different genres that aren't available on consoles. Free online. Cheaper games. 60FPS. The list just goes on and on.
 

rodrigolfp

Haptic Gamepads 4 Life
Well today even phones can do most of the stuff, but when building a PC better aim for high specs as you get outdated easily. 32GB+ RAM and at least 6800XT/3080 if you want a decent upgrade over consoles.
Phones can give me, for example, native mouse+kb support for all my shooters since Wolf3D???
 
Last edited:

Bo_Hazem

Banned
Phones can give me, for example, native mouse+kb support for all my shooters since Wolf3D???

stare GIF
 

Fredrik

Member
Well today even phones can do most of the stuff, but when building a PC better aim for high specs as you get outdated easily. 32GB+ RAM and at least 6800XT/3080 if you want a decent upgrade over consoles.
As I said. On PC you have visual settings. Consoles can have settings menus but devs don’t think console gamers can handle them so consoles don’t have settings menus. And with settings you don’t have to accept sub-60fps at mid gfx settings because some dev think the most important thing about the game is rendering the game in 4K. That’s why console gamers fight for days about 3 fps and preset settings differences between the consoles instead of just going into a menu, flick a switch and play the game.

Plus, if you want to go there. Mods. Tweaks. Upgradability. Overclocking. Triple screen gaming. Etc etc.

It’s all awesome! ❤️
 

Bo_Hazem

Banned
As I said. On PC you have visual settings. Consoles can have settings menus but devs don’t think console gamers can handle them so consoles don’t have settings menus. And with settings you don’t have to accept sub-60fps at mid gfx settings because some dev think the most important thing about the game is rendering the game in 4K. That’s why console gamers fight for days about 3 fps and preset settings differences between the consoles instead of just going into a menu, flick a switch and play the game.

Plus, if you want to go there. Mods. Tweaks. Upgradability. Overclocking. Triple screen gaming. Etc etc.

It’s all awesome! ❤️

That's a solid point especially that most PC gamers play on tiny monitors anyway. I have a gaming PC but used as a workstation not gaming, so don't wanna go there ;)
 

keram

Member
[disclaimer: this is a work in progress post]

Hey fellows,
so some may remember my old Gonzalo-Thread where I simulated the back then rumoured PS5 prototype with the newly release RX 5700XT graphics cards. So back in Summer 2019 we knew nothing official on the PS5 specs. Today we know a whole lot more, rangig from architecture details to clockspeeds.


Last week the RX6700XT released and i snatched one up to do a follow up analysis. Now with an actually comparable RDNA2-based GPU and the now known CPU configuration.

fotom1jx3.jpg


This is the worst overclocker i've ever had (at stock it doesn't even quite archive the ref cards clocks) and it seems Sapphire is selecting ASIC quality for their better range of cards. In this case that's a good thing, because it means that's pretty much the worst case scenario for efficiency.

So before showing first results and explaining the methodology here's a summary of what we are dealing with.

gpuspec4pkjj.png


cpuspecdjjlf.png


*in case of PS5 that's total observed power draw from the wall which equates to the sum of all components TDP [on the AC side]


Further used PC-Components:

650W Gold PSU
2x8GB Samsung B-die running 3600Mhz
B450 Motherboard
Sata SSD
Sata HDD
additional fans


Methodology:

The CPU gets underclocked to PS5 levels (and undervolted to get realistic working conditions in an console environment - SOC voltages sadly aren't known for PS5 atm afaik)

cpuundervoltsettingsozktn.png


The GPU is also underclocked and slightly undervolted. To understand why the frequency is set to 2.315 GHz instead of 2.23 you have to understand that there is an offset within the AMD Navi boosting algorithm that prevents it from reaching it's frequency goal. To archieve a 2.2 GHz clockrate i needed to adjust the frequency goal as shown (f_goal might vary depending on load):

gpuundervoltsettingsp2ktw.png


Power is monitored at the wall, as well as in software via the driver sensor outputs of the measured rails.

Please note, that because of Infinity Cache the RX6700XT is expected to be a bit more efficient than the PS5. On the other hand my PC has a lot more auxiliary power load than a console (double ram pools, more fans, IO etc.)

Results (sneak peak):

Dirt 5 Ultra - Resulition scaling off

GPU frequency
dirt5freq1eka4.png


FPS
dirt5fpsqekf5.png


Total GPU Power (driver)*
dirt5graphicspowerobkm6.png


*please note: the RX 6700XT is measuring more rails than the first gen Navi 5700XT, hence the total power load output from the driver isn't comparable!!!

Power consumption whole system (at the wall)*:
dirt5wallpowernmkl8.jpg


Hovers around 200W constantly (video follows if demanded) without much fluctuation (+/-5W)

*please note that this figure is not directly comparable to the driver side measuremeant because of power conversion / PSU efficiency losses. 200W at the wall roughly equates to 180W on the 12V DC side of things.
What a timewaste ... PS5 != GPU :pie_eyeroll:
 
Last edited:

Fredrik

Member
That's a solid point especially that most PC gamers play on tiny monitors anyway. I have a gaming PC but used as a workstation not gaming, so don't wanna go there ;)
Yeah I really like that you have the choice to use the power any way you want. I could absolutely play in 4K but as of right now I play on a 1080p gsync screen at 60-100+ fps at Ultra settings or power up two screens at the side for triple screen racing at ~60fps with slightly lower settings.

I don’t think consoles suck or anything like that, I use consoles a lot, like when I want to play in the living room, maybe to get home theater sound or to slouch in a couch, or when I play with the kids, or when there is no PC version, etc.

But PC gaming definitely has it’s benefits for me even if I don’t have a 3090.
 
it looks like i could have done a better job explaining what this thread is about :messenger_grinning_sweat:
This thread is first in foremost about analyzing the power characteristics of the used architectures and the price you have to pay to run under low power conditions.

nethertheless im somewhat disappointed of some contriubutions here (looking at you Bo_Hazem Bo_Hazem )
 

scydrex

Member
Your hardware magazine are a bunch of idiots. that a fact.

For 2000 euro's u can get

16 core top end ryzen 5950x, more then double the performance easily.
6800xt, more then double the performance easily
980 pro 1 terabyte 7gbps ssd
32gb of memory

How is that system comparable towards what the PS5 has?

Look PS5 value is solid and for people that don't want to deal with pc's, its great value. But saying its a 2k PC is just utterly laughable.

Were i live add to that price 38% of taxes for products or item that you buy in the internet and cost US$200 or more. To those 2000 add 760 of taxes for a total of 2760 or USD euros...
 
Last edited:

yamaci17

Member
3700x will be an inferior CPU 4 years later. PS5 won't be.

Ps4 equivalent GPU is literally a 750ti and CPU is literally a 1.6 ghz 8 cores Jaguar.

Try to play Red Read Redemption 2 on such a system.

1) 750ti can't push 1080p
2) It can't past any settings beyond low
3) It barely struggles to lock to 30 fps in 720p low-med settings
4) It will do low textures, regardless of its power due to its weak 2 gb vram
5) Before 750ti, 8 total 1.6 ghz jaguar cores will barely render 10-15 fps in the game



This is what happens when you try to run games on a equivalent CPU.

This is what will happen to the precious 3700x. It will render at 30-40 fps while ps5 will easily lock to 60 fps even 5 years later (at lower resolutions, that's on GPU).

And people here are still talking about using less-cached CPUs to match PS5 lmao.

3 years later, even a 5800x won't be able to match a PS5/Xbox Series X.

Games lose 1.5-2 times more performance when ported from to PC. Almost 4 out of 5 games do this. There are only few instances where it's not the case.

Only reason 3700x can still keep on because we're in a cross-gen. 4 years later, there will be CPUs that will be %80-110 faster than 3700x and it will be inferior.

I'm not even going to talk about 6700xt. It will be much, much more inferior compared to PS5.

Imagine playing games with a 1.8 tflops gcn 2 gpu from 2013. Yeah, good luck running at 1080p 18-25 fps. Or get back to me if you can run RDR 2 at native 4k with good looking settings at locked, smooth 30 fps with a rx 580 (practically xbox one x), oh noes, you cant do it because 580 will render 18-21 fps. yeah.

Console equivalent CPUs are always doomed to suffer. Only after 1.5-2 years after console releases, we will have much more stronger hardware than consoles. This time it's much more brutal than PS4. PS4 and Xbone had weak GPU and CPU to start with. This time they're very competitive. They will make lots of PCs obsolete due to the bad ports.
 

yamaci17

Member
If you have Cyberpunk, you can use this shortcut command to directly access PS5 graphical levels (you need to set your resolution before hand, I would say 1296p is a good place to start)


ConsoleEarlyNextGen
ConsoleEarlyNextGenQuality
 

Bo_Hazem

Banned
it looks like i could have done a better job explaining what this thread is about :messenger_grinning_sweat:


nethertheless im somewhat disappointed of some contriubutions here (looking at you Bo_Hazem Bo_Hazem )

It's a wonderful thread, but it's a curious question that got some good answers from Fredrik Fredrik . Personally, if PC gaming is my thing I always shoot at High-end PC gaming not low end specs like many PC-master-race-wannabe do.

Yeah I really like that you have the choice to use the power any way you want. I could absolutely play in 4K but as of right now I play on a 1080p gsync screen at 60-100+ fps at Ultra settings or power up two screens at the side for triple screen racing at ~60fps with slightly lower settings.

I don’t think consoles suck or anything like that, I use consoles a lot, like when I want to play in the living room, maybe to get home theater sound or to slouch in a couch, or when I play with the kids, or when there is no PC version, etc.

But PC gaming definitely has it’s benefits for me even if I don’t have a 3090.

I really appreciate your input and rational point of views, and I'm totally with you. As for PC gaming, if there is ever a game that I really think I need to play then I'll play it, like something too good from Xbox or something like the scam Star Citizens. So far, I would rather play on PS4 Pro before and now PS5 rather than playing on my PC with Radeon VII (a wonderful workstation card with 16GB HBM2 VRAM "1TB/s bandwidth", and a very good 4K gaming card).
 
Last edited:
In detail there are multiple differences and they are all impacting perf/watt a bit but in the grand scheme of things I would agree with Zathalus, it doesn't really matter much and RDNA2 vs. custom RDNA2 is a fair comparison.
It's very interesting that both systems are close together in terms of power draw.
Now it would be fascinating to see how performance compares.

cheers


more indepth results following. after more stability testing i changed the base spec slightly (3700x 1.0V, 6700xt 1140mV at f_goal). This time around the well known synthetic benchmark firestrike:

Scores vs System Power:

firestrikegraphicsscohujwb.png


powerfromwalllyj7f.png





Frequency:
firestrikefrequencycajo7.png


GPU Power:
firestrikegpupowerpkko2.png


Graphics Test 1 is the more power hungry of both tests. I gathered the wall power figure during GT1. that said, GT2 seems to be more representative for a real game workload (the frequency offset seemed to shrink during GT2 which resulted in very slightly higher clocks also)
 
It's a wonderful thread, but it's a curious question that got some good answers from Fredrik Fredrik . Personally, if PC gaming is my thing I always shoot at High-end PC gaming not low end specs like many PC-master-race-wannabe do.
this thread is much more about PS5 than it is about PCs. im a little sad that people don't get that while reading. well i hope it will get more clear when i do the interpretation after gathering data.
 
ok community, i need your help:

  1. Has anyone still lying a 5700xt around? i sadly do not have one anymore. would need a screenshot of HWinfo64 showing all relevant GPU rails under load
  2. Can you help me find a scenario / data where we can actually replicate PS5 at full throttle?
Deacon made some good suggestions already
I don't know of any free demos right now, but some games you could try are Nioh 1 or 2, Dirt 5, AC: Valhalla, DMC5, etc.

DIRT5 doesen't hit 120fps the goal framerate in that mode most of the time, does it? can we replecate the settings or is dynamic res in play here?

i have WD:legion for the pc... that's locked at 30Hz on consoles. maybe there is some stress points where it dips in a replicatable manner? there was a config file with the exact console settings right?
 

yamaci17

Member
ok community, i need your help:

  1. Has anyone still lying a 5700xt around? i sadly do not have one anymore. would need a screenshot of HWinfo64 showing all relevant GPU rails under load
  2. Can you help me find a scenario / data where we can actually replicate PS5 at full throttle?
Deacon made some good suggestions already


DIRT5 doesen't hit 120fps the goal framerate in that mode most of the time, does it? can we replecate the settings or is dynamic res in play here?

i have WD:legion for the pc... that's locked at 30Hz on consoles. maybe there is some stress points where it dips in a replicatable manner? there was a config file with the exact console settings right?
Rainbow Six Siege with FPS unlocked (i heard that console version can have framerate unlocked from menu and you can even observe the frame rate)

Here's rtx 3070+r7 2700, 1440p (upscaled) med-high mix settings
fwFUiKy.png

the game is amazingly highly threaded and i would assume it would be the same for ps5 as well, if u can unlock frames, that is

theoritically this game can max out both cpu and gpu of ps5
 
Last edited:
ok duders, now it's getting serious:


this time around i've compiled the power draw data from the different rail sensors available.


the following is once again the Dirt 5 benchmark. but this time with Ray Tracing enabled since they added that in the last patch and i wanted to try it out (at stock settings enabling RT increased power consumption around 3W on the GPU side)

PC as described in Post 1 at stock settings:

dirt5rton0hjkw.png



PC at PS5 spec:

dirt5rtonps5spece7kgh.png


I've added the approximate power draw from the wall for convenience. the delta between the measured rails and wall power line mainly comprises of motherboard and RAM consumption as well as PSU conversion losses

On the performance side once again, you lose around 10% fps while reducing power consumption nearly by a 100 watts.
 
Last edited:
Rainbow Six Siege with FPS unlocked (i heard that console version can have framerate unlocked from menu and you can even observe the frame rate)

Here's rtx 3070+r7 2700, 1440p (upscaled) med-high mix settings
fwFUiKy.png

the game is amazingly highly threaded and i would assume it would be the same for ps5 as well, if u can unlock frames, that is

theoritically this game can max out both cpu and gpu of ps5

great idea. i snatched that up today (8EUR on ubistore spring sale right now) i suspect that we might get sometwhat CPU limited though. any good sources for ps5 data & settings?
 

yamaci17

Member
great idea. i snatched that up today (8EUR on ubistore spring sale right now) i suspect that we might get sometwhat CPU limited though. any good sources for ps5 data & settings?
hey, it's in the video settings tab, you just disable vsync, choose prioritize performance in the same menu

you may also test with graphics mode as well

in both modes, fps will be unhinged so device will work to its maximum potential
 

SlimySnake

Flashless at the Golden Globes
3Dmark firestrike pretty much the same story: hovering pretty constantly around 200W

Scores:
firestrike2xjub.png
Thats crazy. The very first gonzalo leak by Apisak had the Firestrike score at 20,000. I am guessing thats when the GPU was clocked at 1.8 ghz.

my RTX 2080's firestrike score with a 11700k is 24k. I know firestrike prefers AMD cards, but 30k is very impressive.
 
Some people just don't accept the comfy couch experience and cannot imagine gaming on something else than M+K. Also access to different genres that aren't available on consoles. Free online. Cheaper games. 60FPS. The list just goes on and on.
My PC has HDMI to my TV and I play games using an Xbox controller sometimes. It’s pretty awesome having options.
 
Last edited:

Great Hair

Banned
Americans don't understand how expensive everything is in EU land.

USA fuel prices​

Fuels, price per litreDateUSDUSD
Gasoline prices01.11.20210.9880.988
Diesel prices01.11.20210.960.96

France fuel prices​

Fuels, price per litreDateEURUSD
Gasoline prices01.11.20211.6531.919
Diesel prices01.11.20211.5761.829

USA electricity prices​

Electricity prices per kWhDateUSDUSD
Households01.03.20210.150.15
Business01.03.20210.1090.109

France electricity prices​

Electricity prices per kWhDateEURUSD
Households01.03.20210.180.209
Business01.03.20210.1260.146

Gernany electricity prices​

Electricity prices per kWhDateEURUSD
Households01.03.20210.320.371
Business01.03.20210.210.244


 

SlimySnake

Flashless at the Golden Globes
The PS5 will have similar performance to a 6600xt.
DemonCleaner DemonCleaner If you can secure a 6600xt out (10.6 tflops), it might be a pretty good candidate for comparisons. It's got 4 fewer CUs, albeit with far higher clocks, but might be good for a tflops to tflops comparison.

Control on PS5 has a photomode that has an uncapped framerate, you might be able to do some like for like comparisons with RT on and Off.
 

Sosokrates

Report me if I continue to console war
The PS5 will have similar performance to a 6600xt
DemonCleaner DemonCleaner If you can secure a 6600xt out (10.6 tflops), it might be a pretty good candidate for comparisons. It's got 4 fewer CUs, albeit with far higher clocks, but might be good for a tflops to tflops comparison.

Control on PS5 has a photomode that has an uncapped framerate, you might be able to do some like for like comparisons with RT on and Off.
Interestingly the 6600xt only performs slightly better then the 5700xt in rasterised games despite having a significantly higher clock speed.

 
Last edited:

Tqaulity

Member
Thats crazy. The very first gonzalo leak by Apisak had the Firestrike score at 20,000. I am guessing thats when the GPU was clocked at 1.8 ghz.

my RTX 2080's firestrike score with a 11700k is 24k. I know firestrike prefers AMD cards, but 30k is very impressive.
This this analysis is pretty cool overall! The takeaway I get from the data provided by DemonCleaner DemonCleaner is that the PS5 spec machine is roughly 8% slower than the stock 3700X/6700XT configuration (based on FireStrike benchmark alone). That's actually really impressive for a console, especially to be achieving that with so much less power draw. Now of course every workload is different and Firestrike does performance better on AMD vs Nvidia, but it still shows you what is possible (even with an app that isn't really optimized for the console).

So for comparison sake, let's look at a few other GPUs paired with a 3700X CPU in FireStrike (GraphicsScore):

RX 5700XT: ~26K
RTX 2070S: ~27K
RX 6600XT: ~29K
RTX 2080: ~28K
RTX 2080S: ~30K
RTX 3060TI: ~32K
RTX 2080TI: ~39K
RX 6700XT: ~38K
RTX 3070: ~34K

Hmmm...so a PS5 level system can perform on par with a RTX 2080S as an upper bound? Sounds familiar :messenger_smirking:

Yeah one synthetic benchmark which favors AMD architecture but that is my point....its a best case scenario which is why i called it an "upper bound". Under optimal conditions with software optimized for the hardware, the PS5 can approach 2080S perf (or even higher as time goes on). Very cool!
 

Topher

Gold Member
This this analysis is pretty cool overall! The takeaway I get from the data provided by DemonCleaner DemonCleaner is that the PS5 spec machine is roughly 8% slower than the stock 3700X/6700XT configuration (based on FireStrike benchmark alone). That's actually really impressive for a console, especially to be achieving that with so much less power draw. Now of course every workload is different and Firestrike does performance better on AMD vs Nvidia, but it still shows you what is possible (even with an app that isn't really optimized for the console).

So for comparison sake, let's look at a few other GPUs paired with a 3700X CPU in FireStrike (GraphicsScore):

RX 5700XT: ~26K
RTX 2070S: ~27K
RX 6600XT: ~29K
RTX 2080: ~28K
RTX 2080S: ~30K
RTX 3060TI: ~32K
RTX 2080TI: ~39K
RX 6700XT: ~38K
RTX 3070: ~34K

Hmmm...so a PS5 level system can perform on par with a RTX 2080S as an upper bound? Sounds familiar :messenger_smirking:

Yeah one synthetic benchmark which favors AMD architecture but that is my point....its a best case scenario which is why i called it an "upper bound". Under optimal conditions with software optimized for the hardware, the PS5 can approach 2080S perf (or even higher as time goes on). Very cool!

My PC is 2080S and 3700x. I’ll fire up 3DMark in a bit and post results.
 

Topher

Gold Member
Ran Fire Strike. Here is my PC:

YbftA1C.png


Results:

eSfNxQf.png


I'm running Windows 11 though and may affect the results. This score is lower than it should be.
 
Last edited:
DemonCleaner DemonCleaner If you can secure a 6600xt out (10.6 tflops), it might be a pretty good candidate for comparisons. It's got 4 fewer CUs, albeit with far higher clocks, but might be good for a tflops to tflops comparison.

Control on PS5 has a photomode that has an uncapped framerate, you might be able to do some like for like comparisons with RT on and Off.

The PS5 will have similar performance to a 6600xt.


hi guys,

if i were to finally get another rdna2 GPU 2 for a reasonable price, i would get a 6800 to make some comparisons to XSX. (i had a 6800 over for two weeks around christmas last year but didn't have the time for testing... dammit)

that said, in my opinion a underclocked 6700XT is the correct choice for a comparison to PS5 and not the 6600XT ... the Tflops numbers are deceiving in this case.
the 6600xt is bandwidth starved compared to PS5 in high throughput scenarios. CU / WGP utilization should tend to be somewhat higher in PS5 compared to rdna2 PC GPUs. that should pretty much offset the slight difference in CU count [compared to the 6700xt].
 
Last edited:
This this analysis is pretty cool overall! The takeaway I get from the data provided by DemonCleaner DemonCleaner is that the PS5 spec machine is roughly 8% slower than the stock 3700X/6700XT configuration (based on FireStrike benchmark alone). That's actually really impressive for a console, especially to be achieving that with so much less power draw. Now of course every workload is different and Firestrike does performance better on AMD vs Nvidia, but it still shows you what is possible (even with an app that isn't really optimized for the console).

So for comparison sake, let's look at a few other GPUs paired with a 3700X CPU in FireStrike (GraphicsScore):

RX 5700XT: ~26K
RTX 2070S: ~27K
RX 6600XT: ~29K
RTX 2080: ~28K
RTX 2080S: ~30K
RTX 3060TI: ~32K
RTX 2080TI: ~39K
RX 6700XT: ~38K
RTX 3070: ~34K

Hmmm...so a PS5 level system can perform on par with a RTX 2080S as an upper bound? Sounds familiar :messenger_smirking:

Yeah one synthetic benchmark which favors AMD architecture but that is my point....its a best case scenario which is why i called it an "upper bound". Under optimal conditions with software optimized for the hardware, the PS5 can approach 2080S perf (or even higher as time goes on). Very cool!

thanks for reading the thread thoroughly... happy to see that the message was understandable after all (at least for a few)


yeah, the big BUT is that firestrike does really know well how to utilize CUs / WGPs which is not always reflective of actual games. in time spy nvidia does a ted better comparably. i needed to go with firestrike though if i wanted to compare it to rdna1, since i only had data for that for the 5700 series (back from the Simulating Gonzalo (Rumoured NextGen/PS5 leak) | NeoGAF thread).
 

Sosokrates

Report me if I continue to console war
hi guys,

if i were to finally get another rdna2 GPU 2 for a reasonable price, i would get a 6800 to make some comparisons to XSX. (i had a 6800 over for two weeks around christmas last year but didn't have the time for testing... dammit)

that said, in my opinion a underclocked 6700XT is the correct choice for a comparison to PS5 and not the 6600XT ... the Tflops numbers are deceiving in this case.
the 6600xt is bandwidth starved compared to PS5 in high throughput scenarios. CU / WGP utilization should tend to be somewhat higher in PS5 compared to rdna2 PC GPUs. that should pretty much offset the slight difference in CU count [compared to the 6700xt].

Yeah, i jumped the gun. I agree a lower clocked 6700xt is more comparable to a PS5.
The memory situation messes things up a bit. For none ray tracing games a 5700 clocked @2230mhz would be the best, but the highest Ive seen that clocked is 2100mhz.
 
Yeah, i jumped the gun. I agree a lower clocked 6700xt is more comparable to a PS5.
The memory situation messes things up a bit. For none ray tracing games a 5700 clocked @2230mhz would be the best, but the highest Ive seen that clocked is 2100mhz.

i dont think so

- 5700 does not have the massive pipeline redesign in rdna 2 that enabled this clocks and power profile in the frist place
- i pushed the 5700xt to 2.100 Mhz real measured clock back then. it stopped scaling linearly even in synthetics at this point. it would have run in all sorts of limits in real world applications at even higher clocks. so would the 5700 non xt
 

Sosokrates

Report me if I continue to console war
i dont think so

- 5700 does not have the massive pipeline redesign in rdna 2 that enabled this clocks and power profile in the frist place
- i pushed the 5700xt to 2.100 Mhz real measured clock back then. it stopped scaling linearly even in synthetics at this point. it would have run in all sorts of limits in real world applications at even higher clocks. so would the 5700 non xt

The PS5 and XSX compute unit design and buses actually has quite a bit common with RDNA1, they have features like Raytracing and hardware VRS which they take from RDNA2.
 
Last edited:

Sosokrates

Report me if I continue to console war
can you describe what is the difference between RDNA1 and RDNA2 CU's.
I cant remember the link, but there was a recent article where someone did an analysis on a delidded PS5 and XSX APU and I remember that they said the front end was the same as RDNA1.
In certain aspects the PS5+XSX have more in common with RDNA1.

they are kind of like RDNA1 + the RT insection units,
They are not like the PC RDNA2 design.

Also rasterization performance of RDNA1 + RDNA2 is very similer.
 
Last edited:

Darius87

Member
I cant remember the link, but there was a recent article where someone did an analysis on a delidded PS5 and XSX APU and I remember that they said the front end was the same as RDNA1.
In certain aspects the PS5+XSX have more in common with RDNA1.

they are kind of like RDNA1 + the RT insection units,
They are not like the PC RDNA2 design.

Also rasterization performance of RDNA1 + RDNA2 is very similer.
can you spot the difference? apart RT and power savings in RDNA2:

03737c08-7540-4a78-940e-a660ca7fdebf.PNG
rdna-2-compute-units-100867216-orig.jpg


Despite the intense rejiggering, the fundamental RDNA 2 building blocks remain largely similar to RDNA 1’s in broad strokes—aside from the addition of dedicated ray accelerator hardware, which we’ll get to later—only scaled up much further.
https://officejo.com/rdna-2-deep-dive-whats-inside-amds-radeon-rx-6000-graphics-cards/
 
Last edited:

Sosokrates

Report me if I continue to console war
AwhMGwZ.png
AD2E42z.jpg


This just makes it easier to see.

They look the same apart from the RT unit.

Good read in the link, thanks for that.
 
Last edited:

Madjako

Member
Ok clever guys, here is the link to the article saying that in june 2020 (date of reaveal of next gen console) a PC with PS5 parts would cost 1950€ and XBOX would cost 1700€ :


"PlayStation 5, Xbox Series X or ... PC? | Config of the quarter - Canard PC Hardware 45
By Dandu | July 1, 2020 | Modified on May 31, 2021
For this quarterly configuration, we have decided to offer you an addition to this article. How much is, in June 2020, a PC equivalent to the two Next Gen consoles, the PlayStation 5 and the Xbox Series X?

We based ourselves on the technical information and the prices known mid-May 2020. The configuration takes into account the two consoles, which are quite close technically. Let's start with the CPU. Both integrate an AMD CPU from the Zen 2 family, with eight cores (and SMT). Microsoft uses a fixed frequency, Sony a variable frequency depending on the load (we explain this in the dossier on page 41). Either way, the closest processor is the Ryzen 7 3700X, which is worth $ 370. Let's move on to the GPU. Sony announces an RDNA 2 chip, with 36 CU at 2.23GHz (a high frequency), for ~ 10 teraflops. Microsoft goes up to 52 CU at 1.82 GHz, for ~ 12 teraflops. RDNA 2 cards do not yet exist in our PCs, but Sony's choice corresponds to a Radeon RX 5700 with a very high frequency (~ 500 MHz higher); Microsoft's has a bigger and faster GPU than the Radeon RX 5700 XT. As there is no such thing as a perfect choice, we have selected an overclocked version, which is priced around € 500. Both consoles contain 16GB of GDDR6, so we went with 16GB of DDR4-3200. For storage, Sony integrates an 825 GB SSD, Microsoft a 1 TB SSD. Both advertise in-house controllers, but Sony seems to use PCI-Express 4.0; we therefore took two models with similar characteristics.


Accessories and format. The consoles come with an Ultra HD Blu-ray player and we chose the Asus BW-16D1HT burner because this model physically reads “Ultra HD” discs. Likewise, they each provide a controller; we took as a basis the current versions, sold for 60 €. Let's move on to the case. At the time of this writing, the design of the PlayStation 5 is not known; we selected a model with a larger surface - purely desktop models like the 1990s are still rare - and a mini tower for the Xbox Series X. A Fractal Design Node 804 for Sony, a Define Mini C of the same brand for Microsoft. In both cases, a good power supply (Seasonic Focus GX Gold) of 550 W will suffice. Let's move on to the motherboard. Sony uses PCI-Express 4.0, so the X570 chipset is mandatory in May 2020 (it's quite expensive). Microsoft is content a priori with PCI-Express 3.0, and a B450-based card will be able to fulfill this task. We must also add a Wi-Fi card, a standard in consoles for a few years.
A high total. The total hurts quite a bit in May 2020, especially considering an expected selling price of around € 500 for consoles: we arrive at ~ € 1,700 for a current equivalent of Xbox Series X and ~ € 1,950 for the PlayStation 5 The exercise has all the same flaws: the Blu-ray burner is absent from modern PCs, and PCI-Express 4.0 strongly increases the price in the “Sony” config. In a few weeks, the arrival of the B550 chipset will reduce the bill by 100 €, for example. Without the accessories (controllers, burner, Wi-Fi) and taking a PCI-Express 3.0 SSD, the total drops to around ~ € 1,500. While knowing that prices will evolve by the end of 2020, and that the PC will not be frozen for the next three or four years."
(sorry google translation!)

so what do say naysayers now ???
Please stop answering sentence from your brain without thinking !!

here is the detail for XBOX/PS5:
Processor: AMD Ryzen 7 3700X - 370 €
Ventirad: Included with the CPU - 0 €
Motherboard: ASRock B450M-Pro4 / ASRock X570M Pro4 - 110/220 €
RAM: 2 × 8 GB DDR4-3200 - 90 €
GPU: AMD Radeon RX 5700 XT - € 500
SSD: Corsair MP510 / MP600 1TB - 210/300 €
Power supply: Seasonic Focus GX 550 Gold - 105 €
Case: Fractal Design Define Mini C / Node 804 - 100/140 €
Engraver: Asus BW-16D1HT - 110 €
Controller: Xbox One / Dual Shock 4 - € 60
Wi-Fi card: Gigabyte GC-WBAX200 - 45 €
 
Last edited:

Locuza

Member
I cant remember the link, but there was a recent article where someone did an analysis on a delidded PS5 and XSX APU and I remember that they said the front end was the same as RDNA1.
In certain aspects the PS5+XSX have more in common with RDNA1.

they are kind of like RDNA1 + the RT insection units,
They are not like the PC RDNA2 design.

Also rasterization performance of RDNA1 + RDNA2 is very similer.

Locuza Locuza see what you did there :messenger_grinning_sweat:
This thread?


The 3D-Frontend is indeed *configured* like on PC RDNA1 GPUs.
This doesn't mean that it's the exact same frontend without modifications.
Potentially there are different reasons why the PS5/Xbox Series use the RDNA1 building style.
1.) The hardware design for the new Frontend was not solidified enough for the PS5/Xbox Series, so they use the old design.
2.) Microsoft/Sony purposefully picked the old configuration, because it has more geometry/rasterizer pipelines than the new design.
The theoretical triangle/clock throughput is worse on the new RDNA2 design.
4 Geo&Raster pipelines, together with 4 Shader Arrays, map very well to the previous PS4 Pro and Xbox One X designs.

Personally I think option 1 is unlikely and that option 2 is close to the truth.

So far the rasterization performance of RDNA2 shows good results on average, however, I would love to see more coverage there.
I'm not aware of different microbenchmarks looking at the triangle performance.

______

Now, the 3D frontend design is just the 3D frontend design.
The Compute Unit design is another story.

PS5 and Xbox Series are clearly leveraging the physical design work which AMD did for RDNA2, otherwise the PS5 GPU would never run at 2.23GHz and both consoles would consume much more energy.
 
The PS5 and XSX compute unit design and buses actually has more in common with RDNA1, they have features like Raytracing and hardware VRS which they take from RDNA2.
man, the whole point of this thread was to show you guys that this is not the case. im kinda baffled how one can come to this conclusion after reading the data that is presented here.

well seems like i left to much room for interpretation. so here is the gist of it:

wallpowercomparisonrdorjf6.png

* please note regarding the graph above:
-methodology is described on page 1
-PS5 does not have infinity cache like the 6700xt, what makes its power draw even more impressive
-PS5 is probably not running at quite as low voltages as my example for yield reasons, what makes its power draw even more impressive

-in contrast to the previous graphs in this and the gonzalo thread this one shows real clocks and not clock target from wattman.


-> the important parts of the PS5s GPU can't be rdna1 because with the U/f-characteristics of rdna1 it would draw 400W+ ... no it wouldn't because it would be impossible to reach the clocks it does in the first place.


amd themselves presented that the main show is happening in the redesigned CUs

RX6800_08_575px.jpg


so can we please stop now with that rx 5700 referencing when speaking about the consoles? it's plain wrong.




This thread?


sorry man this was just a cheap attemp to lure you back into this thread :)

i think you've done monumental work with said analysis and the subsecent video. doesn't it make you a bit sad, when some people with their selective perception pic up one minuscule part of the thoroughly considered and well differentiated work that in my opinion you clearly provided with said video? and worse, they even accomplish to wrap it so much out of context that they reverse the original conclusion... argh
 
Last edited:

Sosokrates

Report me if I continue to console war
man, the whole point of this thread was to show you guys that this is not the case. im kinda baffled how one can come to this conclusion after reading the data that is presented here.

well seems like i left to much room for interpretation. so here is the gist of it:

wallpowercomparisonrdorjf6.png

* please note regarding the graph above:
-methodology is described on page 1
-PS5 does not have infinity cache like the 6700xt, what makes its power draw even more impressive
-PS5 is probably not running at quite as low voltages as my example for yield reasons, what makes its power draw even more impressive

-in contrast to the previous graphs in this and the gonzalo thread this one shows real clocks and not clock target from wattman.


-> the important parts of the PS5s GPU can't be rdna1 because with the U/f-characteristics of rdna1 it would draw 400W+ ... no it wouldn't because it would be impossible to reach the clocks it does in the first place.


amd themselves presented that the main show is happening in the redesigned CUs

RX6800_08_575px.jpg


so can we please stop now with that rx 5700 referencing when speaking about the consoles? it's plain wrong.







sorry man this was just a cheap attemp to lure you back into this thread :)

i think you've done monumental work with said analysis and the subsecent video. doesn't it make you a bit sad, when some people with their selective perception pic up one minuscule part of the thoroughly considered and well differentiated work that in my opinion you clearly provided with said video? and worse, they even accomplish to wrap it so much out of context that they reverse the original conclusion... argh

I edited the post, could u edit yours too, so I dont get another poster compaining about the same issue, thanks.
 
Last edited:
Top Bottom