• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

SlimySnake

Flashless at the Golden Globes
If it was full RDNA 2 efficiency, why is it not using its party trick? Why is it not tuned higher like the leaked full RDNA2 cards for PC?
because it has far higher CUs.

think of it like this. PS5 only has 22% faster clocks at 36 CUs, but xsx has 40% more CUs. a bigger chip will always be slower in consoles where you have to watch the TDP. Desktop gpus dont give a fuck about power draws. the 3080 has 135 CUs worth of shader processors and comes in around 320w on its own. nvidia can afford to not give two fucks about the tdp because PC gamers dont.

i can promise you, you are not getting 1.825 ghz on 52 CUs in a console with RDNA 1.0 efficiencies. it's flat out impossible. the 5700xt is 1.825 ghz ingame and pulls in 118w on its own. a gpu with 30% more CUs would draw 30% more power. thats over 165w for the gpu alone. xsx is pulling 165w for the entire console. you are not getting those thermals if the xsx wasnt using 50% rdna 2.0 perf/watts efficiencies.

resultsshjg4.png
 
Last edited:

SlimySnake

Flashless at the Golden Globes
To be fair even if the system is just 5% more powerful they can still market it as being the most powerful and not be wrong.

🤷‍♂️



Maybe that's just the limit of their cooling solution. It's the only thing that I can think of.
sure. i dont mind them marketing it. i saw some folks here link ps5 marketing posters calling the ps4 the most powerful next gen console. but i dont remember sony making a big deal out of in the February PSM reveal, or at E3 2013. If they mentioned it, i dont remember because they didnt try and shove that marketing point down our throats like MS has done in every conference, and every goddamn interview, and tweet. And we all know how many times aaron greenberg tweets and how phil gives out interviews like halloween candy.

If you are going to make such a big deal about it then you better be prepared to show the goods. they did this with the x1x and you know what they did deliver far higher resolutions than the pro. so far, it seems like empty vapid marketing. and might even be considering misleading because if both versions are identical then whats so powerful about it? especially if the ps5 versions load faster.
 
sure. i dont mind them marketing it. i saw some folks here link ps5 marketing posters calling the ps4 the most powerful next gen console. but i dont remember sony making a big deal out of in the February PSM reveal, or at E3 2013. If they mentioned it, i dont remember because they didnt try and shove that marketing point down our throats like MS has done in every conference, and every goddamn interview, and tweet. And we all know how many times aaron greenberg tweets and how phil gives out interviews like halloween candy.

If you are going to make such a big deal about it then you better be prepared to show the goods. they did this with the x1x and you know what they did deliver far higher resolutions than the pro. so far, it seems like empty vapid marketing. and might even be considering misleading because if both versions are identical then whats so powerful about it? especially if the ps5 versions load faster.

Well they have Digital Foundry for that. I'm sure whatever differences the multiplats have Digital Foundry won't hesitate to point them out.
 
I guess. I mean look at his Sony handled their thermals and if that could work in the XSXs form factor. Also Sony did consider having a dual board system when designing the console.
And a vapour chamber... The point is that (imo) all decisions on the cooling and clocking of the XSX points towards the form factor. They are already paying for the vapour chamber (which is more expensive than the heatsink+LM), why not go for higher clocks?

Nevermind, I think I know why, it’s the fixed clocks. They can’t run that high while fixed. A shame, the variable frequencies would let them get a few more TFs.
 

welsay01

Neo Member
Oh yes, I know... my main questions are... why did Xbox go with 56 CU's at lower frequency when they could have gone with less CU's at higher frequency or just increased the frequency on their CU's to provide a bigger delta in performance???

The other thing are the NDA's, why are AMD and Sony so quiet on navi22 and the PS5 dye.

Also infinity caches and primitive shaders.... there seem to be some outstanding items without clarification.

I think the reason for more CUs was because they're also using the same chips for xCloud so they could run four simultaneous XB1s sessions per chip.
 

Aceofspades

Banned
watch dogs is 4k 30 fps on both ps5 and xsx. Source is their producer on reddit during an AMA.

WERWcgs.png


I think this is proof that the difference between the two consoles is simply not significant enough for any kind of framerate boost or even resolution boost. 18% in teraflops would likely buy you 5 fps. so if the ps5 version is 30 fps, xsx would be 35 fps. but since both are likely over 30 fps, it doesnt really matter since they will be locked to 30 fps anyway.

so all this warring. 12 vs 10 tflops. 12 vs 8 tflops or 9 tflops was for nothing. i have yet to see a single game that has any kind of advantages on the xbox series x. everyone seems to be going for literally identical framerates and resolutions. we arent even getting an 18% difference in resolution like i believed we would get because the ps5 seems to be capable enough to do native 4k at 30 fps with ray tracing. i expected 1800p vs native 4k which is a 25% boost in resolution, but we are getting literally identical versions of games.

we will see if the ps5 versions have framedrops or not, but right now things are not looking good for the xsx. they put so much effort into marketing having a most powerful console ever and it seems to have zero upgrades over the ps5. for now anyway. maybe things will change next holiday season when next gen battlefield, cod and ass creed arrive. But i think that will be too late. DF comparisons at the start of the gen have far more of an impact than a year or two down the line when everyone who gives a shit about this stuff has already made their purchases.

We have said DAY 1 of knowing both specs -and even before that - that Specs difference between the two machines is the smallest in HISTORY . People just love to ignore that
 
And a vapour chamber... The point is that (imo) all decisions on the cooling and clocking of the XSX points towards the form factor. They are already paying for the vapour chamber (which is more expensive than the heatsink+LM), why not go for higher clocks?

Nevermind, I think I know why, it’s the fixed clocks. They can’t run that high while fixed. A shame, the variable frequencies would let them get a few more TFs.

They designed their system around variable power consumption and not fixed. That would explain why they went with much lower clocks. Would also explain why they don't have something like Smartshift.
 

kyliethicc

Member
Oh yes, I know... my main questions are... why did Xbox go with 56 CU's at lower frequency when they could have gone with less CU's at higher frequency or just increased the frequency on their CU's to provide a bigger delta in performance???

The other thing are the NDA's, why are AMD and Sony so quiet on navi22 and the PS5 dye.

Also infinity caches and primitive shaders.... there seem to be some outstanding items without clarification.
Because of the chips secondary design for XCloud servers. Design efficiency, cost savings. 2 functions, 1 chip.

Xbox One has 14 CUs, 12 active. 14x4=56. And 56 CUs -2 DCUs disabled for yields still leaves 4 groups of the 12 active CUs they need to run 4 Xbox One games on 1 chip. 52 CUs @ 1.825 GHz got them to their 12 TF goal. They only really need 48 active CUs for their Xcloud chips if they just wanna run up to 4 Xbox One games, so maybe they could even use some of the chips that fail to yield 52 useable CUs, saving even more money.

And Xbox One games need 5 GB of RAM each. 5x4=20 GB of RAM needed. Why does the XSX use a 320 bit bus, which is used for 10 RAM chips? Well, 10 ram chips, 2 GB each, gets them the 20 GB of RAM for running 4 Xbox One games on 1 chip. So the same SOC as in the console can be setup with 20 GB instead of 16 GB when used for XCloud servers.

And it gets them just enough bandwidth, 10 GB @ 560 GB/s, for their split setup in the Series X console. Its a compromise. They didn't want to spend the extra money on 4 more GB GDDR6 for every Series X console.
 

geordiemp

Member
because it has far higher CUs.

think of it like this. PS5 only has 22% faster clocks at 36 CUs, but xsx has 40% more CUs. a bigger chip will always be slower in consoles where you have to watch the TDP. Desktop gpus dont give a fuck about power draws. the 3080 has 135 CUs worth of shader processors and comes in around 320w on its own. nvidia can afford to not give two fucks about the tdp because PC gamers dont.

i can promise you, you are not getting 1.825 ghz on 52 CUs in a console with RDNA 1.0 efficiencies. it's flat out impossible. the 5700xt is 1.825 ghz ingame and pulls in 118w on its own. a gpu with 30% more CUs would draw 30% more power. thats over 165w for the gpu alone. xsx is pulling 165w for the entire console. you are not getting those thermals if the xsx wasnt using 50% rdna 2.0 perf/watts efficiencies.

resultsshjg4.png

The XSX chip is only 360/308 mm bigger, your overestimating the power draw for extra CU.

The XSX is not RDNA1, but there is a story here about the 1,825 GHz whcih will eventually come out

The supposed Navi21 lite is 56 CU and 1.9 Ghz so being a PC part we will find out which its low clocked compared to Navi 21 which is 80 CU and suposedly hitting 2.4 Ghz.

There is more to this and we will find out on 28 th anyway - my guess isNavi 21 lite mainly DUV 7nm and less EUV litho....As no PC part leaves power on the table... ...ever.

How that relates to XSX is unknown of course, but we get more information to debate.
 
Last edited:

THE:MILKMAN

Member
my guess is its mainly DUV 7nm and less EUV litho....As no PC part leaves power on the table... ...ever.
DF in their March article already confirmed the SoC is just enhanced 7nm and doesn't include any EUV:

DF said:
The processor is fabricated on an enhanced rendition of TSMC's 7nm process, which we understand rolls up a bunch of improvements to the technology, right up to but not including the new EUV-based 7nm+.

I would be a little surprised if PS5's isn't the same.
 

geordiemp

Member
DF in their March article already confirmed the SoC is just enhanced 7nm and doesn't include any EUV:



I would be a little surprised if PS5's isn't the same.

The amount of EUV layers and how many critical layers will be on a customer by customer basis and is kept vague for reasons, but its more than binning.

Why some chips are going 2.5 Ghz and some 1.9 Ghz must be more than liquid metal cooling, but we will know more soon enough. It could also be propagation delay and logic....but idk.

If Navi lite 56 CU variant in PC space does not have a 2.4 Ghz option, then we will know.
 
Last edited:

kyliethicc

Member
Zen 3 and the consoles are not 7nm EUV. I bet AMD are saving that newer node for just Big Navi, plus its too expensive for the consoles to use. And AMD can only use so many PC/console chips on a given process in a given year at TSMC, cuz supply chain, etc.
 

Mahavastu

Member
i hate the fact Series X UI isnt 4k, but i also hate when games are 1440p over 1800p/4k
An UI consists mostly of single colored text and single color icons with small details.
It is much easier to spot the lower resolution here with the naked eye then with upscaled photos, movies or games. Remember when you first saw text on a "retina" display?

But then, I do not know if this is a real problem for non-nerds in the real world...
 
Last edited:

THE:MILKMAN

Member
The amount of EUV layers and how many critical layers will be on a customer by customer basis and is kept vague for reasons, but its more than binning.

Why some chips are going 2.5 Ghz and some 1.9 Ghz must be more than liquid metal cooling, but we will know more soon enough. It could also be propagation delay and logic....but idk.

If Navi lite 56 CU variant in PC space does not have a 2.4 Ghz option, then we will know.

It could be as simple as Microsoft needed to go wide for those 4 instances of Xbox One for xCloud and BC? That meant lower clocks.

I wonder if/when we get another PS5 deep dive from Mark Cerny once any AMD NDAs are up and which outlets to look out for?
 

pasterpl

Member
So just to summarise the fud going around for the last couple of months;

first ps5 was rdna1
then ps5 was overheating
then Xsx was overheating
then xsx become rdna1

these consoles are shambles ;)

...

feels like we going in circles

good thing that ms is acquiring sega and ubisoft this Wednesday

/s
 

geordiemp

Member
It could be as simple as Microsoft needed to go wide for those 4 instances of Xbox One for xCloud and BC? That meant lower clocks.

I wonder if/when we get another PS5 deep dive from Mark Cerny once any AMD NDAs are up and which outlets to look out for?

I was not talking about XSX, I was talking about the pC part Navi 21 Lite which is 56 Cu and 1.9 Ghz, and if it does not have a higher clock variant, and if not.....why. That PC part at 1.9 is not a choice or a TIM selection....There will be a technical reason.
 
Last edited:

TLZ

Banned
The PS5 version is leaked, unofficial. So we better wait for an official one.

DMC5 is running with RT on PS5, with no RT on XSX so far for example. It's all over the place, with PS5 having the same or better version so far, but FPS comparisons gonna be interesting, very interesting.
Hmm.. I'm going to wait for these comparisons before buying any multiplat games.
 

gmoran

Member
MS released the GPU diagram at hot chips and Richard at DF said its identical to the RDNA 1.0 gpus.

MS themselves said that the IPC gains over last gen were 25% which are the same IPC gains AMD touted for RDNA 1.0 over GCN. Meaning either there is no IPC gains for RDNA 2.0 over RDNA 1.0 or MS is using RDNA 1.0 CUs.

Now this is where it gets tricky. Sony is likely using the same RDNA 1.0 GPUs with RT bolted on and built on a smaller node to hit higher clocks. However, because we dont have the PS5 GPU diagram, there is still a teeny tiny bit of hope that PS5 is true RDNA 2.0 with some kind of IPC gains. I personally think its the former and the PS5 and XSX are both RDNA 1.0 while borrowing some RDNA 2.0 features like VRS, RT and perf/watt gains.

Going through all of the twitter threads on this and it appears the original twitter post was speculative based on driver details and on the assumption that XSX GPU is navi 21.

So interesting but I think we can be pretty sure XSX and PS5 GPU's are RDNA 2 based just as both teams have said.

Looking forward to 28 October.
 
So just to summarise the fud going around for the last couple of months;

first ps5 was rdna1
then ps5 was overheating
then Xsx was overheating
then xsx become rdna1

these consoles are shambles ;)

...

feels like we going in circles

good thing that ms is acquiring sega and ubisoft this Wednesday

/s

I found it really funny when that guy tried to convince people the PS5 would drop to 5TFs.

:messenger_tears_of_joy:
 

SlimySnake

Flashless at the Golden Globes
Going through all of the twitter threads on this and it appears the original twitter post was speculative based on driver details and on the assumption that XSX GPU is navi 21.

So interesting but I think we can be pretty sure XSX and PS5 GPU's are RDNA 2 based just as both teams have said.

Looking forward to 28 October.
its not just the teams. the lead AMD engineer on RDNA said on a leaked conference call that both were rdna 2.0 back in February. it was posted here.
 

THE:MILKMAN

Member
I was not talking about XSX, I was talking about the pC part Navi 21 Lite which is 56 Cu and 1.9 Ghz, and if it does not have a higher clock variant, and if not.....why. That PC part at 1.9 is not a choice or a TIM selection....There will be a technical reason.

Honestly I'm getting confused with all the rumoured variants! I thought Navi 21 Lite = Arden = XSX?

But lets say you're right and the PC part is 1.9GHz. What are you thinking this means?
 

RaZoR No1

Member
because it has far higher CUs.

think of it like this. PS5 only has 22% faster clocks at 36 CUs, but xsx has 40% more CUs. a bigger chip will always be slower in consoles where you have to watch the TDP. Desktop gpus dont give a fuck about power draws. the 3080 has 135 CUs worth of shader processors and comes in around 320w on its own. nvidia can afford to not give two fucks about the tdp because PC gamers dont.

i can promise you, you are not getting 1.825 ghz on 52 CUs in a console with RDNA 1.0 efficiencies. it's flat out impossible. the 5700xt is 1.825 ghz ingame and pulls in 118w on its own. a gpu with 30% more CUs would draw 30% more power. thats over 165w for the gpu alone. xsx is pulling 165w for the entire console. you are not getting those thermals if the xsx wasnt using 50% rdna 2.0 perf/watts efficiencies.

resultsshjg4.png
IMO this post alone should be evidence enough.
We need to keep in mind, the PS5 has to use Smartshift and allocate some of its power to GPU or CPU and probably cannot max out both out them at the same time (depending on the scenario)The XSX will (according to MS)run always at full speed.
Therfore it makes sense to downclock the GPU to gain some free watt.
Otherwise the XSX would need probably a PSU around 400-450 Watt and need to use Smartshift like PS5 Now it has just a 300+ PSU.
At least this is what I think from the efficiency aspects.
Performance wise.. We will see.. They run on different APIs and currently Sony devs are a bit more talented to squeeze out as much power from the console as possible.
 

geordiemp

Member
Honestly I'm getting confused with all the rumoured variants! I thought Navi 21 Lite = Arden = XSX?

But lets say you're right and the PC part is 1.9GHz. What are you thinking this means?

Navi21 LITE will also be RDNA2, and its a pC part....so the thoughts begin...

Well it does not mean its RDNA1 logic as the silly tweets and stuff going around. But it the important point, as the narrative was painted as get to your power you want and stop. I dont believe that for 1 second.

EUV and DUV are interchangeable by process layer and etch layer, you might need 4-6 EUV enhanced litho layers to make a difference around gates, but it will be top secret TSMC so we will never know fully unless AMD spills the beans.

No consoles or parts are full EUV until smaller nodes, but there is some exhancement going on here to get those frequencies.
 
Last edited:

AeneaGames

Member
Bullshit.

The community put up with a lot of shit from you, and if it all turns sour next week there needs to be a real apology.

This “funny“ fish thing is done. A bunch of you have turned the community toxic.

Seriously now if next week doesn’t go as you planned, how do you expect to make it up to them?

Huh, what now?
Why should fishy need to apologise? Did I miss something?
What will happen this week according to him?

Oh and speaking of toxic, have we ever seen any apology from any toxic posters before? Not sure why one should expect one now, besides, I have no idea what he's supposed to have done what was so toxic...
 
Last edited:
Status
Not open for further replies.
Top Bottom