• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.
I mean the resolution might be the might be the same but there will be other differences.
Sure, one system may have bit better shadows or particles or whatever, the other might load a bit faster, but in the grand scheme of things, those differences are negligible opposed to a difference on either resolution or FPS. That's my opinion anyway.

I'm not looking for tiny details like those to cling unto to be able to justify my purchase of one system or the other. My justification are games. PS5 does it for me as of right now, Series X will probably do when stuff like Everwild, Avowed and so on is released. Ultimately, this is what really matters to me. And again, this is just my opinion, you are welcome to disagree and we can discuss it, I love a good chat.
 

geordiemp

Member
Do you think it is a coincidence that 96 MB is exactly the size required to load 4K and 8K uncompressed textures such as Megascans straight into the L2 cache?

Only a few more days - might take longer to get the real PS5 data though. However, the standard RDNA2 CU block overview should be available on October 28. Good days.

I think the L2 size will be a combination / mix of whats needed for efficient memory bandwidth and how much data size is needed for ray tracing efficiency. I dont know those numbers but AMD will, and I bet Sony do as well but they have a limited die budget so compromises
 
Last edited:

Elog

Member
I think the L2 size will be a combination / mix of whats needed for efficient memory bandwidth and how much data size is needed for ray tracing efficiency. I dont know those numbers but AMD will, and I bet Sony do as well but they have a limited die budget so compromises

Would be interesting to understand how memory intensive a BVH model is in MBs (to run intersects against) - I have only seen it written as 'a lot'.

Cerny made a point in the 'Road to...' that they have an I/O lane straight to the cache (highest file priority) - for that to happen they must have a texture resolution in mind when sizing the cache. And then the UE5 demo where they made a point of being able to use 8K textures straight out of the library. Might be reading to much into those statements though - but that L2 cache size adds some noise. That is for sure.
 

onesvenus

Member
AMD picks up what they like afterwards....
How do we know that's the way it goes? AFAIK AMD hasn't ever picked anything that Sony proposed. That's always said in reference to the PS4 Pro but Cerny himself said that they picked what already was in AMDs roadmap.

So for me there's no basis in saying that AMD picked anything from Sony. What's more realistic is to think that they showed their own roadmap both to Sony and Microsoft and they choose different things. Saying that finding something that's in PS5 also in Navi means that AMD picked something proposed by Sony to use in their GPUs is a stretch with nothing to base it upon.
 

Bo_Hazem

Banned
A Sony fanboys wet fart.

717255.jpg


I think it's time to change the gasket. May you proceed to the maintenance room, please?

adfailure-serviceshop.jpg
 
XSx has 4 shader arrays, Navi 22 has 4 shader arrays, ps5 has 4 shader arrays. What is wide exactly in your opinion ?

A longer shader array would be a better description.

Rv2sUAA.jpg


Big Navi has 80 CU and 8 shader arrays rumoured, that is WIDE, and fast.

XSX - 56 CUs (4 disabled) - 4 shader arrays - 14CUs per array
PS5 - 40 CUs (4 disabled) - 4 shader arrays - 10CUs per array
5700XT - 40 CUs - 4 shader arrays - 10CUs per array
Navi21 - 80 CUs - 8 shader arrays - 10CUs per array


Interesting. More CUs in same shader array means same resources (cache for example) to feed more CUs at same time? Maybe a potential bottleneck here.
 
Last edited:
Yeah, I don't know much about this stuff right off gate. I'm just spouting extrapolation based on what I've attempted to understand.

I, personally, love the SSD. It's a fascinating thing, opinions. The fact we will see lightning fast loading is amazing to me. Conversely, my wife is not a fan. She prefers load times herself.
Just to let you know that it's simply not just about loading times. The SSD and I/O in the PS5 is so fast that it changes how developers can utilize RAM. When you can feed RAM so quickly, it means you don't need as many "cold" assets sitting in RAM just in case when compared to other slower storage methods.
This means you can have more assets for the GPU to draw that would be more of an impact on the player's game. Not to mention all the assets being drawn can be higher quality. Both these will take up more space in RAM, but when storage is so fast it doesn't matter that much.
 

SlimySnake

Flashless at the Golden Globes
What have I missed? What's this Xbox RDNA1 thing?
MS released the GPU diagram at hot chips and Richard at DF said its identical to the RDNA 1.0 gpus.

MS themselves said that the IPC gains over last gen were 25% which are the same IPC gains AMD touted for RDNA 1.0 over GCN. Meaning either there is no IPC gains for RDNA 2.0 over RDNA 1.0 or MS is using RDNA 1.0 CUs.

Now this is where it gets tricky. Sony is likely using the same RDNA 1.0 GPUs with RT bolted on and built on a smaller node to hit higher clocks. However, because we dont have the PS5 GPU diagram, there is still a teeny tiny bit of hope that PS5 is true RDNA 2.0 with some kind of IPC gains. I personally think its the former and the PS5 and XSX are both RDNA 1.0 while borrowing some RDNA 2.0 features like VRS, RT and perf/watt gains.
 

martino

Member
MS released the GPU diagram at hot chips and Richard at DF said its identical to the RDNA 1.0 gpus.

MS themselves said that the IPC gains over last gen were 25% which are the same IPC gains AMD touted for RDNA 1.0 over GCN. Meaning either there is no IPC gains for RDNA 2.0 over RDNA 1.0 or MS is using RDNA 1.0 CUs.

Now this is where it gets tricky. Sony is likely using the same RDNA 1.0 GPUs with RT bolted on and built on a smaller node to hit higher clocks. However, because we dont have the PS5 GPU diagram, there is still a teeny tiny bit of hope that PS5 is true RDNA 2.0 with some kind of IPC gains. I personally think its the former and the PS5 and XSX are both RDNA 1.0 while borrowing some RDNA 2.0 features like VRS, RT and perf/watt gains.
to sum up
tenor.gif
 

J_Gamer.exe

Member
Hope everybody has their popcorn ready. From the looks of things it looks more and more likely that the "PS5 is RDNA 1.5" FUD was just projection from the green camp. Just wait for confirmation that the PS5 has at least 8MB L2 cache.

Also, please remember that Xbox claims 25% IPC gains over the One X, which is GCN. RDNA 2 is 25 % over RDNA 1, which in turn is 25 % over GCN. Probably going to be moderated over this, and that's ok, but in 10 days approximately we'll know more.

Don't know a lot about this but....

Are they not different things? IPC, instructions per clock and performance per watt. Amd said rdna2 was 25% over rdna1 and rdna1 25% over GCN. Making 50% in total performance per watt gains.

But I thought this different to IPC?
 

Bryank75

Banned
I just can’t see it about XSX being RDNA 1 or 1.5 or whatever other than RDNA2. Seems to big of a lie for Microsoft and they only hired Todd recently.
They wouldn't technically be lying because Navi 21 has the CU efficiency of RDNA1 while it also has features of RDNA 2 on the rest of the dye. So it is technically part of the RDNA2 family.

While Navi22 was apparently developed with Sony, some type of collaboration... but we do not know exactly what is in there incontrast to XSX showing their dye at hotchips. It all remains to be disclosed.

Right now it is speculation but there are signs pointing to it being the case.

The performance shouldn't be too different between them either way. So I wouldn't worry to much.
 

01011001

Banned
I think there is something you could take from the frequency the chips are running at....

Also Cernys 'Road to PS5' outlined both performance bumps from GCN to RDNA1 (25%) and again from RDNA1 to 2 (another 25%).
I doubt he would have put so much effort into outlining that just to say that PS5 was at the RDNA1 level.

where did he ever compare RDNA1 to RDNA2? it certainly wasn't in the presentation video he did. I also don't remember him personally ever talking about the exact percentage increase in performance of the new architecture. he only talked about transistor density of the CUs
 

Hashi

Member
No, I was answering your claim about hardware vendors not adding new things that devs have to learn.
And no, Sony doesn't make GPUs or CPUs
Show me where I was writing/speak "hardware vendors not adding new things..."

Sony make CPU for hi-end cameras (8K 120p; uhc-8300 from 2017 can handle that), Tv's (first in world CPU that can handle 8K, 10.000 cd/m2) etc. soo, they make CPUs and semiconductors. But thats not the point.
 
They wouldn't technically be lying because Navi 21 has the CU efficiency of RDNA1 while it also has features of RDNA 2 on the rest of the dye. So it is technically part of the RDNA2 family.

While Navi22 was apparently developed with Sony, some type of collaboration... but we do not know exactly what is in there incontrast to XSX showing their dye at hotchips. It all remains to be disclosed.

Right now it is speculation but there are signs pointing to it being the case.

The performance shouldn't be too different between them either way. So I wouldn't worry to much.
So the RDNA1.5 FUD that PS5 got back in the day. I don’t know, sometimes it just feels too much like Karma that I just can’t believe.

We’ll know soon enough on the 28th (I hope Sony or AMD release a die shot from the PS5 and new GPUs).
 

Bryank75

Banned
So the RDNA1.5 FUD that PS5 got back in the day. I don’t know, sometimes it just feels too much like Karma that I just can’t believe.

We’ll know soon enough on the 28th (I hope Sony or AMD release a die shot from the PS5 and new GPUs).
Well that is exactly it, we need the die shot but there are clues along the way...

like here around 26 minutes: Cerny says if a PC GPU appears for the PC market around the same time, it is not that SOny took that GPU as a component and simply put it into the PS5 but that the collaboration between the 2 companies was successful.... he mentions primitive shaders and cache scrubbers (maybe hinting towards infinity cache)

 

01011001

Banned
I think there is something you could take from the frequency the chips are running at....

Also Cernys 'Road to PS5' outlined both performance bumps from GCN to RDNA1 (25%) and again from RDNA1 to 2 (another 25%).
I doubt he would have put so much effort into outlining that just to say that PS5 was at the RDNA1 level.
where did he ever compare RDNA1 to RDNA2? it certainly wasn't in the presentation video he did. I also don't remember him personally ever talking about the exact percentage increase in performance of the new architecture. he only talked about transistor density of the CUs

so you gonna ignore this? point me to where he ever compares GCN to RDNA1 and then to RDNA2
 

SlimySnake

Flashless at the Golden Globes
Also Cernys 'Road to PS5' outlined both performance bumps from GCN to RDNA1 (25%) and again from RDNA1 to 2 (another 25%).
nah, cerny used GCN 1.0 to compare the IPC gains. MS used Polaris or GCN 2.0. GCN 1.0 was used in PS4 and the gcn 1.0 to polaris gains were roughly 25% and from polaris to rdna 1.0 is 50% so thats what cerny was referring to.

MS was comparing their new chip to the x1x when they made the 25% claim.
 

Bryank75

Banned
nah, cerny used GCN 1.0 to compare the IPC gains. MS used Polaris or GCN 2.0. GCN 1.0 was used in PS4 and the gcn 1.0 to polaris gains were roughly 25% and from polaris to rdna 1.0 is 50% so thats what cerny was referring to.

MS was comparing their new chip to the x1x when they made the 25% claim.
If it was full RDNA 2 efficiency, why is it not using its party trick? Why is it not tuned higher like the leaked full RDNA2 cards for PC?
 

01011001

Banned
Well here is Lisa Su.... at the end she says XSX is based on RDNA but never says '2'



Which is odd.

I believe it was her breakdown where she says 25% between RDNA1 and 2... I may have gotten mixed up between the 2 conferences.


so you are saying you pulled that 👇 out of your ass?

Also Cernys 'Road to PS5' outlined both performance bumps from GCN to RDNA1 (25%) and again from RDNA1 to 2 (another 25%).
I doubt he would have put so much effort into outlining that just to say that PS5 was at the RDNA1 level.

and now you just slide that under the rug and change the subject completely... great
 
Last edited:

kyliethicc

Member
Well here is Lisa Su.... at the end she says XSX is based on RDNA but never says '2'



Which is odd.

I believe it was her breakdown where she says 25% between RDNA1 and 2... I may have gotten mixed up between the 2 conferences.

In that video she says the PS5 is "next gen Radeon" and says Xbox is "next gen Radeon RDNA" so I don't think that means anything. They're both just custom, RDNA-based, but custom architectures.
 

onesvenus

Member
Show me where I was writing/speak "hardware vendors not adding new things..."
I didn't say that you said that hardware vendors did not add new things.
I said that you claimed that some new architectures are not thrown out there for developers to learn and I gave two examples of exactly that.
Maybe I didn't understand what you said here but that's what you seemed to imply
It would be a bit strange to create architecture and throw it for programmers to learn to write code on it

Sony make CPU for hi-end cameras (8K 120p; uhc-8300 from 2017 can handle that), Tv's (first in world CPU that can handle 8K, 10.000 cd/m2) etc. soo, they make CPUs and semiconductors. But thats not the point.
I though we were talking about CPUs and GPUs used in consoles 🤦‍♂️
 

SlimySnake

Flashless at the Golden Globes
watch dogs is 4k 30 fps on both ps5 and xsx. Source is their producer on reddit during an AMA.

WERWcgs.png


I think this is proof that the difference between the two consoles is simply not significant enough for any kind of framerate boost or even resolution boost. 18% in teraflops would likely buy you 5 fps. so if the ps5 version is 30 fps, xsx would be 35 fps. but since both are likely over 30 fps, it doesnt really matter since they will be locked to 30 fps anyway.

so all this warring. 12 vs 10 tflops. 12 vs 8 tflops or 9 tflops was for nothing. i have yet to see a single game that has any kind of advantages on the xbox series x. everyone seems to be going for literally identical framerates and resolutions. we arent even getting an 18% difference in resolution like i believed we would get because the ps5 seems to be capable enough to do native 4k at 30 fps with ray tracing. i expected 1800p vs native 4k which is a 25% boost in resolution, but we are getting literally identical versions of games.

we will see if the ps5 versions have framedrops or not, but right now things are not looking good for the xsx. they put so much effort into marketing having a most powerful console ever and it seems to have zero upgrades over the ps5. for now anyway. maybe things will change next holiday season when next gen battlefield, cod and ass creed arrive. But i think that will be too late. DF comparisons at the start of the gen have far more of an impact than a year or two down the line when everyone who gives a shit about this stuff has already made their purchases.
 

01011001

Banned
If it was full RDNA 2 efficiency, why is it not using its party trick? Why is it not tuned higher like the leaked full RDNA2 cards for PC?

have you seen Sony's cooling solution? if you did and you use your brain for a second, you would know why they didn't it's not rocket science.
the heat sink of the PS5 is gigantic and it uses liquid metal.
the Series X's heatsink isn't half as big and it doesn't use liquid metal.

there's your reason.

they reached their goal of 12TF without the need to make their system the biggest console ever produced and without using half of the worlds copper reserves to cool their system.

by the same logic, why do both systems not run their CPUs at 4.7GHz? Zen 2 can do that... why aren't they? ARE THEY NOT RUNNING ZEN2 AFTER ALL?
 
Last edited:

Elog

Member
Both consoles have RDNA2 GPUs from a power efficiency/ micro fab node point of view.

What we are discussing are specific feature sets, i.e. transistor layouts, for the various parts. There are some question marks surrounding the XSX CUs. The October 28th presentation should address that since it is very likely we get the RDNA2 CU block overview for the PC GPUs then.

It does not answer the question what CUs the PS5 has though. Please note that Sony did CU level customisations for the PS4 and PS4pro that have (AFAIK) never been revealed in detail (the CUs are bigger in mm2 than they should be).
 

Bryank75

Banned
In that video she says the PS5 is "next gen Radeon" and says Xbox is "next gen Radeon RDNA" so I don't think that means anything. They're both just custom, RDNA-based, but custom architectures.
Oh yes, I know... my main questions are... why did Xbox go with 56 CU's at lower frequency when they could have gone with less CU's at higher frequency or just increased the frequency on their CU's to provide a bigger delta in performance???

The other thing are the NDA's, why are AMD and Sony so quiet on navi22 and the PS5 dye.

Also infinity caches and primitive shaders.... there seem to be some outstanding items without clarification.
 
watch dogs is 4k 30 fps on both ps5 and xsx. Source is their producer on reddit during an AMA.

WERWcgs.png


I think this is proof that the difference between the two consoles is simply not significant enough for any kind of framerate boost or even resolution boost. 18% in teraflops would likely buy you 5 fps. so if the ps5 version is 30 fps, xsx would be 35 fps. but since both are likely over 30 fps, it doesnt really matter since they will be locked to 30 fps anyway.

so all this warring. 12 vs 10 tflops. 12 vs 8 tflops or 9 tflops was for nothing. i have yet to see a single game that has any kind of advantages on the xbox series x. everyone seems to be going for literally identical framerates and resolutions. we arent even getting an 18% difference in resolution like i believed we would get because the ps5 seems to be capable enough to do native 4k at 30 fps with ray tracing. i expected 1800p vs native 4k which is a 25% boost in resolution, but we are getting literally identical versions of games.

we will see if the ps5 versions have framedrops or not, but right now things are not looking good for the xsx. they put so much effort into marketing having a most powerful console ever and it seems to have zero upgrades over the ps5. for now anyway. maybe things will change next holiday season when next gen battlefield, cod and ass creed arrive. But i think that will be too late. DF comparisons at the start of the gen have far more of an impact than a year or two down the line when everyone who gives a shit about this stuff has already made their purchases.

To be fair even if the system is just 5% more powerful they can still market it as being the most powerful and not be wrong.

🤷‍♂️

Oh yes, I know... my main questions are... why did Xbox go with 56 CU's at lower frequency when they could have gone with less CU's at higher frequency or just increased the frequency on their CU's to provide a bigger delta in performance???

Maybe that's just the limit of their cooling solution. It's the only thing that I can think of.
 
Last edited:
Listen bud, you stood by while 52 points of fucking fud were spread by your friends and comrades in the xbox community on this forum, not once did you try and set them straight, not once did you ask them to fucking apologize for there bull shit in TURNING THE COMMUNITY TOXIC.

Don't play the fucking victim card with me, or give me your holier than thou bullshit. You're upset because other people's technical views are not lining up with yours.

This is a speculation thread, 10 days to go TICK TOCK.
tumblr_ntfvge83z11rbf5cro3_500.gif
 
Status
Not open for further replies.
Top Bottom