• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

UL releases 3DMark Mesh Shaders Feature test, first results of NVIDIA Ampere and AMD RDNA2 GPUs

Ah yes, let's belittle and downplay the underdog for succeeding and introducing competition into the industry. And that Intel statement just reeks of cringe. "X would be better than Y if only they were better." Like, um, of-fucking-course?

These necrotic fanboys I swear...
There’s also the opposite. There are people that hype over AMD expecting them to crush nvidia generation to generation.
 
Last edited:

GHG

Member
The more I read this thread the more convinced I become that the extreme Xbox fanboys are the console gaming equivalent of flat earthers.

First it was no hardware RT in the PS5, then it was no RDNA 2 and now its mesh shading?

All this hullabaloo and the consoles are pretty much performing like for like. Just accept reality, enjoy your console and move on man.
 

MonarchJT

Banned
The more I read this thread the more convinced I become that the extreme Xbox fanboys are the console gaming equivalent of flat earthers.

First it was no hardware RT in the PS5, then it was no RDNA 2 and now its mesh shading?

All this hullabaloo and the consoles are pretty much performing like for like. Just accept reality, enjoy your console and move on man.
and finally you arrived ) (as usual insulting) we miss just james sawyer ,Elios ,geordie and thelastsword and the full warrior team is complete lol
 
Last edited:

FireFly

Member
MS even proved it when they said the increase in CU perf per cycle is 25% from Xbox One X (that is the increase from GCN to RDNA1) while GCN to RDNA2 CU perf per cycle increase is around 40%.
AMD attributed the IPC increase on RDNA2 PC parts to the infinity cache.
 

Genx3

Member
The more I read this thread the more convinced I become that the extreme Xbox fanboys are the console gaming equivalent of flat earthers.

First it was no hardware RT in the PS5, then it was no RDNA 2 and now its mesh shading?

All this hullabaloo and the consoles are pretty much performing like for like. Just accept reality, enjoy your console and move on man.
Games aren't taking advantage of a lot of these features yet.
I think there is 1 games that uses VRS on XSX and none with mesh shaders.
 

GHG

Member
and finally you arrived ) (as usual insulting) we miss just james sawyer ,Elios ,geordie and thelastsword and the full warrior team is complete lol

b27966140db68d0621628f2309f8a443.gif
 

assurdum

Banned
where i can read about this 25% X CU
36 CUs on ps5 = 58 CUs of previous generation. From Eurogamer (source road to the ps5):
In fact, the transistor density of an RDNA 2 compute unit is 62 per cent higher than a PS4 CU, meaning that in terms of transistor count at least, PlayStation 5's array of 36 CUs is equivalent to 58 PlayStation 4 CUs. And remember, on top of that, those new CUs are running at well over twice the frequency.
 
Last edited:

assurdum

Banned
did you read his beautiful entry in the thread? .. indeed I was kind
I mean we are talking by days about mesh shaders not on ps5, not full RDNA2 and so on, when there are patents about it available for everyone, and as always Xbox fans are full engaged in this matter when just needs a quick research on net to understand it. The other guy Christ is literally annoying as hell, push the argument to the extremities, calls personal conjecture as facts, facts and facts and everyone who tries to argue otherwise, is too much emotionally invested and fanboy. Come on.
 
Last edited:

Fredrik

Member
Sony PS5 Engineer: PS5 GPU is like a RDNA 1.5 Feature set. More than RDNA1 but not all the RDNA2 features.
MS Engineer: Xbox Series consoles are the only consoles with the whole RDNA2 Feature set.

Play Station Fanboys:
PS5 has RDNA3 features and everything MS added to the Xbox Series because MS engineers give Sony all their secrets plus Cerny's genius is beyond AMD's that's why Cerny developed RDNA1 to be more powerful than RDNA2.

Back to the topic at hand.
Mesh Shaders definitely improves performance on all GPU's so we can at least expect Xbox Series consoles to provide some type of boost once games start implementing Mesh shaders.
Lol thank you for joking a bit in this apparently deadly serious thread 😅

On topic, is mesh shaders a new thing devs need to rewrite their engine to take advantage of? Or could it be applied to UE4, Snowdrop, Frostbyte, Red engine, etc?
 

assurdum

Banned
Lol thank you for joking a bit in this apparently deadly serious thread 😅

On topic, is mesh shaders a new thing devs need to rewrite their engine to take advantage of? Or could it be applied to UE4, Snowdrop, Frostbyte, Red engine, etc?
A joking? He practically trolling about ps5 engineers claimed is just a RDNA1.5, all fans said it's rdna3 when it's absolutely false and wait, mesh shaders are the new Jesus coming on series X thanks to the MS magical engineer . That's fanboy argumentation. Mesh shaders was already on Nvidia before AMD rdna2, it's crazy argue MS has invented the wheel as said ps5 is outside this track because RDNA1.5.
 
Last edited:

MonarchJT

Banned
36 CUs on ps5 = 52 CUs of previous generation.
and 52 rdna2 cu ? lol
I mean we are talking by days about mesh shaders not on ps5, not full RDNA2 and so on when there are patents about it available for everyone, and as always Xbox fans are full engaged in this matter when just needs a quick research on net to understand it. The other guy Christ is literally annoying as hell, push the argument to the extremities, when he pushed his personal conjecture as facts, facts and facts and calling everyone who says otherwise too much emotionally invested and fanboy. Really?Only the others are emotionally invested and fanboy??
My god what's wrong with you ..where fuckin i said that theres no something like the mesh shader in the ps5 ?????? i said they modified an rdna1 gpu putting in their own version of the same exact things you can find on the rdna2 gpu's ....and we have to wait to see what version will perform better
find me the post where i said there's no equivalent of mesh shader into the ps5 gpu
 
Last edited:

assurdum

Banned
and 52 rdna2 cu ? lol
My god what's wrong with you ..where fuckin i said that theres no something like the mesh shader in the ps5 ?????? i said they modified an rdna1 gpu putting in their own version of the same exact things you can find on the rdna2 gpu's ....and we have to wait to see what version will perform better
find me the post where i said there's no equivalent of mesh shader into the ps5 gpu
Did you have a minimal ideas what RDNA1 means at least? There are set of instructions of RDNA1 on series X too. If ps5 was RDNA1.5 (whatever you think is it) raytracing would be literally impossible as many other things
 
Last edited:

longdi

Banned
I would say it's the other way around. GPU can contain many HW features waiting to be exposed via the DirectX API. In the case of the PS5 GPU, where there are no DX limits or MS patronage, you can and will certainly go far beyond the DX12 options. SFS is a fancy MS name. Bidirectional sparse virtual textures streaming engine can be another. BSVTSE by SONE. Wonderful.

No need to make them up as trump cards/saviours hiding behind fancy names / code words either ;).

Yes but there is always cut off to which HW features get used at every product design.

It seems the changes are noticeable enough for Amd/MS to assign new names to identify the newer HW features in rDNA2. So thats why i find it awkward to handwave those features as fancy names. 🤷‍♀️
 
Last edited:

MonarchJT

Banned
Did you have a minimal ideas what RDNA1 means at least? There are set of instructions of RDNA1 on series X too. If ps5 was RDNA1.5 (whatever you think is it) raytracing would be literally impossible as many other things
i fact to have RT they did their own flavour of it it's called "intersection engine" isn't the one present in rdna2 gpu's....also if it work basically in the same manner
 
Last edited:

llien

Member
Here we go:

"Unreal Engine 5 PS5 demo runs happily on current-gen graphics cards and SSDs​

An engineer from Epic China has confirmed performance on a current-gen laptop that seems higher than the PS5's 30fps"


Also subsequentiscussions to those told by the engineer shed light on how misleading Sweeney was
A laptop with 2080 in it is "current gen laptop", I guess.
Did everyone reading this comment realize how misleading "current gen laptop" is?
 

Panajev2001a

GAF's Pleasant Genius
can you show me where are those claims and let me read about it ?


XOX (GCN) to XSX (RDNA2)
obInsmu.jpg


AMD GCN to RDNA1
TGEViz8.jpg


AMD RDNA1 to RDNA2:

Performance per watt from RDNA1 to RDNA2 increases by about ~54% (previous jump was closer to ~50%), clock rate also increased, pure IPC increase I think has been estimated to ~7-10%.
 

MonarchJT

Banned

MonarchJT

Banned
No he didn't. It's custom RDNA2 as series X
He did (two times in different topic) and you at this point are trolling. there are entire articles about it.

grafika-v-ps5-je-niekde-medzi-244522-9186518-640.jpg


i read it for you :
XOX (GCN) to XSX (RDNA2)
obInsmu.jpg


AMD GCN to RDNA1
TGEViz8.jpg


AMD RDNA1 to RDNA2:

Performance per watt from RDNA1 to RDNA2 increases by about ~54% (previous jump was closer to ~50%), clock rate also increased, pure IPC increase I think has been estimated to ~7-10%.
probably related to the l3 cache missing and something like smart shift?
 

assurdum

Banned
He did (two times in different topic) and you at this point are trolling. there are entire articles about it.

grafika-v-ps5-je-niekde-medzi-244522-9186518-640.jpg


i read it for you :

probably related to the l3 cache missing and something like smart shift?
Good Lord. Get a vacation. Seriously. Why you persist to post what fit your narrative ignoring the rest? Why?
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
He did (two times in different topic) and you at this point are trolling. there are entire articles about it.

grafika-v-ps5-je-niekde-medzi-244522-9186518-640.jpg


i read it for you :

probably related to the l3 cache missing and something like smart shift?
L3 tweaks for IPC would be about Ryzen 2 to 3 updates or do you mean the infinity cache? Infinity cache did help sustain the IPC boost for the CU’s but I do not think they are the only factor. Smart shift does not factor in IPC calculation: it is a way to balance the power consumption budget between CPU and GPU to allow the latter to keep higher clocks for longer periods of time.

So far the data we have is matching the leaker’s tweet. Both MS and Sony played a bit loose with the RDNA2 based definition :).
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
i read it for you .... "inevitably ended in the midst of a fierce controversy"

what you expect he would do? to fix it he simply said that it cannot be classified as rdna 1, 2, 3 or 4. It was inevitable given what was already happening on the web. and seen how you react

I am seeing how you react and how you are faced with two reasonable options: believe him both times or neither... and you are choosing your own third interpretation/conspiracy theory because... reasons ;).
 
Last edited:

MonarchJT

Banned
L3 would be about Ryzen or do you mean the infinity cache? Infinity cache did help sustain the IPC boost for the CU’s but I do not think they are the only factor. Smart shift does not factor in IPC calculation: it is a way to balance the power consumption budget between CPU and GPU to allow the latter to keep higher clocks for longer periods of time.

So far the data we have is matching the leaker’s tweet. Both MS and Sony played a bit loose with the RDNA2 based definition :).
oooh we are already reaching a common point of agreement. you see?
 

FireFly

Member
36 CUs on ps5 = 58 CUs of previous generation. From Eurogamer (source road to the ps5):
In fact, the transistor density of an RDNA 2 compute unit is 62 per cent higher than a PS4 CU, meaning that in terms of transistor count at least, PlayStation 5's array of 36 CUs is equivalent to 58 PlayStation 4 CUs. And remember, on top of that, those new CUs are running at well over twice the frequency.
That's the transistor density per CU, not the performance per CU.
 

Panajev2001a

GAF's Pleasant Genius
oooh we are already reaching a common point of agreement. you see?

I always stated as such, you are the one called BS the AMD leaker that said MS’s Solution is not exactly the pure RDNA 2 you imagine. So, by your definition apparently XSX is also between RDNA1 and RDNA2 ;).

Also, how each design could have included features that were contributed to desktop cards released at around the same time as the consoles such as Big Navi designs (Cerny directly stated as such in his presentation, not including other consoles, as well as the cache scrubbers which were PS5 only).
 

MonarchJT

Banned
I always stated as such, you are the one called BS the AMD leaker that said MS’s Solution is not exactly the pure RDNA 2 you imagine. So, by your definition apparently XSX is also between RDNA1 and RDNA2 ;).

Also, how each design could have included features that were contributed to desktop cards released at around the same time as the consoles such as Big Navi designs (Cerny directly stated as such in his presentation, not including other consoles, as well as the cache scrubbers which were PS5 only).
Uhm no im thinking and saying is that Ms waited for an rdna2 gpu and they putted out the l3 cache from the cu (infinite something) probably to cut the cost of it and I think also there isn't any form of smart shift because of the sustained perf they was looking for. This explains the waiting, the late devkits timing, the features present in the gpu that also correspond in the nomenclature to those of the new AMD gpu, the claim of the Ms marketing team.
Meanwhile on the opposite I think Sony in order to speed up the times of R&D has highly modified an rdna1 gpu by adding their own version of pretty much everything that is missing to make it a "full rdna2". This explains AMD reddit leak, devkits released 1 year before ms, the different nomenclature of the features and the different patents that reproduce the same features present on the rdna2 such as the geometry engine and the interesection engine, explains the lack of support for ML, explains the confusion in trying to explain without creating panic among the fanboys, by the ps5 engineer Leonardi.

Is the name rdna1 or 2 important? no in highly customized gpu's such those absolutely not, and exactly as Leonardi said, ps5 gpu is customized to a level that is not worth to be called 1,2,3 or 4 so yes, if we want to put it both, Ms and Sony, have been loose with their definition of rdna2. My only interest is to see if Cerny's versions of the rdna2 features architecture enhancements will perform better or worse than those present in the rdna2 gpus.
 
Last edited:

FireFly

Member
50% performance boost per watts eh on ps5. Not claimed on series X
Not sure what relevance that has to what I said. (Performance per watt, transistor density and IPC are all different things)

With that said, RDNA had a claimed performance per watt advantage of 50% over GCN, while XSX delivers 2X performance per watt compared with the One X. So whether or not it is as power efficient as the PS5 (at equivalent clocks), huge strides have been made over RDNA 1.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Uhm no im thinking and saying is that Ms waited for an rdna2 gpu and they putted out the l3 cache from the cu (infinite something) probably to cut the cost of it and I think also there isn't any form of smart shift because of the sustained perf they was looking for. This explains the waiting, the late devkits timing, the features present in the gpu that also correspond in the nomenclature to those of the new AMD gpu, the claim of the Ms marketing team.
Meanwhile on the opposite I think Sony in order to speed up the times of R&D has highly modified an rdna1 gpu by adding their own version of pretty much everything that is missing to make it a "full rdna2". This explains AMD reddit leak, devkits released 1 year before ms, the different nomenclature of the features and the different patents that reproduce the same features present on the rdna2 such as the geometry engine and the interesection engine, explains the lack of support for ML, explains the confusion in trying to explain without creating panic among the fanboys, by the ps5 Leonardi.

Is the name rdna1 or 2 important? no absolutely not, and exactly as Leonardi said both gpus are highly customized, so yes, if we want to put it both, Ms and Sony, have been loose with their definition of rdna2. My only interest is to see if Cerny's versions of the rdna2 features architecture enhancements will perform better or worse than those present in the rdna2 gpus.

Why? What is the point comparing PS5 enhancements to PC GPU’s ;)?

About the time difference between the dev kit releases (which meant that the software side of it was a bit more stable and ready for battle on one side)... you seem to forget that:

1.) Sony likely started a bit earlier (think PS4 Pro launch date vs XOX launch date, a year of difference)

2.) MS imposed on themselves the transition from XDK to GDK

You are looking at that and then dreaming up a scenario that makes one console look best in your eyes (they had to wait to get the full RDNA2+ GPU). You are succeeding in convincing that XSX also is a flux between RDNA1/2/3 ;). Again dodging that IPC improvement MS confirmed with the one AMD mentioned as GCN to RDNA1 IPC improvements (and how it matches the AMD leaker tweets you criticised)...

Seriously, MS historically has chosen features that most closely aligned with the DX roadmap and customised what they needed to to fit their OS structure (virtualised design) and expose lower level features to devs and invested their budget in other aspects, such as way more standard CU’s than the competition. PS4, PS4 Pro, and likely PS5 bet on a different approach that balanced familiarity with potential to be exploited as developers familiarise with the HW more.

This can have its side effects too as a naive/quickly developed PS4 Pro title will fare a lot worse on it than on XOX (given the same amount of dev budget on both).
 
Last edited:
what you expect he would do? to fix it he simply said that it cannot be classified as rdna 1, 2, 3 or 4. It was inevitable given what was already happening on the web. and seen how you react

Then you showed up and claimed that PS5 GPU is RDNA 1. Even Mark Cerny doesn't know that. He should hang himself


not until you keep saying that one of the ps5 principal engineer did not say that the gpu is not between rdna1 and 2

You'll have a rough gen, then
 
Last edited:

3liteDragon

Member
Want to see me destroy your entire argument believing the Principal Software Engineer on PS5 confirmed that it has VRS?



And then here is that same Principal Software Engineer on PS5 agreeing with *gasp* me...




awkward makeup GIF


So remove that tweet from your arsenal. He already confirmed back in August of last year that his tweet was not at all a confirmation of VRS on PS5.

You didn’t disprove shit or “destroy” any of my arguments. The console has VRS no matter how hard you try to spin it, keep whining about it.
 
Last edited:

ethomaz

Banned
AMD attributed the IPC increase on RDNA2 PC parts to the infinity cache.
I don’t remember so... Infinte Cache is a external cache and it is not inside the CU.
IPC is related to all GPU parts... AMD and MS were talking about CU perf. per clock cycle.

But if you have a source about that, please share with us.

BTW if that is true then RDNA 2 CUs are just RDNA 1 CU at arch level without any improvement... that makes sense too.
 
Last edited:

ethomaz

Banned
Last edited:

ethomaz

Banned
Not sure what relevance that has to what I said. (Performance per watt, transistor density and IPC are all different things)

With that said, RDNA had a claimed performance per watt advantage of 50% over GCN, while XSX delivers 2X performance per watt compared with the One X. So whether or not it is as power efficient as the PS5 (at equivalent clocks), huge strides have been made over RDNA 1.
It is +50% from GCN to RDNA and more +50% from RDNA to RDNA 2.

GCN to RDNA 2 is +125% so MS claim of the 2x fits with 125% increase.

images


What doesn’t fit is the MS claim of the 25% perf. per clock only increase from GCN that is basically the what AMD have with GCN to RDNA.
 
Last edited:

Md Ray

Member
i fact to have RT they did their own flavour of it it's called "intersection engine" isn't the one present in rdna2 gpu's....also if it work basically in the same manner
"Intersection engine" is basically "Ray Accelerator" or "RA". AMD/Sony engineers (maybe even MS engineers) were probably using the term "intersection engine" internally during the design process. Hence why Mark Cerny referred to it as "intersection engine" in his talk back in March 2020.

amd-ray-accelerator.png


In late Oct 2020, it became "Ray Accelerator". Which is quite simply AMD's marketing term for "intersection engine" inside RDNA 2 CU. Similar to their marketing term "GameCache" (remember that?) to refer to what was basically just a large L3 cache during the introduction of Zen 2 architecture.
 
Last edited:
Top Bottom