• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

UL releases 3DMark Mesh Shaders Feature test, first results of NVIDIA Ampere and AMD RDNA2 GPUs

martino

Member
From here on i will simply not read anything you post. There are simply too many dangling modifiers and misuse of terminology that it simply wastes everyones time trying to decypher it in order to make out what point you are actually trying to get across.
to be fair there is no proof any game use mesh shader , sfs or ps5 equivalent on third party games and if it's the case it doesn't bring anything that meaningful to the table / pc with hardware without them
So no is more probable.
 

MonarchJT

Banned
From here on i will simply not read anything you post. There are simply too many dangling modifiers and misuse of terminology that it simply wastes everyones time trying to decypher it in order to make out what point you are actually trying to get across.
the complete set of features of the xsx hw is still not even used the point was this . It's okay skit do what is best for you
 
Last edited:

MonarchJT

Banned
He spread FUD and after called out try to move the goal and change what he said... that is basically a circle.
That is know tactic used by Xbox fans on twitter and forums since the beginning of the generation (Eg. No RT > Software RT > Weaker RT... they keep moving the goal).
It is basically useless to reply... time just give the shock of reality needed.
this is just an unfair accusation and honestly if we have to say it all .. all your post about the ps5 hw are speculation since we have nothing official.and clearly you have nothing in hand to prove anything about how the geometry engine works or all the phantom customizations. You guess by blindly claiming in any thread your favorite hw. Talking about fud
 
Last edited:

skit_data

Member
to be fair there is no proof any game use mesh shader , sfs or ps5 equivalent on third party games and if it's the case it doesn't bring anything that meaningful to the table / pc with hardware without them
So no is more probable.
What i meant was monarch lumping together features and said they werent used in games, to then immidiately backtrack on VRS being used. Its inconsistent, and therefore inaccurate enough to be considered word salad.
 

MonarchJT

Banned
What i meant was monarch lumping together features and said they werent used in games, to then immidiately backtrack on VRS being used. Its inconsistent, and therefore inaccurate enough to be considered word salad.
lol ...I assure you that knowing that only the vrs managed to give up to 14% of boost would hurt ethomaz more than not knowing if it was used
 
Last edited:

Allandor

Member
There is nothing wrong with these results.



Source: https://steamcommunity.com/app/223850/discussions/0/5671690972168798553/

My test:

Before
020JzyI.png


After
jSAGkCw.png





Your results fixed:
  • AMD RDNA2: 1762%
  • NVIDIA Ampere: 702%
  • NVIDIA Turing (RTX): 409%
  • NVIDIA Turing: 244%
Ampere is in the middle of RDNA2 and Turing.

:messenger_kissing_smiling:
The percentages are quite distracting from the fact, the with the new driver, RDNA2 is almost on par with ampere ;)
AMD is just worse on the Mesh Shader "off" results and seem not to optimize that path, too.

But what we can see is, that Mesh Shaders can save a lot of resources. The classic way seem to be just plain cpu-bottlenecked, while the mesh shader allows for much "more" geometry to get onto the screen (actually less because the invisible stuff is no longer processed).

Yes, this is just a feature test and like it was with so many features tests from the past, they only show a part of the story. In this case a scene that would never have build like that, but now much less resources are being used to calculate it.

This just demonstrates how much more efficient the new consoles can be (even without "mesh shaders" as sony has something similar) in the future. But as those features are also usable on PC, consoles will still be "outdated" and must scale back details (but at a much more reasonable price point). But as those features are "brand new" I guess it still needs a few years until games really use all those new features. If we just look on the state of DX12/Vulkan in games today .... there are still many games that run better with DX11 or have at least problems with DX12/Vulkan. And we only got a handful of RT titels ... btw, where is Minecraft RT for the xbox?
 
Someone who actually knows what they're talking about, LGamer2 is a tech ENTHUSIAST to say the least, he's followed by DF's Alex and NX Gamer as well.




Principal Software Engineer on PS5.




Wired article from October 2019.


@geordiemp made a thread based on this, showing patents filed by Cerny and a few engineers from Naughty Dog talking about their own custom primitive shading and VRS techniques for PS5. It was clear back then from your Twitter posts and it's clear as day now you have no clue what you're talking about or you're deliberately trolling. Just cause Sony doesn't use the same marketing name as MS for a feature they have in their console or just cause they went with their own custom solutions for certain stuff unlike MS, doesn't mean they're "missing" those features and they suddenly become "inferior" to Xbox's PC RDNA 2 implementations.


Seems you're getting it wrong and misinterpreting the whole meaning. Some of it even directly contradicts what you're trying to say and instead supports my argument that Primitive Shaders doesn't go as far as Mesh Shaders does. Even the slide you use clearly mentions RDNA 2, which we already know supports Mesh Shaders on PC. That isn't a slide about primitive shaders.

So I'll just address the tweet comparing VRS to the Geometry Engine. First, why are they even being compared? That is just a copout to excuse the PS5's lack of the feature.

What VRS and what the geometry engine are suppose to be able to do are two entirely different things that should never be compared, they aren't in competition. They are meant to compliment one another because they both do two entirely different things. VRS is just one tool for optimization for performance and visual quality.

Yes, the Geometry Engine can give you bigger gains with proper planning, but none of that makes VRS pointless because, again, it does something else entirely. If your hardware supports both VRS as well as primitive shaders you could have gotten even bigger performance wins than if you only supported one. As such, if you support both VRS Tier 2 (as Series X does) along with Mesh Shaders your potential performance benefit outweighs that of a design that supports Primitive shaders but lacks VRS.

This argument is like saying RX 5700XT has zero need for VRS because it already has the Geometry Engine which supports Primitive Shaders.

So I will just end off by saying VRS and Mesh Shaders can be used simultaneously to compliment one another. The use of one does not somehow render the other pointless as their purposes are separate. Nothing in that post changes the fact that Primitive Shaders and Mesh Shaders are not the same. Primitive Shaders is simply not quite as powerful/flexible as Mesh Shaders, which is why AMD moved to Mesh Shaders, and also why Nvidia uses Mesh Shaders.

People have to stop treating that single Geometry Engine as a magic box that is the solution to everything, because it isn't. From the start he told everyone what it was capable of. Everything Mark Cerny stated could be done with Primitive Shaders on PS5 matches precisely with what AMD said could be done with Primitive Shaders when originally introduced in Vega with slightly different terminology used, but it amounts to the same thing.
 

John Wick

Member
it's based on rdna1 and 2, words of the ps5 graphic lead engineer .before he went in full damage control coz he did understand that what he said could be utilized against the ps5.. ..rdna2 wasn't even out and this is the reason why ps5 lack of ml,hw vrs,sfs and have their own version of the SAME things

.....the rdna3 things are just enormous wet dreams BS from internet in the same level of misterxmedia
What nonsense are you spewing? No one claims that PS5 has RDNA3. So I don't understand why you've even mentioned it except to further your warrior nonsense.
AMD claim PS5 GPU is RDNA2 based but again you know better. The Sony engineer stated the PS5 GPU isn't RDNA1, 2 or 3. It's based on RDNA2 with more features and minus one. RDNA2 was finalised years ago or do you think it was done in June 2020? The silicon bringup shows this.
 

John Wick

Member
Seems you're getting it wrong and misinterpreting the whole meaning. Some of it even directly contradicts what you're trying to say and instead supports my argument that Primitive Shaders doesn't go as far as Mesh Shaders does. Even the slide you use clearly mentions RDNA 2, which we already know supports Mesh Shaders on PC. That isn't a slide about primitive shaders.

So I'll just address the tweet comparing VRS to the Geometry Engine. First, why are they even being compared? That is just a copout to excuse the PS5's lack of the feature.

What VRS and what the geometry engine are suppose to be able to do are two entirely different things that should never be compared, they aren't in competition. They are meant to compliment one another because they both do two entirely different things. VRS is just one tool for optimization for performance and visual quality.

Yes, the Geometry Engine can give you bigger gains with proper planning, but none of that makes VRS pointless because, again, it does something else entirely. If your hardware supports both VRS as well as primitive shaders you could have gotten even bigger performance wins than if you only supported one. As such, if you support both VRS Tier 2 (as Series X does) along with Mesh Shaders your potential performance benefit outweighs that of a design that supports Primitive shaders but lacks VRS.

This argument is like saying RX 5700XT has zero need for VRS because it already has the Geometry Engine which supports Primitive Shaders.

So I will just end off by saying VRS and Mesh Shaders can be used simultaneously to compliment one another. The use of one does not somehow render the other pointless as their purposes are separate. Nothing in that post changes the fact that Primitive Shaders and Mesh Shaders are not the same. Primitive Shaders is simply not quite as powerful/flexible as Mesh Shaders, which is why AMD moved to Mesh Shaders, and also why Nvidia uses Mesh Shaders.

People have to stop treating that single Geometry Engine as a magic box that is the solution to everything, because it isn't. From the start he told everyone what it was capable of. Everything Mark Cerny stated could be done with Primitive Shaders on PS5 matches precisely with what AMD said could be done with Primitive Shaders when originally introduced in Vega with slightly different terminology used, but it amounts to the same thing.
Oh look another expert? How do you know Sony doesn't have their own version of VRS? Patents files by Cerny and Sony point to this. How do you know Sony's solution isn't better? I mean it could be worse but I don't know. So c'mon mr I know better than actual engineers and devs explain how you know better?
 

Fredrik

Member
Why did this turn into console war???

I’m just here to cry in a corner because I once had a 3080 Gaming X Trio preordered for 8990SEK and now they’re not available for 10990SEK
😭
 

John Wick

Member
this is just an unfair accusation and honestly if we have to say it all .. all your post about the ps5 hw are speculation since we have nothing official.and clearly you have nothing in hand to prove anything about how the geometry engine works or all the phantom customizations. You guess by blindly claiming in any thread your favorite hw. Talking about fud
Pot Kettle Black? Post the evidence that PS5 GPU is RDNA1 based?
 

John Wick

Member
Why did this turn into console war???

I’m just here to cry in a corner because I once had a 3080 Gaming X Trio preordered for 8990SEK and now they’re not available for 10990SEK
😭
It's quite easy if you start at the begining..................
longdi:
"where is our resident amd boy?
RTX > Rdna2.

But mesh shaders in SX, wow at that improvement.....it may be a bigger win than the TFLOPS difference....."
 

SlimySnake

Flashless at the Golden Globes
After looking at the poor VRS implementation in Dirt 5 and Halo Infinite, i dont want VRS anywhere near Sony exclusives.

Looks like the Primitive Shaders in the PS5 Geometry engine can do what mesh shaders are supposed to do.

 

Fredrik

Member
It's quite easy if you start at the begining..................
longdi:
"where is our resident amd boy?
RTX > Rdna2.

But mesh shaders in SX, wow at that improvement.....it may be a bigger win than the TFLOPS difference....."
Lol okay I see now. And I’m going to firmly press X for doubt that MS/AMD can simply release a driver update and XSX will get a 500% fps boost.
 

John Wick

Member
Nah, God Cerny saw how shitty AMDs implementation was and pulled his own perfect mesh shader out of his ass and told AMD to put it in their APU.


edit:


See, Cerny knows better than AMDs engineers.
You mean those genius engineers that still don't have an answer to the Nvidia Tensor cores? Nvidia are on a bigger die process and still spank AMD.
You do realise alot of the features in GPU's come from devs? Or do you think these engineers work in isolation without any feedback or suggestions?
 

John Wick

Member
After looking at the poor VRS implementation in Dirt 5 and Halo Infinite, i dont want VRS anywhere near Sony exclusives.

Looks like the Primitive Shaders in the PS5 Geometry engine can do what mesh shaders are supposed to do.


If you read what RTG was saying the PS5 GE has the ability to cull a lot earlier in the pipeline
 
Oh look another expert? How do you know Sony doesn't have their own version of VRS? Patents files by Cerny and Sony point to this. How do you know Sony's solution isn't better? I mean it could be worse but I don't know. So c'mon mr I know better than actual engineers and devs explain how you know better?
Because Sony has said no such thing, and we already had the PS5 tech Deep Dive showing every major new GPU hardware feature. DF has also directly asked Sony, and they've yet to simply say yes. It isn't an NDA at this stage. It's just not there. Now VRS can be done in software of course, it's just much more performant with proper hardware support.
 

MonarchJT

Banned
He's an expert. Even with Sony and AMD stating it is RDNA2 but it's still RDNA1 with customisations. Let him carry on he'll probably sleep better..
I remain convinced of what the principal (edit: thanks fredrick) graphic engineer said before the forced damage control started , and what timing of devkits and facts show

EdNcAS6XoAANjxT


EdNdcJwUYAM5Xqz


So no amd mesh shader, no rdna2 tier2 vrs, no ML, no sfs, but look', Sony really wanted to spend money on R&D to develop its own variant of the exact same things (such as the geometry engine that mimics mesh shaders or the intersect engine that mimics AMD's raytracing) it would have had by opting for a simple full RDNA2 GPU. So they took a full RDNA2 based gpu as you saying.. they took the raytracing engine out , took out the ML support and they changed the shader engine to made their own ...same with the rt engine. But are you serious? how can you believe such a stuff kind?If you can't connect such simple points please don't go around accusing people of making up stories. As i said before is just an occam's razor situation.
And who said that the ps5 is a mix between rdna1 and 2 was the principal graphic engineer of the console itself, not me.
I would accept these conclusions if the customizations made by modifying a rdna2 gpu had been so revolutionary as to be worth it .(and is clear that there's nothing revolutionary or sony and cerny would scream it over the roof and instead all the marketing and presentation was centered on the ssd)... not simply unmount a rdna2 gpu to mount it again by copying (as demonstrated by several patents) the same architecture later.

Not to mention that you are saying that the ps5 principal graphic engineer was lying
 
Last edited:

Fredrik

Member
And who said that the ps5 is a mix between rdna1 and 2 was the lead graphic engineer of the console said it, not me.i
I don’t know shit but principal graphics engineer at Sony Interactive Entertainment Europe seems like a different role than lead graphics engineer of the console.
 

MonarchJT

Banned
I don’t know shit but principal graphics engineer at Sony Interactive Entertainment Europe seems like a different role than lead graphics engineer of the console.
You are right he is principal not lead ...but I'm sure he know what he is taking about
 

Calverz

Member
Why did this turn into console war???

I’m just here to cry in a corner because I once had a 3080 Gaming X Trio preordered for 8990SEK and now they’re not available for 10990SEK
😭
The usual suspects.
I literally settled on a 3070 which i should get in a couple of days so this sounds like great news. Cant wait to get started with it.
 
I remain convinced of what the principal (edit: thanks fredrick) graphic engineer said before the forced damage control started , and what timing of devkits and facts show

EdNcAS6XoAANjxT


EdNdcJwUYAM5Xqz


So no amd mesh shader, no rdna2 tier2 vrs, no ML, no sfs, but look', Sony really wanted to spend money on R&D to develop its own variant of the exact same things (such as the geometry engine that mimics mesh shaders or the intersect engine that mimics AMD's raytracing) it would have had by opting for a simple full RDNA2 GPU. So they took a full RDNA2 based gpu as you saying.. they took the raytracing engine out , took out the ML support and they changed the shader engine to made their own ...same with the rt engine. But are you serious? how can you believe such a stuff kind?If you can't connect such simple points please don't go around accusing people of making up stories. As i said before is just an occam's razor situation.
And who said that the ps5 is a mix between rdna1 and 2 was the principal graphic engineer of the console itself, not me.
I would accept these conclusions if the customizations made by modifying a rdna2 gpu had been so revolutionary as to be worth it .(and is clear that there's nothing revolutionary or sony and cerny would scream it over the roof and instead all the marketing and presentation was centered on the ssd)... not simply unmount a rdna2 gpu to mount it by copying (as demonstrated by several patents) the same architecture later.

Not to mention that you are saying that the ps5 principal graphic engineer was lying

You are right he is principal not lead ...but I'm sure he know what he is taking about

So, he said PS5 is missing one thing from RDNA 2, but you counted... Let see... 4 features.

This is one of the best example how Xbox fans here and Blue thinks and later proved how they are wrong by engineers. Poor Xbox fans. They will never learn

R01K8B9.png
 
Last edited:

Clear

CliffyB's Cock Holster
Computer components are not Lego. They don't just snap-together uniformly, how they are connected is a big deal on both the software and hardware level. Most, if not all, PC benchmarking and the "common truths" gleaned from it are rely upon the underlying conceit that all else is equal outside of the parts that are the focus of the test. And in the narrow perspective of judging which part "gives the most bang for the buck" it works great.

Unfortunately though, when everything else gets switched up -like with consoles- you can't just assume the same holds true.
I remain convinced of what the lead graphic engineer said before the forced damage control started , and what timing and facts show

EdNcAS6XoAANjxT


EdNdcJwUYAM5Xqz


So no amd mesh shader, no rdna2 tier2 vrs, no ML, no sfs, but look', Sony really wanted to spend money on R&D to develop its own variant of the exact same things (such as the geometry engine that mimics mesh shaders or the interect engine that mimics AMD's raytracing) it would have had by opting for a simple full RDNA2 GPU. So they took a full RDNA2 based gpu as you saying.. they took the raytracing engine out , took out the ML support and they changed the shader engine to made their own. But are you serious? how can you believe such a stuff kind?If you can't connect such simple points please don't go around accusing people of making up stories. As i said before is just an occam's razor situation.
And who said that the ps5 is a mix between rdna1 and 2 was the lead graphic engineer of the console said it, not me.

No. For a start off he's a principal engineer at SIE in Europe, basically a senior coder at the London studio. So not even close to the senior graphic engineer on the console, and as he states he's simply presenting information that was publicly available and not trying to spin it for any particular effect. Its also very telling that its just these 2 tweets that you choose to show and not the ones following where the penny dropped how his words were going to get twisted and he backpedalled rapidly!

Cerny in his road to PS5 laid out how they were developing their own tech and presenting it for potential use in future AMD devices, an assertion backed up by the numerous patents filed by him over the course of the project, many of which have been proven to have been implemented. I mean its kinda obvious that the PS5 is far closer to Big Navi in terms of functionality than Navi based on the evidence of stuff running on it.

And then of course there's the far bigger issue of the overall system architecture, coherency engines and the like, which are designed to take the load off of some of the duties of the SoC, providing and explanation for why you get those periodic chugs on the SX version of Control, but not on PS5 despite it offering a lower level of raw raster performance.

I'm sorry, but it never ceases to amaze me how PC tech enthusiasts treat components like Lego, and talk like system performance is just a matter of sticking the right bricks together. I get that its a perspective borne from years of benchmarking PC components, a scenario where the crucial caveat "all else being equal" is not just relied upon, but desired... However when it comes to relatively exotic, bespoke designs like consoles you cannot rely on that to be the case. The connective material both in software and hardware between the components becomes crucially important.

Cerny's statements indicate a focus on optimization across the system, not just slapping in the biggest, latest APU they could procure from their partners at AMD. Right now we are seeing the results of that; on paper the SX should outperform the PS5 handily, and yet in every case outside of situations where rasterization and fill-rate is the primary metric, its matching and/or surpassing it.

Where this matters in the conversation is that all the RDNA2 technologies you mention serve to improve rasterization NOT data usage! You can't cull data that GPU can't see! And if the pipeline gets bottlenecked trying to feed in these huge chunks of data, its still going to choke and stall regardless of how good it is at tesselating, discarding and varying shader quality in the output!
 
Last edited:

MonarchJT

Banned
So, he said PS5 is missing one thing from RDNA 2, but you counted... Let see... 4 features.

This is one of the best example how Xbox fans here and Blue thinks and later proved how they are wrong by engineers. Poor Xbox fans. They will never learn

R01K8B9.png
I didn't say that 4 features are missing, you're trying to bend reality, I said that they had to develop their own versions of the same identical, exact, things for I don't know what obscure reason. Which you will understand for yourself is highly stupid to do, especially if you are not bringing any major improvements in copying those features to make you spend the necessary money on R&D. See everyone is free to believe what he wants , ps5 performance is great and I'm sure cerny added the main features that make a gpu rdna2 (definitely in agreement with AMD because otherwise it would mean that cerny has the crystal ball) so there is nothing to worry about. To be seen if the performance of these customizations will match the versions made by amd
 
I didn't say that 4 features are missing, you're trying to bend reality, I said that they had to develop their own versions of the same identical, exact, things for I don't know what obscure reason. Which you will understand for yourself is highly stupid to do, especially if you are not bringing any major improvements in copying those features to make you spend the necessary money on R&D. See everyone is free to believe what he wants , ps5 performance is great and I'm sure cerny added the main features that make a gpu rdna2 (definitely in agreement with AMD because otherwise it would mean that cerny has the crystal ball) so there is nothing to worry about. To be seen if the performance of these customizations will match the versions made by amd

Oh, yes you did
So no amd mesh shader, no rdna2 tier2 vrs, no ML, no sfs

And later you said who would believe that Sony would do such a thing :

So they took a full RDNA2 based gpu as you saying.. they took the raytracing engine out , took out the ML support and they changed the shader engine to made their own ...same with the rt engine. But are you serious? how can you believe such a stuff kind?If you can't connect such simple points please don't go around accusing people of making up stories. As i said before is just an occam's razor situation.
And who said that the ps5 is a mix between rdna1 and 2 was the principal graphic engineer of the console itself, not me.
I would accept these conclusions if the customizations made by modifying a rdna2 gpu had been so revolutionary as to be worth it .(and is clear that there's nothing revolutionary or sony and cerny would scream it over the roof and instead all the marketing and presentation was centered on the ssd)... not simply unmount a rdna2 gpu to mount it again by copying (as demonstrated by several patents) the same architecture later.

So, you said to that i'm clouded by console war, but hey......Looks who's talking.
 

MonarchJT

Banned
Oh, yes you did


And later you said who would believe that Sony would do such a thing :



So, you said to that i'm clouded by console war, but hey......Looks who's talking.
i was meaning that the one found on ps5 are not the amd version of it .... and i explained later in the post. Rdna2 based gpu have all those things you don't go to reinvent the wheel just to copy it and change its name. How is it that this alone does not make you ring some alarm bells I don't know. If you add the words of that engineer, the timing of the devkits strangely released 1 year earlier than the Xbox ones when amd still hadn't even released the architecture, ms saying "we waited for the tech", the AMD ps5 apu prototype leak, Microsoft crying "we are the only ones with a rdna2 gpu, the lack of ML support present in every rdna2 gpu...well I don't know what else to add.
I understand that we are discussing about nothing since in any case many of the rdna2 upgrades are found on the ps5 but for me what was written above is evidence that the base of the gpu in the ps5 is not that and for this I am curious to know how this ps5 customizations will compete with those made by amd in rdna2 gpu's.

Having said that, everyone can think of it as they want.
 
Last edited:
How is it that this alone does not make you ring some alarm bells I don't know. If you add the words of that engineer, the timing of the devkits strangely released 1 year earlier than the Xbox ones when amd still hadn't even released the architecture, ms saying "we waited for the tech", the AMD ps5 apu prototype leak, Microsoft crying "we are the only ones with a rdna2 gpu, the lack of ML support present in every rdna2 gpu...well I don't know what else to add.

You are using engineer's words just to give you some weight in your posts.

Well, imagine that instead of Blue is you in this tweet and in which way ( back then PS software engineer ) engineer answered to you.

R01K8B9.png

Y
 
Last edited:

MonarchJT

Banned
You are using engineer's words just to give you some weight in your posts.

Well, imagine that instead of Blue is you in this tweet and in which way ( back then PS software engineer ) answered to you.

R01K8B9.png

Y
His words are understandable to everyone and he says clearly and in a way that everyone can understand and they certainly cannot be confused "a mix between rdna1 and rdna2 with it's unique bits". Your post has little to do with what I posted and among other things I never said that the ps5 is rdna1..exactly like the engineer on your screenshot does well to respond like that. It is not rdna1 ..is a mix between rdna1 and rdna2 exactly as the other engineer said. And you keep accusing me of making up facts when I wrote you 10 times a series of events that at least prove my thesis more than your "it's rdna2 because otherwise I cry !!"

i stop here . trust whatever makes you feel better
 
Last edited:

Elog

Member
i was meaning that the one found on ps5 are not the amd version of it .... and i explained later in the post. Rdna2 based gpu have all those things you don't go to reinvent the wheel just to copy it and change its name. How is it that this alone does not make you ring some alarm bells I don't know. If you add the words of that engineer, the timing of the devkits strangely released 1 year earlier than the Xbox ones when amd still hadn't even released the architecture, ms saying "we waited for the tech", the AMD ps5 apu prototype leak, Microsoft crying "we are the only ones with a rdna2 gpu, the lack of ML support present in every rdna2 gpu...well I don't know what else to add.
I understand that we are discussing about nothing since in any case many of the rdna2 upgrades are found on the ps5 but for me what was written above is evidence that the base of the gpu in the ps5 is not that and for this I am curious to know how this ps5 customizations will compete with those made by amd in rdna2 gpu's.

Having said that, everyone can think of it as they want.
I am not sure I should respond but here I go.

You are missing some key points about the differences between the PS5 GPU architecture and the 'standard' RDNA2 architecture. The differences are substantial - it is not just a reworded variant but a substantial difference in how geometry is processed. And as stated the PS5 version has - on paper - some fairly interesting advantages.

The question then becomes: Why did AMD not implement this in RDNA2? Why did Nvidia not go for a similar solution? Cerny cannot be that smart?

While Cerny has delivered in spades over the years, Nvidia and AMD engineers are very smart as well. The reason why Nvidia and AMD has not implemented similar solutions are spelled VEGA and RTX 20XX series of cards. For both those card generations, AMD and Nvidia respectively, implemented significant changes on the hardware level that added functionality and advantages to the rendering pipeline if the graphical engines went through some major updates to utilize these advantages. Given the installed base of PC GPUs compared to the initial sales of VEGA and RTX 20XX cards there was close to 0 incentive for the engine developers to spend significant time rewriting and optimizing code for the new features. Result? Both VEGA and RTX 20XX cards were hammered from a cost/performance point of view at every site and magazine that ran their standard 30 games (or there around) test suit from the last decade - since all that new expensive silicon was not used.

Fast forward to today. Hell will freeze over for Nvidia and AMD to release hardware changes that cannot be accommodated under the current DirectX umbrella and offer significant performance advantages with driver updates and minor patches in that 30 game test suit that everyone is running with. This means that any hardware based solution that is more optimal but that requires significant engine changes will end up on the back-burner. This where the current implementation of mesh shaders and VRS comes in - both solutions are not optimal from a silicon point of view but allows this hardware to work with minimal effort in existing games. It is a win for AMD and Nvidia.

Sony is probably the only player that is big enough to force upgrades to existing engines over a short period of time due to its dominant position in the 3rd party market. It is a win-win for AMD. AMD gets direct access and insightful knowledge from top 3d engine developers and artists at Sony and can incorporate a silicon change in the Playstation 5 that AMD wants to incorporate in future AMD GPUs but that will require substantial updates to how engines are coded. The Playstation 5 will drive graphical engine changes over a period of time due to the enormous installed base so when those AMD GPUs actually are released for the PC market there are games and engines to show off the performance. That is why I believe in the RDNA3 rumour - it makes an awful lot of sense.

To utilize pure RDNA2 in the XSX/S also makes sense for MS since they want a more or less unified environment with the PC eco system as well as have hardware based back-compatibility - the latter point more or less requires them to operate under the current DirectX umbrella to not waste silicon on duplicated silicon features (which equals cost).

If you listen to the grapevine you also know that Sony started to work with 3d studios very early in the PS5 development cycle - and they had to since they forced significant coding changes on the developers. And they needed to be ready when the PS5 released. The next 18 months will be very interesting in terms of tests and comparisons. Is the updated graphical pipeline process in the PS5 a win or a loss? However, to talk about this significant hardware change on the GPU side as RDNA1 is to completely misrepresent what is going on. It is frankly bullshit.
 
His words are understandable to everyone and he says clearly and in a way that everyone can understand. Your post has little to do with what I posted and among other things I never said that the ps5 is rdna1..exactly like the engineer on your screen does well to respond like that. It is not rdna1 ..is a mix between rdna1 and rdna2 exactly as the other engineer said. And you keep accusing me of making up facts when I wrote you 10 times a series of events that at least prove my thesis more than your "it's rdna2 because otherwise I cry !!"

i stop here . trust whatever makes you feel better

That dumbass said that ( till this day PS5 is still RDNA 1 for him ), not you, but i've just provide an example tweet what engineer would say in same way to you, since you are you using engineers words to add some weight in your posts here like in the same way that dumbass did on Twitter. You also trust whatever makes you feel better.
 

MonarchJT

Banned
to
I am not sure I should respond but here I go.

You are missing some key points about the differences between the PS5 GPU architecture and the 'standard' RDNA2 architecture. The differences are substantial - it is not just a reworded variant but a substantial difference in how geometry is processed. And as stated the PS5 version has - on paper - some fairly interesting advantages.

The question then becomes: Why did AMD not implement this in RDNA2? Why did Nvidia not go for a similar solution? Cerny cannot be that smart?

While Cerny has delivered in spades over the years, Nvidia and AMD engineers are very smart as well. The reason why Nvidia and AMD has not implemented similar solutions are spelled VEGA and RTX 20XX series of cards. For both those card generations, AMD and Nvidia respectively, implemented significant changes on the hardware level that added functionality and advantages to the rendering pipeline if the graphical engines went through some major updates to utilize these advantages. Given the installed base of PC GPUs compared to the initial sales of VEGA and RTX 20XX cards there was close to 0 incentive for the engine developers to spend significant time rewriting and optimizing code for the new features. Result? Both VEGA and RTX 20XX cards were hammered from a cost/performance point of view at every site and magazine that ran their standard 30 games (or there around) test suit from the last decade - since all that new expensive silicon was not used.

Fast forward to today. Hell will freeze over for Nvidia and AMD to release hardware changes that cannot be accommodated under the current DirectX umbrella and offer significant performance advantages with driver updates and minor patches in that 30 game test suit that everyone is running with. This means that any hardware based solution that is more optimal but that requires significant engine changes will end up on the back-burner. This where the current implementation of mesh shaders and VRS comes in - both solutions are not optimal from a silicon point of view but allows this hardware to work with minimal effort in existing games. It is a win for AMD and Nvidia.

Sony is probably the only player that is big enough to force upgrades to existing engines over a short period of time due to its dominant position in the 3rd party market. It is a win-win for AMD. AMD gets direct access and insightful knowledge from top 3d engine developers and artists at Sony and can incorporate a silicon change in the Playstation 5 that AMD wants to incorporate in future AMD GPUs but that will require substantial updates to how engines are coded. The Playstation 5 will drive graphical engine changes over a period of time due to the enormous installed base so when those AMD GPUs actually are released for the PC market there are games and engines to show off the performance. That is why I believe in the RDNA3 rumour - it makes an awful lot of sense.

To utilize pure RDNA2 in the XSX/S also makes sense for MS since they want a more or less unified environment with the PC eco system as well as have hardware based back-compatibility - the latter point more or less requires them to operate under the current DirectX umbrella to not waste silicon on duplicated silicon features (which equals cost).

If you listen to the grapevine you also know that Sony started to work with 3d studios very early in the PS5 development cycle - and they had to since they forced significant coding changes on the developers. And they needed to be ready when the PS5 released. The next 18 months will be very interesting in terms of tests and comparisons. Is the updated graphical pipeline process in the PS5 a win or a loss? However, to talk about this significant hardware change on the GPU side as RDNA1 is to completely misrepresent what is going on. It is frankly bullshit.
ok 👌 i can like this but ..also now tell us what future enhancement in gpu architecture we are seeing in the ps5 and why those are hidden and silently bypassed in the presentation by his creator Cerny? (pls don't tell me that the geometry engine is the deal eh)
ps. we are still talking about gpu architecture right ?
 
Last edited:

Elog

Member
to

ok 👌 now tell us what future enhancement in architecture we are seeing in the ps5 and why those are hidden and silently bypassed in the presentation by his creator Cerny? (pls don't tell me that the geometry engine is the deal eh)
He did not bypass them at all. He stated the change upfront in the presentation - that was one of the main parts of it.

It is the fact that the pipeline is driven by the geometry engine and not by the CUs. I summarized the changes in this post and it is very similar to the changes that the Epic team detailed when describing key changes in UE5, i.e. that the geometry processing step is completely changed (many many advantages from that): https://www.neogaf.com/threads/next...-analysis-leaks-thread.1480978/post-262329077

So you will not like the answer since it is the geometry processing on the hardware level that is the main difference :)
 

onesvenus

Member
both solutions are not optimal from a silicon point of view
Based on what metrics?
AMD gets direct access and insightful knowledge from top 3d engine developers and artists at Sony
Why do you think Sony studios feedback is more important than feedback from let's say Epic, Unity and Crytek which represent engines used in a very high number of games? Do you think those engineers would not be asking for something like what you suppose Sony asked for if it were so much better?
However, to talk about this significant hardware change on the GPU side as RDNA1 is to completely misrepresent what is going on. It is frankly bullshit.
It's true, talking about it being RDNA1 is bullshit but buying all those rumors about secret tech in the PS5 without any proof of it existing and claiming that somehow Sony has outsmarted all the computer graphics engineers in both AMD and Nvidia is not less delusional.
 
I came into this thread thinking it was an nVidia vs AMD comparison... but some how it becane a PS5 v XSX for some :messenger_dizzy: Console bro's relax and put the guns and e-penis away, give it a rest for a thread :messenger_tears_of_joy: ... or two.
 
Last edited:

MonarchJT

Banned
He did not bypass them at all. He stated the change upfront in the presentation - that was one of the main parts of it.

It is the fact that the pipeline is driven by the geometry engine and not by the CUs. I summarized the changes in this post and it is very similar to the changes that the Epic team detailed when describing key changes in UE5, i.e. that the geometry processing step is completely changed (many many advantages from that): https://www.neogaf.com/threads/next...-analysis-leaks-thread.1480978/post-262329077

So you will not like the answer since it is the geometry processing on the hardware level that is the main difference :)
I did read it and and I also went out of curiosity to read redgamingtech. My personal take is that the only real add over the architecture are the cache scrubs. It is clear that the geometry engine copying what mesh shader is doing but I don't know how the pipeline work will differ between the two, but i hope for one thing , that both can process meshlets because otherwise, apart from some tweaking, it would take studios to crate two renderers engine for each one of the future multiplat games. Although it absolutely does it means nothing, my personal take is that when meshshader / vrs will be used, the advantage of having more CU's working in parallel will be much, much more consistent than right now where pure fillrate is still the master (and the ps5 given its clock is in clear advantage)
 
Last edited:
I remain convinced of what the lead graphic engineer said before the forced damage control started , and what timing of devkits and facts show

EdNcAS6XoAANjxT


EdNdcJwUYAM5Xqz


So no amd mesh shader, no rdna2 tier2 vrs, no ML, no sfs, but look', Sony really wanted to spend money on R&D to develop its own variant of the exact same things (such as the geometry engine that mimics mesh shaders or the interect engine that mimics AMD's raytracing) it would have had by opting for a simple full RDNA2 GPU. So they took a full RDNA2 based gpu as you saying.. they took the raytracing engine out , took out the ML support and they changed the shader engine to made their own. But are you serious? how can you believe such a stuff kind?If you can't connect such simple points please don't go around accusing people of making up stories. As i said before is just an occam's razor situation.
And who said that the ps5 is a mix between rdna1 and 2 was the lead graphic engineer of the console itself, not me.
I would accept these conclusions if the customizations made by modifying a rdna2 gpu had been so revolutionary as to be worth it .(and is clear that there's nothing revolutionary or sony and cerny would scream it over the roof and instead all the marketing and presentation was centered on the ssd)... not simply unmount a rdna2 gpu to mount it by copying (as demonstrated by several patents) the same architecture later.

Not to mention that you are saying that the ps5 lead graphic engineer was lying

I've stressed this point forever. Mark Cerny, from the moment he became more publicly prominent amongst playstation gamers and gamers in general, has always shared a wealth of information about any and every major Playstation hardware feature, right down to the ACEs and important communication lanes inside PS4 if he found it important. It's improbable to think the PS5 would possess such major headline next-gen graphics features in its GPU, yet get zero mention from its lead architect.

The SONY MADE THEIR OWN EVERYTHING stuff is ridiculous. We know what Sony made themselves, Mark Cerny told us. He said it was Sony's idea to implement the GPU cache scrubbers and if we saw it in a future GPU then that would be a sign that their collaboration with AMD proved useful for the PC space also, they built their own custom SSD/flash controller and I/O complex with additional task specific hardware inside and their custom hardware decompression unit.

When it comes to GPU features, Mark Cerny actually prepares people watching his deep dive by stressing that new features cost transistors, and they wanted to strike a balance between adding new features while ensuring that developers would not be forced to use those new features, to keep things optional.

One of the first things Mark Cerny stresses that's an advantage of the Geometry Engine is as follows

abort processing of a vertex if all geometry that uses it is off-screen.

Brings handling of triangles and other primitives under full programmatic control

remove back faced or off-screen vertices and triangles,

More complex usage involving primitive shaders allows synthesizing geometry on the fly as it's being rendered


smoothly varying level of detail, addition of procedural detail to close up objects and improvements to particle effects and other visual special effects




Check page six of the Vega Whitepaper as it brings up the next gen geometry engine.

To meet the needs of both professional graphics and gaming applications, the geometry engines in “Vega” have been tuned for higher polygon throughput by adding new fast paths through the hardware and by avoiding unnecessary processing. This next-generation geometry (NGG) path is much more flexible and programmable than before.

To highlight one of the innovations in the new geometry engine, primitive shaders are a key element in its ability to achieve much higher polygon throughput per transistor. Previous hardware mapped quite closely to the standard Direct3D rendering pipeline, with several stages including input assembly, vertex shading, hull shading, tessellation, domain shading, and geometry shading. Given the wide variety of rendering technologies now being implemented by developers, however, including all of these stages isn’t always the most efficient way of doing things. Each stage has various restrictions on inputs and outputs that may have been necessary for earlier GPU designs, but such restrictions aren’t always needed on today’s more flexible hardware. Surface Shading Tessellator Per-Vertex Shading (VS/DS) Per-Primitive Shading (GS) Position Shading Primitive Culling Attribute Shading Primitive Shader Primitive Culling Primitive Assembler Figure 4 Geometry processing via the traditional DX path (left) and primitive shaders (right) 6 AMD | Radeon Technologies Group “Vega’s” new primitive shader support allows some parts of the geometry processing pipeline to be combined and replaced with a new, highly efficient shader type. These flexible, general-purpose shaders can be launched very quickly, enabling more than four times the peak primitive cull rate per clock cycle.

s64s8sM.jpg


Important, I am not saying the PS5 has the exact same geometry performance as Vega, because even in the RDNA whitepaper it makes clear that changes were made that significantly improves the geometry performance of RDNA, which means the Geometry Engine from PS5 will not perform the exact same way it does in Vega. In RDNA 1st gen, for example, the cull rate for triangles was more than doubled over Vega.

In a typical scene, around half of the geometry will be discarded through various techniques such as frustum culling, back-face culling, and small-primitive culling. The faster these primitives are discarded, the faster the GPU can start rendering the visible geometry. Furthermore, traditional geometry pipelines discard primitives after vertex processing is completed, which can waste computing resources and create bottlenecks when storing a large batch of unnecessary attributes. Primitive shaders enable early culling to save those resources.

The “Vega” 10 GPU includes four geometry engines which would normally be limited to a maximum throughput of four primitives per clock, but this limit increases to more than 17 primitives per clock when primitive shaders are employed.⁷ Primitive shaders can operate on a variety of different geometric primitives, including individual vertices, polygons, and patch surfaces. When tessellation is enabled, a surface shader is generated to process patches and control points before the surface is tessellated, and the resulting polygons are sent to the primitive shader. In this case, the surface shader combines the vertex shading and hull shading stages of the Direct3D graphics pipeline, while the primitive shader replaces the domain shading and geometry shading stages. Primitive shaders have many potential uses beyond high-performance geometry culling. Shadow-map rendering is another ubiquitous process in modern engines that could benefit greatly from the reduced processing overhead of primitive shaders. We can envision even more uses for this technology in the future, including deferred vertex attribute computation, multi-view/multi-resolution rendering, depth pre-passes, particle systems, and full-scene graph processing and traversal on the GPU.

Features can, and often, undergo improvements or enhancements, which I'm positive are present in PS5, so ignore the more than 17 primitives per clock Vega number as it's sure to be notably larger on PS5, but the core geometry engine primitive shaders functionality that was finally fixed for RDNA 1st gen (it didn't work in Vega) is highly likely one and the same in PS5 with the benefit of obvious benefit of additional architectural enhancements that come with the PS5.

What Mark Cerny described the PS5's geometry engine to be capable of is more or less all present and accounted for in the AMD Vega whitepaper description of the Geometry Engine and primitive shaders. Just because it's a feature first introduced (though never used) for Vega shouldn't be taken as me having a go at the PS5. That doesn't make it "OLD" by any stroke of the imagination. No console or PC title that I'm aware of has ever utilized primitive shaders. And obviously I know of none that utilize the seemingly more advanced Mesh Shaders, which carries the potential to completely replace the fixed function tesselation unit in GPUs.
 
Top Bottom