• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox Dev Demonstrates NVIDIA GeForce RTX 2080 Ti & Xbox Series X Mesh Shaders Performance With DirectX 12 Ultimate API

Neo_game

Member
I love this the PS5 fans saying it's not that bad. Well it is, Sony "ducked" you fans.
PS5 is a 5700xt at it's base.
Yes the CUs are RDNA2. The 256bit bus is apart of that old 5700xt. Like ogxb1 it looses in every point in every way.
You can have the fastest ssd and the GPU still has a 256bits wide bus.

Mark Carey is the same man that told you that PS4 pro was 8.2Tf Very misleading. And that tech talk was a bad Info commercial. Don't do it, you can't lose in every almost Category
and say it's better or close or even ,
It's not.......... the next thing you guys will be saying is that there are two
GPUS.....

Xbox has 17% advantage over PS5. Just like PS4 had 40% over it last gen. Note that this on paper and real world case is going to far less just like last gen. Some guys have already mentioned it. I think BW is the key factor. Those 10gig of 560gb/sec in Xbox is their biggest advantage I think.
 

Goliathy

Banned
Xbox has 17% advantage over PS5. Just like PS4 had 40% over it last gen. Note that this on paper and real world case is going to far less just like last gen. Some guys have already mentioned it. I think BW is the key factor. Those 10gig of 560gb/sec in Xbox is their biggest advantage I think.

to be fair, PS5 clocks are variable, we don't know yet how far it can down and how often it will, but we know that it is NOT locked, since its variable, so the difference will be greater than 17%.
 
Last edited:
He says he thinks it would be a couple percentage in drop. It’s an entirely different story once developers have actual games running on the hardware.

Of course last gen this same guy also said he thinks native 4K gaming would require 8 teraflops, so he can clearly either be extremely wrong at times like anyone else, or he can really fudge numbers for PR purposes.

Either way expect the performance to drop more than 2%.
 

martino

Member
Once you realize that everyone has biases to some degree and you can never be sure how much they are actually trying to be objective or rational, you can then start to use your intelligence and rationality to see past the bias and look at the facts if there are any and compare them, challenge them or learn from them. Simply hand waving away anyone's opinion just because you think they are "bias" is a good way to try to live in your own comfortable narrative and leave your intelligence unchallenged. Thereby strengthening your own bias... Like trying to make a snowball in a freezer one snowflake at a time... Gotta keep it safe!

and Cerny sure provided lot of data and demos.
 
Last edited:
Ok do you have a better source to go against what Cerny said?

That is the point we only have what Cerny said... so using made up numbers in a discussion is what a fanboy should avoid.

If you stay accurate you can be anything but your opinion rely on accurate data.

BTW. I’m not English native but “thinks” is certain... “guess” is not sure?
Cerny is hyping his own design. That is all.
Xbox has 17% advantage over PS5. Just like PS4 had 40% over it last gen. Note that this on paper and real world case is going to far less just like last gen. Some guys have already mentioned it. I think BW is the key factor. Those 10gig of 560gb/sec in Xbox is their biggest advantage I think.
That's not accounting for PS5s variable clocks. The difference can be up to 25% depending on game
 

Goliathy

Banned
L4FimjI.png

and that's exacty why Sony couldn't show us a single advantage of an SSD - except shorter loading times.
 

quest

Not Banned from OT
He does state 'a couple' percent performance drop'. Its in the GPU section of the video.
Exactly he avoided saying 2% and a chart of the exact down clocking. Couple could mean a lot of things to a lot of people generic term. He never committed to hard number or charts. Like a certain company in 2013. If it was 2% then lock it 2% lower and be done with it but it must vary more not to lock it. It's common sense. Like it the garbage it displays at 1080p in 2013. Worse new feature since esram and dd3.
 

Dural

Member
Exactly he avoided saying 2% and a chart of the exact down clocking. Couple could mean a lot of things to a lot of people generic term. He never committed to hard number or charts. Like a certain company in 2013. If it was 2% then lock it 2% lower and be done with it but it must vary more not to lock it. It's common sense. Like it the garbage it displays at 1080p in 2013. Worse new feature since esram and dd3.

Exactly, if it only dropped 2% you wouldn't make it variable. If anything that statement about dropping 2% in clocks for 10% power drop is more of an indictment of the design and shows how it was a last minute overclock. Common sense is something the PlayStation fans have lacked from the beginning of the leaks, the GitHub leak was so obviously the PS5 yet so many didn't want to believe it.
 
If directx12ULTIMATEAPI makes game development easier by simultaneously building games for both PC and XsX, does tnat mean 'play anywhere' can take effect. So If I were to buy Forza 8 or Forza Horizon 5 for XsX I can also play it on PC? The thing is, PC's do not have 4k Ultra-Blu Ray Disk drives. They still use shitty DVD players, not even Blu-ray drives
 
If directx12ULTIMATEAPI makes game development easier by simultaneously building games for both PC and XsX, does tnat mean 'play anywhere' can take effect. So If I were to buy Forza 8 or Forza Horizon 5 for XsX I can also play it on PC? The thing is, PC's do not have 4k Ultra-Blu Ray Disk drives. They still use shitty DVD players, not even Blu-ray drives
PC's don't even come with disc drives, for many, many years now... Bluray is quite old and antique at this point, unless for watching movies.
 
PC's don't even come with disc drives, for many, many years now... Bluray is quite old and antique at this point, unless for watching movies.

Some still come with DVD-Drives which is baffeling. Atleast make it Blu-Ray. In the OT-Topic Forum, DVD movie sales still beat Blu-Ray and Ultra-Blu Ray. That is pathetic.
 

lynux3

Member
and that's exacty why Sony couldn't show us a single advantage of an SSD - except shorter loading times.
Sony hasn't really shown us anything regarding their SSD except for what it's capable of. Technically we weren't even supposed to see that loading time demo. What does Alex know about open world game development and design that the other multiple excited developers don't know?
 
Last edited:
Sony hasn't really shown us anything regarding their SSD except for what it's capable of. Technically we weren't even supposed to see that loading time demo. What does Alex know about open world game development and design that the other multiple excited developers don't know?
Maybe those devs were referring to other things like the 3D audio? And what were the devs praising? What were they comparing it to?
 

lynux3

Member
Maybe those devs were referring to other things like the 3D audio? And what were the devs praising? What were they comparing it to?
I guess you don't follow recent discussion lately, but I'll give you a hint. It was the most requested feature from developers.
 
Some still come with DVD-Drives which is baffeling. Atleast make it Blu-Ray. In the OT-Topic Forum, DVD movie sales still beat Blu-Ray and Ultra-Blu Ray. That is pathetic.
I have not seen a desktop computer with a physical drive in almost 10 years or so. Games aren't distributed on disc format on pc side of things. When you buy a game from Steam, Gog, Amazon, or whatever store front you choose, you don't get sent a disc in the mail. You just download your games, and that's it.

If you need a standalone disc drive, no one is stopping you though. That's the beauty of having a choice. 🧠
 
Ok do you have a better source to go against what Cerny said?

That is the point we only have what Cerny said... so using made up numbers in a discussion is what a fanboy should avoid.

If you stay accurate you can be anything but your opinion rely on accurate data.

BTqW. I’m not English native but “thinks” is certain... “guess” is not sure?
Mark corny is a salesman.... I won't bo as far as you say he's a liar, but that said he will say anything to make a sale.
1.the 8.4TF statement
2. The doby atmost Comment.
How can you not know what you're putting In the hardware you designed?
 

Neo_game

Member
Cerny is hyping his own design. That is all.

That's not accounting for PS5s variable clocks. The difference can be up to 25% depending on game

4K is 8million. PS5 games will be having some 6millions which is some 30% less pixels. These are diminishing return when you are gaming at such high resolution. That is the easiest way devs are going to scale this little difference between the two.
 

MikeM

Member
Bringing this thread back to life.
Question: are there any games out there for Series X that are confirmed to be using mesh shaders? I want to see what a difference using this tech will be in the real world.
 
Bringing this thread back to life.
Question: are there any games out there for Series X that are confirmed to be using mesh shaders? I want to see what a difference using this tech will be in the real world.

No way any would be ready this soon. This is the first time any console has ever had this capability, and the only other place it could be found was in Nvidia RTX cards, which of course also weren't being fully taken advantage of yet on the PC side. Some of these features have only just this year shown up in 3DMark feature tests. It will take time.
 

Lethal01

Member
No way any would be ready this soon. This is the first time any console has ever had this capability, and the only other place it could be found was in Nvidia RTX cards, which of course also weren't being fully taken advantage of yet on the PC side. Some of these features have only just this year shown up in 3DMark feature tests. It will take time.

Raytracing capable games came pretty quickly when the RTX cards launched.
 

Schmick

Member
I'm pretty sure I read somewhere that games that use Mesh shaders/meshlet culling will enjoy a 2X performance increase.

That's very significant.

I can't wait until Graphic Engines start taking advantage of these features.
If this is true then why am I hearing GAF talking about the XSS not be able to survive the whole duration of this generation?
 

M1chl

Currently Gif and Meme Champion
Bringing this thread back to life.
Question: are there any games out there for Series X that are confirmed to be using mesh shaders? I want to see what a difference using this tech will be in the real world.
Probably not yet, this sort of technology takes few months/years for it to be stable version (for whole game, not tech demo). True basically for anything which works with very basics of the engine...
 

Sosokrates

Report me if I continue to console war
Bringing this thread back to life.
Question: are there any games out there for Series X that are confirmed to be using mesh shaders? I want to see what a difference using this tech will be in the real world.

No, but Epic have software based shaders in UE5 which are similar to mesh shaders and offer similar performance.
 
Last edited:

Genx3

Member
If this is true then why am I hearing GAF talking about the XSS not be able to survive the whole duration of this generation?

People won't be discussing Mesh shader performance improvements until they see it with their own eyes.

Right now there aren't any currently released games taking advantage of this.

Tests have been done on this and in those tests a 2 times performance increase has been consistently observed.

Also as long as games have to be released on XB1 and PS4 then devs have little reason to use Mesh shaders because those consoles don't have support for it.

I guess last gen really is holding back this gen.
 
If this is true then why am I hearing GAF talking about the XSS not be able to survive the whole duration of this generation?

That's only if they're expecting it to do what Series X and PS5 does. It doesn't have to. It only needs to do what it can do at lower resolutions, which will be a whole hell of a lot more just with Sampler Feedback Streaming coming into play before Mesh Shaders.

Series S is due at some point this gen along with Series X to get a pretty nice boost in RAM efficiency thanks to SFS. If a hypothetical game on Series S right now is using 4GB of RAM for textures, that would become roughly 1.6GB total with Sampler Feedback Streaming. That single change in how games are developed would give Series S 12.4GB of effective RAM as opposed to its current maximum of 10GB. Suddenly Series S would be just over 1GB of effective RAM behind Series X's total available RAM for games, not accounting for SFS being used on Series X.

Series S has 10GB of RAM total. 8GB @ 224GB/s & 2GB @ 56GB/s. It could become a hypothetical 12.4GB effective RAM with SFS if the game needs 4GB for its texture budget.
 

Sosokrates

Report me if I continue to console war
Being that the mesh shaders in DX12U are a hardware supported feature, I find this claim dubious at best.



Starts talking about it at 32.20

That video is well above my knowledge level, but at like 32.29 he says "primitive and mesh shaders can be faster but are still bottlenecked and not designed for this"

But hes talking about rasterization, so maybe primitive and mesh shaders will have other good uses.


From watching that video it seem UE5 benifits from compute, the more compute the better.
 
Last edited:
No, but Epic have software based shaders in UE5 which are similar to mesh shaders and offer similar performance.

Epic claimed they were using what they call "hyper optimised compute shaders" on the UE5 demo on PS5 as well Primitive Shaders which the engine also supports. Makes sense considering the stupid amounts of polygons they were pushing in scene.

Primitive Shaders are also similar to Mesh Shaders in terms of the raw performance gains they offer, only difference is PS modify the traditional graphics pipeline whilst MS introduce a new pipeline.

jwi8Hy3.jpg
 

Riky

$MSFT
I think we'll start seeing Sampler Feedback Streaming first, from what people have said Mesh Shaders in engine is further out. Jason Ronald said he expected to see SFS in use 2022, which makes sense as Microsoft first party seem to be going next gen only after Halo Infinite.
 

Sosokrates

Report me if I continue to console war
Epic claimed they were using what they call "hyper optimised compute shaders" on the UE5 demo on PS5 as well Primitive Shaders which the engine also supports. Makes sense considering the stupid amounts of polygons they were pushing in scene.

Primitive Shaders are also similar to Mesh Shaders in terms of the raw performance gains they offer, only difference is PS modify the traditional graphics pipeline whilst MS introduce a new pipeline.

jwi8Hy3.jpg
Primitive shaders are similar but mesh shaders are slightly better. Thats what Alex from DF said anyway.
But with UE5 being so custom, compute performance seems to be the most important thing in determining performance.
 

Lethal01

Member
People won't be discussing Mesh shader performance improvements until they see it with their own eyes.

Indeed, there is a big gap between, "I heard that some test show a consistent 2x performance increase"
And a game that runs 2x better due to mesh shaders.
 

Lethal01

Member
That's only if they're expecting it to do what Series X and PS5 does. It doesn't have to. It only needs to do what it can do at lower resolutions, which will be a whole hell of a lot more just with Sampler Feedback Streaming coming into play before Mesh Shaders.

Series S is due at some point this gen along with Series X to get a pretty nice boost in RAM efficiency thanks to SFS. If a hypothetical game on Series S right now is using 4GB of RAM for textures, that would become roughly 1.6GB total with Sampler Feedback Streaming. That single change in how games are developed would give Series S 12.4GB of effective RAM as opposed to its current maximum of 10GB. Suddenly Series S would be just over 1GB of effective RAM behind Series X's total available RAM for games, not accounting for SFS being used on Series X.

Series S has 10GB of RAM total. 8GB @ 224GB/s & 2GB @ 56GB/s. It could become a hypothetical 12.4GB effective RAM with SFS if the game needs 4GB for its texture budget.

If XSX, XSS and PS5 all start using more advanced streaming techniques like people are claiming they will then Series S will still keep being a bottleneck.
 

ZehDon

Member


Starts talking about it at 32.20

That video is well above my knowledge level, but at like 32.29 he says "primitive and mesh shaders can be faster but are still bottlenecked and not designed for this"

But hes talking about rasterization, so maybe primitive and mesh shaders will have other good uses.


From watching that video it seem UE5 benifits from compute, the more compute the better.

Comparing nanite geometry to mesh shaders isn't a 1:1 because they're really not the same thing.

Sony and Microsoft have implemented hardware level features for culling geometry - Sony's Primitive Shaders and Microsoft's Mesh Shaders. These are designed to simply cull triangles from their rendering pipelines based whatever factors the developer programs as a component of the rendering pipelines themselves. They're designed to work for everything from animated models to the world meshes; every triangle that is drawn on the screen can be passed through these systems. This should provide strong performance gains across the board for every game that implements them.

Nanite is an implementation of REYES and is limited to static meshes. It's designed to cull static world geometry without a perceptible loss of detail. This is a different goal to either Sony or Microsoft's shaders. Epic's method for achieving real-time performance is prohibitive to things like animated models, and Epic have commented in the past that they're still working on this, but it's all RnD at this point, so we're unlikely to see this in this version of Unreal Engine. As is true for virtually everything hardware supported, the hardware backed features will be reliably faster in more scenarios than Epic's current software solution.

The idea that Epic have surpassed Primitive or Mesh Shaders with nanite, when none of these have even been shipped and tested in a real game scenario, is a little hyperbolic.
 

Lethal01

Member
The idea that Epic have surpassed Primitive or Mesh Shaders with nanite, when none of these have even been shipped and tested in a real game scenario, is a little hyperbolic.

Yeah, Nanite is a known quantity that anyone can get their hands on and play around with, it's got it's limitations but it's usefulness is also clear.
Mesh shaders on the other hand are much more hard to test. Good thing is that not everything in a scene needs to be nanite so even if nanite is faster for static objects you could still get the benefit of mesh shaders for other stuff.
 
Raytracing capable games came pretty quickly when the RTX cards launched.

I don't think RT requires nearly as much a revamp of a game's graphics pipeline as mesh shaders will. So by comparison, it's easier to implement.

Also, Nvidia's been doing a lot of parternships with devs to help them implement RT (as well as DLSS) via CUDA support, so they have a financial interest right there. I'm sure Microsoft & Sony are assisting select developers (especially the bigger 3P ones) with API and middleware support to get RT working as best as possible on their platforms, but that's just something teams will have to mainly figure for themselves over time.

Same goes for mesh shaders, provided they can rewrite parts of their graphics pipeline to fully take support of it (or at worst, write new engines to do so).

Didn’t he say PS5’s Primitive Shaders feature is based on RDNA 1 which is completely false?

That was so long ago even he's probably forgotten. However, while PS5 does use Primitive Shaders, which aren't 100% similar to Mesh Shaders, it's using the updated version (I forgot what it was called, maybe Primitive Shader 1.2 or 2.1 or something like that), which AMD themselves didn't implement for their Vega GPUs (those used an earlier version).

Whatever specific differences there are between Primitive Shaders (specifically the ones Sony are utilizing) and Mesh Shaders, regardless how many or few, how big or small, we'll have to wait until a later time to see. I can't imagine the differences being too big though at the API level or featureset level, and probably not too much at the hardware level, either.

Would be best for not just their 1P devs but especially 3P devs if the two are more or less similar. Makes code portability much more manageable.
 
Last edited:
Top Bottom