• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Alan Wake 2 Xbox Tech Review - Excellent On Series X, But What About Series S?

Zathalus

Member
tflops are tflops if you are just brute forcing your games on these consoles like if they were low budget pcs.
Maybe if they utilized some smarter techniques and special optimizations this hardware uses, the results would be better.
But why do that if they can just slap it with fsr and change preset to low.
What 'special optimizations'? The PS5 is performing like a 2080, while the XSX is almost on the level of a 2080ti. This is better performance from the consoles then almost every other game out there. It's also one of the first multiplatform games to use Mesh/Primitive shaders. These console have nothing left to give.
 

Gaiff

SBI’s Resident Gaslighter
UE5 is doing Mesh shaders, and considering it's probably going to be the most used game engine of the generation, then a ton of games will use Mesh Shaders.
And the 5700Xt runs the game poorly because this game engine has no support for Primitive Shaders, unlike what UE5 does. So it has to emulate Mesh shaders through compute.
But it does have support for Primitive Shaders which is what the PS5 uses. DX12 presumably has no support for Primitive Shaders which is why it runs like ass on older cards.

And I still understand wtf is the difference between primitive and mesh shaders. All the documentation I've read only speaks of Mesh Shaders.
 

winjer

Gold Member
But it does have support for Primitive Shaders which is what the PS5 uses. DX12 presumably has no support for Primitive Shaders which is why it runs like ass on older cards.

And I still understand wtf is the difference between primitive and mesh shaders. All the documentation I've read only speaks of Mesh Shaders.


Mr. Wang
Certainly, Mesh Shader was adopted as standard in DirectX 12. However, the new geometry pipeline concept originally started with the concept of tidying up the complicated geometry pipeline, making it easier for game developers to use, and to make it easier to extract performance. In other words, it can be said that both AMD and NVIDIA had the same goal as the starting point of the idea. To put it bluntly, Primitive Shader and Mesh Shader have many similarities in terms of functionality, although there are differences in implementation.
So did AMD abandon the Primitive Shader? As for hardware, Primitive Shader still exists, and how to use Mesh Shader is realized with Primitive Shader , it corresponds to Mesh Shader with such an image.

Mr. Wang
Primitive Shader as hardware exists in everything from Radeon RX Vega to the latest RDNA 3-based GPU. When viewed from DirectX 12, Radeon GPU's Primitive Shader is designed to work as a Mesh Shader.
 

Gaiff

SBI’s Resident Gaslighter
Mr. Wang
Primitive Shader as hardware exists in everything from Radeon RX Vega to the latest RDNA 3-based GPU. When viewed from DirectX 12, Radeon GPU's Primitive Shader is designed to work as a Mesh Shader.
And this is what confuses me. If this were true, then the 5700 XT wouldn't perform so poorly. Remedy said verbatim that it's an unsupported GPU but if viewed from DX12, Primitive Shaders are designed to work as Mesh Shaders, why isn't it the case with RDNA1 cards?
 
This is a thread example of why I don't really fuck wit the green rats

When it performs better on ps5, they all pop in talking bout it's some BS like tools, marketing and shit.

When xsx does better - best believe it's time to retire the playstation, all game from now on - future technology - will run better on XSX and it's alright, just accept it, xsx the goat, ps weaker like we was saying back in 2020....

I say shite
 

DaGwaphics

Member
What 'special optimizations'? The PS5 is performing like a 2080, while the XSX is almost on the level of a 2080ti. This is better performance from the consoles then almost every other game out there. It's also one of the first multiplatform games to use Mesh/Primitive shaders. These console have nothing left to give.

Yeah, I don't really see how you can complain about the console performance. When you see what others are getting when they run PCs using these same/approximate parameters the consoles seem to be punching above their weight, it's just a really heavy game. The "low" in this game isn't particularly easy to run, I think that's what throws people off.

@ SkyHighTrees SkyHighTrees , I'm sure PS5 will get a cleanup patch before too long and they will make adjustments as needed. Let's not act like the same doesn't happen in reverse, or that it isn't ignored when the fixes come in if Xbox is down. LOL
 
Last edited:

Bojji

Member
And this is what confuses me. If this were true, then the 5700 XT wouldn't perform so poorly. Remedy said verbatim that it's an unsupported GPU but if viewed from DX12, Primitive Shaders are designed to work as Mesh Shaders, why isn't it the case with RDNA1 cards?

I think RDNA2 hardware and/or AMD drivers are doing MS -> PS conversion automatically. Developers would have to specifically made code to use PS natively just for RDNA1 cards, they won't bother.
 
Last edited:

winjer

Gold Member
And this is what confuses me. If this were true, then the 5700 XT wouldn't perform so poorly. Remedy said verbatim that it's an unsupported GPU but if viewed from DX12, Primitive Shaders are designed to work as Mesh Shaders, why isn't it the case with RDNA1 cards?

That's what is confusing me as well.
From what I understand Mesh Shaders replace the traditional geometry stage with one single stage. So it removes VS+HS+TS+DS+GS.
Basically, MS is like Compute Shaders, but for geometry.
Primitive Shaders also relace most of the traditional stages with two stages: Surface shader + Primitive shader. And it keeps the Tesselation stage in between.

Traditional:
cKD1oa3.png


Primitive Shader
Hw0WGYl.png


Source: AMD patent
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
That's what is confusing me as well.
From what I understand Mesh Shaders replace the traditional geometry stage with one single stage. So it removes VS+HS+TS+DS+GS.
Basically, MS is like Compute Shaders, but for geometry.
Primitive Shaders also relace most of the traditional stages with two stages: Surface shader + Primitive shader. And it keeps the Tesselation stage in between.

Traditional:
cKD1oa3.png


Primitive Shader
Hw0WGYl.png


Source: AMD patent
I get that part. The disconnect I don't understand is the point where the Primitive Shaders on RDNA1 cards aren't viewed as Primitive Shaders like the link you posted suggested.

I think RDNA2 hardware and/or AMD drivers are doing MS -> PS conversion automatically. Developers would have to specifically made code to use PS natively just for RDNA1 cards, they won't bother.
You mean Primitive to Mesh Shaders conversion automatically and not the other way around, correct?

And also, I've found this. This could explain the performance deficit of RDNA1 cards.

https://www.techpowerup.com/240879/amd-cancels-implicit-primitive-shader-driver-support#:~:text=Primitive shaders are lightweight shaders,been delayed again and again.

At one of its 2018 International CES interactions with the press, AMD reportedly announced that it had cancelled the implicit driver path for primitive shaders. Game developers will still be able to implement primitive shaders on AMD hardware, using a (yet to be released) explicit API path. The implicit driver path was the more interesting technology though, since it could have provided meaningful performance gains to existing games and help cut down a lot of developer effort for games in development. AMD didn't state the reasons behind the move.
 
Last edited:

skneogaf

Member
I actually thought when I was watching that maybe the xbox series x version has been held back to match the ps5 version.

Both modes are about as good as it gets for locked frame rates in series x.

Does this game support unlocked frame rates on consoles with a vrr display?

Hmmm interesting as I feel like the xbox fans were saying mesh shaders were going to be a big difference between the two consoles.

I remember seeing a demo of a mesh shader thing that had loads of columns or something being displayed.

Is there a tangible difference between the two consoles on mesh shading abilities or was it hyperbolic nonsense and is this game just a coincidence?
 

Bojji

Member
I get that part. The disconnect I don't understand is the point where the Primitive Shaders on RDNA1 cards aren't viewed as Primitive Shaders like the link you posted suggested.


You mean Primitive to Mesh Shaders conversion automatically and not the other way around, correct?

And also, I've found this. This could explain the performance deficit of RDNA1 cards.

https://www.techpowerup.com/240879/amd-cancels-implicit-primitive-shader-driver-support#:~:text=Primitive shaders are lightweight shaders,been delayed again and again.

Calculations are done on Primitive shaders but devs see them as Mesh Shaders thanks to DX12 layer - no different then on Nvidia hardware. At least that's how I understand it.

I actually thought when I was watching that maybe the xbox series x version has been held back to match the ps5 version.

Both modes are about as good as it gets for locked frame rates in series x.

Does this game support unlocked frame rates on consoles with a vrr display?

Hmmm interesting as I feel like the xbox fans were saying mesh shaders were going to be a big difference between the two consoles.

I remember seeing a demo of a mesh shader thing that had loads of columns or something being displayed.

Is there a tangible difference between the two consoles on mesh shading abilities or was it hyperbolic nonsense and is this game just a coincidence?

Mesh shaders are non issue, hardware for this is actually mostly the same on both consoles just the implementation is different.

Biggest differences between PS5 and Xbox are hardware VRS (so far shit in 99% of games using it) and sampler feedback streaming (so far usage of it is not known).
 
Last edited:

winjer

Gold Member
I get that part. The disconnect I don't understand is the point where the Primitive Shaders on RDNA1 cards aren't viewed as Primitive Shaders like the link you posted suggested.

That is not the issue I have.
According to Mr. Wang from AMD, primitive Shaders work as Mesh Shaders. And he says that Primitive Shaders are on Radeon since Vega.
But then why is it that Alan Wake works perfectly well on RNDA2 and 3. But does not run well on RDNA1 and Vega.
Is something missing in the Primitive Shaders on these older GPUs. Or something is not enabled on AMD's drivers for those cards?


That thing was about Vega, the first to use primitive Shaders. At the time it was touted as being able to improve performance a lot.
But AMD said it would require a driver to be enabled on all games. But that never happened. So only when devs implement support on the game.
Some people speculated that the Primitive Shaders on Vega was not complete.
And that this was also the reason why Raja left AMD.
 

Fafalada

Fafracer forever
And the image quality is one of the worst I've seen this gen. Mega ssr grain, mega fsr breakup, mega specular shimmer...
To be fair - that's endemic of Remedy's tech - their last two releases were extra grainy as well (without any RT use too, and more so with RT on).
And even with DLSS Control is - well - breaking up a lot in motion.
 

Gaiff

SBI’s Resident Gaslighter
That is not the issue I have.
According to Mr. Wang from AMD, primitive Shaders work as Mesh Shaders. And he says that Primitive Shaders are on Radeon since Vega.
But then why is it that Alan Wake works perfectly well on RNDA2 and 3. But does not run well on RDNA1 and Vega.
Is something missing in the Primitive Shaders on these older GPUs. Or something is not enabled on AMD's drivers for those cards?
That's exactly what I asked a few posts above lol. Been looking everywhere and no one really has an answer. Lots of speculations and proposed explanations but nothing real and verifiable.
That thing was about Vega, the first to use primitive Shaders. At the time it was touted as being able to improve performance a lot.
But AMD said it would require a driver to be enabled on all games. But that never happened. So only when devs implement support on the game.
Some people speculated that the Primitive Shaders on Vega was not complete.
And that this was also the reason why Raja left AMD.
Oh, you're right. RDNA1 didn't come out until March 2019. I thought the 5700 XT was already out back then. Turns out not and this is strictly for Vega, but at the same time, RDNA1 also supports Primitive Shaders so I would assume that implicit driver path was also meant to support RDNA1 which also has Primitive Shaders because Mesh weren't implemented until RDNA2.
 

adamsapple

Or is it just one of Phil's balls in my throat?
I actually thought when I was watching that maybe the xbox series x version has been held back to match the ps5 version.

Both modes are about as good as it gets for locked frame rates in series x.

Does this game support unlocked frame rates on consoles with a vrr display?

Hmmm interesting as I feel like the xbox fans were saying mesh shaders were going to be a big difference between the two consoles.

I remember seeing a demo of a mesh shader thing that had loads of columns or something being displayed.

Is there a tangible difference between the two consoles on mesh shading abilities or was it hyperbolic nonsense and is this game just a coincidence?

Sadly, no unlocked.

But the Xbox version looks like it *could* have overhead for a DRS supported 40h mode. PS5 will probably be better locked with another patch or two too.

In general, I hope they try to do a 40hz mode in future updates.

This is a thread example of why I don't really fuck wit the green rats

When it performs better on ps5, they all pop in talking bout it's some BS like tools, marketing and shit.

When xsx does better - best believe it's time to retire the playstation, all game from now on - future technology - will run better on XSX and it's alright, just accept it, xsx the goat, ps weaker like we was saying back in 2020....

I say shite

Can you point to which posts are saying this here?
 
Last edited:

lh032

I cry about Xbox and hate PlayStation.
damn, looks like i will be on hold first, PS5 needs additional polishing
 

M1chl

Currently Gif and Meme Champion
nanite leverages primitive shaders, but we havent seen UE5 games completely crush pascal and old AMD GPUs do. Maybe the rdna 1.0 gpus can get primitive shader support in the future but pascal, vega and polaris GPUs are shit out of luck with this game. Whereas UE5 games run more or less according to their tflops specs on those older GPUs.

This is basically the first game to offer such a drastic performance increase by leveraging mesh shaders.
and this is why having multiple (or rather a lot of) game engines is important. Given that I lost a lot of hope for UE5 due to archaic CPU architecture of said engine (which was already troublesome in UE4), so it would be sad if it would be same story for the whole gen. I don't give a fuck how many effect you can cram on the screen, what is important, that you will keep up the pace with HW and when I see shit like this like with Northlight (which frankly has amazing physics as well, but its not really utilized in AW2) my heart beats fast. Because otherwise we are still stuck with polished PS4/XB1 engines, which was quickly reworked from X360 ones. Just a few games, really leveraging the features set of HW available (to masses, like 500USD consoles).
 
When the PS5 Pro releases will Remedy have the decency to update it again? Doesn't have dynamic res scaling as far as I'm aware. It blows my mind that devs still aren't future proofing these games and I don't have faith they'll patch them when better hardware arrives.
 

adamsapple

Or is it just one of Phil's balls in my throat?
Does anyone know who created Mesh Shaders? No right? It is was not Sony, Microsoft or AMD.

It was Nvidia who created just like FXAA.




I don't think anyone is debating about who created it. Both AMD and Nvidia have supported it for a few years now. We just haven't really seen games use it. AW2 is probably the first big retail game other than a demo or two that have used it.
 

DenchDeckard

Moderated wildly
This is a thread example of why I don't really fuck wit the green rats

When it performs better on ps5, they all pop in talking bout it's some BS like tools, marketing and shit.

When xsx does better - best believe it's time to retire the playstation, all game from now on - future technology - will run better on XSX and it's alright, just accept it, xsx the goat, ps weaker like we was saying back in 2020....

I say shite

Can you show me the posts that back up what you are claiming about green rats, here?

I'm seeing level posts with people talking about why the xbox gpu is flexing here, wider cus and more advanced than the ps5s. Its only natural that a game that's using this many next gen techniques is going to see a performance difference, which is inline with the power gap we've known about for 3 years.

The issue this gen is everything is cross gen and based on older techniques. The great news is the ps5 Pro will come out next year, perfectly in time with more next gen games and it will take the performance crown on every game. So you can relax, everything will be OK.
 
Last edited:

rofif

Can’t Git Gud
Good to see these next gen features finally coming to fruition, great performance on Series X at launch, vindicated again.

If you've got a PS5 with a VRR TV you're going to be fine.
What next gen features?

Vrr helps only in performance mode and even then it goes below 48fps in stutter land. And vrr does nothing for 30fps. They didn’t even implement low fps compensation here. Lazy bastards.
 

sinnergy

Member
Can you show me the posts that back up what you are claiming about green rats, here?

I'm seeing level posts with people talking about why the xbox gpu is flexing here, wider cus and more advanced than the ps5s. Its only natural that a game that's using this many next gen techniques is going to see a performance difference, which is inline with the power gap we've known about for 3 years.

The issue this gen is everything is cross gen and based on older techniques. The great news is the ps5 Pro will come out next year, perfectly in time with more next gen games and it will take the performance crown on every game. So you can relax, everything will be OK.
Personally I always looked at the designs and saw their strengths, I stated this multiple times .. instead of cheering for your team. But it’s hard it seems, to look objectively. PS5 is more singing when the engine is build more like a traditional engine , these like clock speeds. And there are enough of these, we just get away from cross-gen.

Series is more about compute , more Cu’s , meshshaders , VRS, SFS, a higher TF , more bandwidth a wider bus , ,but these all need to be implemented in next or this generation engines to use greater parallel computing .

It is not for nothing that if. PS5 pro comes out with the rumored 60 (66?) Cus that they do this …

And it is also visible at EPIC, that want to leverage parallelism on the CPU side, as their tech is mostly single (couple of threads ) threaded. And we see how performance is in titles released .
 
Last edited:
So mesh shaders are far more intensive than ray tracing?

It was always the case that as graphics engines progress, developers would want to push more and more triangles, the problem is the the old graphics pipeline pretty much sucked at doing this and it had so many constraints such as lack of fine grain culling.

You can render the same amount of polygons on AW2 with the old pipeline but the performance hit would be significantly higher.

UE5 is doing Mesh shaders, and considering it's probably going to be the most used game engine of the generation, then a ton of games will use Mesh Shaders.
And the 5700Xt runs the game poorly because this game engine has no support for Primitive Shaders, unlike what UE5 does. So it has to emulate Mesh shaders through compute.

UE5 does support Mesh and Primtive Shaders, but Nanite mostly leverages what Epic call "hyper optimized compute shaders", it's able to render the triangles faster, but not in all scenarios, in which case it falls back onto Mesh and Primtive Shaders.

And this is what confuses me. If this were true, then the 5700 XT wouldn't perform so poorly. Remedy said verbatim that it's an unsupported GPU but if viewed from DX12, Primitive Shaders are designed to work as Mesh Shaders, why isn't it the case with RDNA1 cards?

If we look at the the RDNA ISA and whitepapers, there were no changes made to the hardware to accommodate Mesh Shaders. so it's very likely a driver problem which is not something rare with AMD... I know the current discourse around this topic with tech enthusiasts is that that RDNA 1 lacks a lot of driver support compared to RDNA 2 and 3 but yes like you said, a lot of this is just speculation and we don't have a lot of official information on this.
 
Last edited:

Filben

Member
They didn’t even implement low fps compensation here.
This is partly on Sony. Because Sony doesn't allow to run all games in a 120Hz container on a system level like (IIRC) the XSX does or you can do on PC, devs have to manually implement a 120Hz mode for LFC to work. It would have been easy, like on PC, just to set your console and whatever output/game to 120Hz and have enough Hz-headroom for LFC. But the PS5 only outputs the Hz value the game is given to it. A bit more control from the PS5's system level would be amazing. For example on PC I can set my monitor to 60, 100, 120 or 165Hz. The game's support has nothing to with it. And 120Hz is actually a pretty amazing sweetspot because even static fps targets of 30, 40, 60 run very well and for titles supporting above 60fps you have enough Hz headroom for that extra +60fps smoothness. But Sony being Sony and devs being devs for not implementing 120Hz modes because... I don't know, they equate Hz with fps, is my guess, and do not see any reason for that on performance heavy titles. As if 120fps is the only reason for a 120Hz mode. Which is clearly not.
 

rofif

Can’t Git Gud
This is partly on Sony. Because Sony doesn't allow to run all games in a 120Hz container on a system level like (IIRC) the XSX does or you can do on PC, devs have to manually implement a 120Hz mode for LFC to work. It would have been easy, like on PC, just to set your console and whatever output/game to 120Hz and have enough Hz-headroom for LFC. But the PS5 only outputs the Hz value the game is given to it. A bit more control from the PS5's system level would be amazing. For example on PC I can set my monitor to 60, 100, 120 or 165Hz. The game's support has nothing to with it. And 120Hz is actually a pretty amazing sweetspot because even static fps targets of 30, 40, 60 run very well and for titles supporting above 60fps you have enough Hz headroom for that extra +60fps smoothness. But Sony being Sony and devs being devs for not implementing 120Hz modes because... I don't know, they equate Hz with fps, is my guess, and do not see any reason for that on performance heavy titles. As if 120fps is the only reason for a 120Hz mode. Which is clearly not.
Partially on Sony but there is no reason they don’t implement lfc themselves. A lot of games have it
 

Riky

$MSFT
Can you show me the posts that back up what you are claiming about green rats, here?

I'm seeing level posts with people talking about why the xbox gpu is flexing here, wider cus and more advanced than the ps5s. Its only natural that a game that's using this many next gen techniques is going to see a performance difference, which is inline with the power gap we've known about for 3 years.

The issue this gen is everything is cross gen and based on older techniques. The great news is the ps5 Pro will come out next year, perfectly in time with more next gen games and it will take the performance crown on every game. So you can relax, everything will be OK.

It's interesting we've now seen a deep Tier 2 VRS implementation with Forza and now Mesh Shaders being utilised finally. Since AW2 is pretty dark it could of also benefited from Tier 2 VRS, it would be interesting to see how an unlocked mode would perform, how much headroom over 60fps is there on Series X?
We just need to see the first games using Sampler Feedback Streaming now which we know is now in the GDK, maybe the first party releases next year will feature it.
All these technologies coming to fruition is probably why MS feels they don't need a Pro console.
 

winjer

Gold Member
That's exactly what I asked a few posts above lol. Been looking everywhere and no one really has an answer. Lots of speculations and proposed explanations but nothing real and verifiable.

Or maybe what that Mr Chang said was just wrong, and RDNA2 and 3 do have Mesh Shaders and not Primitive Shaders.
 
Or maybe what that Mr Chang said was just wrong, and RDNA2 and 3 do have Mesh Shaders and not Primitive Shaders.

Or what the VP of engineering at AMD said was correct... but again it's not like we have driver level code showing Mesh Shaders being converted into Primtive Shaders on RDNA 2 desktop GPU's. /s

So do RDNA 2 and 3, but AMD's drivers still convert Mesh Shaders into Primitive Shaders in code on the driver level.

In fact AMD's RDNA ISA and white-papers don't even make mention of Mesh Shaders.

We already knew this about three years ago.

 

Radical_3d

Member
MS feels they don't need a Pro console
MS can feel however it wants. If the guys looking for the cutting edge of the affordable market have a 12TF Xbox vs a 20+TF option, they’ll choose the later. With the 75 billions of buying Skyfield and Candy Crush they could have an state of the art mid-gen refresh. Personally I’m waiting for the physical release and the pro console to play this next year.
 

Bojji

Member
In short. Next gen engine like UE5, next Capcom Engine, Remedy games will perform better on Xbox Series X than PS5 due it's clear that Xbox Series X GPU is more advance than PS5 and software hacks cannot save you all the time.

Hahaha, of you really think that way you will be surprised few times in the future.

People are throwing words like "next gen engine", "old gen engine" like it actually means something, we have cumpute heavy engines for years now.

PS5 has faster hardware in some GPU aspects (thanks to higher clocks mostly) but slower in others.

Xbox has faster hardware in some GPU aspects but slower in others and higher memory bandwidth.

Some games/engines will be bottlenecked by things slower on Xbox hardware and some will be bottlenecked by things slower on playstation hardware.

This is how this generation is going so far and it won't change most likely. So yeah this is pure hyproscisy when game is performing better on Xbox and people are screaming "finally next gen engine!" but when game is performing better on PS5 there are only excuses for lower Xbox performance, no mention of hardware aspects that could lead to it.
 
MS can feel however it wants. If the guys looking for the cutting edge of the affordable market have a 12TF Xbox vs a 20+TF option, they’ll choose the later. With the 75 billions of buying Skyfield and Candy Crush they could have an state of the art mid-gen refresh. Personally I’m waiting for the physical release and the pro console to play this next year.
I don't think there will be pro consoles, just because game releases that fully support the current non pro consoles have been very slow coming.
I dont think a pro console sits right at the moment, as much as I'd like to see them personally.
 

Roxkis_ii

Member
Devs need to stop bending over to these people. most games if not all have been 30 fps. all the classics. all the gotys in the past 20-25 years have been 30 fps games. no one cares. make the game you want to make. i played FF16, zelda and now Spiderman 2 at 30 fps. they were smooth games. these silly compromised 60 fps modes are hurting the fidelity and artistic intent of these games and causing more drama thanks to DF than its worth.

I'm sorry, how does a proformance mode stop you from enjoying games at 30 fps? Should dev also remove options from pc user too?
 

sinnergy

Member
Hahaha, of you really think that way you will be surprised few times in the future.

People are throwing words like "next gen engine", "old gen engine" like it actually means something, we have cumpute heavy engines for years now.

PS5 has faster hardware in some GPU aspects (thanks to higher clocks mostly) but slower in others.

Xbox has faster hardware in some GPU aspects but slower in others and higher memory bandwidth.

Some games/engines will be bottlenecked by things slower on Xbox hardware and some will be bottlenecked by things slower on playstation hardware.

This is how this generation is going so far and it won't change most likely. So yeah this is pure hyproscisy when game is performing better on Xbox and people are screaming "finally next gen engine!" but when game is performing better on PS5 there are only excuses for lower Xbox performance, no mention of hardware aspects that could lead to it.
Not true, there is always talk about games that perform better on Ps5 that favor clock speeds .. but the fact is , more cus are a benefit in engines that take this into account .. if Sony could, they would have these high clock speeds and equal cus like series X, they couldn’t , because of reasons ( probably staying in budget) but the fact is engines are going for more parallel instructions . Compute heavy as you call it .. and they will become even more compute. Heavy. Couple this with what there is available in hardware on Series X, it is a pretty smart design, hardware VRS, SFS, Full meshshader support, if this all is leveraged you can go a lot further.

The problem is, you need to implement this all, and that costs money and time. And most develop for all systems .. so we will see how this sticks.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
In short. Next gen engine like UE5, next Capcom Engine, Remedy games will perform better on Xbox Series X than PS5 due it's clear that Xbox Series X GPU is more advance than PS5 and software hacks cannot save you all the time.

Depends on the engine and which features of the engine developers are utilizing. There's no catch-all.
 
Not true, there is always talk about games that perform better on Ps5 that favor clock speeds .. but the fact is , more cus are a benefit in engines that take this into account .. if Sony could, they would have these high clock speeds and equal cus like series X, they couldn’t , because of reasons ( probably staying in budget) but the fact is engines are going for more parallel instructions . Compute heavy as you call it .. and they will become even more compute. Heavy.

You have to look beyond graphics engines as a whole, Not all graphics workloads are compute bound and may favor higher clocks, this is why we've been seeing the PS5 trade blows with the Series X.

SFS is another example, people think it's going to bring some sort of magic performance boost to Series X titles over PS5, but it's not a compute bound feature and scales more with SSD bandwidth and latency. In fact, we've already been seeing systems similar to SFS like Virtual Texturing in UE4 & 5 and in several other engines.

I wouldn't start making conclusions based on one game either.
 
Last edited:

Bojji

Member
Not true, there is always talk about games that perform better on Ps5 that favor clock speeds .. but the fact is , more cus are a benefit in engines that take this into account .. if Sony could, they would have these high clock speeds and equal cus like series X, they couldn’t , because of reasons ( probably staying in budget) but the fact is engines are going for more parallel instructions . Compute heavy as you call it .. and they will become even more compute. Heavy. Couple this with what there is available in hardware on Series X, it is a pretty smart design, hardware VRS, SFS, Full meshshader support, if this all is leveraged you can go a lot further.

The problem is, you need to implement this all, and that costs money and time. And most develop for all systems .. so we will see how this sticks.

When Microsoft doesn't give a crap about most of RDNA2 features then don't expect third party developers to actually do (and so far they don't). Right now we know that:

Vrs is crap, even if Xbox version would perfom 5% better thanks to it will most likely look noticeably worse.

Lack of Mesh shaders is non issue.

SFS is still pure mystery.

So far Xbox win in some games thanks to better aspects of some hardware parts not any of those RDNA2 exclusive features. Even in doom eternal when you have higher resolution thanks to VRS you can say that game actually has worse IQ because this technique makes some things look like crap.
 

Darsxx82

Member
This is how this generation is going so far and it won't change most likely. So yeah this is pure hyproscisy when game is performing better on Xbox and people are screaming "finally next gen engine!" but when game is performing better on PS5 there are only excuses for lower Xbox performance, no mention of hardware aspects that could lead to it.
That on many occasions the Studios prioritize the PS5 version and that it is also usually the base development platform is not a excuse, is a reality. On such evenly matched hardware, this is always going to have an effect

Especially when you see how in many cases how performance improves in a matter of less than 1 month or weeks on Xbox with patches that are more and more frequently arriving later on Xbox compared to PS5.
Depends on the engine and which features of the engine developers are utilizing. There's no catch-all.
And the optimization time dedicated to each platform and what is the basiedevelopment. In hardware (XSX and PS5) that is so on par and also sharing technology, this is always key. At least for the results on launch day because we've already seen how these change with successive patches. For example, RE4 Remake was quite significant.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
Vrs is crap, even if Xbox version would perfom 5% better thanks to it will most likely look noticeably worse.

So far Xbox win in some games thanks to better aspects of some hardware parts not any of those RDNA2 exclusive features. Even in doom eternal when you have higher resolution thanks to VRS you can say that game actually has worse IQ because this technique makes some things look like crap.

Bit of a generalization here, the difference in quality isn't noticeable outside of 400% zoom and decent usage of it can result in double digit performance improvements. Also, CDPR also uses it for Cyberpunk and it's hardly noticeable there as well.



Considering many of Xbox's first party output is on Unreal Engine 5 in the coming year(s), they'll probably support VRS, Unreal's version of mesh shading and all that jazz.
 

Bojji

Member
That on many occasions the Studios prioritize the PS5 version and that it is also usually the base development platform is not a excuse, is a reality. On such evenly matched hardware, this is always going to have an effect

Especially when you see how in many cases how performance improves in a matter of less than 1 month or weeks on Xbox with patches that are more and more frequently arriving later on Xbox compared to PS5.

And the optimization time dedicated to each platform and what is the basiedevelopment. In hardware (XSX and PS5) that is so on par and also sharing technology, this is always key. At least for the results on launch day because we've already seen how these change with successive patches. For example, RE4 Remake was quite significant.

This is reality and I know it. But what to expect, Xbox as a whole is outsold by PS5 in 2:1 ratio and then you have Xbox Series S that is dominating Xbox sales. I suspect developers pay more attention to Series S version sometimes and Series X "just works". That's the reality, I also doubt it produces any big differences, if game is unoptimized it is unoptimized in core code and both consoles outside of Apis are essentialy the same hardware.

Bit of a generalization here, the difference in quality isn't noticeable outside of 400% zoom and decent usage of it can result in double digit performance improvements. Also, CDPR also uses it for Cyberpunk and it's hardly noticeable there as well.



Considering many of Xbox's first party output is on Unreal Engine 5 in the coming year(s), they'll probably support VRS, Unreal's version of mesh shading and all that jazz.


I wonder what performance difference is produces and if it is even worth using, most people don't see this at all as they play with DLSS. Xbox version Cyberpunk had noticeable image deficiency with VRS and run worse than PS5 version.
 

Riky

$MSFT
Bit of a generalization here, the difference in quality isn't noticeable outside of 400% zoom and decent usage of it can result in double digit performance improvements. Also, CDPR also uses it for Cyberpunk and it's hardly noticeable there as well.



Considering many of Xbox's first party output is on Unreal Engine 5 in the coming year(s), they'll probably support VRS, Unreal's version of mesh shading and all that jazz.


id carried on developing it after Doom Eternal, there was a presentation on how much progress they have made. We've seen the fruits of this with Starfield and Forza where it hasn't even been noticed by some analysis outlets. When you're talking about millisecond frame budgets it's important.
 
Last edited:
Top Bottom