• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: The Matrix Awakens Tech Analysis + PS5 vs Xbox Series S/X Performance Analysis

Darsxx82

Member
This is false. They have worked on XSX when they made UE5 Valley of the Ancient demo for Xbox Series X.


MgrSEOD.png
What?? It is clear that you do not know the difference between having to show some cuts and having to show a complete demo published at the disposal of users. To show a few seconds of footage you don't need any experience or optimization time. Just pick a few cuts and say it works.
A public demo requires offering something that works decently on a specific hardware, and if you have no experience in it you have it much more difficult (even more if you are a small team)........ that's where TC appears to solve that deficit of experience in XSeries and, from what has been published, we can say that They has done an excellent job from which even PS5 has benefited.
 
Last edited:

RoadHazard

Gold Member
It’s in gameplay , rt reflections and shadows , together with Ssr, it in the DF video. Even a extra bounce in the calculations.

No, pretty sure everything I've read about it says RT shadows are not in gameplay. Where does DF say that?

What IS in gameplay is RT GI and reflections.
 
Last edited:

assurdum

Banned
If we really want to say it all who has no idea what he says and that objective was exactly one of those who in those famous threads shouted at the impossibility of the demos made for Sony to run on other devices at the same quality was you (with others). I waste time with you and I'll put you on ignore also because I don't like how you go personally ... I never said that you don't have the basis to talk about certain things (even if it is evident) I don't understand how you can point the finger at others.

To your knowledge the demo ran on the laptop .. the idea of an engine engineer talking about the resolution and how many framerate per second a video running on a laptop had ... is as stupid as your answers and an insult to human intelligence. Bye
I definitely have more tech basis than yours, that's for sure. The fact you continue to spread misinformation around what Epic/Sweeney really said about the ps5 hardware "advantages" show you are just a cheerleader fanboy as Ricky, who doesn't really care about the tech argumentation but it's just about rise the flag of the MS brand every time there is the opportunity and make console war if someone dare to praise anything relative to the ps5 hardware. Poor child, yeah better put me in the ignore list, it's better than to face who put you in front of your hypocrisy and asinine attitude.
 
Last edited:

MonarchJT

Banned
I definitely have more tech basis than yours, that's for sure. The fact you continue to spread misinformation around what Epic/Sweeney really said about the ps5 hardware "advantages" show you are just a cheerleader fanboy as Ricky, who doesn't really care about the tech argumentation but it's just about rise the flag of the MS brand every time there is the opportunity and make console war if someone dare to praise anything relative to the ps5 hardware. Poor child, yeah better put me in the ignore list, it's better than to face who put you in front of your hypocrisy and asinine attitude.
you have shown all the times that you have had the opportunity to say only biased things. This one on UE5 is just the latest ... on your technical knowledge I don't care about your opinion and not even on your skills please do not judge mine since even on this like everything else, you travel in total blindness.
the important thing is that as usual what we were discussing for months ended to be exactly as I was saying. The i / o exceeded the requests of the engine will not give any advantage so to say that certain things were only possible with the PS5 I / O was wrong ...after that level in ue5 and the GPU is the true limit. Data and facts show this. The rest is just talk
 
Last edited:
I just take what the developers who are actually using it say, id said they wish every platform supported it, I don't think it gets any clearer than that.
This is kind of strange to focus so much on this specific thing in every single conversation tho. It's a feature, it helps with performance (always at the cost of visual quality)... Sure, why would someone want it to be on everything so that everyone who wants to use it has access to it.... Nvidia cards have had it sine 2019, some intel laptop GPUs have had it for a while as well, AMD has had it on their GPUs as well and they call it Variable Shading it has been on PlayStation sine the ps4pro was released (2016 I believe).

So it's neither free performance, nor is it a unique feature... So why would developers act like it's not on "other" platforms?

Here is a word tomshardware has to say about it on PC:

Variable rate shading (VRS) is a type of rendering technique used by Nvidia graphics cards based on the Turing (RTX 20-series and GTX 16-series cards) and Ampere (RTX 30-series) architectures, as well as Intel's Gen11 graphics architecture, which arrived in laptops in 2019 via Intel's 10nm Ice Lake CPUs.

A little further on AMD:
AMD's alternative is called FidelityFX Variable Shading, also called VS. It differs by being open source, which AMD claims will allow for easier implementation in games.

Then for the PS5 (you must have been part of the thread about this tweet because you pay so much attention to this technology):


The opposite would have been surprising since the ps4pro had variable shading.

Qualcomm has support for a similar feature on their phone chips:

Apple too:

So where are those poor decelopers who don't have access to some sort of variable shading? The switch? Then why do you even bring it up here?
 

adamsapple

Or is it just one of Phil's balls in my throat?
Lol, this is hilarious coming from you. When it comes to PS5 vs XSX comparison, you say higher resolution is the way forward. :messenger_tears_of_joy:

Not to assume what Riky meant, but he´s talking about Unreal´s super resolution in general.

If two platforms are running the same super resoluition and one of them runs in a higher resolution, that´d be a separate topic and worth comparing.
 

Riky

$MSFT
This is kind of strange to focus so much on this specific thing in every single conversation tho. It's a feature, it helps with performance (always at the cost of visual quality)... Sure, why would someone want it to be on everything so that everyone who wants to use it has access to it.... Nvidia cards have had it sine 2019, some intel laptop GPUs have had it for a while as well, AMD has had it on their GPUs as well and they call it Variable Shading it has been on PlayStation sine the ps4pro was released (2016 I believe).

So it's neither free performance, nor is it a unique feature... So why would developers act like it's not on "other" platforms?

Here is a word tomshardware has to say about it on PC:



A little further on AMD:


Then for the PS5 (you must have been part of the thread about this tweet because you pay so much attention to this technology):


The opposite would have been surprising since the ps4pro had variable shading.

Qualcomm has support for a similar feature on their phone chips:

Apple too:

So where are those poor decelopers who don't have access to some sort of variable shading? The switch? Then why do you even bring it up here?


Because they are talking about software VRS which comes at a cost in resources and quality.
Hardware assisted Tier 2 VRS is a different beast.


Enjoy.
 

Nankatsu

Member
While slighty pretty, I wasn't impressed at all by it's performance.

Most games I've been playing lately constantly target the 60 fps, and the sudden shift back to 30 fps feels extremely bad and choppy.

And this is only a very scripted / limited tech demo, imagine if it was more free in terms of things to do.

Sure the demo isn't optimize for performance mode, but still....personally this is not what I want from next-gen performance wise.
 
Last edited:
Because they are talking about software VRS which comes at a cost in resources and quality.
Hardware assisted Tier 2 VRS is a different beast.


Enjoy.
They all show artefacts.

and it's not like the 10-15% extra performance figure was based on comparison with the existing variable shading implementations, but compared to nothing... Which means that the actual benefit may actually be close to nil, much like regular upscaling vs what Sony made for the ps4pro.

So who are the poor devs who can't do any kind of VRS?

And again, this is not like the MS dev blog was going to compare to what is in the PS5 with the geometry engines, or Qualcomm phone chips, etc. Their documentation is part of their PR strategy.
 
Last edited:

Riky

$MSFT
They all show artefacts.

and it's not like the 10-15% extra performance figure was based on comparison with the existing variable shading implementations, but compared to nothing... Which means that the actual benefit may actually be close to nil, much like regular upscaling vs what Sony made for the ps4pro.

So who are the poor devs who can't do any kind of VRS?

And again, this is not like the

You obviously didn't read the article,

"While the use of Tier 1 VRS in Gears Tactics offered some great performance gains, it had some small compromises to visual quality and didn’t work well with Dynamic Resolution Scaling. As a result, we investigated the extra flexibility allowed in Tier 2 to see if we could solve the Tier 1 shortcomings.

The primary difference between Tier 1 and Tier 2 VRS is granularity. Tier 1 allows you to specify a shading rate per draw. Tier 2 allows you to instead specify the shading rate in a screen space texture."

Then to DF in the Doom Eternal review,

"It's also interesting to note that Xbox Series consoles use the hardware-based tier two VRS feature of the RDNA2 hardware, which is not present on PlayStation 5. VRS stands for variable rate shading, adjusting the precision of pixel shading based on factors such as contrast and motion. Pre-launch there was plenty of discussion about whether PS5 had the feature or not and the truth is, it doesn't have any hardware-based VRS support at all."

What you believe doesn't really matter, we've got evidence from actual game developers saying different.
 
Last edited:

Riky

$MSFT
Just to be clear, I will not take you seriously because it has been shown time and again that no matter what VRS shows visible artefacts, I don't care how much better MS claims it is than tier 1 trolling... It's not.

It's not about taking me seriously it's about denying what MS, The Coalition, AMD and id are telling you, believe what you want but most reasonable people will accept what the people who make the hardware and are actually using it say as true. Nobody is saying it doesn't have any effect on visual quality but that in motion as it's a frame by frame implementation it's negligible compared to a lot lower resolution as Doom Eternal shows in the DF breakdown, also comes with a nice performance bump as also shown.
 

adamsapple

Or is it just one of Phil's balls in my throat?
Just to be clear, I will not take you seriously because it has been shown time and again that no matter what VRS shows visible artefacts, I don't care how much better MS claims it is than tier 1 trolling... It's not.

?


The whole point of VRS is to reduce fidelity in select areas of the frame to gain performance. If you don´t notice any difference outside of 400% zooms, VRS is doing its job as expected.

Tier 2 VRS is obviously an improvement over Tier 1 VRS. Reportedly, moving from T1 to T2 allowed the devs to gain 14% more performance in selected scenarios in Gears Tactics without any perceptible visual changes between them ..... that´s exactly what the feature is desiged to do.

VRS is not meant to be a silver bullet, but when used adequately, it gains you have some crucial performance without perceptible degradation to the visuals. I call that an absolute win.
 
Last edited:
So Sweeney lied but the reddit video showing a media player in a badly translated chinese AMA is a better source ?Sounds legit.
Yes, Sweeney lied by implying that one of his lead engineers doesn't know the difference between a video and a natively run application. Not sure what's more comical, Sweeney's damage control or people actually desperately believing his damage control.
 
True. I'm not complaining at all, I'm actually happy they're trying to push these systems to their max even if it meant choosing 24fps for the cinematics with further dips into low 20s during those intense collisions during gameplay. I really wanted to see what the Zen 2 CPUs in these systems were capable of when pushed harder, I know it's still not fully optimized and I'm sure there's still a ton of room for improvement on the CPU side. Very curious to see how it all shakes out in the coming years.

Actually most of the work they need to do to improve speeds is on the GPU, because this new tech is all light on CPU.
The CPU is not being stressed in these demos.
 

RoadHazard

Gold Member
So at 10tf rtx 3060m should run this at 1080p 60 fps?

Big boy consoles will definitely come down to 1080p for 60 fps.

With Series S chugging along at 720p 30.

Not likely. With all the heavy Lumen and Nanite stuff going on here, I doubt just lowering the resolution would give you double framerate. Also, 1440p is just 78% more pixels than 1080p, so even if it WAS as simple as pixels rendered -> framerate (it's not) you'd have to go even lower. We also don't know what the CPU utilization is at 30fps. They say there's headroom left for game logic, but how much? If utilization is already over 50% you're gonna have issues achieving 60fps (changing resolution etc has no effect on CPU load).
 
Last edited:

FrankWza

Member
Please name a cheaper higher performing option. This demo demonstrates the same performance as the higher end consoles. It has lower resolution and that is by design. The console is $300 no one but non Xbox customers complain about unrealistic expectations placed on the budget device. Paying more and getting higher resolution isn't remarkable.
Nope not at all. The PlayStation platform is objectively more expensive to purchase and play games on overall. First party titles cost more, upgrading games in many cases costs you extra, and cloud saves are behind a paywall. Sony also barely makes the digital version of the PS5 available where as Microsoft has kept plenty of XSS in the retail pipeline. Sony also does not allow you to purchase digital games outside of their storefront limiting ways to save money if you don't like Sony's price. I was at Target recently and there were several digital Xbox games for sale. Zero digital titles on PlayStation 5. For the same price you get far more games and options on XSS that you can't get on PlayStation at all. The PS5 certainly has a more powerful GPU but it isn't a better value.

On topic this demo is proof that the XSS is quite capable of playing current generation games at performance levels comparable to what more expensive devices have. It is quite impressive what $300 can get you. I'm curious what games that take full advantage of the full feature set will look like.
Kenan Thompson Reaction GIF by NBC

Tv Land Sleeping GIF by #Impastor
 

elliot5

Member
This is kind of strange to focus so much on this specific thing in every single conversation tho. It's a feature, it helps with performance (always at the cost of visual quality)... Sure, why would someone want it to be on everything so that everyone who wants to use it has access to it.... Nvidia cards have had it sine 2019, some intel laptop GPUs have had it for a while as well, AMD has had it on their GPUs as well and they call it Variable Shading it has been on PlayStation sine the ps4pro was released (2016 I believe).

So it's neither free performance, nor is it a unique feature... So why would developers act like it's not on "other" platforms?

Here is a word tomshardware has to say about it on PC:



A little further on AMD:


Then for the PS5 (you must have been part of the thread about this tweet because you pay so much attention to this technology):


The opposite would have been surprising since the ps4pro had variable shading.

Qualcomm has support for a similar feature on their phone chips:

Apple too:

So where are those poor decelopers who don't have access to some sort of variable shading? The switch? Then why do you even bring it up here?

I don't understand the purpose of the tweet. I think he's saying people pre launch were trying to claim VRS is better for optimization than Geometry Engine (aka mesh shaders and primitive shaders), because nobody knows what they're talking about.

Obviously reducing poly counts and simplifying your scene assets has much more benefit in terms of optimization than more granular pixel shader optimization (vrs tier 2). That's what he's pointing out here.

You can do both. Optimize meshes and optimize pixel shaders. One with have better gains than the other in most scenarios, but theyre not mutually exclusive techniques.
 
Last edited:

ethomaz

Banned

Maybe I’m missinterpreting it but to me that says they have the same version everywhere, and the engine just scales automatically depending on how much power is available on the hardware it’s currently running on.

I’m not sure what he is trying to say here.
Because the project needs to be compiled by the PS5 SDK or MS GDK that does optimizations specifically for platforms.

Maybe he is saying the base code was optimized by Coalition on Unreal Engine before export it to SDKs to futter be platform specific optimized to generate the platform binary.

So all the optimizations made by Coalition are indeed exported to all platforms no matter if beneficiate or not all platforms.
 
Last edited:
VRS is not meant to be a silver bullet, but when used adequately, it gains you have some crucial performance without perceptible degradation to the visuals. I call that an absolute win.
The problem is that they are perceptible... Otherwise we would not have this discussion. The thing is that the feature is presented as a silver bullet by some.

I won't argue against it, I think it's over sold and underwhelming.
 

Lethal01

Member
very , very distant from what people was thinnking and screaming about gb/s of streaming ahaahah

They literally said they had to include some prerendered scenes specifically because they didn't have enough time to stream data in for this demo.
So again this demo shows that we still need faster SSD's
 
I dont understand the "Coalition helped make optimize the engine narrative" as a fether in the cap of the Green Rats. If a XSX dev helped with optimization and the advantages in I/O dont result in better performance and it still runs better on PS5 dont you have to conclude that the PS5 is the more capable machine? Form a logical stand point that is
 
Last edited:

Greggy

Member
I dont understand the "Coalition helped make optimize the engine narrative" as a fether in the cap of the Green Rats. If a XSX dev helped with optimization and the advantages in I/O dont result in better performance and it still runs better on PS5 dont you have to conclude that the PS5 is the more capable machine? Form a logical stand point that is
I didn't know it had a better performance on PS5. Source?
This article proves that there is an Xbox studio at the very forefront of UE5 development, which is probably what the green ratpack is claiming as a win.
 

elliot5

Member
I dont understand the "Coalition helped make optimize the engine narrative" as a fether in the cap of the Green Rats. If a XSX dev helped with optimization and the advantages in I/O dont result in better performance and it still runs better on PS5 dont you have to conclude that the PS5 is the more capable machine? Form a logical stand point that is
nobody is pushing a narrative that the coalition helped optimize the engine as some sort of XSX win gd. The Coalition are just talented Unreal developers and work closely with Epic all the time including in UE5 early access. Some work they did for Epic's demo was engine level and improved performance across all target platforms. That's it. That's the tweet. There's no "takeaways", just an interesting tidbit that they weren't solely there to make memory optimizations to let it run on XSS.
 

Klik

Member
I just played it on PS5. It looks ok but honestly nothing impressive. I guess, graphicaly speaking, i expected too much from this gen
 

sinnergy

Member
I just played it on PS5. It looks ok but honestly nothing impressive. I guess, graphicaly speaking, i expected too much from this gen
On a crt tv? Is this a joke post ? It’s almost indistinguishable from movie offline rendered CGI? It’s almost on par with Final fantasy the spirits within but real time on 4 - 12 TF consoles ..
 

Ironbunny

Member
I just played it on PS5. It looks ok but honestly nothing impressive. I guess, graphicaly speaking, i expected too much from this gen

Totally opposite for me. Knowing the limits a console sets for hardware this went way beyond my expectations what is possible this gen. And when you think a story driven singleplayer game that is fully using the engine to its limits...I dont think we have seen nothing yet.
 

Mr Moose

Member
On a crt tv? Is this a joke post ? It’s almost indistinguishable from movie offline rendered CGI? It’s almost on par with Final fantasy the spirits within but real time on 4 - 12 TF consoles ..
I think you need to re-watch Spirits Within again. It did not age well.
 

DForce

NaughtyDog Defense Force
On a crt tv? Is this a joke post ? It’s almost indistinguishable from movie offline rendered CGI? It’s almost on par with Final fantasy the spirits within but real time on 4 - 12 TF consoles ..

Can't believe you said Spirits Within. lol

There are cutscenes from last generation look better than that movie. :messenger_tears_of_joy:
 
Top Bottom