• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[Digital Foundry] Immortals of Aveum PS5/Xbox Series X/S: Unreal Engine 5 is Pushed Hard - And Image Quality Suffers

sinnergy

Member
Dude multiple developers have openly given digital foundry exact ps5 and xsx settings for their games before. There is no nda here when it comes to this stuff.

And they trashed ue5 through out that interview.
They trashed nothing.. the even went from UE4, to UE5.1 in steps and they use most UE5 features, they liked that they even could use Zbrush models . And nanite, they did not like Lumen Hardware that much, but are looking into implementing in the future. But with all new tech it takes time. Who knows how clean their own code is.
 
Last edited:
They trashed nothing.. the even went from UE4, to UE5.1 in steps and they use most UE5 features, they liked that they even could use Zbrush models . And nanite, they did not like Lumen Hardware that much, but are looking into implementing in the future. But with all new tech it takes time. Who knows how clean their own code is.

Dirty devs.

LOL

First time I've heard that.
 

Darsxx82

Member
Dude multiple developers have openly given digital foundry exact ps5 and xsx settings for their games before. There is no nda here when it comes to this stuff.

And they trashed ue5 through out that interview.

No, it's not like that at all. When they have published console settings it has been in general and when said settings have been 1:1.

It is very different from going out and explaining specifics, specific optimizations and expressing differences (api limitations and hardware problems) between versions. The point where he talks about lack of ram due to excessive use of the OS vs PS5, and that they have discussed with MS how to solve that problem, and the specific capabilities deficiencies of the hardware are main points covered always by NDA contracts.

Of course there is nothing common and current or already seen. In fact, you yourself are the first to be surprised by the length of the publications.

He is clearly breaking the bounds of that NDA and will soon step aside the moment the call from above reaches him.
He has replied, here's the ini configs he posted.

This is mostly greek to me but just eye-balling it looks like the parameters are almost entirely identical between the two, barring some specific console level ini lines thta are in one but not the other.

Like, the Xbox section has a view distance and anti aliasing ini line, of course the PS5 would use them too but they're not written in the PS5 section, and some settings vice versa.


+CVars=r.FidelityFX.FSR2.Sharpness=0

Supposedly this is XSX specific... FSR CAS (sharpening filter) is disabled on XSX.

The rest of the ones he indicates match with the PS5 settings that he indicates. He does not mention about others.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
+CVars=r.FidelityFX.FSR2.Sharpness=0

Supposedly this is XSX specific... FSR CAS (sharpening filter) is disabled on XSX.

The rest of the ones he indicates coincide with the PS5 settings that he indicates. He does not mention about others.

This kinda reminds me of how the RE4 Remake update cycle went. At launch the PS5 version was softer and Xbox was sharper even though they had same pixel counts, then Capcom updated the game and it actually made the Xbox version softer as well, but then in a third patch they restored the Xbox versions sharpness and also improved the PS5 version.

Let's see if the Immortals team also makes tweaks like that with their updates. The developer did say the ini settings might not be valid anymore as the game gets patched.
 
Last edited:
No, it's not like that at all. When they have published console settings it has been in general and when said settings have been 1:1.

It is very different from going out and explaining specifics, specific optimizations and expressing differences (api limitations and hardware problems) between versions. The point where he talks about lack of ram due to excessive use of the OS vs PS5, and that they have discussed with MS how to solve that problem, and the specific capabilities deficiencies of the hardware are main points covered always by NDA contracts.

Of course there is nothing common and current or already seen. In fact, you yourself are the first to be surprised by the length of the publications.

He is clearly breaking the bounds of that NDA and will soon step aside the moment the call from above reaches him.



+CVars=r.FidelityFX.FSR2.Sharpness=0

Supposedly this is XSX specific... FSR CAS (sharpening filter) is disabled on XSX.

The rest of the ones he indicates coincide with the PS5 settings that he indicates. He does not mention about others.

Well if he is true real then I think it's even more interesting what he said. Although some don't like it of course.
 

Darsxx82

Member
This kinda reminds me of how the RE4 Remake update cycle went. At launch the PS5 version was softer and Xbox was sharper even though they had same pixel counts, then Capcom updated the game and it actually made the Xbox version softer as well, but then in a third patch they restored the Xbox versions sharpness and also improved the PS5 version.

Let's see if the Immortals team also makes tweaks like that with their updates. The developer did say the ini settings might not be valid anymore as the game gets patched.

He says that the IQ difference will be resolved very surely in future patches.....Which is a contradiction when according to him the reason is the lack of available ram vs PS5 and little lower asyc-compute that are supposed to be non-remediable. If they are, then he describes a lack of optimization situation.

Then, the curious thing in that .ini is that among the XSX-specific settings it mentions the deactivation of the sharpening filter. If on PS5 it is activated that will have visibility in the IQ differences.
 

Tripolygon

Banned
And? Go google , async likes more cu’s, it is just a fact,
I feel like you're just grasping at straws, so I google "async likes more cu's" and nothing popped up.

that is why in Control with unlocked cam mode RT is faster on Series X. Nothing to do with this topic and the amount of CU’s, that is just how it works .
Or maybe it's just simply because AMD raytracing units are 1 to 1 with number of CUs XSX has more (slower) 52 CUs vs PS5 less (faster) 36 CUs.

ray triangle intersection rate

PS5 - 4 x 36 x 2.23 = 321.12 Billion RTI/s
XSX - 4 x 52 x 1.825 = 379.6 Billion RTI/s 16% difference for XSX

In control that you mentioned " Xbox series X has 16% higher performance than PS5"

K9XalMW.png


Has fuck all to do with async compute. Stop grasping at straws. Admit when you are wrong about something.
 
Last edited:

Mr.Phoenix

Member
I think how its written out just makes it more confusing for people. Gonna try and clean it up a bit.

For the PS5 here are some of the changes we have in the device profile.

; Baseline

+CVars=r.SecondaryScreenPercentage.GameViewport=50
+CVars=r.PostProcessing.QuarterResolutionDownsample=1
+CVars=r.RenderTargetPoolMin=600
+CVars=r.LumenScene.SurfaceCache.CardMaxResolution=256
+CVars=r.Bloom.ScreenPercentage=25
+CVars=r.Lumen.Reflections.DownsampleFactor=3
+CVars=r.FreeSkeletalMeshBuffers=1
+CVars=r.SupportReversedIndexBuffers=0
+CVars=r.SupportDepthOnlyIndexBuffers=0

; Scalability Groups
+CVars=sg.GlobalIlluminationQuality=2
+CVars=sg.AtmosphereQuality=2 ; CVars here used to be part of GI quality
+CVars=r.Lumen.Reflections.TraceMeshSDFs=0 ; Trace the global SDF instead
+CVars=r.LumenScene.FastCameraMode=1 ; use less texel density, but update more frequently
+CVars=r.Bloom.AsyncCompute=1
+CVars=r.LocalExposure=1
+CVars=r.LightShaftQuality=1

; Newly added skylight resolution CVar
+CVars=ASC.SkylightMaxCubemapResolution=512

For the X we have

; Baseline

+CVars=r.SecondaryScreenPercentage.GameViewport=50
+CVars=r.PostProcessing.QuarterResolutionDownsample=1
+CVars=r.LumenScene.SurfaceCache.CardMaxResolution=256
+CVars=r.Bloom.ScreenPercentage=25
+CVars=r.Lumen.Reflections.DownsampleFactor=3
+CVars=r.FreeSkeletalMeshBuffers=1
+CVars=r.SupportReversedIndexBuffers=0
+CVars=r.SupportDepthOnlyIndexBuffers=0
+CVars=r.SSR.HalfResSceneColor=0 ; This can cause a crash if set to 1
+CVars=r.Lumen.Reflections.TraceMeshSDFs=0 ; Trace the global SDF instead
+CVars=r.LumenScene.FastCameraMode=1 ; use less texel density, but update more frequently
+CVars=r.Bloom.AsyncCompute=1
+CVars=r.FidelityFX.FSR2.Sharpness=0 ; To avoid artifacts

; Newly added skylight resolution CVars
+CVars=ASC.SkylightMaxCubemapResolution=512

as well as the X specific overrides

[XSX_Anaconda DeviceProfile]

DeviceType=XSX BaseProfileName=XSX
+CVars=r.SecondaryScreenPercentage.GameViewport=50 ; We want a 1080p secondary screen viewport
+CVars=sg.ViewDistanceQuality=3
+CVars=sg.AntiAliasingQuality=3
+CVars=sg.PostProcessQuality=3
+CVars=sg.TextureQuality=3
+CVars=sg.EffectsQuality=3
+CVars=sg.FoliageQuality=3
+CVars=sg.ShadowQuality=2
+CVars=sg.ShadowResolutionQuality=2
+CVars=sg.ShadowMeshQuality=2
+CVars=sg.ShadingQuality=3
+CVars=sg.GlobalIlluminationQuality=2
+CVars=sg.AtmosphereQuality=2
+CVars=r.Streaming.MaxEffectiveScreenSize=1080
+CVars=r.Streaming.PoolSize=4096 ; Overkill but necessary to avoid pool overflow (perf hit) and has not been known to cause emergency memory pressure ; Virtual texture pools in XSXEngine.ini are scaled to XSS ; Here we make sure to configure the pool scalability to better fit the XSX
+CVars=r.VT.PoolSizeScale.Group0=4 ; x4 because she's worth it
+CVars=r.VT.PoolSizeScale.Group1=4
+CVars=r.VT.PoolSizeScale.Group2=4 ; There are only 3 groups that we can scale
+CVars=r.LocalExposure=1
+CVars=r.LightShaftQuality=1e

A couple of interesting notes in there. Including the reason why CAS is disabled in the Xbox versions.
 
Last edited:

Lysandros

Member
I feel like you're just grasping at straws, so I google "async likes more cu's" and nothing popped up.


Or maybe it's just simply because AMD raytracing units are 1 to 1 with number of CUs XSX has more (slower) 52 CUs vs PS5 less (faster) 36 CUs.

ray triangle intersection rate

PS5 - 4 x 36 x 2.23 = 321.12 Billion RTI/s
XSX - 4 x 52 x 1.825 = 379.6 Billion RTI/s 16% difference for XSX

In control that you mentioned " Xbox series X has 16% higher performance than PS5"

K9XalMW.png


Has fuck all to do with async compute. Stop grasping at straws. Admit when you are wrong about something.
Besides, there is absolutely no evidence of XSX commending such an advantage across third party games featuring RT since this infamous "perfect benchmark". Quite the contrary in fact, we have examples like Callisto Protocol where XSX features much lower quality RT reflections. We had a relatively earlier example in the name of Ghostrunner featuring RT reflections where the performance in favor of PS5 was quite big at the same resolution. The newly introduced mode with RT shadows in Elden Ring shows us a PS5 advantage in both resolution and FPS as per Vgtech's analysis. The theoretical max RT intersections per sec metric is just a facet of overall RT performance, there are others like ray bounces which actually scale with clock speed, GPU cache performance, CPU, the VRAM amount/bandwidth and of course the APIs.

Edit: Thinking about it again, PS5 being moderately better at async should also result in RT processes eating away less GPU resources from general compute and shading meaning that PS5 could be a little 'freer' to use those RT processes concurently.
 
Last edited:

Tripolygon

Banned
Besides, there is absolutely no evidence of XSX commending such an advantage across third party games featuring RT since this infamous "perfect benchmark". Quite the contrary in fact, we have examples like Callisto Protocol where XSX features much lower quality RT reflections. We had a relatively earlier example in the name of Ghostrunner featuring RT reflections where the performance in favor of PS5 was quite big at the same resolution. The newly introduced mode with RT shadows in Elden Ring shows us a PS5 advantage in both resolution and FPS as per Vgtech's analysis. The theoretical max RT intersections per sec metric is just a facet of overall RT performance, there are others like ray bounces which actually scales with clock speed, GPU cache performance, CPU, the VRAM amount/bandwidth and of course the APIs.
Oh definitely, it's a game-by-game thing and certain game engines favor certain Hardwares. I just wanted to address the conjecture about async compute and control.

On a different note, I enjoy reading your post. Back to lurking i go now. 🤣🤣🤣
 

Zathalus

Member
+CVars=r.FidelityFX.FSR2.Sharpness=0

Well that rather neatly explains the sharper image on PS5. CAS is not enabled on Xbox.

I imagine it's a oversight. It's hardly the worse issue at the moment, frame generation is completely broken on PC.
 
+CVars=r.FidelityFX.FSR2.Sharpness=0

Well that rather neatly explains the sharper image on PS5. CAS is not enabled on Xbox.

I imagine it's a oversight. It's hardly the worse issue at the moment, frame generation is completely broken on PC.
Yeah you called it early on
 

Mr.Phoenix

Member
+CVars=r.FidelityFX.FSR2.Sharpness=0

Well that rather neatly explains the sharper image on PS5. CAS is not enabled on Xbox.

I imagine it's a oversight. It's hardly the worse issue at the moment, frame generation is completely broken on PC.
No. It's not an oversight. I reposted that list and cleaned it up so it's easier to read.

The dev mentioned why it was disabled.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
No, it's not like that at all. When they have published console settings it has been in general and when said settings have been 1:1.

It is very different from going out and explaining specifics, specific optimizations and expressing differences (api limitations and hardware problems) between versions. The point where he talks about lack of ram due to excessive use of the OS vs PS5, and that they have discussed with MS how to solve that problem, and the specific capabilities deficiencies of the hardware are main points covered always by NDA contracts.

Of course there is nothing common and current or already seen. In fact, you yourself are the first to be surprised by the length of the publications.

He is clearly breaking the bounds of that NDA and will soon step aside the moment the call from above reaches him.



+CVars=r.FidelityFX.FSR2.Sharpness=0

Supposedly this is XSX specific... FSR CAS (sharpening filter) is disabled on XSX.

The rest of the ones he indicates match with the PS5 settings that he indicates. He does not mention about others.
Alex has done deep breakdowns of settings he received from developers comparing them to PC versions. Just off the top of my head, Remedy's Control, Watch Dogs Legion and Doom Eternal.
 

adamsapple

Or is it just one of Phil's balls in my throat?
+CVars=r.FidelityFX.FSR2.Sharpness=0

Well that rather neatly explains the sharper image on PS5. CAS is not enabled on Xbox.

I imagine it's a oversight. It's hardly the worse issue at the moment, frame generation is completely broken on PC.

Is CAS a heavy feature that eats performance cycles?

Wondering if that being disabled is the reason for the general performance difference, along the IQ difference.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
They trashed nothing.. the even went from UE4, to UE5.1 in steps and they use most UE5 features, they liked that they even could use Zbrush models . And nanite, they did not like Lumen Hardware that much, but are looking into implementing in the future. But with all new tech it takes time. Who knows how clean their own code is.
The entire interview is them pointing out how UE5 is/was so inefficient that they had to come up with their own techniques. Dont make me pull quotes.
 

Kataploom

Gold Member
No. It's not an oversight. I reposted that list and cleaned it up so it's easier to read.

The dev mentioned why it was disabled.
Yeah but then why enable it on PS5 at the cost of performance? It clearly performs worse, why don't do the same on Xbox if they were ok by performance provided anyway. Seems like both versions were made by different people not fully aligned each other.
 

SomeGit

Member
Is CAS a heavy feature that eats performance cycles?

Wondering if that being disabled is the reason for the general performance difference.

No, it’s not heavy at least on PC, but if they are starved for memory it’s one less thing to worry about. But they mention it’s to avoid artifacts so either something is messed in the CAS implementation or some other change introduces them I don’t think it’s performance related.
 
Last edited:

Darsxx82

Member
Alex has done deep breakdowns of settings he received from developers comparing them to PC versions. Just off the top of my head, Remedy's Control, Watch Dogs Legion and Doom Eternal.
Alex/DF has done it from the files of the PC version of some games that include console-specific settings..... which is very different vs a developer do it who also fully goes into details and development specificities that are within the typical ones protected by NDAs.
 

sinnergy

Member
The entire interview is them pointing out how UE5 is/was so inefficient that they had to come up with their own techniques. Dont make me pull quotes.
only if you want to believe UE5 is a negative for the industry . That article was not that negative at all.. they wanted all features at 60 frames .. So they needed to do some coding , not that weird , as for example Matrix demo is around 30 fps.
 
Last edited:

Darsxx82

Member
Is CAS a heavy feature that eats performance cycles?

Wondering if that being disabled is the reason for the general performance difference, along the IQ difference.
No, it is a setting whose implementation is practically "free".

The reason he assure is that its implementation in XSX produces excessive artifacts. Presumably that would not be the case on PS5.

PS. Yes, the use of CAS definitely has significant effects on the IQ difference. In fact it was the most reasonable bet to bet knowing that the base resolution was the same.

That there are more reasons behind? There are the words of the developer but, ironically, these are not supported either in the .ini that he has published or in the results of the analysis.
 
Last edited:

Lysandros

Member
Yeah but then why enable it on PS5 at the cost of performance? It clearly performs worse, why don't do the same on Xbox if they were ok by performance provided anyway. Seems like both versions were made by different people not fully aligned each other.
Maybe they simply overestimated the additional headroom offered by PS5 somewhat and overshot their budget to some degree. It's also completely plausible that PS5 settings would result in an even (slightly) higher performance penalty on XSX. Or there is just another thing going on on PS5 in optimization front.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
The issue is that Nanite and Lumen are incredibly compute heavy and the consoles simply can't keep up.

Using all the latest features and hardware Lumen will force the consoles to run at 720-900p, even at 30fps.

UE5 feels like it was developed for the PS6 and next Xbox in mind. PC is the only platform that can do it justice at the moment.

Are we sure this is true? Or could it be this developer that's having an issue with UE5? Why are we acting as if the dev's talent doesn't matter anymore.
 

SomeGit

Member
[
only if you want to believe UE5 is a negative for the industry . That article was not that negative at all.. they wanted all features at 60 frames .. So they needed to do some coding , not that weird , as for example Matrix demo is around 30 fps.

Devs seem to like it but it seems like it just shoots for the moon at the moment, I think it will only become really good in the next gen.
 
Last edited:

Mr.Phoenix

Member
Yeah but then why enable it on PS5 at the cost of performance? It clearly performs worse, why don't do the same on Xbox if they were ok by performance provided anyway. Seems like both versions were made by different people not fully aligned each other.
First off, this narrative of the PS5 version clearly performing worse is an exaggeration. Obviously, they felt the performance levels of the PS5 were acceptable. It has lower lows, but that was obviously something they felt was acceptable if it constantly looked better when its running.

Secondly, they didn't say CAS affected performance, they said it was disabled because on the XS consoles, it introduced artifacts.

I am literally watching a new false narrative being formed...
 
Last edited:

mrcroket

Member
Are we sure this is true? Or could it be this developer that's having an issue with UE5? Why are we acting as if the dev's talent doesn't matter anymore.
People buy the more convenient narrative, the source (or quality of that) doesn't matter.

The only "true" is that some games runs better on PS5 and some other on SX, or just run different (e.g. more resolution vs better framerate).
 

Zathalus

Member
Are we sure this is true? Or could it be this developer that's having an issue with UE5? Why are we acting as if the dev's talent doesn't matter anymore.
Remnant 2 doesn't even have lumen enabled and still drops to 720p with frame drops. Fortnite drops below 900p and that is first party and is still using software lumen. It's also not even pushing the hardware that hard.

It's not a question of dev talent, but just that the major features of UE5 hammer the GPU.

No. It's not an oversight. I reposted that list and cleaned it up so it's easier to read.

The dev mentioned why it was disabled.
Can't imagine what artifacts it can cause that would force it to be disabled. CAS looks fine on the PC build.
 
Last edited:

Kataploom

Gold Member
Are we sure this is true? Or could it be this developer that's having an issue with UE5? Why are we acting as if the dev's talent doesn't matter anymore.
Well, for once it's the norm among devs using UE5... If those devs didn't have any problem working with UE4 or other engines last gen but suddenly can't make their games look and perform decently in UE5, then in my opinion:

1. The features (or other engine underlying features) are too heavy for current consoles, or...

2. Such features are hard to optimize for these console so your average dev team won't be able to make it work properly more often than not or won't for a long time (probably the remaining of this gen anyway).

If the dev in Reddit is not fake, this team did a lot of I+D so probably the problem doesn't lie on devs lack of expertise.
 
Last edited:

JimboJones

Member
This comparison between DLSS and FSR2 shows a somewhat similar difference between the Xbox and PS images, as CAS is enabled on the FSR side.
 
Harsh statements with reason the thread has so many pages xDD

-Implementations for async compute is obviously platform dependent (ie. we are calling console OS libraries) and thats generally been faster on PS5.

-We aren't using the same tunings on the PS5 and Xbox X - despite being close in hardware specs, the PS5 is a little better so we pushed fidelity a little more. That is most of the performance delta.
 

SlimySnake

Flashless at the Golden Globes
I feel like you're just grasping at straws, so I google "async likes more cu's" and nothing popped up.


Or maybe it's just simply because AMD raytracing units are 1 to 1 with number of CUs XSX has more (slower) 52 CUs vs PS5 less (faster) 36 CUs.

ray triangle intersection rate

PS5 - 4 x 36 x 2.23 = 321.12 Billion RTI/s
XSX - 4 x 52 x 1.825 = 379.6 Billion RTI/s 16% difference for XSX

In control that you mentioned " Xbox series X has 16% higher performance than PS5"

K9XalMW.png


Has fuck all to do with async compute. Stop grasping at straws. Admit when you are wrong about something.
The funny thing about control was that it was only faster in photomode with nothing going on. As soon as they went into gameplay, xbox was regularly dropping below 30 fps.

Basically the same thing we have seen all gen. XSX has that 18% tflops advantage whenever you are standing still or walking around but as soon as you start actually playing the game and start shooting, the framerate dips sometimes below PS5 performance.
 

Zathalus

Member
The funny thing about control was that it was only faster in photomode with nothing going on. As soon as they went into gameplay, xbox was regularly dropping below 30 fps.

Basically the same thing we have seen all gen. XSX has that 18% tflops advantage whenever you are standing still or walking around but as soon as you start actually playing the game and start shooting, the framerate dips sometimes below PS5 performance.
Control FPS drops were all fixed in a patch though.
 

Mr.Phoenix

Member
Remnant 2 doesn't even have lumen enabled and still drops to 720p with frame drops. Fortnite drops below 900p and that is first party and is still using software lumen. It's also not even pushing the hardware that hard.

It's not a question of dev talent, but just that the major features of UE5 hammer the GPU.


Can't imagine what artifacts it can cause that would force it to be disabled. CAS looks fine on the PC build.
Neither me, but whatever it was, it must have been serious enough to warrant it.

Just don't want this new false narrative being formed that stuff like CAS is why the S5 version performs worse in some outliner cases. Or even that CAS is responsible for why the PS5 looks better in general.

Its cool being a keyboard dev and all, but we shouldn't just start making shit up to suit narratives in light of actual documentation. We just don't know enough even from what has been disclosed.
 

Riky

$MSFT
Control FPS drops were all fixed in a patch though.
It wasn't even a game patch, a firmware update fixed it all according to DF.
Also when talking about bandwidth comparisons this guy forgot to mention the 96mb of infinity cache.
 

Zathalus

Member
According to the new DF video, FSR is sharper then even DLSS in this game, indicating the sharpening filter is indeed working (sharpening for DLSS appears to be disabled). DLSS is superior in everything else though.
 

JimboJones

Member
Can't imagine what artifacts it can cause that would force it to be disabled. CAS looks fine on the PC build.
I wonder if decisions about settings where made by different teams of people? Some people like sharpening filters and feels it's worth the perceived recovery in detail despite sharpening artifacts, other people hate it.
 

winjer

Gold Member
CAS has no performance penalty on modern GPUs. I'm sure that even on the PS5 and Series S/X it won't cause a drop of even one frame per second.
But CAS as an effect, looks stronger on lower resolutions. So at higher resolutions it's usually necessary to increase it's strength value.
 

Lysandros

Member
Basically the same thing we have seen all gen. XSX has that 18% tflops advantage whenever you are standing still or walking around but as soon as you start actually playing the game and start shooting, the framerate dips sometimes below PS5 performance.
Not sure to understand, what do you mean by this?
 

Darsxx82

Member
The funny thing about control was that it was only faster in photomode with nothing going on. As soon as they went into gameplay, xbox was regularly dropping below 30 fps.

Basically the same thing we have seen all gen. XSX has that 18% tflops advantage whenever you are standing still or walking around but as soon as you start actually playing the game and start shooting, the framerate dips sometimes below PS5 performance.
No, I think you don't know the real situation.

From the beginning XSX always performed better in dense moments. The problem was traversal moments and streaming zones. This made many prefer the situation of PS5 vs. XSX.

Then it became known that in photo mode the framerate was unlocked and the real differences. What "nothing happens" is not real because you can pause at any time and gather the scene with the most load possible and the framerate is affected.

Finally a patch or update to the XSX API resolved the issues of repeated drops in traversal and streaming and XSX clearly offers the best experience in terms of performance.
 

SlimySnake

Flashless at the Golden Globes
Not sure to understand, what do you mean by this?
This was something they saw in Metro. It had a 20% advantage in pixel counts when you were out and about just exploring. But the moment you started action sequences, the DRS would kick in and bring down the resolution to PS5 levels.

For some reason, the XSX hardware struggles to keep up when dynamic elements are added to the screen. We mostly saw this in 120 fps modes of several games where the PS5 surprisingly kept pace with the xsx and we didnt see a 18% advantage in line with the tflops difference. When we did it was wildly inconsistent.
 

Gaiff

SBI’s Resident Gaslighter
Lol, so this game runs terribly on PC too and performance on equivalent hardware seems roughly on par with consoles as the 2080 offers similar performance to the PS5. The problem being the fact that settings don't make a bit of difference aside from Global Illumination, Shadow Rendering Pool Size, and Shadow Resolution Quality.

The 3600 seems to have major traversal stutters not seen on consoles. Also, according to benchmarks, AMD GPUs perform noticeably better in this game than their NVIDIA counterparts. The 6700 XT for instance beats the 2080 Ti/3070 and the 6800 XT beats the 3080 by 15-20%.
 

SomeGit

Member
Lol, so this game runs terribly on PC too and performance on equivalent hardware seems roughly on par with consoles as the 2080 offers similar performance to the PS5. The problem being the fact that settings don't make a bit of difference aside from Global Illumination, Shadow Rendering Pool Size, and Shadow Resolution Quality.

The 3600 seems to have major traversal stutters not seen on consoles. Also, according to benchmarks, AMD GPUs perform noticeably better in this game than their NVIDIA counterparts. The 6700 XT for instance beats the 2080 Ti/3070 and the 6800 XT beats the 3080 by 15-20%.

I might be wrong, but I think that's consistent with other prior UE5 games like Fortnite, where AMD was also stronger (without RT acceleration).

Edit: Yep I'm wrong, the comparisons I remembered where 6800XT vs 3070, it's abnormally good on AMD then.
 
Last edited:

Lysandros

Member
This was something they saw in Metro. It had a 20% advantage in pixel counts when you were out and about just exploring. But the moment you started action sequences, the DRS would kick in and bring down the resolution to PS5 levels.

For some reason, the XSX hardware struggles to keep up when dynamic elements are added to the screen. We mostly saw this in 120 fps modes of several games where the PS5 surprisingly kept pace with the xsx and we didnt see a 18% advantage in line with the tflops difference. When we did it was wildly inconsistent.
I am not agreeing on it being "a general rule of the generation". It can be a highly specific behaviour tied to a particular game or situation going both ways. How do you explain cases like real time cutscenes having higher performance on PS5 in some games without the input from the player then? Shouldn't this "purely GPU bound"? By the way you seem to be persisting in your wrong assumption that 18% higher theoretical TF advantage = 18% 'performance' advantage, that is far from being the case.
 
Last edited:
Top Bottom