• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[VGTECH] Crash 4 PS5 / XSX / XSS

Heisenberg007

Gold Journalism
Not really for different reasons

1)for the 10 the time sony first party prefeer their own engine and mostly don't want use UE5 ..it only served sony to promote ps5 thanks to the demo. (which as confirmed by the engineer could run on a laptop with a 2080 and an SSD 870 Sam) also sony bought shares of epic for 250m during that period.

2) There is no question that having the same CU's the higher the frequency the higher the performance. But everyone knows that the benefits of parallelization are greater and scale better than Ghz. Not for nothing the biggest gpu producers do nothing but add more cu's. the same could be said with the CPU cores. I don't want to stay here to reiterate some theories that circulated at the launch of the consoles about why the ps5 had so few cu's that go against any evolution happened recently in the world of gpu. In a couple of years we'll know if it was cerny magic or some poor cheap sauce.

3) That game was developed to take advantage of the ps5 you should post other benchmark to be taken seriously and look a bit you will find that it doesn't go just as you say.
Everyone knows how Sony invests. They shared their PS5 tech for building the UE5 engine. In exchange, Epic sold them a certain % of ownership. This is how Sony has always operated.

I have the responses for your other points, but first I'd wait for a credible link from you that proves an Epic engineer said that a laptop could run that demo.
 

geordiemp

Member
Not really for different reasons

1)for the 10 the time sony first party prefeer their own engine and mostly don't want use UE5 ..it only served sony to promote ps5 thanks to the demo. (which as confirmed by the engineer could run on a laptop with a 2080 and an SSD 870 Sam) also sony bought shares of epic for 250m during that period.

2) There is no question that having the same CU's the higher the frequency the higher the performance. But everyone knows that the benefits of parallelization are greater and scale better than Ghz. Not for nothing the biggest gpu producers do nothing but add more cu's. the same could be said with the CPU cores. I don't want to stay here to reiterate some theories that circulated at the launch of the consoles about why the ps5 had so few cu's that go against any evolution happened recently in the world of gpu. In a couple of years we'll know if it was cerny magic or some poor cheap sauce.

3) That game was developed to take advantage of the ps5 you should post other benchmark to be taken seriously and look a bit you will find that it doesn't go just as you say.

A 6800 from a 6700 is more parallel. Adds all the SE blocks and shader arraysand the CU are in groups of 10 CU.

An XSX goes to 14 CU, but adds no other blocks such as parameter cache, LDS, L1 etc .

The question is how parallel is XSX in GPU gaming workloads as they just added CU and nothing else. A bit like VEGA compute designs, which also added CU and nothing else. That worked out well.
 
Last edited:

MonarchJT

Banned
Only thing embarrassing is the constant "it's nothing" when PS5 beats the tower of power at anything. It's certainly not 'nothing', it's pretty significant that not only is the PS5 making up a 2tf/16CU deficit it's also actually performing better.

This particular game has stuttering though, which doesn't manifest itself as a framerate drop, which is far more annoying than dropped frames.

If everything else was identical, which it isn't by the way, that would be enough to call the win for PS5.
there's another embarassing thread around where at least some can brag 70% more resolution instead of 0.25% of frames
 

Heisenberg007

Gold Journalism
Sony will never tell naughty dog (or guerilla with decima engine etc etc) to give up their engine where they have been working for years improving animation management to move to the UE. it could happen just in case Sony is preparing to release everything on PC.
I'm not sure why first-party engines should even be a part of this conversation. Those first-party games won't be releasing on Xbox anyway, so the comparison point is moot.

UE5 is in the discussion because (1) you brough it up, and (2) it will be the most popular game engine this generation for third-party devs and multi-platform publishers.
 
I actually just looked at a load of 6700 XT reviews and the card gets trounced by the 6800 on nearly all games 1440p and above so I think it must just be a standout for the godfall game and that resolution. I was trying to locate the benchmark from the website that the image was shared but I can't seem to find it, if any one could share that would be great. I'm really interested how the engines scale per GPU at different resolutions. 6700 XT is deffo (as advertised by AMD) a 1440P card and the 6800 does a great job of out performing it at higher resolutions and in raytracing.
Here's a different benchmark, with Raytracing (from PCGH).

he5EhAr.png

Clear win for the 6800.
 

jroc74

Phone reception is more important to me than human rights
I'm saying that Epic was very careful not to mention the Xbox name during and after the presentation for a specific reason. probably an economic reason and absolutely non-technical as some would have you believe.
To believe that they designed the UE5 most famous cross-platform graphics engine based on the specifications of a console on which any major first parties have never used this graphics engine, it is naive at least.EU It is mainly used on PCs and in the world of consoles, we know who uses it the most is Microsoft. If naughtydog, guerrilla, SSM abandon their rendering engines in favor of UE5, thus losing control over the engine, that's another matter, and it will only mean one thing. That Sony is preparing to release games on pc.

Everyone in next cycle will get better nvme ssd undoubtedly Sony too.

Seems you missed the parts where I said:

Epic said UE5 will be scalable all the way to smartphones. Pretty sure the XSX falls somewhere in there between smartphones and consoles;
In house engines will be redesigning their engines to take advantage of the NVMe's.

For all the tools talk....to think first party devs wont redesign engines to take advantage of the NVMe's, I/O systems is also a wild position to take.


Because apparently Avengers is using the same PS4 Pro resolution settings in the PS5 version. Crash isn't a "lazy port" either but (according to you) it's using the same shadow settings as the Xbox One X version -- which makes it a "lazy port"?

I wonder where you drew the line. So it's a lazy port if the XSX version uses the old-gen settings. But it's not a lazy port if the PS5 version uses the old-gen settings? That's very fair.


I'm completely okay believing this theory that the devs half-assed it. My point is that then we should be consistent.

I'm okay believing that both devs didn't change the higher shadow settings in Crash but XSX could easily render it. But then the same principle should apply to Avengers as well with the resolution settings (both are next-gen ports, not running in BC).

My disagreement is that he is saying that Crash devs used old-gen shadow settings on XSX because they ran out of time. Then he is saying Avengers devs used old-gen res settings on PS5 because that's all the console could manage, and Avenger devs didn't run out of time.

Why the inconsistency?
Exactly, This is what I wanna know.
Poor upgrade on ps5 = bad port

Poor upgrade on XsX = bad console

Pretty sure this can be interchanged all kinds of ways to cancel each other out.
 
Last edited:

Topher

Gold Member
Still, it's seriously embarrassing

Ps5 29250 (99.09%) XsX 28946 (98.57%)

0.52% of advantage, and most of this nothingness is during the cutscenes and not the gameplay. lol

A "nothing" difference between XSX and PS5 is a significant win for PS5. I mean.....XSX was supposed to wipe the floor with PS5. That's all that was said leading up to launch. Even in this thread you are maintaining XSX should have "double digit" advantage.....and yet it doesn't?

Tell me again, how is that "nothing"?

Come On What GIF by MOODMAN
 
Exactly, This is what I wanna know.
Maybe my English is bad, but I already explained it. It's not an inconsistency at all. Resolution in Avengers stayed the same because PS5 would choke on native resolution. It drops to half of 1440p at times already. And most other settings are different, which is proof that this wasn't a lazy port at all. Meanwhile in Crash, nothing but the resolution changed on the XSX version. They couldn't even be bothered to change the One S shadow setting. The definition of a lazy port.
 

DenchDeckard

Moderated wildly
I think all that really matters is these consoles are close. Both are powerful, games will swing one way or another and from looking at these threads it will be fun for us all to observe the madness that ensues! :D
 

Topher

Gold Member
Maybe my English is bad, but I already explained it. It's not an inconsistency at all. Resolution in Avengers stayed the same because PS5 would choke on native resolution. It drops to half of 1440p at times already. And most other settings are different, which is proof that this wasn't a lazy port at all. Meanwhile in Crash, nothing but the resolution changed on the XSX version. They couldn't even be bothered to change the One S shadow setting. The definition of a lazy port.

What do you mean "it drops to half of 1440p at times already"? 720p?
 

Heisenberg007

Gold Journalism
Maybe my English is bad, but I already explained it. It's not an inconsistency at all. Resolution in Avengers stayed the same because PS5 would choke on native resolution. It drops to half of 1440p at times already. And most other settings are different, which is proof that this wasn't a lazy port at all. Meanwhile in Crash, nothing but the resolution changed on the XSX version. They couldn't even be bothered to change the One S shadow setting. The definition of a lazy port.
"Shadows stayed the same because XSX would choke on higher quality shadows. It already drops more frames than PS5 with lower shadow settings." (your logic, not mine).
 
"Shadows stayed the same because XSX would choke on higher quality shadows. It already drops more frames than PS5 with lower shadow settings." (your logic, not mine).
The difference is that we have concrete data and evidence about PS5 not being able to sustain half of 2160p. You'd have a point if Avengers was 100% capped at CBR 4K60.
 

Mr Moose

Member
The difference is that we have concrete data and evidence about PS5 not being able to sustain half of 2160p. You'd have a point if Avengers was 100% capped at CBR 4K60.
What about the quality mode? How do they compare between Series X and PS5? Both 4k DRS, right?
 

Md Ray

Member
Very good post. 👍 Isn't RX 6800 a 3 Shader Engine/10 CUs per shader array design? In that case i would expect XSX to scale even worse with its 2 SE/12-14 CUs per SA setup compared to PS5.
Exactly. 6800 is a 3 SE design.

It's becoming more and more clear that one of the biggest RDNA 2 strengths seems to be its high clock speed that gets you very high performance (sometimes on par with a GPU that's wider but slower). Cerny saw the potential (and benefits) in high clocks that RDNA 2 is bringing to the table, so the team(s) at Sony seem to have stopped at nothing to bring this to their console. To the point, I get the impression that Cerny & co internally felt that missing out on high GPU clocks and settling for lower freq. would be a big missed opportunity.

It becomes even more apparent when you look at the giant cooling solution they went with at the expense of making the console bigger (even bigger than fat PS3), while, on top, going as far as taking the risks of using liquid metal in a mass-produced consumer electronic, haha.
 

Jaysen

Banned
A "nothing" difference between XSX and PS5 is a significant win for PS5. I mean.....XSX was supposed to wipe the floor with PS5. That's all that was said leading up to launch. Even in this thread you are maintaining XSX should have "double digit" advantage.....and yet it doesn't?

Tell me again, how is that "nothing"?

Come On What GIF by MOODMAN
And prior to release everyone said the PS5 SSD would wipe the floor with the XSX SSD, and instead I'm seeing XSX games loading faster. It's a dev by dev and engine by engine scenario. People using these differences to justify one console over the other are just fanboy idiots. The proper thing to do is not reward developers/publishers who can't be bothered to get both versions of their games running smoothly. Eventually they'll learn or go out of business.
 

Topher

Gold Member
The difference is that we have concrete data and evidence about PS5 not being able to sustain half of 2160p.


What evidence?

"Essentially, the more mayhem you unleash, the lower typical resolution is likely to be - expect values in the 1800p range in the heat of the action, dropping to 1440p when all hell truly breaks loose."

Marvel's Avengers on PS5: every upgrade tested • Eurogamer.net

Predominantly 1800p throughout. 1440p is the lower bound in "very rare" cases.

And prior to release everyone said the PS5 SSD would wipe the floor with the XSX SSD, and instead I'm seeing XSX games loading faster. It's a dev by dev and engine by engine scenario. People using these differences to justify one console over the other are just fanboy idiots. The proper thing to do is not reward developers/publishers who can't be bothered to get both versions of their games running smoothly. Eventually they'll learn or go out of business.

And so perhaps the differences between these two consoles have been entirely overstated in several aspects. I have no problem acknowledging that at all.
 
Last edited:
Man I wish I had a ps5 so I could play a version of Crash 4 that wasn't super blurry lol. Guess I could finally pick up the crash trilogy for PS4 while I wait. But man this game has some really blurry TAA
 

Md Ray

Member
This is really interesting! thanks for posting. Great piece of evidence of how certain games will perform on wider vs narrow architecture.

Is there benchmarks for 4k? would be interesting to see how the 6700 XT performs vs the 6800 at higher resolutions. I believe that's where we may see some benefits from wider and slower over faster and narrow?
Thanks. The data is from this review: AMD Radeon RX 6700 XT Review | TechSpot

Unfortunately, they haven't made 4K benchmarks available, apparently only available to floatplane and Patreon members.
 
What evidence?

"Essentially, the more mayhem you unleash, the lower typical resolution is likely to be - expect values in the 1800p range in the heat of the action, dropping to 1440p when all hell truly breaks loose."

Marvel's Avengers on PS5: every upgrade tested • Eurogamer.net

Predominantly 1800p throughout. 1440p is the lower bound in "very rare" cases.
That's the evidence. They kept the resolution because PS5 can't max it out. Changing it to native resolution would be idiotic.
 

Jaysen

Banned
What evidence?

"Essentially, the more mayhem you unleash, the lower typical resolution is likely to be - expect values in the 1800p range in the heat of the action, dropping to 1440p when all hell truly breaks loose."

Marvel's Avengers on PS5: every upgrade tested • Eurogamer.net

Predominantly 1800p throughout. 1440p is the lower bound in "very rare" cases.



And so perhaps the differences between these two consoles have been entirely overstated in several aspects. I have no problem acknowledging that at all.
They've obviously been overstated. I own both, and besides the obvious UI/OS and features differences, they play my games almost identically. Destiny 2 and GTA Online are identical on both. One major difference for GTA Online fans is on XSX you can force a single player lobby and make all the money you want without other players griefing you. The one feature difference that keeps me playing on my XSX most of the time is Quick Resume. It literally takes me 3 seconds to get right back to where I was in my Dead Cells run. I hope Sony copies this feature.
 

Elog

Member
can i have a official talk about any form of vrs (tier 1 or 2 or custom) in ps5?
From the amount of things that there are hidden inside the GE at this point I wouldn't be shocked if it also make coffee
Both of these features were mentioned in the Road to PS5 - people were just not paying attention. As to manuals etc for the PS5 dev kit there are no public sources AFAIK.

As to your coffee comment - these two features together with the culling through the GE have been discussed ad nauseam on these boards since last Spring but the Xbox crowd just refuse to listen and come back to these marketing terms instead of functionalities. The PS5 can do shader prioritization and culling through the GE instead of under the umbrella of the two industry standards MESH and VRS. It is really as simple as that.

I am not sure why we discuss these minor differences between the two consoles when the key question is one that no one can answer right now 'How easy is it for developers to perfom culling and shader prioritization/optimization in the two development environments close to the metal?' - that question is the real one since the hardware differences here are not major.

The two major hardware differences between these two consoles is hardware based BC for the Xbox which is awesome and the I/O for the PS5 where we have to wait and see what it is capable of (and no - I am not talking about loading times which is not that different between the two).
 
Last edited:

Mr Moose

Member
Looks great, runs great, it's not a lazy port, it's a free upgrade.
That's the evidence. They kept the resolution because PS5 can't max it out. Changing it to native resolution would be idiotic.
Or they used the Pro/One X versions and upgraded it just like they did with this game.
Let's see how the native res versions compare (quality mode).
 

ethomaz

Banned
What evidence?

"Essentially, the more mayhem you unleash, the lower typical resolution is likely to be - expect values in the 1800p range in the heat of the action, dropping to 1440p when all hell truly breaks loose."

Marvel's Avengers on PS5: every upgrade tested • Eurogamer.net

Predominantly 1800p throughout. 1440p is the lower bound in "very rare" cases.
Just a correction it is 1440p to 2160p in CBR... because that is what PS4 Pro used.

2160p in CBR uses 1930x2160 base res.
1440p in CBR uses 1280x1440 base res.

So that is the DRS being used in Avengers for PS4 Pro and PS5 versions except in PS4 Pro case the DRS can drop to 1080p in CBR (960x1080 base res).
 
Last edited:
That's where I got the link, yes. So what ? There's a lot of familiarity between Riky's posts and yours. Not sure what you're trying to prove.

Backseat moderation is bannable if I remember correctly, so beware.
Bro, this is embarassing. I know for a fact that you're Krisprolls. Don't worry, I won't tell the mods. Just stop clowning around lmao

Should be directly comparable though since both are using native resolutions with DRS, unless they have different settings.
Yes, but the comparison is not very interesting because both consoles sustain 4K30 almost all of the time.
 

jroc74

Phone reception is more important to me than human rights
And so perhaps the differences between these two consoles have been entirely overstated in several aspects. I have no problem acknowledging that at all.

This doesnt sit well in the fanboy wars, you need to see yourself out. Like ASAP.

While I agree with this, we are still waiting to see what a Series exclusive can do with 12TF. (and Full RDNA 2!!!) We have already seen what a PS5 exclusive can do with the SSD, I/O in several games.

Amd some ppl need to stop getting hung up on game, save loading. Those arent the only things that can load.
 
Last edited:

MonarchJT

Banned
A 6800 from a 6700 is more parallel. Adds all the SE blocks and shader arraysand the CU are in groups of 10 CU.

An XSX goes to 14 CU, but adds no other blocks such as parameter cache, LDS, L1 etc .

The question is how parallel is XSX in GPU gaming workloads as they just added CU and nothing else. A bit like VEGA compute designs, which also added CU and nothing else. That worked out well.
wut?
The main difference is the 6800 have 128mb of IC but it have also 4mb of l2 (x60cu) for the entire gpu vs 5mb (x52cu) of the Xsx ...
The xsx have 5mb L2 cache for the whole GPU, level 1 cache for the Shader Array, and the LDS for the dual CU just like every other rdna2 gpu
the arrays of the xsx are 2x12 2x14
 
Last edited:

dcmk7

Banned
Bro, this is embarassing. I know for a fact that you're Krisprolls. Don't worry, I won't tell the mods. Just stop clowning around lmao


Yes, but the comparison is not very interesting because both consoles sustain 4K30 almost all of the time.
Know for a fact? Receipts.

You assume it's him. Which is totally, totally different.
 
Top Bottom