• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGTech: Stranger of Paradise PS5 & Xbox Series X|S Frame Rate Test

Arioco

Member



PS5, Xbox Series X and Xbox Series S all have a Favor Performance Mode and Favor Resolution Mode.

PS5 also has a 1080p mode which can be accessed by changing the PS5 system output resolution to 1080p.

PS5 and Xbox Series X in Performance Mode use a dynamic resolution with the highest resolution found being 3840x2160 and the lowest resolution found being 2688x1512.

Xbox Series X in Resolution Mode uses a dynamic resolution with the highest resolution found being 3840x2160 and the lowest resolution found being 2688x1512. The dynamic resolution system doesn't seem to work as expected in Resolution Mode on Xbox Series X as the resolution doesn't seem to be improved compared to Performance Mode.

PS5 in Resolution Mode uses a dynamic resolution with the highest resolution found being 3840x2160 and the lowest resolution found being approximately 3456x1944. Pixel counts below 3840x2160 seem to be very rare in this mode.

During gameplay PS5 and Xbox Series X in Performance Mode and Resolution Mode both seem to use a form of checkerboard rendering to reach their stated resolutions and both appear to use nearest neighbour upscaling to upscale to 3840x2160. The artifacts produced by checkerboard rendering seem to differ in some areas between PS5 and Xbox Series X.

The resolution appears to switch to a native 1920x2160 during cutscenes on Xbox Series X and PS5 in the Performance and Resolution Modes and this doesn't appear to be upscaled using nearest neighbour scaling.

PS5 in the 1080p mode uses a dynamic resolution with the highest resolution found being 1920x1080 and the lowest resolution found being approximately 1728x972. Pixel counts below 1920x1080 seem to be very rare in this mode.

Xbox Series S in both Performance Mode and Resolution Mode uses a dynamic resolution with the highest resolution found being 1920x1080 and the lowest resolution found being 1344x756. The dynamic resolution system doesn't seem to work as expected in Resolution Mode on Xbox Series S as the resolution doesn't seem to be improved compared to Performance Mode.

Cutscenes on Xbox Series S and PS5 in the 1080p mode appear to be locked to 1920x1080.

The Xbox Series consoles are missing Ambient Occlusion which is present on PS5. Anti-aliasing appears to be more effective on Xbox Series X than PS5.

Xbox Series S has graphical reductions compared to the other two consoles such as: reduced texture quality, reduced texture filtering, reduced draw distance and reduced effects quality. Hair and fur is also simplified on Xbox Series S and as a consequence the Series S doesn't experience the same severe frame rate issues that can occur on PS5 and Series X during some parts of cutscenes.

Because of how demanding the hair and fur is on PS5 and Series X, performance can vary noticeably during cutscenes based on character equipment.

Stats: https://docs.google.com/spreadsheets/u/0/d/1-4drh2qkNSSNZcNfby6I8CrcGddh_f7aPce4apKBF7U/htmlview#
 
Last edited:

Arioco

Member
Bizarre this difference between the resolution mode for the PS5 vs XsX


Not no mention Series consoles are capped at 30 fps (with severe frame pacing issues) in Resolution Mode, whereas PS5 is uncapped and hovers between 40 to 60 fps. Very strange.
 

Arioco

Member
Yes. This time resolution mode seems to be 60 fps on PS5 vs 30 fps on XSX?

That would be strange but that's what their stats say.


Not only the stats, the video shows exactly that. And yes, it's very weird.
 
What a complete mess of a game lmao

hair-fur-1pxj25.png
 

azertydu91

Hard to Kill
Yeah this has to be a bug or a patch that screwed up resolution(edit: meant framerate), usually when people say it is a lazy port or the dev barely worked on x or y version it is bullshit but if it stays that way maybe in this case it will be true.
 
Last edited:
Yeah this has to be a bug or a patch that screwed up resolution(edit: meant framerate), usually when people say it is a lazy port or the dev barely worked on x or y version it is bullshit but if it stays that way maybe in this case it will be true.
The game is a complete turd, at least with regards to its tech. It drops both frames and resolution even in the PS5 1080p mode, while looking like an early PS4 game at best. Meanwhile Death Stranding runs at a native 4K60 (mostly).
 

azertydu91

Hard to Kill
The game is a complete turd, at least with regards to its tech. It drops both frames and resolution even in the PS5 1080p mode, while looking like an early PS4 game at best. Meanwhile Death Stranding runs at a native 4K60 (mostly).
This has never bothered some using it as ammo for console war so it is not surprising yet it hasn't happenned has much as other times.I'll just let the warriors fight and talk to the people that simply want to have a reasonable discussion .
 

Lysandros

Member
This game is poorly optimized. The amount of geometry in each scene is ridiculous compared to most games.


Can well be. But maybe It can also serve as a semi benchmark to give an idea about the geometry performance of the machines, in cutscenes especially.
 

Clear

CliffyB's Cock Holster
Why don't they just use Nioh2 engine, it runs pretty good on PS4.

Seems like it is, just the staff working on this one haven't observed the same guidelines on assets as they did for Nioh/NIoh2. Basically it looks like they either decided/or were made to build the armour assets and stuff to mr zipper's specifications and ended up with some huge and unoptimized models.

Or maybe for whatever reason its always using high-lod versions. People forget how good looking NIoh's character generator is - it was always way better than FROM's for instance. The problem is the system uses uniform assets, Nioh is a game where you can basically play as any character in the game. So going over-budget on one, means you have to go over on everything.

The other odd thing worthy of note is that the engine never scaled particularly well with power. It always ran very well on PS4, but gains on PC were never anywhere near what was expected.
 
Last edited:

SEGAvangelist

Gold Member
Seems like an all-around technical mess. Looking forward to playing this when it's 50% off in a few months. The demo was really fun.
 

BbMajor7th

Member
Why don't they just use Nioh2 engine, it runs pretty good on PS4.
It's odd - Nioh 1 remaster does a near-locked 1080p/120 on PS5. Stranger of Paradise has some of the worst art and technical direction I've seen from a major publisher in recent years.
 
Last edited:
  • Like
Reactions: Rea
Bizarre this difference between the resolution mode for the PS5 vs XsX

I think it makes perfect sense.
This game seems technically a big garbage. People wen't showing how it's laughably NOT optimized? With models with zillions of unnecessary polygons? At least on PC. If these same models are being used on the consoles then it makes sense that the PS5 performs better with it's higher culling rate.
 

assurdum

Banned
We are sure this game doesn't runs via BC in ps5? It's not the first a game which uses BC show such huge gap in favour of the XSX.
 

assurdum

Banned
I think it makes perfect sense.
This game seems technically a big garbage. People wen't showing how it's laughably NOT optimized? With models with zillions of unnecessary polygons? At least on PC. If these same models are being used on the consoles then it makes sense that the PS5 performs better with it's higher culling rate.
Thought exactly the same. In theory should happen the opposite.
 

MonarchJT

Banned
I think it makes perfect sense.
This game seems technically a big garbage. People wen't showing how it's laughably NOT optimized? With models with zillions of unnecessary polygons? At least on PC. If these same models are being used on the consoles then it makes sense that the PS5 performs better with it's higher culling rate.
depend on how the engine and rendering pipeline work...if all processes are highly paralyzed the differences should be evident favoring the Xbox. Having said that, although the Xbox is objectively more powerful, it is certainly not by + 200%
 
depend on how the engine and rendering pipeline work...if all processes are highly paralyzed the differences should be evident favoring the Xbox. Having said that, although the Xbox is objectively more powerful, it is certainly not by + 200%

Objectively the PS5 will perform certain tasks better given the hardware differences. Which is why we are seeing this back and forth between the two systems. It’s not like either manufacturer can guarantee having the best versions 100% of the time. Really depends on how the games are built. Definitely different than the Pro Vs X days where you know the Pro never had a chance at doing anything better.
 

MonarchJT

Banned
Objectively the PS5 will perform certain tasks better given the hardware differences. Which is why we are seeing this back and forth between the two systems. It’s not like either manufacturer can guarantee having the best versions 100% of the time. Really depends on how the games are built. Definitely different than the Pro Vs X days where you know the Pro never had a chance at doing anything better.
no objectively no, PS5 gpu would do objectively better if they had the same amount of cu at higher clock. but this isn't the case. it depends on how optimized the parallelization is. for this reason, for decades manufacturers have snubbed the clock in favor of the cu count.
With this I am not saying that this game performs better because it is highly parallelized. given the discrepant results from the actual hardware probably the bulk of the differences it is given by a non-optimization
 
Last edited:
no objectively no, it would do objectively better if they had the same amount of cu at higher clock. but this isn't the case. it depends on how optimized the parallelization is. for this reason, for decades manufacturers have snubbed the clock in favor of the cu count.
With this I am not saying that this game performs better because it is highly parallelized. given the discrepant results from the actual hardware probably the bulk of the differences it is given by a non-optimization

That’s not 100% guaranteed by either if you haven’t noticed. It’s why it’s false to believe that every game should be better on one compared to the other. It really depends on the work loads and what the engines require. CUs are not the only thing inside these consoles other parts of the hardware also contribute to performance.
 

MonarchJT

Banned
That’s not 100% guaranteed by either if you haven’t noticed. It’s why it’s false to believe that every game should be better on one compared to the other. It really depends on the work loads and what the engines require. CUs are not the only thing inside these consoles other parts of the hardware also contribute to performance.
master the times we do not see better results it is simply because dividing the work (parallelizing) over all the CU sometimes is complicated and it takes so much time (therefore money) so many developers simply decide to take the shortest route. But given the correct optimization extrapolated from development times and issues. Yes 99.9% of the time the number of CU would win over the brute forcing of the clock speed (I'm generalizing here to make understand the concept....the clock difference on the number of cu should be taken into account). Obviously we see and know that optimization in the world of video games is something that varies from really poor to sufficient most of the time for this so fluctuating result
 
Last edited:
master the times we do not see better results it is simply because dividing the work (parallelizing) over all the CU sometimes is complicated and it takes so much time (therefore money) so many developers simply decide to take the shortest route. But given the correct optimization extrapolated from development times and issues. Yes 99.9% of the time the number of CU would win over the brute forcing of the clock speed (I'm generalizing here to make understand the concept....the clock difference on the number of cu should be taken into account). Obviously we see and know that optimization in the world of video games is something that varies from really poor to sufficient most of the time for this so fluctuating result

I think the issue here is that your to focused on CUs and you don’t see the bigger overall picture. The One X beat the Pro in almost every single aspect of its hardware which is why a vastly superior version was practically guaranteed on the One X.

The PS5 and the Series X have their own advantages when it comes to the hardware ware which is why multiplatform games don’t always favor one platform or the other. And in many ones one system will have advantages in certain spots where it will struggle in other spots. Further adding support to the theory that CUs are not the only things that matter. We have all these comparisons that help support that. If CUs were the only thing that would guarantee superior performance it would be reflected in every single multiplatform comparison.

I know that optimization and lazy developers plus broken tools can skew the results but that’s not going to be the case with every single game. At they point it’s just a fantasy to assume that every time the PS5 has some sort of advantage it’s down to those reasons.

Remember this gen isn’t a repeat of Pro Vs One X and not everything depends on CUs as these comparisons have shown.
 

assurdum

Banned
master the times we do not see better results it is simply because dividing the work (parallelizing) over all the CU sometimes is complicated and it takes so much time (therefore money) so many developers simply decide to take the shortest route. But given the correct optimization extrapolated from development times and issues. Yes 99.9% of the time the number of CU would win over the brute forcing of the clock speed (I'm generalizing here to make understand the concept....the clock difference on the number of cu should be taken into account). Obviously we see and know that optimization in the world of video games is something that varies from really poor to sufficient most of the time for this so fluctuating result
I think you continue to ignore to have more CUs is almost useless if you don't feed them with enough data and the XSX peak just 25% of more bandwidth over the ps5. Let's not talk about the array configuration. Let's not forget ps5 has cache scrubbers/geometry engine to feed better the gpu data; meanwhile developers will start to use better the multiple CUs in XSX they could be able to use better the ps5 hardware, I guess. You won't see more differences than what you have seen until now in the multiplat performance. Imo.
 
Last edited:

isual

Member
i played it and beat it on ps5 using a ps4 game disc. it works OK, but the pro tip is to raise the brightness to maximum settings so that it isn't too dark. i definitely think this game was rushed; and probably midway or close to completion was changed to be something else, ergo, the end product we see now.
 

Fafalada

Fafracer forever
dividing the work (parallelizing) over all the CU sometimes is complicated and it takes so much time (therefore money)
Can we please not try to reinvent the last 20 years of GPU history in service of creating a narrative that suits someone's world-views? Also because it's not very nice to be putting blame on developers for things that they aren't really responsible for.
GPU's primary job is literally what you described - and a rather large part of GPU silicon budget is dedicated to logic that ensures optimal utilization of their CUs - at least in normal rendering workloads. And it's hard to be more normal than what looks more like a 4k port of a PS3 game, there's nothing exotic going on here.

To be clear I'm not saying there's no manual occupancy tweaks that can improve things in some cases - but that's nothing on the scale of differences people typically discuss (or demand) in here. Also it's more an issue of having many targets to optimize for, than actual costs. That's the reason most of the time only 1st party devs bother, and even that's getting harder thanks to inflation of HW SKUs.
 
Can we please not try to reinvent the last 20 years of GPU history in service of creating a narrative that suits someone's world-views? Also because it's not very nice to be putting blame on developers for things that they aren't really responsible for.
GPU's primary job is literally what you described - and a rather large part of GPU silicon budget is dedicated to logic that ensures optimal utilization of their CUs - at least in normal rendering workloads. And it's hard to be more normal than what looks more like a 4k port of a PS3 game, there's nothing exotic going on here.

To be clear I'm not saying there's no manual occupancy tweaks that can improve things in some cases - but that's nothing on the scale of differences people typically discuss (or demand) in here. Also it's more an issue of having many targets to optimize for, than actual costs. That's the reason most of the time only 1st party devs bother, and even that's getting harder thanks to inflation of HW SKUs.

I know for a fact that developers have access to the additional CUs of the Series X. If they make a game that uses all the PS5s CUs they are still going to use the additional CUs that the Series X has. It’s not like developers are only using 36 CUs on the Series X. They are using the full range of 52 CUs that the console has.
 
Top Bottom