• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: The Touryst PS5 - The First 8K 60fps Console Game

Just a lot of butthurt right here...

p.s. What happened with the 12 TF GPU and 18 extra CUs for ML and SFS?

Not sure what you're attempting to say here but I get it, you just wanted a driveby moment, right? :messenger_tears_of_joy:

I literally talk about SFS nonstop and just about every chance I get on this forum as to how important it will be for Series X. Go look at the threads on the very subject if you think I've stopped bringing them up. Hint: I never stopped. I'm just more gungho about SFS about SFS than all other features because I think it's THE thing with the most potential of Microsoft's hardware capabilities because it does the most valuable thing imaginable, save a crap ton of RAM for games. What thing do devs request more than anything from the hardware makers? More memory.

More memory efficiency through Sampler Feedback Streaming is that much more important because it is my firm belief that it was the very solution arrived at by Microsoft and AMD's engineers to guarantee the 10GB @ 560GB/s & 6GB @ 336GB/s memory setup is able to more consistently feed the larger GPU (larger GPUs have more memory bandwidth because they need it) with the maximum 560GB/s, thus ensuring the 52 CU RDNA 2 GPU performs like a 52 CU RDNA 2 GPU because then it runs into many less scenarios where it can be insufficiently bandwidth supplied.

Some titles doing a better or worse job at managing that 10GB of faster memory are the sole reason you would ever see any scenario where a PS5 title performs objectively better than a similar Series X game. Key word being objectively - some seem to think 0.4% to 2% better fps on playstation while running at a 10-15% lower resolution = PS5 performance win. I can assure you it doesn't. Games that do a better job with this balance you see at minimum a resolution advantage with matching or better performance. Some games may just opt for parity and may not see the need in adjusting memory management to squeeze out greater performance from Series X, because make no mistake it can always outperform PS5 in matching scenarios with equal attention paid to hardware intricacies. But one thing I'm certain we now all definitely agree on after the touryst is that better resolution does in fact equal better performance, unless the fps is terrible.

On the ML front, I've acknowledged that is more wait and see to see what it can actually accomplish because what we do know for certain is it's nothing like Nvidia's Tensor cores. It's a less dedicated hardware approach to machine learning. Its applications for games is still yet to be determined, but Microsoft seems fairly confident it can be used for something, so we will see.

The GPU being 12TF and having all those extra CUs for a total of 1024 extra ALUs over the PS5 is every bit as big a deal as it always was. Why would you think otherwise, because of games 1 year in? Series X having 1024 more ALUs than PS5 isn't something that has suddenly disappeared. It will become more of a factor as the gen progresses and developers start more directly targeting specific capabilities in ever more granular ways. More and more techniques, particularly of the like that can branch off and emerge from the more advanced feature set that Series X is designed around, such as Mesh Shaders, Sampler Feedback Streaming, Tier 2 VRS, Machine Learning, adaptive ray tracing scenarios, inline ray tracing that's more developer driven etc., will all benefit from the type of hardware in Series X. Unreal 5 engine's Lumen lighting? That benefits from MORE ALUs, more compute units. The more advanced geometry processing capabilities introduced by PS5's Geometry Engine and its primitive shaders feature and Series X's Mesh Shaders benefit from more, not less, ALUs or Compute Units. Everywhere games development is headed leans more heavily to increased programmability or compute.

More specifically, the newer types of game engines that will, as their baseline, be designed to directly exploit the greatest strengths of things such as the CPU, memory, SSD/Decompression hardware and the GPU's capabilities will paint a very different picture from the still largely transitional phase the new consoles are still very much in. So hold your horses soldier. We got people storming the castles over the Touryst (and bizarrely the Medium, same game many of you poked fun at when it was only on Xbox) believing some grand victory has been won and are genuinely talking as if barely a year into the gen you've seen a fraction or legitimate snapshot of what either the PS5 or Series X can do just yet. Every game designed and released up to this point was/is still very much designed around old hardware paradigms/baselines. We haven't seen anything yet.

This generation of consoles will represent the largest multi-year capabilities glow up we have ever experienced on consoles because the tech has reached that point. On the ray tracing front there's not nearly enough performance grunt just yet to do what the best Nvidia GPU are doing right now on PC, that capability comes with the next gen of consoles, but even if they're capable of still surprising us, I still see everything else these consoles will be capable of doing during this gen with all their other hardware capabilities to be more significant.

So there's my response. No driveby stuff. A full response. Short version: Not a thing has changed. We are too early into the gen to say "See? See? It's all nothing. Totally useless. Look the other way". Every advantage that Xbox Series X has over PS5 will almost certainly come into play during this gen because the game engines and techniques that most take advantage of what Series X does best area coming, and then we will be having a very different discussion then. :)
 
Watch the video he talk about how the Xbox Series X still look like a native 4K game , Maybe sampling from 6K isn't that noticeable over 4K to him

around the 7:30 mark



Higher levels of supersampling look better than lower resolutions of super sampling. 6K looked totally fine till people saw 8K. All of it is clean. PS5 is just more clean because 8K is many more pixels than 6K.
 

DeepEnigma

Gold Member
sci-fi matrix GIF
 

onQ123

Member
Higher levels of supersampling look better than lower resolutions of super sampling. 6K looked totally fine till people saw 8K. All of it is clean. PS5 is just more clean because 8K is many more pixels than 6K.
Yeah I know but he didn't seem to notice much of a difference with the Xbox Series X 6K SS
 
Yeah I know but he didn't seem to notice much of a difference with the Xbox Series X 6K SS

Probably down to art style and the improved depth of field effect, which is likely also improved due to the resolution I think. Game is a beautiful thing though.
 

onQ123

Member
Who would have guessed a $20 indie based on voxels would be one of the biggest DF threads here.


Probably because of the situation with Xbox Series X being 6K while PS5 is 8K

Not sure what you're attempting to say here but I get it, you just wanted a driveby moment, right? :messenger_tears_of_joy:

I literally talk about SFS nonstop and just about every chance I get on this forum as to how important it will be for Series X. Go look at the threads on the very subject if you think I've stopped bringing them up. Hint: I never stopped. I'm just more gungho about SFS about SFS than all other features because I think it's THE thing with the most potential of Microsoft's hardware capabilities because it does the most valuable thing imaginable, save a crap ton of RAM for games. What thing do devs request more than anything from the hardware makers? More memory.

More memory efficiency through Sampler Feedback Streaming is that much more important because it is my firm belief that it was the very solution arrived at by Microsoft and AMD's engineers to guarantee the 10GB @ 560GB/s & 6GB @ 336GB/s memory setup is able to more consistently feed the larger GPU (larger GPUs have more memory bandwidth because they need it) with the maximum 560GB/s, thus ensuring the 52 CU RDNA 2 GPU performs like a 52 CU RDNA 2 GPU because then it runs into many less scenarios where it can be insufficiently bandwidth supplied.

Some titles doing a better or worse job at managing that 10GB of faster memory are the sole reason you would ever see any scenario where a PS5 title performs objectively better than a similar Series X game. Key word being objectively - some seem to think 0.4% to 2% better fps on playstation while running at a 10-15% lower resolution = PS5 performance win. I can assure you it doesn't. Games that do a better job with this balance you see at minimum a resolution advantage with matching or better performance. Some games may just opt for parity and may not see the need in adjusting memory management to squeeze out greater performance from Series X, because make no mistake it can always outperform PS5 in matching scenarios with equal attention paid to hardware intricacies. But one thing I'm certain we now all definitely agree on after the touryst is that better resolution does in fact equal better performance, unless the fps is terrible.

On the ML front, I've acknowledged that is more wait and see to see what it can actually accomplish because what we do know for certain is it's nothing like Nvidia's Tensor cores. It's a less dedicated hardware approach to machine learning. Its applications for games is still yet to be determined, but Microsoft seems fairly confident it can be used for something, so we will see.

The GPU being 12TF and having all those extra CUs for a total of 1024 extra ALUs over the PS5 is every bit as big a deal as it always was. Why would you think otherwise, because of games 1 year in? Series X having 1024 more ALUs than PS5 isn't something that has suddenly disappeared. It will become more of a factor as the gen progresses and developers start more directly targeting specific capabilities in ever more granular ways. More and more techniques, particularly of the like that can branch off and emerge from the more advanced feature set that Series X is designed around, such as Mesh Shaders, Sampler Feedback Streaming, Tier 2 VRS, Machine Learning, adaptive ray tracing scenarios, inline ray tracing that's more developer driven etc., will all benefit from the type of hardware in Series X. Unreal 5 engine's Lumen lighting? That benefits from MORE ALUs, more compute units. The more advanced geometry processing capabilities introduced by PS5's Geometry Engine and its primitive shaders feature and Series X's Mesh Shaders benefit from more, not less, ALUs or Compute Units. Everywhere games development is headed leans more heavily to increased programmability or compute.

More specifically, the newer types of game engines that will, as their baseline, be designed to directly exploit the greatest strengths of things such as the CPU, memory, SSD/Decompression hardware and the GPU's capabilities will paint a very different picture from the still largely transitional phase the new consoles are still very much in. So hold your horses soldier. We got people storming the castles over the Touryst (and bizarrely the Medium, same game many of you poked fun at when it was only on Xbox) believing some grand victory has been won and are genuinely talking as if barely a year into the gen you've seen a fraction or legitimate snapshot of what either the PS5 or Series X can do just yet. Every game designed and released up to this point was/is still very much designed around old hardware paradigms/baselines. We haven't seen anything yet.

This generation of consoles will represent the largest multi-year capabilities glow up we have ever experienced on consoles because the tech has reached that point. On the ray tracing front there's not nearly enough performance grunt just yet to do what the best Nvidia GPU are doing right now on PC, that capability comes with the next gen of consoles, but even if they're capable of still surprising us, I still see everything else these consoles will be capable of doing during this gen with all their other hardware capabilities to be more significant.

So there's my response. No driveby stuff. A full response. Short version: Not a thing has changed. We are too early into the gen to say "See? See? It's all nothing. Totally useless. Look the other way". Every advantage that Xbox Series X has over PS5 will almost certainly come into play during this gen because the game engines and techniques that most take advantage of what Series X does best area coming, and then we will be having a very different discussion then. :)



I think things will go the other way around in PS5 favor as devs need more memory & start streaming more data from the SSD


 
Last edited:

Snake29

RSI Employee of the Year
Not sure what you're attempting to say here but I get it, you just wanted a driveby moment, right? :messenger_tears_of_joy:

I literally talk about SFS nonstop and just about every chance I get on this forum as to how important it will be for Series X. Go look at the threads on the very subject if you think I've stopped bringing them up. Hint: I never stopped. I'm just more gungho about SFS about SFS than all other features because I think it's THE thing with the most potential of Microsoft's hardware capabilities because it does the most valuable thing imaginable, save a crap ton of RAM for games. What thing do devs request more than anything from the hardware makers? More memory.

More memory efficiency through Sampler Feedback Streaming is that much more important because it is my firm belief that it was the very solution arrived at by Microsoft and AMD's engineers to guarantee the 10GB @ 560GB/s & 6GB @ 336GB/s memory setup is able to more consistently feed the larger GPU (larger GPUs have more memory bandwidth because they need it) with the maximum 560GB/s, thus ensuring the 52 CU RDNA 2 GPU performs like a 52 CU RDNA 2 GPU because then it runs into many less scenarios where it can be insufficiently bandwidth supplied.

Some titles doing a better or worse job at managing that 10GB of faster memory are the sole reason you would ever see any scenario where a PS5 title performs objectively better than a similar Series X game. Key word being objectively - some seem to think 0.4% to 2% better fps on playstation while running at a 10-15% lower resolution = PS5 performance win. I can assure you it doesn't. Games that do a better job with this balance you see at minimum a resolution advantage with matching or better performance. Some games may just opt for parity and may not see the need in adjusting memory management to squeeze out greater performance from Series X, because make no mistake it can always outperform PS5 in matching scenarios with equal attention paid to hardware intricacies. But one thing I'm certain we now all definitely agree on after the touryst is that better resolution does in fact equal better performance, unless the fps is terrible.

On the ML front, I've acknowledged that is more wait and see to see what it can actually accomplish because what we do know for certain is it's nothing like Nvidia's Tensor cores. It's a less dedicated hardware approach to machine learning. Its applications for games is still yet to be determined, but Microsoft seems fairly confident it can be used for something, so we will see.

The GPU being 12TF and having all those extra CUs for a total of 1024 extra ALUs over the PS5 is every bit as big a deal as it always was. Why would you think otherwise, because of games 1 year in? Series X having 1024 more ALUs than PS5 isn't something that has suddenly disappeared. It will become more of a factor as the gen progresses and developers start more directly targeting specific capabilities in ever more granular ways. More and more techniques, particularly of the like that can branch off and emerge from the more advanced feature set that Series X is designed around, such as Mesh Shaders, Sampler Feedback Streaming, Tier 2 VRS, Machine Learning, adaptive ray tracing scenarios, inline ray tracing that's more developer driven etc., will all benefit from the type of hardware in Series X. Unreal 5 engine's Lumen lighting? That benefits from MORE ALUs, more compute units. The more advanced geometry processing capabilities introduced by PS5's Geometry Engine and its primitive shaders feature and Series X's Mesh Shaders benefit from more, not less, ALUs or Compute Units. Everywhere games development is headed leans more heavily to increased programmability or compute.

More specifically, the newer types of game engines that will, as their baseline, be designed to directly exploit the greatest strengths of things such as the CPU, memory, SSD/Decompression hardware and the GPU's capabilities will paint a very different picture from the still largely transitional phase the new consoles are still very much in. So hold your horses soldier. We got people storming the castles over the Touryst (and bizarrely the Medium, same game many of you poked fun at when it was only on Xbox) believing some grand victory has been won and are genuinely talking as if barely a year into the gen you've seen a fraction or legitimate snapshot of what either the PS5 or Series X can do just yet. Every game designed and released up to this point was/is still very much designed around old hardware paradigms/baselines. We haven't seen anything yet.

This generation of consoles will represent the largest multi-year capabilities glow up we have ever experienced on consoles because the tech has reached that point. On the ray tracing front there's not nearly enough performance grunt just yet to do what the best Nvidia GPU are doing right now on PC, that capability comes with the next gen of consoles, but even if they're capable of still surprising us, I still see everything else these consoles will be capable of doing during this gen with all their other hardware capabilities to be more significant.

So there's my response. No driveby stuff. A full response. Short version: Not a thing has changed. We are too early into the gen to say "See? See? It's all nothing. Totally useless. Look the other way". Every advantage that Xbox Series X has over PS5 will almost certainly come into play during this gen because the game engines and techniques that most take advantage of what Series X does best area coming, and then we will be having a very different discussion then. :)

You contradict yourself...nice move...

We are too early into the gen to say "See? See? It's all nothing

Ok...but..

Every advantage that Xbox Series X has over PS5 will almost certainly come into play during this gen because the game engines and techniques that most take advantage of what Series X does best area coming, and then we will be having a very different discussion then. :)

What is makes this different? You are basically claiming a victory already....
 
Last edited:

FrankWza

Member
The topic at hand is the 1st 8K 60fps console game . for now all we know is that it wasn't 8K on Xbox Series X & the reason giving by the devs is that PS5 higher clock rate & memory setup allowed them to release the game at 8K 60FPS .
This was a good note to end the conversation on. It was probably already posted about 5 times already and could have ended any time.
Who would have guessed a $20 indie based on voxels would be one of the biggest DF threads here.
it was the first trip to the moon. The first trip is a big deal. Nobody remembers the second trip to the moon.
 

onQ123

Member
It wouldn't be a bad marketing move for Sony to get more Switch ports on PS5 that run at 8K , Hell they could get some PS3 games & resale them as 8K remasters lol
 

Hoddi

Member
You should have a few drinks then. Lord knows I have. Which is also why I have a mustache.

One of the worst insults of the viking age was to call someone beardless. There's a solution to that and I think you may want to use it. You need to drink up, Bender, because you're the weakest.
 

bender

What time is it?
You should have a few drinks then. Lord knows I have. Which is also why I have a mustache.

One of the worst insults of the viking age was to call someone beardless. There's a solution to that and I think you may want to use it. You need to drink up, Bender, because you're the weakest.

Like most humans you appear to be clueless.
 

bender

What time is it?
Well, you appear to be beardless.

There will be stories written about you that everyone will promptly forget because you couldn't even grow a teenager's mustache.

Along with a malfunctioning brain, your eyes also appear to be broken.
 

Lognor

Banned
It's the only 8K console game and that makes it the resolution benchmark for now , if you don't understand that I'm done talking to you.
A benchmark is something to use as a baseline. Given this is a very simple indie game the runs well on the switch (lol) and 8k is NOT the norm, no this is not a benchmark. You can claim it as such to try and check some box in your made up console war. But again no it's not a benchmark.
 

onQ123

Member
A benchmark is something to use as a baseline. Given this is a very simple indie game the runs well on the switch (lol) and 8k is NOT the norm, no this is not a benchmark. You can claim it as such to try and check some box in your made up console war. But again no it's not a benchmark.
Right now it's the benchmark for what resolution these consoles can run it at . Xbox Series X 6K 60fps lower DoF & Shadows , PS5 8K 60fps better DoF & Shadows.
 
Last edited:

FranXico

Member
On reflection, I'm wondering if they have tapped into the whole geometry engine aspect for more efficient object culling and polygon decimation? The visual style of the game I suspect might be very friendly to some sort of low-level technique to omit hidden faces - saving a lot of draw time in the process
i would think that the shader code this game uses doesn't need as many cores as others, thus making the GPU frequency more relevant for performance. And the fact that the engine actually got properly ported (instead of just bridging a DirectX engine with GNMX like most teams do) means there was little CPU overhead too.
 

Heisenberg007

Gold Journalism
How's that? Xsx beats ps5 on resolution almost every time. On games that seemingly matter more, or at least sell more. Call off duty and such.
So like 9/10 times? When did that happen?

And Call of Duty? Receipts, please.

Edit: Lognor Lognor , would you stop leaving 'triggered' emojis like a 12-year old kid and instead share the receipts to back up your claim?
 
Last edited:

DenchDeckard

Moderated wildly
This thread has delivered.

Anyone bought the game?

Props to shin en using the extra time to go back and increase the quality on shadows and depth of field. They used this opportunity to really get down to the nitty gritty of the ps5s hardware. I hope more devs do that on games with timed exclusivity.
 
Last edited:

ZywyPL

Banned
Resolution matters, until it doesn't. :)

Now it’s a benchmark 🤣 if it was a triple a game with Crysis super graphics running at 8 K / 60 it would have been ..


Only here on Neogaf a 2yo Switch game port can be a new next-gen console benchmark, while just yesterday those people said they can't see a difference between 1440p an 4K anyway (polus notice how nobody mentioned the SSD even once). This thread is already twice as long as the infamous Marvel's Avenger one, it's really a sight to behold, and in the same fashion, I wonder how many folks actually gave the game a try and checked it out for themselves?


8k aside ive been meaning to check this game out, looks cute almost like an isometric nintendo game.

It really does feel like a Nintendo game, sort of a next-gen Mario, I highly recommend it.
 
Last edited:

onQ123

Member
Only here on Neogaf a 2yo Switch game port can be a new next-gen console benchmark, while just yesterday those people said they can't see a difference between 1440p an 4K anyway (polus notice how nobody mentioned the SSD even once). This thread is already twice as long as the infamous Marvel's Avenger one, it's really a sight to behold, and in the same fashion, I wonder how many folks actually gave the game a try and checked it out for themselves?




It really does feel like a Nintendo game, sort of a next-gen Mario, I highly recommend it.
In the land of 1440P , 8K is King
 
You contradict yourself...nice move...



Ok...but..



What is makes this different? You are basically claiming a victory already....

Nothing about it is contradictory if you read carefully what I actually said. What I am speaking on are hardware advantages that are well documented in favor of Xbox Series X. They aren't up to feelings or opinion on if they are actual hardware advantages or not. Saying those hardware advantages will more readily come into play further into the gen when game engines and techniques that are designed to get the most from it actually arrive isn't the same as claiming victory based on results one year in or from a single title when we know the consoles have much left in the tank. And let's not pretend Series X isn't already demonstrating it's the more capable hardware by the expected percentages. It kinda is.

Below is what we're talking about

12TF 52 Compute Unit Full RDNA 2 GPU (or 26 Dual Compute Units)
3,328 Stream Processors
10GB @ 560GB/s 6GB @ 336GB/s
1825MHz clock speed (locked)
CPU @ 3.8GHz (without SMT) 3.6GHz (with SMT) both locked
Mesh Shaders, VRS Tier 2, Sampler Feedback Streaming, Ray Tracing, Hardware Machine Learning

The contention some are making is that the 12TFLOPs GPU of the Series X, with all the above features, doesn't command as much of a performance edge over the PS5 as many suggested it would. The problem with any such assumption is that it's largely premature since much of what's listed above from an advanced graphics performance feature standpoint, hasn't sniffed, let alone even appeared, in a game this early into the gen. Ray Tracing we know harms performance, and knew that entering the gen, so I won't label that a performance enhancer. Every other feature supported by the Series X GPU is designed to offer enhanced performance.

We have one instance total of a released game for both systems where one of these performance enhancing features are in use, just one. Doom Eternal. It used not the inferior version of VRS, the one which can be software emulated, but the more advanced hardware Tier 2 VRS. What was the performance outcome in what we know to be one of the most advanced and graphically impressive game engines out there?



PS5 and Xbox Series X in Balanced Mode use a dynamic resolution with the highest resolution found being 3840x2160 and the lowest resolution found being approximately 2720x1530. Pixel counts below 3840x2160 were found more often on PS5 than Xbox Series X. As an example, in one scene the PS5 dropped to approximately 3456x1944 and the Xbox Series X rendered that scene at 3840x2160.

That's a 23% performance advantage for Series X in balanced mode

PS5 and Xbox Series X in Raytracing Mode use a dynamic resolution with the highest resolution found being 3200x1800 and the lowest resolution found being approximately 2266x1275. Pixel counts below 3200x1800 were found more often on PS5 than Xbox Series X. As an example, in one scene the PS5 dropped to approximately 2986x1680 and the Xbox Series X rendered that scene at 3200x1800.

That's a 14% performance advantage for Series X in RT mode.

PS5 in the 120fps Mode uses a dynamic resolution with the highest resolution found being 2816x1584 and the lowest resolution found being approximately 1992x1120.
Xbox Series X in the 120fps Mode uses a dynamic resolution with the highest resolution found being 3200x1800 and the lowest resolution found being approximately 2266x1275.

That's a 29% performance edge for Series X in 120fps mode.




Then there's Avengers, another visually advanced looking game.

In the Quality mode where both consoles used dynamic res with native resolutions, Series X maintained a 13% performance advantage.

In the performance mode it's an even more insane 62% performance advantage for the Series X in terms of performance because the PS5 is literally rendering way less pixels. If we go by Digital Foundry's pixel count, then it's an even higher 74% edge. Now, I'll go with VGTech simply because they provide way more data compared to Digital Foundry.

PS5 in quality mode uses a dynamic resolution with the highest resolution found being 3840x2160 and the lowest resolution found being 3200x1800. Xbox Series X in quality mode uses a dynamic resolution with the highest resolution found being 3840x2160 and the lowest resolution found being approximately 3413x1920.
PS5 in performance mode uses a dynamic resolution with the highest resolution found being 3840x2160 and the lowest resolution found being 2560x1440. PS5 in performance mode uses a form of checkerboard rendering to reach the stated resolutions. Checkerboard 3840x2160 seems to be a common rendering resolution on PS5 in performance mode. PS5 in performance mode can sometimes exhibit half horizontal resolution artifacts and the UI elements can sometimes show visible stippling artifacts, both seem to be related to the use of checkerboard rendering.
Xbox Series X in performance mode uses a dynamic resolution with the highest resolution found being 3840x2160 and the lowest resolution found being approximately 2304x1296. Edit: Xbox Series X in performance mode drops below its maximum resolution more often than PS5 does in performance mode. This is due to Xbox Series X using native rendering instead of checkerboard rendering. The original common resolution range that I posted here for Xbox Series X was removed due to it potentially being inaccurate

Now something I've always been curious about is this. We know both consoles turn in excellent performance figures in avengers. Neither console is performing badly, which the stats back up. But we also know the PS5 is rendering, at any given moment, many less pixels compared to the Series X. At minimum it appears 62% less. And even then we get these kinds of stats.

ZvDlTem.jpg



You would think that if the PS5 in this particular game is rendering so many less pixels, how is it possible that its performance can be THIS close to the Series X, and not be pretty much flawless? It suggests that if the PS5 were anywhere close to the native resolutions of the Series X, it would performed quite a bit worse. Or, simply put, the resolution difference would have needed to have been even lower. The PS5's maximum checkerboard resolution of 3840x2160 works out to 1920x2160, that's 12.5% more pixels than 1440p. Series X's maximum native 4K is literally 100% more pixels by comparison. Yes, the Series X drops from that native 4K more often than the PS4 drops from its half checkerboard 4K, but at worst the Series X advantage is in the realm of 62% more resolution performance. But to be more fair, 50% more resolution if you leave PS5 at its version of 4K and instead use a lower native pixel count for Series X under its native 4K.

This makes what Crystal Dynamics told NXGamer even more valid. Had they not gone with Checkerboarding, they would have ended up with potentially worse image quality or something softer in appearance overall. Checkerboarding, even with its imperfections, guaranteed a better visual outcome than if they hadn't gone with it. Now we know CD takes their performance seriously, so clearly the game wouldn't have performed like shit at native, because they wouldn't have allowed it to. They would have simply run at a lower overall resolution to maintain the performance, and it would looked worse next to the Series X version. So they made the smartest decision. And just in case someone tries to claim PS5 has better overall performance because Series X had the lowest minimum framerate, which appears less than a quarter of 1% of the time

4L1QSIk.jpg




Finally, another highly visually advanced title on a game engine that's no joke.

Metro Exodus, yet another demanding game pushing visuals, stop me if you've heard this one before, but Series X is maintaining roughly 21-23% more performance in resolution. And no, PS5 maintaining 60fps less than 1% of the time over Series X does not constitute a performance win for the PS5 when it's already running at a lower resolution and dropping to lower than 1080p in more demanding scenes.

PS5 uses a dynamic resolution with the highest native resolution found being approximately 2844x1600 and the lowest resolution found being approximately 1792x1008. Pixel counts at 1792x1008 seem to be rare on PS5. PS5 uses temporal upsampling to reconstruct a 3840x2160 resolution. Xbox Series X uses a dynamic resolution with the highest native resolution found being approximately 2844x1600 and the lowest resolution found being approximately 1920x1080. Pixel counts at 1920x1080 seem to be rare on Xbox Series X. Xbox Series X uses temporal upsampling to reconstruct a 3840x2160 resolution.
Volga Level Opening Train Scene - PS5: 2560x1440, Series X: 2844x1600 Caspian Level Driving - PS5: 2560x1440, Series X: 2844x1600 Taiga Level Exploration - PS5: 2176x1224, Series X: 2400x1350 Taiga Level Forest Demanding Scene - PS5: 1792x1008, Series X: 1920x1080

And finally, though I won't waste time posting it, but you can go look at the stats if you don't believe me, Resident Evil Village actually performs better on Series X Better than 2% of the time in RT mode. Both consoles are flawless in non RT mode, not a single drop according to the stats. And due to how fantastic Capcom's checkerboarded solution is it's near damn impossible to get a proper resolution count on the game, so inconclusive so we'll assume they must have been 100% identical. There are some titles where despite having a resolution advantage Series X's performance is just unacceptable, that's a PS5 win. The 4 games I just ticked through are not one of them.

Another game, the next gen version of Star Wars Jedi: Fallen Order is also again better on Series X, but that one is much closer. A roughly 8% resolution edge with less than 1%, not even a tenth of 1% framerate edge going to PS5 in both modes. But admittedly that game is far less, to practically nothing, to bark about because they are just that damn close.

Series X has more convincing cases, in more visually demanding games on very advanced engines. And guess what? That advantage is often indeed precisely the GPU performance advantage that it's been suggested to have. There are times it seems to almost exceed it. Feel free to look on the PC side of things for nearly matching GPU configurations and you will notice generally similar or smaller percentage advantages in framerate performance at identical resolutions. Translation: in console terms the stronger GPU maintaining higher fps at a particular resolution would be the one running at the higher resolution compared to the weaker GPU running at a lesser resolution for performance reasons, and in so doing they would get roughly equal framerates, exactly what you're seeing between PS5 and Series X. A familiar trend you will often see if you look at these PC game benchmarks is that often the weaker GPU running at the lower resolution will see generally better framerates than the stronger GPU was getting at its higher resolution, forcing the stronger chip to come down from its perch just a little bit to get closer to performing like the weaker card, just at a higher resolution because that's what it should be doing.

Just for fun people can use the 2080 Super to represent Series X and the 2070 Super or 5700XT to represent PS5 (differences in hardware and core makeup are roughly close to PS5 and Series X). What you'll often see is that the lower resolution framerate performance of weaker cards tend to be better than their more capable counterparts running at the higher resolutions. This would mean that the two cards through dynamic resolution would need to draw closer in resolution to meet their targets


So if people want to label Touryst a benchmark, then we can show benchmarks for much more graphically demanding titles developed under similar circumstances and timeframes on more advanced game engines that demonstrate which platform is more capable. I'm unsure how a game that released nearly an entire year after the Series X launch with an entirely rewritten game engine and the added benefit of a lot more time under their belt (things Xbox couldn't benefit from) can be considered a fair benchmark of the capabilities of the two systems, but that won't stop people from trying to hold it up as evidence of platform capability. Then again, we certainly know people on the Xbox side would have done the same if the roles were reversed, so I guess all is fair. But there's a shiny asterisk all over this one.

You Know It GIF by MOODMAN
 
Last edited:

Lysandros

Member
On reflection, I'm wondering if they have tapped into the whole geometry engine aspect for more efficient object culling and polygon decimation? The visual style of the game I suspect might be very friendly to some sort of low-level technique to omit hidden faces - saving a lot of draw time in the process.
Wouldn't the faster fixed function discard (due to Geometry Engine running at 20% higher frequency) also beneficial in this case, even without the low level access?
 

Snake29

RSI Employee of the Year
Nothing about it is contradictory if you read carefully what I actually said. What I am speaking on are hardware advantages that are well documented in favor of Xbox Series X. They aren't up to feelings or opinion on if they are actual hardware advantages or not. Saying those hardware advantages will more readily come into play further into the gen when game engines and techniques that are designed to get the most from it actually arrive isn't the same as claiming victory based on results one year in or from a single title when we know the consoles have much left in the tank. And let's not pretend Series X isn't already demonstrating it's the more capable hardware by the expected percentages. It kinda is.

Below is what we're talking about

12TF 52 Compute Unit Full RDNA 2 GPU (or 26 Dual Compute Units)
3,328 Stream Processors
10GB @ 560GB/s 6GB @ 336GB/s
1825MHz clock speed (locked)
CPU @ 3.8GHz (without SMT) 3.6GHz (with SMT) both locked
Mesh Shaders, VRS Tier 2, Sampler Feedback Streaming, Ray Tracing, Hardware Machine Learning

The contention some are making is that the 12TFLOPs GPU of the Series X, with all the above features, doesn't command as much of a performance edge over the PS5 as many suggested it would. The problem with any such assumption is that it's largely premature since much of what's listed above from an advanced graphics performance feature standpoint, hasn't sniffed, let alone even appeared, in a game this early into the gen. Ray Tracing we know harms performance, and knew that entering the gen, so I won't label that a performance enhancer. Every other feature supported by the Series X GPU is designed to offer enhanced performance.

We have one instance total of a released game for both systems where one of these performance enhancing features are in use, just one. Doom Eternal. It used not the inferior version of VRS, the one which can be software emulated, but the more advanced hardware Tier 2 VRS. What was the performance outcome in what we know to be one of the most advanced and graphically impressive game engines out there?





That's a 23% performance advantage for Series X in balanced mode



That's a 14% performance advantage for Series X in RT mode.



That's a 29% performance edge for Series X in 120fps mode.




Then there's Avengers, another visually advanced looking game.

In the Quality mode where both consoles used dynamic res with native resolutions, Series X maintained a 13% performance advantage.

In the performance mode it's an even more insane 62% performance advantage for the Series X in terms of performance because the PS5 is literally rendering way less pixels. If we go by Digital Foundry's pixel count, then it's an even higher 74% edge. Now, I'll go with VGTech simply because they provide way more data compared to Digital Foundry.





Now something I've always been curious about is this. We know both consoles turn in excellent performance figures in avengers. Neither console is performing badly, which the stats back up. But we also know the PS5 is rendering, at any given moment, many less pixels compared to the Series X. At minimum it appears 62% less. And even then we get these kinds of stats.

ZvDlTem.jpg



You would think that if the PS5 in this particular game is rendering so many less pixels, how is it possible that its performance can be THIS close to the Series X, and not be pretty much flawless? It suggests that if the PS5 were anywhere close to the native resolutions of the Series X, it would performed quite a bit worse. Or, simply put, the resolution difference would have needed to have been even lower. The PS5's maximum checkerboard resolution of 3840x2160 works out to 1920x2160, that's 12.5% more pixels than 1440p. Series X's maximum native 4K is literally 100% more pixels by comparison. Yes, the Series X drops from that native 4K more often than the PS4 drops from its half checkerboard 4K, but at worst the Series X advantage is in the realm of 62% more resolution performance. But to be more fair, 50% more resolution if you leave PS5 at its version of 4K and instead use a lower native pixel count for Series X under its native 4K.

This makes what Crystal Dynamics told NXGamer even more valid. Had they not gone with Checkerboarding, they would have ended up with potentially worse image quality or something softer in appearance overall. Checkerboarding, even with its imperfections, guaranteed a better visual outcome than if they hadn't gone with it. Now we know CD takes their performance seriously, so clearly the game wouldn't have performed like shit at native, because they wouldn't have allowed it to. They would have simply run at a lower overall resolution to maintain the performance, and it would looked worse next to the Series X version. So they made the smartest decision. And just in case someone tries to claim PS5 has better overall performance because Series X had the lowest minimum framerate, which appears less than a quarter of 1% of the time

4L1QSIk.jpg




Finally, another highly visually advanced title on a game engine that's no joke.

Metro Exodus, yet another demanding game pushing visuals, stop me if you've heard this one before, but Series X is maintaining roughly 21-23% more performance in resolution. And no, PS5 maintaining 60fps less than 1% of the time over Series X does not constitute a performance win for the PS5 when it's already running at a lower resolution and dropping to lower than 1080p in more demanding scenes.




And finally, though I won't waste time posting it, but you can go look at the stats if you don't believe me, Resident Evil Village actually performs better on Series X Better than 2% of the time in RT mode. Both consoles are flawless in non RT mode, not a single drop according to the stats. And due to how fantastic Capcom's checkerboarded solution is it's near damn impossible to get a proper resolution count on the game, so inconclusive so we'll assume they must have been 100% identical. There are some titles where despite having a resolution advantage Series X's performance is just unacceptable, that's a PS5 win. The 4 games I just ticked through are not one of them.

Another game, the next gen version of Star Wars Jedi: Fallen Order is also again better on Series X, but that one is much closer. A roughly 8% resolution edge with less than 1%, not even a tenth of 1% framerate edge going to PS5 in both modes. But admittedly that game is far less, to practically nothing, to bark about because they are just that damn close.

Series X has more convincing cases, in more visually demanding games on very advanced engines. And guess what? That advantage is often indeed precisely the GPU performance advantage that it's been suggested to have. There are times it seems to almost exceed it. Feel free to look on the PC side of things for nearly matching GPU configurations and you will notice generally similar or smaller percentage advantages in framerate performance at identical resolutions. Translation: in console terms the stronger GPU maintaining higher fps at a particular resolution would be the one running at the higher resolution compared to the weaker GPU running at a lesser resolution for performance reasons, and in so doing they would get roughly equal framerates, exactly what you're seeing between PS5 and Series X. A familiar trend you will often see if you look at these PC game benchmarks is that often the weaker GPU running at the lower resolution will see generally better framerates than the stronger GPU was getting at its higher resolution, forcing the stronger chip to come down from its perch just a little bit to get closer to performing like the weaker card, just at a higher resolution because that's what it should be doing.

Just for fun people can use the 2080 Super to represent Series X and the 2070 Super or 5700XT to represent PS5 (differences in hardware and core makeup are roughly close to PS5 and Series X). What you'll often see is that the lower resolution framerate performance of weaker cards tend to be better than their more capable counterparts running at the higher resolutions. This would mean that the two cards through dynamic resolution would need to draw closer in resolution to meet their targets


So if people want to label Touryst a benchmark, then we can show benchmarks for much more graphically demanding titles developed under similar circumstances and timeframes on more advanced game engines that demonstrate which platform is more capable. I'm unsure how a game that released nearly an entire year after the Series X launch with an entirely rewritten game engine and the added benefit of a lot more time under their belt (things Xbox couldn't benefit from) can be considered a fair benchmark of the capabilities of the two systems, but that won't stop people from trying to hold it up as evidence of platform capability. Then again, we certainly know people on the Xbox side would have done the same if the roles were reversed, so I guess all is fair. But there's a shiny asterisk all over this one.

You Know It GIF by MOODMAN


Sorry to interrupt you (nice post), but you're are wasting your time posting and trying to defend, claim victories and being so obsessive with the XSX hardware, that the only thing you do is talking positive only about the XSX as if it will beat the PS5 from now.

No one asked for all that and still you contradict yourself in the other post.....sorry.

Xbox fans are so deeply invested and obsessed with the XSX hardware, that they are so blind for the next disappointment.

Sorry but all this post is "I will convince you guys that the XSX is the all mighty console". Going back to you previous post:

We are too early into the gen to say "See? See? It's all nothing.

You are actually doing what you told yourself here.
 
Last edited:

DenchDeckard

Moderated wildly
We do have to all admit that it is funny that resolution matters now?

Literally 16 pages over a resolution that no ones TV can actually display or the PS5 can even output.....But VRR is irrelevant?

This is a classic!

I do think its really cool video and you can clearly see the benefit the super sampling provides but surely its not 16 pages on a 2 year old switch game?

News is that dry?
 
Last edited:

DForce

NaughtyDog Defense Force
SenjutsuSage pretends to be neutral, but he showed up again after a long hiatus when more games on the Xbox Series X started winning comparisons.

Somehow he can quote developers stating how an engine benefits from Xbox's hardware, but when a developer says their game takes advantage of a narrow approach with faster clocks, it must be disputed no matter what. :messenger_tears_of_joy:
 

Snake29

RSI Employee of the Year
We do have to all admit that it is funny that resolution matters now?

Literally 16 pages over a resolution that no ones TV can actually display or the PS5 can even output.....But VRR is irrelevant?

This is a classic!

I do think its really cool video and you can clearly see the benefit the super sampling provides but surely its not 16 pages on a 2 year old switch game?

News is that dry?

The res is getting downsampled so it looks very clean. It's not a common used resolution for gaming and it's the first game with internal 8K rendering. No it doesn't mean it's important, but that small res difference between Hitman 3 on XSX and PS5 was more "important" for the xbox fanboys and still using that as the "big" hardware difference between the 2 consoles to this day, when it's just a engine thing and nothing more.

This thread wouldn't have so many pages if some butthurt and XSX hardware brainwashed/obsessive xbox fanboys, have come in to convince ps fans again that there is nothing better than the XSX.
 
Last edited:
Top Bottom