• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: The Touryst PS5 - The First 8K 60fps Console Game

Lysandros

Member
The 1060 has 48 ROPs but only 32 rasterizers on the front-end. The ROPs on Pascal were tied to the number of memory channels and so they were 48 because the 1060 was a 192-bit card. It could still only rasterize 32 pixels/clock.

You're right about the GPU utilization though I'll add that 8k30 is twice as much as 4k60. 50% utilization at 4k60 follows rather closely with that.
So you are saying Techpowerup figure of 82 Gpixel/s is wrong?

Yes 8K/30 FPS is twice as much pixels compared to 4K/60FPS but only half of the geometry throughput.
 

Dodkrake

Banned
The Geometry Engine is the exact same thing as the Geometry Engine found in any RDNA card, there is absolutely no evidence of this being customised any further by Sony. Even the presentation by Mark Cerny does not allude to this, he discusses the exact feature set that the new Geometry Engine introduced in RDNA. Refer to page 6-9 here.

Cache Scrubbers are handled by the GPU itself and game developers do not need to do anything to take advantage of it, as Cerny himself noted on The Road to PS5 from 19:10 on.

The I/O complex is not going to assist with GPU demanding tasks, it is there to facilitate streaming of data from the SSD directly into RAM, it can certainly help with faster asset loading and this is a solid advantage for the PS5.

Tempest engine is just 3D audio, and the XSX has custom dedicated 3D audio processors of its own.

The features I were referring to are directly responsible for assisting with GPU loads, such as VRS. Which is used to great effect in Doom Eternal. Sampler Feedback Streaming can also have a positive performance impact, as can Mesh Shaders. In the case of Mesh Shaders PS5 has comparable technology via Primitive Shaders, so I expect minimal difference there.

Even with those advantages I expect game to range from being better on the PS5 to being better on the XSX, depending on the game engine and various other factors, with the overall advantage going to the XSX. Once again, the differences should not be anything major.

1. Absense of evidence is not evidence of absense

2. Correct

3. Correct, but not having data stuck helps with efficiency, which in turn helps with GPU tasks

4. Cool

5. Yeah, the Xbox has all those things, but sony was incompetent enough to not have any performance saving measures in their SDKs and APIs.

6. Source needed. The clear advantage on the Xbox has mostly been a gain in resolution (expected, due to the TF difference), but more stable framerates and even better textures / texture filtering on the PS5 (limited games for the latter).

As the gen moves on, the PS5 will distance itself from the Xbox, especially due to their lower level api access. Devs cannot be arsed to code for the Series X specifically when there's such an easy port from PC to XBS consoles. Additionally, Direct X is a hog.
 
Ori is also 6K on Xbox Series X so it might be the sweet spot

That was one of the first things I considered, but then I realized how different Ori and the Touryst truly are. Ori as a whole is a significantly more demanding title, all things considered.



Those boss fights, those natural disasters and creatures you have to engage with, the effects, the combat, it's all on a whole other level.
 

RoadHazard

Gold Member
I have always felt that the Jaguar CPUs in the PS4 held the GPU back in many ways. The GPU had to do more heavy lifting in 60 FPS games than they would with a similar GPU on PCs with better CPUs. Those Uncharted 4 hacks that had the game running at 60 fps were only able to get to 60 fps by reducing the resolution down to 560p. 1/4 resolution drop instead of 1/2 it would take on PCs to get double the FPS.

The PS5 GPU also has 1.5x IPC gains compared to the GCN 1.0 PS4 GPU, but that still puts the PS5 at around 15 GCN 1.0 Tflops. I'd expect an 8x resolution boost. 16x is mind boggling and is probably mostly due to the 8x more powerful CPU.

There's no way this game is CPU-bound on anything. It's 60fps on Switch. This is all GPU.
 

Zathalus

Member
1. Absense of evidence is not evidence of absense

2. Correct

3. Correct, but not having data stuck helps with efficiency, which in turn helps with GPU tasks

4. Cool

5. Yeah, the Xbox has all those things, but sony was incompetent enough to not have any performance saving measures in their SDKs and APIs.

6. Source needed. The clear advantage on the Xbox has mostly been a gain in resolution (expected, due to the TF difference), but more stable framerates and even better textures / texture filtering on the PS5 (limited games for the latter).

As the gen moves on, the PS5 will distance itself from the Xbox, especially due to their lower level api access. Devs cannot be arsed to code for the Series X specifically when there's such an easy port from PC to XBS consoles. Additionally, Direct X is a hog.
Mark Cerny called it the new Geometry Engine. RDNA had a new Geometry Engine compared to GCN. All the features he talked about are possible and detailed in the RDNA whitepaper. There is no reason to assume it is anything but, heck try and Google any source for it being anything special and all you get is information from "insiders" who were all wrong on everything else they speculated about.

Also, what source are you referring to? That some games run better on XSX? I listed them in my previous post. Sure some of them have slightly better FPS than the XSX version, but the resolution difference can in excess of 20%. We also have Doom Eternal that has the exact same FPS on both but can have 30% advantage in resolution on the XSX. For games that run at the same resolution on both, they can have in excess of 10% better performance on the XSX, such as Control and RE: Village.

As for the API, both have low-level APIs, DX12 is literally that (it was actually designed to counter the PS4 low-level API) and is more efficient on the XSX then it is on PC, as I have detailed quite a bit in this thread.
 
Last edited:
1. Absense of evidence is not evidence of absense

2. Correct

3. Correct, but not having data stuck helps with efficiency, which in turn helps with GPU tasks

4. Cool

5. Yeah, the Xbox has all those things, but sony was incompetent enough to not have any performance saving measures in their SDKs and APIs.

6. Source needed. The clear advantage on the Xbox has mostly been a gain in resolution (expected, due to the TF difference), but more stable framerates and even better textures / texture filtering on the PS5 (limited games for the latter).

As the gen moves on, the PS5 will distance itself from the Xbox, especially due to their lower level api access. Devs cannot be arsed to code for the Series X specifically when there's such an easy port from PC to XBS consoles. Additionally, Direct X is a hog.


Bold statement indeed. I think the opposite will end up proving true. Series X as the gen moves on will start stretching its legs and leave no doubt, particularly once things like Sampler Feedback Streaming become more common. And if the machine learning hardware ends up producing like Microsoft claims it will, the difference will be even more significant.
 

DenchDeckard

Moderated wildly
Yeah.


FAjHIeMWQAIeeEK

Well there you have it :) thanks.
 

Mr Moose

Member
That was one of the first things I considered, but then I realized how different Ori and the Touryst truly are. Ori as a whole is a significantly more demanding title, all things considered.



Those boss fights, those natural disasters and creatures you have to engage with, the effects, the combat, it's all on a whole other level.

This game has completely locked fps (so there's some headroom there).
 

SlimySnake

Flashless at the Golden Globes
There's no way this game is CPU-bound on anything. It's 60fps on Switch. This is all GPU.
Whats the resolution of the switch version?

There were 60 fps games on the PS360 which were far more graphics intensive. The Jaguar CPUs are roughly on par with those PS360 CPUs.
 
My position is that a single game is a piss poor representation of anything.

Some games that run better on Series X:

Control
RE: Village
Tales of Arise
Subnautica: Below Zero
Metro Exodus
Outriders
Watch Dogs: Legion
Hitman 3
Marvel's Avengers
Doom Eternal

Some games that run better on PS5:

The Touryst
Assassins Creed: Valhalla
Little Nightmares: II
Call of Duty Black Ops: Cold War
Dirt 5
Scarlet Nexus

Then there are a ton of games that are basically identical on both (most having less than .05 FPS difference average between them).

Thus, I think it should be pretty clear that the consoles are very close to each other, each with respective weaknesses and strengths. Some game engines will work better on the one, while others will work better on the other. I'd wager this tit for tat will go on for most of the generation with the XSX taking a slight lead in most titles on average, especially once game engines are optimised to take advantage of features the XSX has that the PS5 does not. Even then, the difference will be way smaller then the XONE/PS4 or XOX/Pro difference.
Deathloop, made by Microsoft, plays better on PS5.
 
Oh so it's using Velocity Architecture? SFS? All the other DX12U features and all the xbox specific features? No, it's not.

have you seen the game? it appears you dont understand what velocity architecture and SFS do

can you tell me where are all those huge textures that need their mipmaps partially streamed? or all those very big and different models that need to be streamed in an out very quickly as the character run so fast?
 
Last edited:

RoadHazard

Gold Member
Whats the resolution of the switch version?

There were 60 fps games on the PS360 which were far more graphics intensive. The Jaguar CPUs are roughly on par with those PS360 CPUs.

"In docked mode, resolution can vary from a maximum of 1080p to slightly less than 50 per cent on both axes. Typically, outdoor areas average around 810p to around 900p while indoor areas stick closer to full 1080p in most cases. Portable mode uses the same technique, with a maximum resolution of 720p and 50 per cent of that on each axis for the lower bounds. It typically jumps between 612p and 720p in this mode."

(DF)

Anyway, resolution doesn't affect CPU utilization. But framerate does. I understand you're saying that because the Jaguar is so weak the GPU often has to used for more compute tasks, limiting what else it can be used for, but I REALLY doubt that's the case here. It's a very simple game.
 

Darius87

Member
The Geometry Engine is the exact same thing as the Geometry Engine found in any RDNA card, there is absolutely no evidence of this being customised any further by Sony. Even the presentation by Mark Cerny does not allude to this, he discusses the exact feature set that the new Geometry Engine introduced in RDNA. Refer to page 6-9 here.
is patent for GE from Cerny isn't evidence?
https://www.neogaf.com/threads/cerny-patent-on-geometry-engine-details.1556724/
GE for PS5 is it's definitely custom.

dev mentioning PS5 memory setup could mean many things i don't thing it's related to GPU BW for XSX, could be caches or something else one thing for sure pixel fill rate is major contributor to high resolutions.
 

Lysandros

Member
Can someone please tell the developers of this game that the unshakable state of truth according to the 'Holy Guards of Teraflop' cult of Gaf is '0.1 more TF=1% more resolution' and not only they are clueless about their own engine but they are also in a very dangerous path that can create a rift in spacetime which will doom us all?
 
Last edited:

Zathalus

Member
is patent for GE from Cerny isn't evidence?
https://www.neogaf.com/threads/cerny-patent-on-geometry-engine-details.1556724/
GE for PS5 is it's definitely custom.

dev mentioning PS5 memory setup could mean many things i don't thing it's related to GPU BW for XSX, could be caches or something else one thing for sure pixel fill rate is major contributor to high resolutions.
No it is not. There is a patent detailing shared L3 cache on the CPU and that is clearly not the case.
 

ethomaz

Banned
Remover that game called Falconner?

It was supposed to be running (and release) at 8k60 on early Series X devkits.

But ended 4k60 on final release devkit.

I wonder if the same happened there and here.
 

Allandor

Member
Ori is also 6K on Xbox Series X so it might be the sweet spot
The touryst has almost no textures at all to speak of that could eat up memory. So I guess memory size is really not a problem.
The game on xbox series is just a quick port of the xbox one version, which was just ported from the switch version to just run ok on xbox one and xbox one x (already 4k). Than it got a series X patch that just increases the resolution to 6k. A complete engine rewrite was necessary for a native PS5 app.

The game is really a good game. But from a technical perspective it is really not a demanding game. Why they opted out at 6k at series x? well I guess it was just to get it released without further improvements over the xbox one x version (other than the res bump and 120Hz mode). Now almost one year later and much more work later they have rewritten the engine to work with the PS5 API. Much more current knowledge etc went into the new code.

We can speculate abut this game a long time, but now we have more or less a game at two very different engines states we try to compare here. Even with small differences in visual features.
 

Hoddi

Member
So you are saying Techpowerup figure of 82 Gpixel/s is wrong?

Yes 8K/30 FPS is twice as much pixels compared to 4K/60FPS but only half of the geometry throughput.
TPU is just multiplying the number of ROPs with the official boost clockrate. This mismatch between the rasterizers/ROPs is addressed in more detail in AnandTech's review.
 

Rea

Member
The Geometry Engine is the exact same thing as the Geometry Engine found in any RDNA card, there is absolutely no evidence of this being customised any further by Sony. Even the presentation by Mark Cerny does not allude to this, he discusses the exact feature set that the new Geometry Engine introduced in RDNA. Refer to page 6-9 here.
Read the whole paper up until to the conclusion, there's absolutely no evidence which suggesting that PS5 GE is the same as RDNA1. Am i missing something? Do i need a special glasses or something.
 

Snake29

RSI Employee of the Year
My position is that a single game is a piss poor representation of anything.

Some games that run better on Series X:

Control
RE: Village
Tales of Arise
Subnautica: Below Zero
Metro Exodus
Outriders
Watch Dogs: Legion
Hitman 3
Marvel's Avengers
Doom Eternal

Some games that run better on PS5:

The Touryst
Assassins Creed: Valhalla
Little Nightmares: II
Call of Duty Black Ops: Cold War
Dirt 5
Scarlet Nexus

Then there are a ton of games that are basically identical on both (most having less than .05 FPS difference average between them).

Thus, I think it should be pretty clear that the consoles are very close to each other, each with respective weaknesses and strengths. Some game engines will work better on the one, while others will work better on the other. I'd wager this tit for tat will go on for most of the generation with the XSX taking a slight lead in most titles on average, especially once game engines are optimised to take advantage of features the XSX has that the PS5 does not. Even then, the difference will be way smaller then the XONE/PS4 or XOX/Pro difference.

Run better? then they run better on PS5 performance wise, not XSX.
 
Last edited:
Ya, I've edited my post to include my old GTX 1060. There's something very strange about this game not hitting 8k60 on the XSX as it should be at least 3x faster than that.

My 2080 Ti runs it at 100fps+.
Remember we actually don't know the average framerate on PS5. It could be 100fps for all we know. What we know is that it never ever drops under 60fps. What's the min framerate using your 2080ti using the most demanding scene?
 
Last edited:
My position is that a single game is a piss poor representation of anything.

Some games that run better on Series X:

Control
RE: Village
Tales of Arise
Subnautica: Below Zero
Metro Exodus
Outriders
Watch Dogs: Legion
Hitman 3
Marvel's Avengers
Doom Eternal

Some games that run better on PS5:

The Touryst
Assassins Creed: Valhalla
Little Nightmares: II
Call of Duty Black Ops: Cold War
Dirt 5
Scarlet Nexus

Then there are a ton of games that are basically identical on both (most having less than .05 FPS difference average between them).

Thus, I think it should be pretty clear that the consoles are very close to each other, each with respective weaknesses and strengths. Some game engines will work better on the one, while others will work better on the other. I'd wager this tit for tat will go on for most of the generation with the XSX taking a slight lead in most titles on average, especially once game engines are optimised to take advantage of features the XSX has that the PS5 does not. Even then, the difference will be way smaller then the XONE/PS4 or XOX/Pro difference.
but all the Games you have listed there are either cross gen Games or early Next Gen Exclusives - wich means they are a piss poor representation of Next Gens Capabilitys if i stay with your Wording here.
And in that regard the Touryst stands out because the Team used PS5s low level API.
And as far as i remeber there´s no differentiation on Xbox for low/ high level API. They use DX12 wich is to some extent a low level API. But cannot hope to ever be so close to metal than what Sony uses with their GNM.( at least thats was the name of their low level API for PS4) I understood MS Effort with BC for example as an specialised interpreter (a gearbox if you will) between old Gen Architecture code and newer Architecture hardware. That is now build in in DX 12. Thats the main Advantage that MS has over Sony in that regard. They opted for a build in funktionality in their DX12 for BC.

They created DX 12 as a solution for everything. A little high level, a little low level, some BC Funktionality. So there is no better or more efficient usage of DX12 for the Touryst on Xbox X/S. You use DX 12 and thats it. On PS5 and earlier PS4 it is different. There is the High Level GNMX wich is comparable to DX11 in its (fixed) solutions for certain Problems and then of course GNM the low level API.
Back in the Day you could easily tell wich was mainly used for developing the game in PS4 Era.
Games like Elex looked like Arse while running like shit, seemingly maxing out the PS4 / Pro. And then there were Games like God of War, RDR2, Horizon Zero Dawn and others (almost all Exclusives) looked like a Generation ahead while even running better.
And all this talk about Xbox DX 12 was not used well for The Touryst - they achieved 6k/60 on it. That is not something you can do by programming it like shit or be a "lazy Dev" .
Well There It Is Jurassic Park GIF

The Touryst uses a small efficient custom Engine wich makes therefore good use of both systems API and Hardware. The result was 14,5Million more Pixels rendered by PS5 over Xbox X. Like almost double the Pixelcount on PS5.
So we either see here the performance difference of the different Approaches of Hardware, the APIs or a combination of both. Since the Touryst is such a small Project they could alter their Engine and adopt it for NextGen fairly easy so i would count it as an early Next Gen Demo of some sorts even if there are Versions of that Game for PS4/Xbone One and Switch. Actually the fact that they were able to push out so many different versions of their Game while being a small Indie Developer ,suggests that their Engine is clean and easy to adopt for any System.

But like you said - in the End both Systems will have Games wich will be just better suited for either a High or Wide Architecture.
 
Last edited:

DenchDeckard

Moderated wildly
It being a native Series X app doesn't mean the engine was rewritten specifically for the console. It's still a port of the Xbox One version with enhancements. An engine rewrite and using the Series X GDK are not the same exact things.

It might be true, but there's no point in us arguing it. It is just the way it is. Yes, the game launched months ago on Xbox so maybe they didnt have chance to get the most out of the system but it is what it is.

Respect to Shin En in really pushing the native PS5 version here, hopefully they go back and give the Xbox Series X some extra attention but the boat has probably sailed.
 
Last edited:
"In docked mode, resolution can vary from a maximum of 1080p to slightly less than 50 per cent on both axes. Typically, outdoor areas average around 810p to around 900p while indoor areas stick closer to full 1080p in most cases. Portable mode uses the same technique, with a maximum resolution of 720p and 50 per cent of that on each axis for the lower bounds. It typically jumps between 612p and 720p in this mode."

(DF)

Anyway, resolution doesn't affect CPU utilization. But framerate does. I understand you're saying that because the Jaguar is so weak the GPU often has to used for more compute tasks, limiting what else it can be used for, but I REALLY doubt that's the case here. It's a very simple game.
Less than 50% on both axis is a polite way to say a quarter resolution. So min resolutions are <540p docked and <360p in mobile. Contrary to what many people are claiming this game is actually very demanding even for Switch. To put into context that's even more demanding than The Witcher 3 on the same console.
 

SlimySnake

Flashless at the Golden Globes
"In docked mode, resolution can vary from a maximum of 1080p to slightly less than 50 per cent on both axes. Typically, outdoor areas average around 810p to around 900p while indoor areas stick closer to full 1080p in most cases. Portable mode uses the same technique, with a maximum resolution of 720p and 50 per cent of that on each axis for the lower bounds. It typically jumps between 612p and 720p in this mode."

(DF)

Anyway, resolution doesn't affect CPU utilization. But framerate does. I understand you're saying that because the Jaguar is so weak the GPU often has to used for more compute tasks, limiting what else it can be used for, but I REALLY doubt that's the case here. It's a very simple game.
Thanks. 900p 60 fps for a 0.39 tflops Switch in docked mode is pretty good, but then it doesnt explain why the base PS4 is only 1080p 60 fps despite 4x more tflops. or why the Pro also tops out at 1512p despite the 10x more tflops compared to the switch.

Now the 33 million vs 1.4 million pixels of 900p is a 23x increase in pixels. That actually lines up with the increase in tflops going from 0.39 tflops to 10.2 tflops. (26x)

So i would say that the engine rewrite they did is probably the reason why they are able to take full advantage of the PS5 because the base PS4 should be able do 4k 60 fps if the switch is able to 900p 60 fps.
 
Last edited:
Thanks. 900p 60 fps for a 0.39 tflops Switch in docked mode is pretty good, but then it doesnt explain why the base PS4 is only 1080p 60 fps despite 4x more tflops. or why the Pro also tops out at 1512p despite the 10x more tflops compared to the switch.

Now the 33 million vs 1.4 million pixels of 900p is a 23x increase in pixels. That actually lines up with the increase in tflops going from 0.39 tflops to 10.2 tflops. (26x)

So i would say that the engine rewrite they did is probably the reason why they are able to take full advantage of the PS5 because the base PS4 should be able do 4k 60 fps if the switch is able to 900p 60 fps.
You are comparing different things.

Switch docked mode: min resolution less than 540p (maybe 500p?, DF didn't specify), max 1080p
Pro: min 1440p max 2160p
PS4: min 1080p (PS4 can't go higher than this).
XSX: min 6K
PS5: min 8K

You need to compare the minimum resolutions in the most demanding scenes.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
You are comparing different things.

Switch docked mode: min resolution less than 540p (maybe 500p?, DF didn't specify), max 1080p
Pro: min 1440p max 2160p
PS4: min 1080p (PS4 can't go higher than this).
XSX: min 6K
PS5: min 8K

You need to compare the minimum resolutions in the most demanding scenes.
I am looking at the average provided by DF.

The DRS situation is more obvious to spot on PS4 Pro, where the game operates mostly in the 1440p-1512p range.

For Switch.
Typically, outdoor areas average around 810p to around 900p while indoor areas stick closer to full 1080p in most cases.

Technically the PS5 cant do 8k either, it didnt stop them from downsampling from a higher base. There should be nothing stopping the PS4 from rendering internally at a higher resolution and then downsampling it to 1080p. But it doesnt do that and DF thinks there might be DRS in play despite the low 1080p 60 fps target.

The base last-gen console hits 1080p at 60 frames as you might expect, but while unconfirmed, there may be the inclusion of dynamic resolution scaling in the odd situation.
 

Clear

CliffyB's Cock Holster
Never forget that the rendering per-pixel isn't a constant. It depends on how many operations and combinations are involved in rasterizing each one. So just being able to process more at a time because you have a "wider" GPU isn't necessarily going to give better results than a "narrower" but faster one where the shading process per pixel is accelerated.
 

Connxtion

Member
The only facts we know are that it is 8K 60 on PS5 and it isnt on Xbox. Maybe they patch it and the whole conversation goes out of the window and changes. but ultimately this is where we are at right now.

Now saying that, if a PC with a 2080 can run it at 8K 60 fps...I really cant get my head around how these consoles cant, if coded for them. so it does make me lean towards wondering what is going on with the series version. I also wonder, does the game have the optimized for Series X and Series S badge..because if it does, don't they have to be using the series consoles API?
Nup, what determines the badge is the SDK it’s built on, under the info menu for a game. (This Gen10?)

COD: MW was built with the newer SDK and all that did was force it on the internal storage 🙈 a week or two later they removed it and it went back to being just the standard Xbox version 🤷‍♂️

So seems they can just update the SDK or hit a toggle in a config menu to say use Gen10 or what ever Gen we are on and it will apply the XS logo and force it on the internal 🤷‍♂️ on the Xbox.

What I suspect happened here is they just ran the X1X version on the XSX and seen how high they could push the res before it dipped the FPS 🤷‍♂️ and I assume they need to use the newer SDK to do the res bumps.

Either way it’s a mute point, the game looks fine on both and unless you’re sitting 2mm away from the screen you won’t ever notice 🤷‍♂️😂
 
The touryst has almost no textures at all to speak of that could eat up memory. So I guess memory size is really not a problem.
The game on xbox series is just a quick port of the xbox one version, which was just ported from the switch version to just run ok on xbox one and xbox one x (already 4k). Than it got a series X patch that just increases the resolution to 6k. A complete engine rewrite was necessary for a native PS5 app.

The game is really a good game. But from a technical perspective it is really not a demanding game. Why they opted out at 6k at series x? well I guess it was just to get it released without further improvements over the xbox one x version (other than the res bump and 120Hz mode). Now almost one year later and much more work later they have rewritten the engine to work with the PS5 API. Much more current knowledge etc went into the new code.

We can speculate abut this game a long time, but now we have more or less a game at two very different engines states we try to compare here. Even with small differences in visual features.

I guarantee the issue is memory.

Weaker PC GPUs, ones with less RAM and even less pixel fillrate run it at 8K. But on PCs VRAM isn't the only major source of RAM in the system. With Series X developers need to control more specifically where in RAM data is being accessed to guarantee the maximum performance out of the GPU. This is why the 10GB is labeled by Microsoft as GPU optimal and the rest is labeled standard memory.

Let's use Doom Eternal as an example. Xbox Series S in that game has zero ray tracing. Why? Is it because it can't do ray tracing? Not powerful enough to ray trace? Of course not. It has ray tracing in plenty of other games including Metro Exodus. But Doom Eternal on Series S has no ray tracing likely due to the fact with their current engine design and optimizations, ray tracing simply didn't work within the memory constraints of Series S. This could change in the future, for example, with Sampler Feedback Streaming, as textures then would take up much less space in memory, freeing up much needed RAM to ray trace on Series S. Or they may just optimize for it in other ways.

Series X, however, did get ray tracing. So what does this basic example tell us? Sometimes a piece of hardware not doing something isn't always evidence that it can't do it. It could also potentially be a sign that it can't, gotta be fair there, but more often than not it is a matter of optimization.

How did they achieve 6K in Ori, for example? It was due in large part to all the optimizations and changes they made to get the game working for switch that they then went and applied intelligently to Series X. He makes clear that had they just taken the Xbox version of the game and put it on Series X, they wouldn't have achieved the performance targets they aimed for.

"If we were to just take the version that we shipped for the original Xbox, and we would have tried to hit 120hz that would have been quite a bigger leap than taking the switch version that we knew would have been that much more optimized." They had to break the game apart and put it back together again in order to achieve what they did on Series X.

They had to reinvent the rendering pipeline, reinvent the simulation, reinvent the streaming, the memory management, all of those things

Optimizing for the least performing switch allowed them to push things even further on Series X and even enhanced the quality. They created a bunch of sliders that they used to optimize different parts of the game for switch, and they were able to utilize those same sliders, though not all, to modify things like depth of field, resolutions of specific things in the back without anybody being able to tell the difference on Series X. But even after those switch optimizations it still wasn't enough in some cases, they had to go and do more on Series X. Some optimizations couldn't be brought over, he talks about making Ori very specific to the Series X. Things they did specific to the engine path on Switch "you couldn't just get for free" on Series X. They needed to engine specific changes to Series X. When going from 4K to 6K in using their motion blur, they're keeping the other slices the same because it's a high enough precision to still look great once brought up to 6K and most can't tell with the motion blur used. When they go to 6K they really only go to 6K with one specific center slice and the rest are secretly lower, and that's how they supersample the character, the artwork in the center, and that's how they get the crisp look of native 6K.

The developer explains much of this to DF from the point where this video starts to about the 25 minute mark. An excellent video where they go into a whole lot of detail of the kind of work it took to get a game like Ori working on Series X at 6K. At 27:57 John asks him what were the biggest bottlenecks on the way to getting there to 120fps. He said a huge challenge was giving themselves enough room in the window performance wise on the GPU to ensure they always nailed their 8.3ms window. He said a challenge was preventing the spikes in GPU performance (hello that sounds like the kinds of spikes that might occur if you run into that split memory situation), again he refers to all the tweaks and optimizations made to animation systems, particle systems and the like. They didn't want to rely on VRR he said.

The switch based optimizations weren't enough ultimately, so it'd be hovering above 100fps on Series X, and in some cases hitting 120fps, but it wasn't perfect, they had to balance work on cores and that kind of stuff. So Ori at 4K 120 and 6K 60fps was a challenge. He stresses it wasn't the hardest thing in the world, but it took some work.

 
Last edited:

yewles1

Member
but all the Games you have listed there are either cross gen Games or early Next Gen Exclusives - wich means they are a piss poor representation of Next Gens Capabilitys if i stay with your Wording here.
And in that regard the Touryst stands out because the Team used PS5s low level API.
And as far as i remeber there´s no differentiation on Xbox for low/ high level API. They use DX12 wich is to some extent a low level API. But cannot hope to ever be so close to metal than what Sony uses with their GNM.( at least thats was the name of their low level API for PS4) I understood MS Effort with BC for example as an specialised interpreter (a gearbox if you will) between old Gen Architecture code and newer Architecture hardware. That is now build in in DX 12. Thats the main Advantage that MS has over Sony in that regard. They opted for a build in funktionality in their DX12 for BC.

They created DX 12 as a solution for everything. A little high level, a little low level, some BC Funktionality. So there is no better or more efficient usage of DX12 for the Touryst on Xbox X/S. You use DX 12 and thats it. On PS5 and earlier PS4 it is different. There is the High Level GNMX wich is comparable to DX11 in its (fixed) solutions for certain Problems and then of course GNM the low level API.
Back in the Day you could easily tell wich was mainly used for developing the game in PS4 Era.
Games like Elex looked like Arse while running like shit, seemingly maxing out the PS4 / Pro. And then there were Games like God of War, RDR2, Horizon Zero Dawn and others (almost all Exclusives) looked like a Generation ahead while even running better.
And all this talk about Xbox DX 12 was not used well for The Touryst - they achieved 6k/60 on it. That is not something you can do by programming it like shit or be a "lazy Dev" .
Well There It Is Jurassic Park GIF

The Touryst uses a small efficient custom Engine wich makes therefore good use of both systems API and Hardware. The result was 14,5Million more Pixels rendered by PS5 over Xbox X. Like almost double the Pixelcount on PS5.
So we either see here the performance difference of the different Approaches of Hardware, the APIs or a combination of both. Since the Touryst is such a small Project they could alter their Engine and adopt it for NextGen fairly easy so i would count it as an early Next Gen Demo of some sorts even if there are Versions of that Game for PS4/Xbone One and Switch. Actually the fact that they were able to push out so many different versions of their Game while being a small Indie Developer ,suggests that their Engine is clean and easy to adopt for any System.

But like you said - in the End both Systems will have Games wich will be just better suited for either a High or Wide Architecture.
This is reminding me of PS2 vs DC in the beginning...
 

MrFunSocks

Banned
It might be true, but there's no point in us arguing it. It is just the way it is. Yes, the game launched months ago on Xbox so maybe they didnt have chance to get the most out of the system but it is what it is.

Respect to Shin En in really pushing the native PS5 version here, hopefully they go back and give the Xbox Series X some extra attention but the boat has probably sailed.
I think this is the first time in history that anyone has suggested that there’s a possibility that a dev, any dev, completely used the entire maximum amount of power possible in a console on console launch day lol

Of course he didn’t lol. The entire game is 500MB yet you think that memory and cpu clock speed on a 12TF machine isn’t enough to get probably the most basic 3D game running in 8K?
 
It might be true, but there's no point in us arguing it. It is just the way it is. Yes, the game launched months ago on Xbox so maybe they didnt have chance to get the most out of the system but it is what it is.

Respect to Shin En in really pushing the native PS5 version here, hopefully they go back and give the Xbox Series X some extra attention but the boat has probably sailed.

I can't stress this enough. Maximum respect to the developer on this. This game is legitimately a classic. They don't make games like this anymore and amongst all the other stuff, the one fact that people should come away with is that more people need to try this fantastic game. It takes me back to those days when I would come across an old PC title that was like this, but it was never THIS friggin beautiful. Excellent idea, great puzzles, just gaming at its finest.
 
I think this is the first time in history that anyone has suggested that there’s a possibility that a dev, any dev, completely used the entire maximum amount of power possible in a console on console launch day lol
Of course he didn’t lol. The entire game is 500MB yet you think that memory and cpu clock speed on a 12TF machine isn’t enough to get probably the most basic 3D game running in 8K?

it doesn't work like that, you can have small code and still bring a system to its knees, its not the amount of assets its what it does on CPU/GPU in the case of this game is more GPU intensive as it doesn't look as much is needed to run the logic, being so small is good, if you can spare 500 MB and have all assets in RAM you can have no loading time without requiring fast i/o storage, PRT or other things :)

the developer mentioned speed and memory setup, not TF, TFLOPS is an amount of maximum operations per second but this is a theoretical value and is not the only important metric on a GPU, in a game you are not only going to make just floating point operations one after another during a second to see how much you can do, you are going to do lot of things and of course render to screen all that takes time, it requires access to the memory too, if you cannot accommodate a certain buffer and/or as fast as you want you do it another way or reduce it just like any other game, if they use less complex shaders or find later a way to improve how fast they render to the buffers then they can improve it
 
Last edited:

DForce

NaughtyDog Defense Force
Bold statement indeed. I think the opposite will end up proving true. Series X as the gen moves on will start stretching its legs and leave no doubt, particularly once things like Sampler Feedback Streaming become more common. And if the machine learning hardware ends up producing like Microsoft claims it will, the difference will be even more significant.
You keep mentioning SFS when it's common in a lot of engines. It's not going to be anything as you hope.
 

MrFunSocks

Banned
it doesn't work like that, you can have small code and still bring a system to its knees, its not the amount of assets its what it does on CPU/GPU in the case of this game is more GPU intensive as it doesn't look as much is needed to run the logic, being so small is good, if you can spare 500 MB and have all assets in RAM you can have no loading time without requiring fast i/o storage, PRT or other things :)

the developer mentioned speed and memory setup, not TF, TFLOPS is an amount of maximum operations per second but this is a theoretical value and is not the only important metric on a GPU, in a game you are not only going to make just floating point operations one after another during a second to see how much you can do, you are going to do lot of things and of course render to screen all that takes time, it requires access to the memory too, if you cannot accommodate a certain buffer and/or as fast as you want you do it another way or reduce it just like any other game, if they use less complex shaders or find later a way to improve how fast they render to the buffers then they can improve it
And this game isn't some insane super game, it's one of the most basic games that will release this generation. There's no way that the PS5 can render the game at 4x the resolution of the Series X because of higher clock speeds and the RAM differences. This game does not need >10GB of RAM to render 8K lol.
 

Loxus

Member
And this game isn't some insane super game, it's one of the most basic games that will release this generation. There's no way that the PS5 can render the game at 4x the resolution of the Series X because of higher clock speeds and the RAM differences. This game does not need >10GB of RAM to render 8K lol.
What about Minecraft?
 

onQ123

Member
And this game isn't some insane super game, it's one of the most basic games that will release this generation. There's no way that the PS5 can render the game at 4x the resolution of the Series X because of higher clock speeds and the RAM differences. This game does not need >10GB of RAM to render 8K lol.
It's not 4X the resolution it's just under 2X the resolution , Series X probably came close to 8K but had frame rate drops so they went with 6K to super sample from.
 
Top Bottom