• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Cyberpunk 2077 Next-Gen Patch: The Digital Foundry Verdict

Indeed, there is also the 10% of one core dedicated to I/O processing on XSX side to consider which leave the 'difference' as something like ~2% (very negligible to say the least) even if we completely ignore the inter connect cache latency differences between the CPUs, which i don't think is a wise thing to do if we really want to understand what's going on under the hood by the way...
But in this game CPU is also used for I/O on PS5.
 
Both Sony and Microsoft use Hypervisors (or similar) since the Xbox 360 and PS3 eras, you don't get games running to metal anymore.

AFAIK only some I/O and memory security features use a hypervisor on Sony's consoles. If the whole system was running under a hypervisor then they could implement more elaborate backward compatibility systems like Microsoft does. Instead, Sony had to go with a ISA-compatible CPU and ISA-compatible GPU that has the same amount of compute units as the PS4 Pro.
The PS5's BC can only take advantage of its higher clocks and memory bandwidth. It can't make use of the higher IPC from RDNA2 and Zen2 on BC.


And from what we learned with Xbox One... the bloated OS and hypervisors makes the machine have a lower performance on CPU side even with higher clock.
I don't know about bloated OS. I'd say Microsoft has learned how to avoid bloating on their OSes by now.
Running everything under a hypervisor is probably bringing larger differences than whatever bloat both OSes are carrying at the moment.


How can we be sure that XSX' split 560/336 GB/s bandwidth pool (with probable latency penalty) is actually superior to PS5's unified 448 GB/s pool especially in light of The Tourist developers' comments about how they were able reach 8K on PS5 citing memory setup as one of the reasons (besides clocks).
I thought memory contention issues were a thing of the past on the Series consoles...
 

Lysandros

Member
But in this game CPU is also used for I/O on PS5.
Can we be sure about this/to which degree, same as XSX? Are you saying the PS5 I/O hardware stands 'completely' idle in this game? I also don't think that this game is using it to a decent/competent degree but i personaly wouldn't go as far to say that it puts same degree of strain for I/O operations on PS5 CPU as the XSX one.
 
Last edited:

Lysandros

Member
AFAIK only some I/O and memory security features use a hypervisor on Sony's consoles. If the whole system was running under a hypervisor then they could implement more elaborate backward compatibility systems like Microsoft does. Instead, Sony had to go with a ISA-compatible CPU and ISA-compatible GPU that has the same amount of compute units as the PS4 Pro.
The PS5's BC can only take advantage of its higher clocks and memory bandwidth. It can't make use of the higher IPC from RDNA2 and Zen2 on BC.



I thought memory contention issues were a thing of the past on the Series consoles...
Based on? Both APUs should still be prone to some contention with CPU usage to varying degrees being unified RAM/APU designs. Are you saying that Microsoft completely solved the issue and Sony did not? And also, what is your take on the explanation given by the Tourist's developers? What kind of advantage PS5's RAM setup may have given them to reach 8K compared to XSX if this not the higher bandwidth required to reach that resolution? Why even bother to mention it if that is not the case? Doesn't this also align with earlier comments made by some developers like Ali Selahi from Crytek? Why should we ignore these?..
 
Last edited:
We can be sure about this/to which degree, same as XSX? You are saying PS5 I/O hardware 'completely' idle in this game? I also don't think that this game is using it to a decent/compent degree but i personaly wouldn't go as far to say that it puts same degree of strain for I/O operations on PS5 CPU as the XSX one.
Yes, about the same loadings time than XSX. Similar to game on PC. Case closed.
 

SomeGit

Member
AFAIK only some I/O and memory security features use a hypervisor on Sony's consoles. If the whole system was running under a hypervisor then they could implement more elaborate backward compatibility systems like Microsoft does. Instead, Sony had to go with a ISA-compatible CPU and ISA-compatible GPU that has the same amount of compute units as the PS4 Pro.
The PS5's BC can only take advantage of its higher clocks and memory bandwidth. It can't make use of the higher IPC from RDNA2 and Zen2 on BC.

If that was the case then the PS4 would have to have the same CUs as the Pro and PS5, beside the the series consoles also had to be ISA compatible with the One, so I'm not really sure what's your point. If the series were ARM based, or used an Nvidia GPU it wouldn't have the same level of compatibility.

There is a more elaborate solution on the Series consoles, but if it caused significant I/O overhead, we would have seen a bigger performance delta on Xbox One X and PS4 with it's weak Jaguar cores and more software driven I/O. I don't remember anything like it happening on CPU or I/O bound games though.

Comparing Hyper V on Azure to Hyper V on Xbox isn't comparing the same solution.
 
Last edited:
Based on? Both APUs should still be prone to some contention with CPU usage to varying degrees being unified RAM/APU designs. Are you saying that Microsoft completely solved the issue and Sony did not? And also, what is your take on the explanation given by the Tourist's developers? What kind of advantage PS5's RAM setup may have given them to reach 8K compared to XSX if this not the higher bandwidth required to reach that resolution? Why even bother to mention it if that is not the case? Doesn't this also align with earlier comments made by some developers like Ali Selahi from Crytek? Why should we ignore these?..

I meant the memory contention issues that relate to the Series consoles' asymmetric memory channels, resulting in a "slow" and "fast" memory pools. Which the PS5 doesn't need to solve because it has the same amount of memory in each channel.

Word on the street twittersphere was that the first Series SDKs wasn't great at specifying memory allocation among the two pools and a bunch of data that was supposed to be in the fast pool ended up on the slow pool.



If that was the case then the PS4 would have to have the same CUs as the Pro and PS5
Using only 18 CUs with higher clocks is exactly what happens when either the Pro or the PS5 run a PS4 binary.
 
Last edited:

SomeGit

Member
Using only 18 CUs with higher clocks is exactly what happens when either the Pro or the PS5 run a PS4 binary.

Sure, but your post implied that Sony couldn't go beyond the number of CUs on the PS4 Pro. If that was the case, then they would have been locked to the number of CUs in the OG PS4.
 
Last edited:
Sure, but your post implied that Sony couldn't go beyond the number of CUs on the PS4 Pro. If that was the case, then they would have been locked to the number of CUs in the OG PS4.
It's what I implied and that's exactly the case.

The PS4 Pro can't make use of the extra 18 CUs when running PS4 binaries, just like the PS5 wouldn't be able to make use of more than 18 CUs when running PS4 binaries or more than 36 CUs when running PS4 Pro binaries. And it's indeed because games run close to metal and not on a virtual machine like the Xbox One and later.
 

SomeGit

Member
It's what I implied and that's exactly the case.

The PS4 Pro can't make use of the extra 18 CUs when running PS4 binaries, just like the PS5 wouldn't be able to make use of more than 18 CUs when running PS4 binaries or more than 36 CUs when running PS4 Pro binaries. And it's indeed because games run close to metal and not on a virtual machine like the Xbox One and later.

But you were implying that they HAD to be designed that way. As in the PS5 couldn't ship with more than 36CUs because it had to match the CUs of the PS4 Pro.
It could it's just couldn't use them all in PS4 Pro mode.

And it's not simply because they run in virtual machines, if that was the case then One X in One S would be able to use the entire GPU, which is doesn't, and Series on One X would be able to run the entire GPU, which again it doesn't it has to use the GCN/Polaris compatibility mode of the RDNA GPUs. It's not that linear.

Exclusive OS isn't the same thing as Windows 10 or Linux running virtually on a Hyper V Azure server, the overhead isn't the same.
 
Last edited:
But you were implying that they HAD to be designed that way. As in the PS5 couldn't ship with more than 36CUs because it had to match the CUs of the PS4 Pro.
The PS5 could ship with more CUs but it wouldn't use more than 36 when running PS4 Pro binaries.


And it's not simply because they run in virtual machines, if that was the case then One X in One S would be able to use the entire GPU, which is doesn't, and Series on One X would be able to run the entire GPU, which again it doesn't it has to use the GCN/Polaris compatibility mode of the RDNA GPUs.
They One X is capable of using all its CUs when running One S binaries, and it's the same with the Series consoles running One S games.
I'm talking ALU-per-ALU of course, i.e. the full GPU's width. I'm not talking about specific architectural features like running raytracing or VRS on a One S game that would obviously not support it.
 

SomeGit

Member
They One X is capable of using all its CUs when running One S binaries, and it's the same with the Series consoles running One S games.
I'm talking ALU-per-ALU of course, i.e. the full GPU's width. I'm not talking about specific architectural features like running raytracing or VRS on a One S game that would obviously not support it.

It goes beyond that, it isn't able to run at the full IPC of RDNA 2, it fallback to a Polaris compatibility mode.
Same with One X on One S, it fallbacks to a compatibility mode at around 3TFLOPS.
 
Last edited:

ChiefDada

Member
Besides, there is also the cache bandwidth side of things to consider which should contribute to the whole system bandwidth/performance

I agree. Could be wrong, but I have a feeling that the cache coherency/GPU cache scrubber interconnection is more important than people realize, and probably even superior to AMDs Infinity Cache approach, which supposedly gives up to 1.7x effective bandwidth.

This is the main reason why I tell people here to not be so pessimistic about the consoles capabilities (although I'm not too familiar with Microsoft solution, if they have one). Game engines have to be adjusted to take full advantage of this.
 

Cherrypepsi

Member
VRS gives the least performance increase and VRR is nothing to do with RDNA2 as it was on last gen Xbox consoles.
SFS and Mesh Shaders the hardware assisted version will give huge performance increases, although PS5 does feature primitive shaders which should give similar results to Mesh Shaders.
it's a game that has absolute parity between PS5 and XSX, yet people bring up the most delusional stuff again and again you talk about how one version is a clear winner over the other, it's just mental.

I don't know what your mission is, but some people here see your posts as some sort of entertainment, do you realise that? because I don't know if I should be sorry for you.
I even agree with a lot of things you say, and your posts are most of the time written in a way that you don't attack or hurt anyone, which is good.
but it's clearly 100% fanboyism and all these GPU marketing terms that you throw around are just absolute irrelevant.
 

Riky

My little VRR pleasure pearl goes vrrrooommm.
it's a game that has absolute parity between PS5 and XSX, yet people bring up the most delusional stuff again and again you talk about how one version is a clear winner over the other, it's just mental.

I don't know what your mission is, but some people here see your posts as some sort of entertainment, do you realise that? because I don't know if I should be sorry for you.
I even agree with a lot of things you say, and your posts are most of the time written in a way that you don't attack or hurt anyone, which is good.
but it's clearly 100% fanboyism and all these GPU marketing terms that you throw around are just absolute irrelevant.

You're hurt I get it and you feel the need to strike out, you'll be ok. If you think VRR and VRS are " marketing terms" and not real then best to stay out of tech threads as you're just embarrassing yourself with your lack of knowledge.

I haven't said one version is a clear winner in fact I said that once PS5 gets VRR they would be practically identical, get your facts right.

Somebody asked the question what RDNA2 features haven't been used and I replied, no need for your personal attack.
 
Last edited:

Cherrypepsi

Member
You're hurt I get it and you feel the need to strike out, you'll be ok. If you think VRR and VRS are " marketing terms" and not real then best to stay out of tech threads as you're just embarrassing yourself with your lack of knowledge.

XSX is literally the only console that I own, I am a gamepass subscriber, I'm coming to you as a fellow xbox player. trust me, one of us is embarrassing himself and it's not me <3
 

Riky

My little VRR pleasure pearl goes vrrrooommm.
XSX is literally the only console that I own, I am a gamepass subscriber, I'm coming to you as a fellow xbox player. trust me, one of us is embarrassing himself and it's not me <3
I'd go and do your research, these are features that work across PC graphics cards too, nothing to do with just one console even if you're an "Xbox player".
 

Snake29

Member
You're hurt I get it and you feel the need to strike out, you'll be ok. If you think VRR and VRS are " marketing terms" and not real then best to stay out of tech threads as you're just embarrassing yourself with your lack of knowledge.

I haven't said one version is a clear winner in fact I said that once PS5 gets VRR they would be practically identical, get your facts right.

Somebody asked the question what RDNA2 features haven't been used and I replied, no need for your personal attack.

The thing is, that you love to hype yourself with all these buzzwords MS was throwing around before launch, only to make excuses, to make it look like the PS5 is not powerful enough. Well it looks like the XSX needs constantly patches to hide these imperfections, stutters, frame pacing issues and performance drops, while the res difference is clearly not that big.
 

Riky

My little VRR pleasure pearl goes vrrrooommm.
The thing is, that you love to hype yourself with all these buzzwords MS was throwing around before launch, only to make excuses, to make it look like the PS5 is not powerful enough. Well it looks like the XSX needs constantly patches to hide these imperfections, stutters, frame pacing issues and performance drops, while the res difference is clearly not that big.

I'm not hyped, I just look at what the companies say, no more than people who talk about the PS5 SSD or the Geometry Engine. The PS5 being "not powerful enough" is simply in your head.
If you think only one console suffers from "imperfections, stutters, frame pacing issues and performance drops" then I suggest you go look at the VGtech stats sheets, if you struggle I'll DM you some. As for patches all consoles have them.
 

Lysandros

Member
I agree. Could be wrong, but I have a feeling that the cache coherency/GPU cache scrubber interconnection is more important than people realize, and probably even superior to AMDs Infinity Cache approach, which supposedly gives up to 1.7x effective bandwidth.

This is the main reason why I tell people here to not be so pessimistic about the consoles capabilities (although I'm not too familiar with Microsoft solution, if they have one). Game engines have to be adjusted to take full advantage of this.
Indeed. I feel that presenting XSX as the 'bandwidth champion' by basing it solely on the faster part of the RAM in isolation without a deeper look on whole system level, following the data path from the storage (where the SSD bandwidth is involved) all the way to the processors/CU/fixed function units without taking the cache hierarchy/bandwidth/architecture into account as if the data magically bypasses them is exactly the same take that look at XSX GPU compute ceiling compared to PS5 and declare it to be the most powerful machine based uniquely on it ignoring all the other metrics/parameters. I don't think this kind of selective mentality helps us to understand the current performance profile of these machines accross the games.
 
Last edited:

RoadHazard

Member
That's not what I'm talking about. The console warring between power this and power that is immature and I don't pay attention to that. I'm talking about what the hardware *should* do with relation to graphics features. I always hear the Sony crowd talking about how a game underperforms because it's not written specifically for a next-gen console assuming that a developer would get enormous amounts of gains had that been the case. I don't see Xbox gamers declaring "unoptimized, this game would run like CG if the graphics engine was rewritten for the next-gen console." Xbox gamers wouldn't say that because all of their exclusives are developed on PC first. If the PC can't do it, then surely the XSX won't.

No, Xbox gamers say "tools" instead.

But it's an easily understandable fact that most games so far are not "optimized" for the new consoles (both of them), because the engines are made for last gen. The new consoles can do things you have to specifically develop for, you won't reach their full potential by just tweaking some settings in a last-gen engine.

And if you don't believe in consoles being capable of punching above their weight, try running GoW on PS4-equivalent PC hardware.
 
Last edited:

VFXVeteran

Banned
No, Xbox gamers say "tools" instead.

But it's an easily understandable fact that most games so far are not "optimized" for the new consoles (both of them), because the engines are made for last gen. The new consoles can do things you have to specifically develop for, you won't reach their full potential by just tweaking some settings in a last-gen engine.
How can you speak so confidently without being a developer yourself? Yet you are trying to tell a developer how graphics engines are. I don't understand this. Engines are neutral. They don't cater to any hardware platform. It's their compile time game code that's written to the platform is what you are talking about. The way to calculate a shadow map, for example, will always be the same regardless of platform. Computing a BRDF material for physically based accuracy is another example of completely independent of platform. Physics calculations are also independent. Collision detection is also independent. I could go on and on and on. Memory management would be something that's dependent on platform.

And if you don't believe in consoles being capable of punching above their weight, try running GoW on PS4-equivalent PC hardware.
That's a horrible example. PC hardware is totally pragmatic and exactly what they use for initial development passes of a new concept. You want a hardware platform that's totally independent of custom hardware to get the original algorithm written.

We've all seen what "optimization" means for these consoles - lowering resolution and using reconstruction techniques. This won't change going forward.
 

RoadHazard

Member
How can you speak so confidently without being a developer yourself? Yet you are trying to tell a developer how graphics engines are. I don't understand this. Engines are neutral. They don't cater to any hardware platform. It's their compile time game code that's written to the platform is what you are talking about. The way to calculate a shadow map, for example, will always be the same regardless of platform. Computing a BRDF material for physically based accuracy is another example of completely independent of platform. Physics calculations are also independent. Collision detection is also independent. I could go on and on and on. Memory management would be something that's dependent on platform.


That's a horrible example. PC hardware is totally pragmatic and exactly what they use for initial development passes of a new concept. You want a hardware platform that's totally independent of custom hardware to get the original algorithm written.

We've all seen what "optimization" means for these consoles - lowering resolution and using reconstruction techniques. This won't change going forward.

A few examples are the geometry engine and the fast I/O. And there's more. The new consoles are NOT just faster versions of the previous ones, they have new hardware features that you need to explicitly support. Cross-gen games generally won't do that very well.

Is it a horrible example because it proves you wrong? You're the one who doesn't seem to understand what console optimization means. You seem to think it's just making a PC game and ticking the "build for console X" option, but I can promise you that's not how Naughty Dog got TLoU2 looking better than the vast majority of PC games on the ancient PS4 hardware. Their engine is absolutely not "neutral". Targeting a set hardware configuration lets you do much more than making something that needs to run on EVERYTHING. If you don't understand that, I don't know how else to explain it to you.

But we all know you absolutely live for coming into these threads and downplaying the consoles (while pretending to be a game development expert even though you're from a completely different field). It's pretty much the only thing I ever see you do here on GAF. So have fun with that I guess!
 
Last edited:

ChiefDada

Member
I always hear the Sony crowd talking about how a game underperforms because it's not written specifically for a next-gen console assuming that a developer would get enormous amounts of gains had that been the case.

Engines are neutral. They don't cater to any hardware platform.


Nick Penwarden, VP of Engineering at Epic Games:
"The ability to stream in content at extreme speeds enables developers to create denser and more detailed environments, changing how we think about streaming content. It’s so impactful that we’ve rewritten our core I/O subsystems for Unreal Engine with the PlayStation 5 in mind,"
 

VFXVeteran

Banned
A few examples are the geometry engine and the fast I/O.
I hear this come up alot with people who haven't implemented anything rendering. Fast I/O isn't going to make a difference at all with this game. I've said this a dozen times. The PS5's SSD is too fast for it's GPU. If you want to understand why, then take a look at all the UE5 demo threads. Geometry engine is introducing more geometry into the pipeline and this will impact performance, not optimize it.

And there's more. The new consoles are NOT just faster versions of the previous ones, they have new hardware features that you need to explicitly support.
These features were introduced in Nvidia boards years ago. These engines already have implemented the features you speak of.

And you're the one who don't seem to understand what console optimization means.
I know full well what that entails because I work on realtime engines. You just don't want to accept the truth of what I have to say.

You seem to think it's just making a PC game and ticking the "build for console X" option, but I can promise you that's not how Naughty Dog got TLoU2 looking better than the vast majority of PC games on the ancient PS4 hardware.
"Looking better" is subjective. I have worked with a friend who was the tech artist for all of the characters in TLOU2. Trust me, I know what they did. Again, I've told this to people and they ignore it. I'm the one that told people about the PC ports from exclusives to begin with.

But we all know you absolutely live for coming into these threads and downplaying the consoles (while pretending to be a game development expert even though you're from a completely different field). It's pretty much the only thing I ever see you do here on GAF. So have fun with that I guess!
So that's the root of it there. You ignore my professional employment because it doesn't adhere to your glorified stance on exclusive games and you somehow "hope" that there is more to the PS5 than what's happening now. Good Luck.
 

VFXVeteran

Banned
It’s so impactful that we’ve rewritten our core I/O subsystems for Unreal Engine with the PlayStation 5 in mind,
Sorry but that has nothing to do with rewriting AN ENTIRE GRAPHICS ENGINE. How many times do I have to keep saying this? Get over it already. The PS5's GPU is the limit here and always will be. What do you expect? For games at the end of the cycle to have full on RT features @ native 4k/60FPS? How is the I/O going to help the GPU render pixels?
 
Does VRR pretty much come as standard with new TV's these days?

Yeah with a lot of the 2021 models it does…so it’s becoming more of a factor.

And yeah, higher resolution and stable frame rate on the XSX with no screen tearing. There’s no rational way this is a win for the PS5.
 

RoadHazard

Member
I hear this come up alot with people who haven't implemented anything rendering. Fast I/O isn't going to make a difference at all with this game. I've said this a dozen times. The PS5's SSD is too fast for it's GPU. If you want to understand why, then take a look at all the UE5 demo threads. Geometry engine is introducing more geometry into the pipeline and this will impact performance, not optimize it.


These features were introduced in Nvidia boards years ago. These engines already have implemented the features you speak of.


I know full well what that entails because I work on realtime engines. You just don't want to accept the truth of what I have to say.


"Looking better" is subjective. I have worked with a friend who was the tech artist for all of the characters in TLOU2. Trust me, I know what they did. Again, I've told this to people and they ignore it. I'm the one that told people about the PC ports from exclusives to begin with.


So that's the root of it there. You ignore my professional employment because it doesn't adhere to your glorified stance on exclusive games and you somehow "hope" that there is more to the PS5 than what's happening now. Good Luck.

You're not a game developer. You're not a game engine developer. You're not game graphics developer. You work in a tangentially related field, and therefore think you're an expert. Meanwhile you keep covering your ears and shouting whenever someone points out obvious truths that you refuse to accept.

And yes, I do think we'll see much more impressive things on the PS5 than we have so far. It has happened every single console generation so far (including on PS4, which was even more "off the shelf" with fewer customizations than the PS5), so why would you expect it not to this time? Or do you deny that this is even a thing that happens, that games tend to look better toward the end of a console generation than at the start? If so, I guess you're simply delusional.
 

RoadHazard

Member
Yeah with a lot of the 2021 models it does…so it’s becoming more of a factor.

And yeah, higher resolution and stable frame rate on the XSX with no screen tearing. There’s no rational way this is a win for the PS5.

If you prefer higher frame rates in heavy situations it clearly is. VRR doesn't improve frame rates, it just makes the effects of dropped frames less noticeable (which is great, don't get me wrong).

If PS5 supported VRR it would be a clear win for it. 10fps is a lot more noticeable than 100 lines of resolution. As it stands it depends on whether or not you have a VRR-capable TV. If you do, the experience will probably feel smoother overall on XSX. If you don't (and most don't yet), PS5 is the better version.
 
Last edited:

metaverse

Gold Member
That's not what I'm talking about. The console warring between power this and power that is immature and I don't pay attention to that. I'm talking about what the hardware *should* do with relation to graphics features. I always hear the Sony crowd talking about how a game underperforms because it's not written specifically for a next-gen console assuming that a developer would get enormous amounts of gains had that been the case. I don't see Xbox gamers declaring "unoptimized, this game would run like CG if the graphics engine was rewritten for the next-gen console." Xbox gamers wouldn't say that because all of their exclusives are developed on PC first. If the PC can't do it, then surely the XSX won't.

There hasn't been many Xbox Series game releases, or game releases from Microsoft in general. However, have you not seen the complaints about Xbox one/x holding back Halo Infinite?
 

ChiefDada

Member
You just can't beat seeing VFXVeteran VFXVeteran rustling jimmies lol

Well sure, it's frustrating when someone who claims to be well versed in the tech we're discussing constantly spread falsehoods. I've been fascinated with the current gen tech ever since the original ue5 demo, and I came to this forum primarily to learn from others who had relevant tech background to gain knowledge. VFXVeteran VFXVeteran is the only active member that fits the bill as far as I am aware, but unfortunately he has chosen the dark side and uses his powers for evil. I remember asking him a general question about the long-term future of console performance, and he responded by insinuating I was some kind of console warrior. It's been downhill ever since. I soon realized his takes are often so bizarre and easily disproven. He wants to turn every conversation into a console vs PC/offline rendering comparison. To this day, I haven't figured out what he could possibly be gaining from these shenanigans.
 

ChiefDada

Member
The PS5's SSD is too fast for it's GPU

How is the I/O going to help the GPU render pixels?


Eurogamer/Digital Foundry interview with Developer Shin'en on how the PS5 was able to render 8K for The Touryist:

Typically, in the PC space, to get a faster GPU, manufacturers produce 'wider' designs that run at the same clocks as less capable parts - or even slower. Xbox Series X follows the same pattern. Its GPU runs at a slower clock, but should be more capable overall as it has many more compute units. Shin'en tells us that in the case of its engine, the increase to clock frequencies and the difference in memory set-up makes the difference. Beyond this, rather than just porting the PS4 version to PS5, Shin'en rewrote the engine to take advantage of PS5's low-level graphics APIs.
 

Riky

My little VRR pleasure pearl goes vrrrooommm.
If you prefer higher frame rates in heavy situations it clearly is. VRR doesn't improve frame rates, it just makes the effects of dropped frames less noticeable (which is great, don't get me wrong).

If PS5 supported VRR it would be a clear win for it. 10fps is a lot more noticeable than 100 lines of resolution. As it stands it depends on whether or not you have a VRR-capable TV. If you do, the experience will probably feel smoother overall on XSX. If you don't (and most don't yet), PS5 is the better version.

It's very hard to notice the difference between just a few frames for a brief period with VRR, in fact almost impossible, it's not the like the game constantly runs at a lower framerate as 97% of the time it's a solid 60fps, so you are talking about 3% of the action.
What you notice during framerate drops is your displays reaction to it with screen tearing or stutter or both, take that away and you just won't know.
So in the end higher resolution will be more noticeable as nothing can bridge that gap, see Hitman 3.
 
Last edited:

SlimySnake

Member
Begs to reason to push hardware for comparison, otherwise may a well run Pac-Man on both and conclude the systems are 100% identical because game is locked 60.
But comparison of what? Two hardwares? Or how the game actually runs? What's more important to the viewer who is making a decision on which version to get?

I have always found these worst case scenario comparisons completely misleading. You need to do average framerate tests, not find the 1% worst scenario and base the entire opinion on something that will only happen 1% of the time. Even PC games use average framerate benchmarks. They add the 1% minimum in the test, but only as a reference.

I could understand the first few initial tests they did back in 2020 to determine just how powerful these machines are, but in 2022? What difference does it serve other than console wars? How does the game perform in average playthroughs? Whats the average framerate during combat and traversal? Thats all that should matter. I brought up the Control comparison earlier, the game runs smoother on the PS5 and yet the photomode tests showed a 16% XSX advantage. Who cares? People playing the actual game dont see that 16% advantage.

Back when GOW came out on the Pro, John from DF found the worst case scenario and claimed the 60 fps mode is unplayable and drops too often. NX Gamer simply ran a benchmark of the first 20 minutes complete with the boss fights. His average FPS was over 56. And yet John told everyone to use the 30 fps option. Whose test is more valuable to the final consumer?
 

VFXVeteran

Banned
You're not a game developer. You're not a game engine developer. You're not game graphics developer. You work in a tangentially related field, and therefore think you're an expert. Meanwhile you keep covering your ears and shouting whenever someone points out obvious truths that you refuse to accept.

You must not have gotten the memo. VFX has been working in the realtime sector for 5yrs now. Not games, but realtime graphics. What else?
 

VFXVeteran

Banned
Eurogamer/Digital Foundry interview with Developer Shin'en on how the PS5 was able to render 8K for The Touryist:
I give up with you. I'll bookmark this comment and wait for 3yrs and then come back to you to see if we have somehow squeeze full on RT games with Nanite geometry and 8k textures @ native 4k/60FPS. Until then.. enjoy your dreams of low-level API code-to-the-metal from a 2080-like GPU.
 
Last edited:

Shmunter

Gold Member
But comparison of what? Two hardwares? Or how the game actually runs? What's more important to the viewer who is making a decision on which version to get?

I have always found these worst case scenario comparisons completely misleading. You need to do average framerate tests, not find the 1% worst scenario and base the entire opinion on something that will only happen 1% of the time. Even PC games use average framerate benchmarks. They add the 1% minimum in the test, but only as a reference.

I could understand the first few initial tests they did back in 2020 to determine just how powerful these machines are, but in 2022? What difference does it serve other than console wars? How does the game perform in average playthroughs? Whats the average framerate during combat and traversal? Thats all that should matter. I brought up the Control comparison earlier, the game runs smoother on the PS5 and yet the photomode tests showed a 16% XSX advantage. Who cares? People playing the actual game dont see that 16% advantage.

Back when GOW came out on the Pro, John from DF found the worst case scenario and claimed the 60 fps mode is unplayable and drops too often. NX Gamer simply ran a benchmark of the first 20 minutes complete with the boss fights. His average FPS was over 56. And yet John told everyone to use the 30 fps option. Whose test is more valuable to the final consumer?
The way I see it there is no one size fits all. What’s acceptable to one may not be to another. Or what they get out of it.

Averages can be just as misleading as the slice of analysis can be too vertical and not represent having a horrible or frustrating time in parts as a general example.

I personally don’t game on both so it’s not a consumer choice video, it’s a pure tech benchmark to me, whereas others don’t even care because VRR will carry it etc.

We need to see it all, and take from it what matters to each.
 

Kenpachii

Member
Is it me or is this game running worse now on max settings with a 3080.

However the wonky npc's are fixed it seems like, huge upgrade from that.
 

ChiefDada

Member
I'll bookmark this comment and wait for 3yrs and then come back to you to see if we have somehow squeeze full on RT games with Nanite geometry and 8k textures @ native 4k/60FPS.

Once again, no one has ever claimed this.

Argue Peace Out GIF
 

RoadHazard

Member
I give up with you. I'll bookmark this comment and wait for 3yrs and then come back to you to see if we have somehow squeeze full on RT games with Nanite geometry and 8k textures @ native 4k/60FPS. Until then.. enjoy your dreams of low-level API code-to-the-metal from a 2080-like GPU.

We sure enjoyed GoW, Spider-Man, TLoU2, etc, on the GTX 750/Radeon 7850-like GPU of the PS4.

And no one has ever claimed the silly things you're talking about that not even the best gaming PCs (where just the GPU costs several times more than a PS5) can do.
 
Last edited:
Top Bottom