• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Playstation 5 Back Compat vs Xbox Series X, Ps4/Pro (Skyrim)

John Wick

Member
Stop. There is no evidence that the X doesn't use the full resources for BC, in fact that was one of the strengths since the BC was not HW based, it could accelerate and boost old titles. Only PS5 has the concept of "PS4" and "PS4 Pro" HW modes for BC, it was in the Cerny slides. Don't invent excuses unless you have something to back it up.
I was speculating.
 
The engine is irrelevant. Its legacy code being pushed through BC with brute force. It's not like the Skyrim devs were coding for the XSX or PS5.
It is relevant in the fact that hardware can't give an engine things the engine is not asking for and expect it to use them. If it worked that way, enhanced remakes/ports would make themselves.

This is useless if the intent is comparing both consoles.

The interesting part would be figuring why it is happening, but it's probably not a hard hardware limit for either. Just a case of unoptimized game asking for a lot of things in some absurd manner, so it hits a ceiling - that ceiling might be artificial, or a real bottleneck; but a real bottleneck that might not happen if you code your game properly. (still interesting)

But most likely, the game really likes more Mhz and less parallelization. And the way it does things is ineffective, you can't change that with a software layer.
 
Last edited:

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
It is relevant in the fact that hardware can't give an engine things the engine is not asking for and expect it to use them. If it worked that way, enhanced remakes/ports would make themselves.
True, but MS use APIs for BC like the on PC which have DirectX. A new RTX 3090 PC can run at higher res and high frame rate although a game was written in 2016 and had no knowledge of said GPU. You can't push RT or new features in without a patch, but there is no reason you cannot harness the raw horsepower to brute force performance.
 
I'm surprised how badly it ran on the Pro. I guess native 4k was too much for it even in Skyrim, but Bethesda was like "whatever who cares". Feel bad for anybody that bought it back then on their shiny new Pro only to find fps in the mid 20s on freaking Skyrim. Jesus.
The VR version was pretty buggy too, my neighbor was made sick because "he got stuck in a wall and everything started spinning really fast". Apparently he threw the helmet and had to lay down for almost an hour before he could move.

Bugs can be pretty bad in VR.
 

Shmunter

Member
As far as I know, only PlayStation ever created a butterfly GPU where half could be disabled for PS4 compatibility. Outside of that there is no such thing as only using X amount of CU’s. The hardware is what it is and does it’s best to render given Instruction sets.

Seemingly we do not yet know what PS5 Boost mode does? Does it run all 36 cu’s at full clock for base PS4 games? Or just the 18 at higher clock? Pro games would logically run all 36.
 
True, but MS use APIs for BC like the on PC which have DirectX. A new RTX 3090 PC can run at higher res and high frame rate although a game was written in 2016 and had no knowledge of said GPU. You can't push RT or new features in without a patch, but there is no reason you cannot harness the raw horsepower to brute force performance.
What you describe is linearity, if all you give your hardware is more power in a linear fashion, it will be easier for it to scale up nicely because what you're doing is taking bottlenecks out of the equation.

By easier I should note that doesn't necessarily come hand in hand with any notion of efficiency. The cost of rendering at much higher resolutions than initially intended can multiply cost several times making whatever the game is doing even more inefficient. It's all a matter of how have things evolved, best example would be take a single threaded full HD software renderer from 2013 (the year last gen consoles launched), yet if you tried to pull 4K now, on a modern CPU it would fluke harder than it did all those years ago. Because it's four times the work, your PC on paper is surely 4 times more powerful, but not in a linear way, IPC has not improved 400%, and frequency also hasn't so you're stuck. This was the reason we offloaded that work to GPU's in the first place.

Now, if you look at a game like this, that really wasn't updated to take advantage and offload a lot of things to the hardware side, there's quite a few bottlenecks that an API can't fix. You can't move CPU things to to GPU, or make things that were previously expensive to "free" ones, imagine this game had a taxing for the time Bokeh effect, it would cost more to do at a high resolution, than a modern Bokeh with low-level hardware support. A case for this is the Witcher 2 Cinematic Depth of Field thing which continues to be a nightmare on top range GPU's to this day.

The way this was designed, parallelized and optimized means the game code is not "wide" to take advantage of wider hardware than the one that existed then. For wide to work, you have to make everything independent (asynchronous). With Skyrim you can't do that, because it elected to do things with a lot of back and forth dependencies of the era.

Even if it the game core was proper multithreaded, the way it was written means gains while existing/better than before would be technically relative (against the gold standard of brute force solving everything), because you'd still be processing synchronous events on multiple cores/overhead meaning some things would finish really fast, yeah, then have to wait for the ones that didn't to finish before they move on.

EDIT: I looked into Xbox One vs PS4 videos for this game, as Xbox One had less GPU, but slightly more CPU frequency (1,6 on PS4 vs 1,75 GHz on Xbone), but wasn't able to see any Xbox One advantage that would suggest the game was CPU limited on PS4, against the Xbox One - quite the opposite (PS4 had better framerate, which is kinda usual seeing how severely bandwidth bottlenecked it was). So again, any guess is a good guess.
 
Last edited:

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
The way this was designed, parallelized and optimized means the game code is not "wide" to take advantage of wider hardware than the one that existed then. For wide to work, you have to make everything independent (asynchronous). With Skyrim you can't do that, because it elected to do things with a lot of back and forth dependencies of the era.

Stop, you are inventing concepts to make reality match your expectations. This is fantasy.
 
Stop, you are inventing concepts to make reality match your expectations. This is fantasy.
I didn't invent anything. I hypothesised in detail, because this is meant to be a constructive discussion of sorts and because I didn't agree with your API and brute force trumps all mantra. Yet I didn't say something along the lines of "you're imagining things to match your expectations" which would make responding to anything a breeze.

What are my expectations here, exactly? (you don't have to answer)


You're the one saying that an unoptimized game should run fine and use a full stack of hardware by sheer brute force because of the magic of API's. Now, I wrote that I don't know where the fault is - but it might be, and it probably is, in a few weak links because of the way a game old like this, that can only take advantage of a Core 2 Duo, in a PC, is bound to be structured. You mention Direct X, now, this game is Direct X 9 era, you can search for how inefficient Direct X 9 is on modern computers, yeah, it runs better than it did back then, but you're leaving a lot on the table and thus some games become performance hogs.

I'll surmise regardless: Games of this era had problems with multithreading and everything went past the CPU who acted as a gatekeeper, stalling everything else. And that doesn't scale linearly because you have an API to take care of it.
 
Last edited:
I didn't invent anything. I hypothesised in detail, because this is meant to be a constructive discussion of sorts.

What are my expectations here, exactly? (you don't have to answer)


You're the one saying that an unoptimized game should run fine and use a full stack of hardware by sheer brute force because of the magic of API's. Now, I wrote that I don't know where the fault is - but it might be, and it probably is, in a few weak links because of the way a game old like this, that can only take advantage of a Core 2 Duo, in a PC, is bound to be structured.

I'll surmise regardless: Games of this era had problems with multithreading and everything went past the CPU who acted as a gatekeeper, stalling everything else. And that doesn't scale linearly because you have an API to take care of it.
Would a power draw test clear anything up? PC wise it would. Xbox and PS should just make an app that shows GPU/cpu usage.
 
Would a power draw test clear anything up? PC wise it would. Xbox and PS should just make an app that shows GPU/CPU usage.
cpu/gpu usage and frequency would probably clear everything up yes, or at least give us a ton of information. It would also tell us how back compatibility works in depth for both consoles, and how differentiated the strategy is.

Closest we could study of Skyrim SE behaviour is on hacked Nintendo Switch via Rivatuner (GPUCPU usage statistics) but that's too much work to make an hypothesis for the game state on other platform running via BC.
 
Last edited:

FranXico

Member
As far as I know, only PlayStation ever created a butterfly GPU where half could be disabled for PS4 compatibility. Outside of that there is no such thing as only using X amount of CU’s. The hardware is what it is and does it’s best to render given Instruction sets.

Seemingly we do not yet know what PS5 Boost mode does? Does it run all 36 cu’s at full clock for base PS4 games? Or just the 18 at higher clock? Pro games would logically run all 36.
Developers never "code for X CUs", the GPU itself is the one that spreads out and manages the workload. The behavior of certain workloads is what dictates where do more CUs or higher frequency perform better. But that's entirely the hardware, not devs.
 
Last edited:
Developers never "code for X CUs", the GPU itself is the one that spreads out and manages the workload. The behavior of certain workloads is what dictates where do more CUs or higher frequency perform better. But that's entirely the hardware, not devs.

That makes the 12TF GCN claim make alot more sense and helps explains the performance boost in many BC games.
 
Developers never "code for X CUs", the GPU itself is the one that spreads out and manages the workload. The behavior of certain workloads is what dictates where do more CUs or higher frequency perform better. But that's entirely the hardware, not devs.
So the entire Skyrim engine could just perform better on higher clocks then more cu?
 

muteZX

Banned
So the entire Skyrim engine could just perform better on higher clocks then more cu?

Yes. Let's say the common renderpipeline has 20 graphics processing steps and Skyrim can't effectively saturate more than 5-10 /whatever/ teraflops of compute power, then for example the power of "underperforming" raster ops. units /one of these gfx steps/ becomes a brake/bottleneck. So it doesn't matter how many CUs you have there, but it does matter how fast are ROPs running .. and thus better PS5 perf. in Skyrim.

PS5 - 64 ROPs at 2230 Mhz .. 142.72 Gpix/s of fillrate.
XSX- 64 ROPs at 1825 Mhz .. 116.80 Gpix/s of fillrate

Thats a theory /not counting main memory parametres etc. etc./
 

FranXico

Member
So the entire Skyrim engine could just perform better on higher clocks then more cu?
It could in theory, but what we already see in practice is that it runs perfectly fine in both configurations, save for that one scenario where higher frequency helps a bit (not that much, though).
 
Last edited:
It could in theory, but what we already see in practice is that it runs perfectly fine in both configurations, save for that one scenario where higher frequency helps a bit (not that much, though).
So engines and what they are trying to do will just work better on ps or Xbox and it's less about power and more about which is suited better for that particular engine/game. If it's anything like cpu which took forever for loads to be distributed well across multiple cores Xbox could be behind for a bit. Or it's not like cpu?
 

muteZX

Banned
So engines and what they are trying to do will just work better on ps or Xbox and it's less about power and more about which is suited better for that particular engine/game. If it's anything like cpu which took forever for loads to be distributed well across multiple cores Xbox could be behind for a bit. Or it's not like cpu?

A game engine/HW config - there are many roadblocks. It can be CPU limited or GPU limited /geometry setup, ROPs, main compute, async compute/ or starved for more mem/IOP bandwidth, "slow" API. Each one of them can limit/lower your framerate severely.
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
So the entire Skyrim engine could just perform better on higher clocks then more cu?
No one knows why they perform differently, but not knowing is better than making stuff up.

Like Richard suggest in the video (as a theory), faster is better than wide, this could be true in general hence Sony went this route in the design. In any distributed HW design, the more you rely on spreading the work around (more CUs), the more overhead and more likely the distributed effort doesn't reach peak FLOPS. Hence Cerny said don't use TF as the sole metric. We know MS was designing the APU for than just the XSX, so they had other considerations.

A similar concept shows up in PC CPUs, after a min number of cores (3-4), it's is usually more advantages to increase the clock rate. Develops have a hard time fully utilizing all cores for all but very specific tasks.

Or it's the "tools" ;)
 
Would a power draw test clear anything up? PC wise it would. Xbox and PS should just make an app that shows GPU/cpu usage.

No, because power draw would not tell the full story. You can have applications that draw a lot of power or in some cases even saturate the CPU/GPU with work but that doesn't mean they are leveraging all the features of the design optimally.

And I think that's something to keep in mind with the Skyrim stuff. Though I'll say, the fact there's frame drops on Series X with alpha transparency effects (the dragon flame here, the foilage in Hitman 3 and Valhalla (at least pre-patch), etc.) might just be suggesting the real pixel fillrate gap between it and PS5 are manifesting themselves (Series X has lower pixel fillrate, but higher texture fillrate ceiling).

One other thing to keep in mind is if this Skyrim port's using the PS4 pro and One X versions, then while on PS5 the full GPU would be saturated, on Series X it'd only be 40 CUs saturated with work. Curious if the resolution targets and texture resolution is the same in both versions too; typically One X versions of games used higher resolution textures. In both PS5 and Series X's cases, this mod is simply treating the systems as 10 TF GCN and 12 TF GCN designs, respectively.
 
No one knows why they perform differently, but not knowing is better than making stuff up.

Like Richard suggest in the video (as a theory), faster is better than wide, this could be true in general hence Sony went this route in the design. In any distributed HW design, the more you rely on spreading the work around (more CUs), the more overhead and more likely the distributed effort doesn't reach peak FLOPS. Hence Cerny said don't use TF as the sole metric. We know MS was designing the APU for than just the XSX, so they had other considerations.

A similar concept shows up in PC CPUs, after a min number of cores (3-4), it's is usually more advantages to increase the clock rate. Develops have a hard time fully utilizing all cores for all but very specific tasks.

Or it's the "tools" ;)

By that logic PS5 should've used a 4-core or 6-core CPU, then. You should also keep in mind that "wider" changes meaning as architectures change; for GCN 52 CU or 60 CU design may've been considered "wide" but with 80 CU cards like 6900XT available that type of design would be considered more mid or small. By this logic, PS4 Pro was a "wide" design because GCN CU limit was 40 CUs (outside of Vega, which was specifically GCN 5.0), which the Pro basically used.

Even Cerny's talk of "narrow vs. wide" is predicated on the specific architecture; RDNA 2 is built for scalability so going wider does not impact it as much as it would've done with, say, a 80 CU RDNA 1 or Vega design. The entire reason AMD wanted to push beyond 40 CUs for RDNA 2 was because they most likely resolved the frontend issues in managing wider designs, hence why we're seeing rasterization performance on the 80 CU cards performing at or better than the 3080 and 3090 GPU card equivalents.

You are taking Cerny's "narrow vs. wide" points to on-the-nose, same with Richard's points you bring up. They mean these things in VERY general terms, but there is so much nuance to embedded system designs that you can't use such a general idea as a catch-all to explain application performance. So maybe it's just a case that people who are speculating on this stuff have more knowledge on how these technologies function than yourself, that doesn't mean they're just making things up out of panic.
 
Last edited:
No, because power draw would not tell the full story. You can have applications that draw a lot of power or in some cases even saturate the CPU/GPU with work but that doesn't mean they are leveraging all the features of the design optimally.

And I think that's something to keep in mind with the Skyrim stuff. Though I'll say, the fact there's frame drops on Series X with alpha transparency effects (the dragon flame here, the foilage in Hitman 3 and Valhalla (at least pre-patch), etc.) might just be suggesting the real pixel fillrate gap between it and PS5 are manifesting themselves (Series X has lower pixel fillrate, but higher texture fillrate ceiling).

One other thing to keep in mind is if this Skyrim port's using the PS4 pro and One X versions, then while on PS5 the full GPU would be saturated, on Series X it'd only be 40 CUs saturated with work. Curious if the resolution targets and texture resolution is the same in both versions too; typically One X versions of games used higher resolution textures. In both PS5 and Series X's cases, this mod is simply treating the systems as 10 TF GCN and 12 TF GCN designs, respectively.
I didn't think power draw would tell the full story especially not with "next gen"games. It could be an interesting metric to be added with these back compat tests. Unless they always are basically the same then it wouldn't tell anything but if they are low on some titles and higher on others it could help narrow down reasons instead of just speculation. I can't find any power draw comparisons for this game in specific.
 
Like Richard suggest in the video (as a theory), faster is better than wide,)No one knows why they perform differently, but not knowing is better than making stuff up.(...)
Funny, that you are still taking swipes, while repeating what I previously said on the post you just took a swipe to.

(...) The way this was designed, parallelized and optimized means the game code is not "wide" to take advantage of wider hardware than the one that existed then. For wide to work, you have to make everything independent (asynchronous). With Skyrim you can't do that, because it elected to do things with a lot of back and forth dependencies of the era.
The term wide, hadn't been mentioned before in this thread, in the entirely of it's 4 pages (nor was that term specifically mentioned on Richard's hypothesis). If it's that much of speculative bullshit I don't know why do you almost quote it, then slam it, again.
No, because power draw would not tell the full story. You can have applications that draw a lot of power or in some cases even saturate the CPU/GPU with work but that doesn't mean they are leveraging all the features of the design optimally.
You're quite right. I hadn't considered it to that extent.
I didn't think power draw would tell the full story especially not with "next gen"games. It could be an interesting metric to be added with these back compat tests. Unless they always are basically the same then it wouldn't tell anything but if they are low on some titles and higher on others it could help narrow down reasons instead of just speculation. I can't find any power draw comparisons for this game in specific.
Yes, it would be interesting to further study both PS5 and Xbox Series S/X BC and that's certainly a place where you could start.
 
Last edited:

Jokerevo

Banned
It's the more popular brand on this forum. Any good news for them is a victory and any perceived bad news is a defeat or an attack.
Victory? Lol. This is a victory? I mean the XsX is the more powerful machine. Wtf is going on here? In multiplats it should be the leader but it's consistently being shafted. You guys should be raging because BC games is all you've got right now.
 
Victory? Lol. This is a victory? I mean the XsX is the more powerful machine. Wtf is going on here? In multiplats it should be the leader but it's consistently being shafted. You guys should be raging because BC games is all you've got right now.
Would you really consider this one title as something an Xbox fan should be upset about? XSX got the 60 fps update first and most of the BC game updates only apply to the Xbox see Rocket league, Warzone, and Fallout 4.

I'll ignore the fanboyish 'BC is all they have' statement but this is hardly something to be up in arms about. I do hope you head to that Hitman 3 thread and explain to the Sony fans there that that game too is not something they should flip out about especially if they think like you do that BC is all Xbox has.

It's the more popular brand on essentially every forum to be fair
It is the bigger worldwide brand no doubt. You'd figure there would be be a real comfort in that yet they seem to be so 'concerned' about what MS is doing. Why would they even care?
 
Last edited:
Would you really consider this one title as something an Xbox fan should be upset about? XSX got the 60 fps update first and most of the BC game updates only apply to the Xbox see Rocket league, Warzone, and Fallout 4.

I'll ignore the fanboyish 'BC is all they have' statement but this is hardly something to be up in arms about. I do hope you head to that Hitman 3 thread and explain to the Sony fans there that that game too is not something they should flip out about especially if they think like you do that BC is all Xbox has.


It is the bigger worldwide brand no doubt. You'd figure there would be be a real comfort in that yet they seem to be so 'concerned' about what MS is doing. Why would they even care?
The Xbox vs PlayStation thing is really weird especially since Nintendo is cruising both of them and a gaming PC crushes these consoles. Fun games is Nintendo, pretty games is PC.
 

Trogdor1123

Gold Member
This mod is pretty darn stable I have to say. No crashes on my ps5 yet. Still crossing my fingers for a fallout 4 one on ps5
 

Shmunter

Member
The mod is transformive, but it should be default. A basic ini patch and we don’t get it officially. Bad devs!
 
Last edited:

Trogdor1123

Gold Member
The mod is transformive, but it should be default. A basic in patch and we don’t get it officially. Bad devs!
They probably don't see the value? Maybe? Allows them to resell it on the next gen systems with "upgrades", who knows.

We need this for fallout 4 and those supposed fallout 3 and New Vegas remasters (which I doubt exist)
 
Top Bottom