• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF - Assassin's Creed Valhalla: PS5 vs Xbox Series X/ Series S Next-Gen Comparison!

FritzJ92

Member
Yep. However, it's still clocked faster (with or without SMT).

I can only think of two things here:

Tooling:
If it does come down to immature Microsoft tooling, then they royally fucked up. (Given how much better their BC support is, I would like to think this isn't the case)

TDP:
The SOC is being thermal throttled.

Can't be thermally throttled, the XSX kept a locked 60FPS on Gears 5 which is technically more graphically impressive when DF was testing the wattage of the system. There isn't a thermal issue, as that scene was pushing XSX to its highest possible temp at the time... do you think AC V is pushing wattage above Gears 5 and hitting a thermal limit? I doubt it TBH...
 

Yoboman

Member
Regardless of tooling. Logically this shouldn't be happening (for any comparison).

The PS5 is a 10 TF machine with a 3.5 ghz CPU
The XBSX is a 12 TF machine with a 3.8 ghz CPU

They're directly comparable as they're both RDNA2+Zen2 based. Scratching my head right now. It doesn't matter how minimal the difference is based on specs. The XSX should technically be offering better performance here.
Guess that means MS were not being entirely truthful. Clearly an Xbox TF is worth less than a PS5 TF
 

FritzJ92

Member
100Mhz but the Xbox OS have way more overhead with the hypervisor than PS5.
Same happened with PS4 vs XB1... XB1 had a CPU clocked higher but PS4 have more CPU for game because it doesn't have hypervisors working in background.

Plus PS5 CPU is not used for a lot of render tasks because it is offload to other co-processors.

False, because AC Unity a CPU-intensive game performed better because of the CPU, even though it was the weaker hardware...

Source:
"Up until recently, both Xbox One and PlayStation 4 have reserved two entire CPU cores (out of eight available) in order to run the background operating system in parallel with games. Since October, Microsoft has allowed developers access to 50 to 80 per cent of a seventh processing core - which may partly explain why a small amount of multi-platform titles released during Q4 2014 may have possessed performance advantages over their PS4 counterparts in certain scenarios."

 
I've watched the video plenty of times. This doesn't apply here. This is Assassin's Creed we are talking about (a last gen game). Not something that would take advantage of Playstation 5's custom solution.

If PS5 has unified cache and Xbox does not then I would put the difference down to this. But I hadn’t heard this being called as fact until the post above but there is nothing concrete about this as far as I’m aware. If we ever get an architecture shot of the ps5 we could see it there.

As for the technical reason this is happening I’d almost put the Sony cache scrubbers as the reason without knowing any other hardware specifics. It seems like the kind of thing that this kind of custom hardware does for devs automatically and the area it’s happening in the game looks cpu heavy with shifting data requirements in memory (inside and outside and lots of enemy and allied ai). But I mean I really don’t know what I’m talking about and this is just my semi tech literate shot in the dark I could be way off base.
 

sendit

Member
Can't be thermally throttled, the XSX kept a locked 60FPS on Gears 5 which is technically more graphically impressive when DF was testing the wattage of the system. There isn't a thermal issue, as that scene was pushing XSX to its highest possible temp at the time... do you think AC V is pushing wattage above Gears 5 and hitting a thermal limit? I doubt it TBH...

Gears is barely touching on what AC V is doing.
 

Lysandros

Member
Regardless of tooling. Logically this shouldn't be happening (for any comparison).

The PS5 is a 10 TF machine with a 3.5 ghz CPU
The XBSX is a 12 TF machine with a 3.8 ghz CPU

They're directly comparable as they're both RDNA2+Zen2 based. Scratching my head right now. It doesn't matter how minimal the difference is based on specs. The XSX should technically be offering better performance here.
Your level of technical knowledge of the respective systems is astonishing. I can only congratulate, that's too deep for me to understand.
 

sendit

Member
Your level of technical knowledge of the respective systems is astonishing. I can only congratulate, that's too deep for me to understand.

Given a high level overview. Please elignten me. Or are you just going to spit out buzz words without really knowing what they mean (Cache Scrubbers, I/O Complex, etc...)? Nothing in AC:V is meant to take advantage of anything custom in the PS5.

Also, I'm not disputing the results. I'm just surprised by them (which i'm sure everyone here is).
 
Last edited:

Coolwhhip

Neophyte
Your level of technical knowledge of the respective systems is astonishing. I can only congratulate, that's too deep for me to understand.

This whole thing is just proof how full of shit a lot of people are on the internet. All claiming to be tech experts and declaring how much more powerful Xbox is based on a few numbers they probably dont even understand.

Kinda makes sense Cerny with decades of experience in games tech made them all look like fools.
 
Last edited:

Lysandros

Member
Given a high level overview. Please elignten me. Or are you just going to spit out buzz words without really knowing what they mean (Cache Scrubbers, I/O Complex, etc...)? Nothing in AC:V is meant to take advantage of anything custom in the PS5.

Also, I'm not disputing the results. I'm just surprised by them (which i'm sure everyone here is).
No, of course not. Like i said, i am not up to task. Never mind. 👍
 

ethomaz

Banned
False, because AC Unity a CPU-intensive game performed better because of the CPU, even though it was the weaker hardware...

Source:
"Up until recently, both Xbox One and PlayStation 4 have reserved two entire CPU cores (out of eight available) in order to run the background operating system in parallel with games. Since October, Microsoft has allowed developers access to 50 to 80 per cent of a seventh processing core - which may partly explain why a small amount of multi-platform titles released during Q4 2014 may have possessed performance advantages over their PS4 counterparts in certain scenarios."

images


BTW Sony released the 7th core on PS4.


The advantage in CPU on Xbox One was just about a year until Sony unlocked the 7th core to developers too.
 
Last edited:
The main question I had wasn't answered.

They both use dynamic resolution AND reach those resolutions using a reconstruction technique. But they don't go into which reconstruction technique OR what the native resolutions being rendered actually are.

Regardless the results look good. You can be less than native 4K and still produce crisp graphics.
 
Last edited:

onQ123

Member
Regardless of tooling. Logically this shouldn't be happening (for any comparison).

The PS5 is a 10 TF machine with a 3.5 ghz CPU
The XBSX is a 12 TF machine with a 3.8 ghz CPU

They're directly comparable as they're both RDNA2+Zen2 based. Scratching my head right now. It doesn't matter how minimal the difference is based on specs. The XSX should technically be offering better performance here.

https://www.neogaf.com/threads/if-m...e-else-using-it-to-compare-it-to-ps5.1562061/

 

sendit

Member
https://www.neogaf.com/threads/if-m...e-else-using-it-to-compare-it-to-ps5.1562061/


Reading on what you posted. I think mods should reopen that thread up for further discussion.
 

GHG

Member
Imagine thinking that a results from games nullify the value of a machine that has a 15% determent over the other one.

I get it, this is a dick measuring thread. But 15% isn't that bad, BOTH machines are good.

Pick which one you like.

Imagine that.

The real tools in the console wars are the warriors.

So you decide to post this in response to me and not the people who are posting nonsense like the very post I replied to?

Carry on, as you will.

tenor.gif
 

Romulus

Member
The other thing is slightly less hardcore gamers that are waiting to choose XSX or ps5 based on multiplatforms, ps5 has already won at this point. Those people see that ps5 is winning the majority of big releases and they likely won't look again. You only get one chance for first impressions, especially with people that dont check forums daily.
 
Last edited:

VFXVeteran

Banned
I want to address that torch.

It takes away a good 10FPS from a 3090 on 4k/Ultra. The torch is an expensive render.. why? Because having more than 1 shadow casting light source wreaks havoc on ALL GPUs. It's a shame that that is the case but here we are where creating shadows continues to be very expensive. PS5 drivers are just really good at the dynamic res and probably drops resolution down enough to keep the FPS high.
 
I want to address that torch.

It takes away a good 10FPS from a 3090 on 4k/Ultra. The torch is an expensive render.. why? Because having more than 1 shadow casting light source wreaks havoc on ALL GPUs. It's a shame that that is the case but here we are where creating shadows continues to be very expensive. PS5 drivers are just really good at the dynamic res and probably drops resolution down enough to keep the FPS high.
False. both run at minimum 1440p. buddy, the case is closed. as of now, ps5 is the better performing console, period. But why do you care? 3090 still beats everything. I don’t understand your constant downplay on everything ps related. Just move on.
 

GHG

Member
I want to address that torch.

It takes away a good 10FPS from a 3090 on 4k/Ultra. The torch is an expensive render.. why? Because having more than 1 shadow casting light source wreaks havoc on ALL GPUs. It's a shame that that is the case but here we are where creating shadows continues to be very expensive. PS5 drivers are just really good at the dynamic res and probably drops resolution down enough to keep the FPS high.

The PS5 isn't running at a lower resolution in comparison to the Series X in that scene.

There's more to it than that. I'd be interested to hear some developer insights into the two machines now that the nda's should be over.

This is well and truly where much of gaming journalism is a failure, it would be nice to get some investigative and informative stuff on the state of the respective machines but instead we get the usual trash from most outlets.
 

SlimySnake

Flashless at the Golden Globes
Just a quick question.

How much of a performance hit is V-sync?
ran a few tests. a bit of a pain because some of the latest games cap the framerate at 60 fps if you turn on vsync.

Dirt rally 2.0 - 1440p 83 FPS with vsync on. 113 with vsync on. this game has some of the worst performance ever and it dips all the time, but in like to like cases thats the result i got for the best case scenarios.
Mad Max - 1440p 155-164 fps (max on my screen with vsync on with gpu utilization still around 90%) and 185 fps with vsync off.
Mafia remake - 45 fps vsync on at native 4k. it remembered my lg cx settings and was downsampling the native 4k screen. with vsync off it went up to 53 fps. couldnt test on 1440p because it locks the framerate to 60 fps with vsync on.

overall, it seems to be a 15-20% hit. Fixing tearing definitely hits the gpu hard.
 

Shmunter

Member
ran a few tests. a bit of a pain because some of the latest games cap the framerate at 60 fps if you turn on vsync.

Dirt rally 2.0 - 1440p 83 FPS with vsync on. 113 with vsync on. this game has some of the worst performance ever and it dips all the time, but in like to like cases thats the result i got for the best case scenarios.
Mad Max - 1440p 155-164 fps (max on my screen with vsync on with gpu utilization still around 90%) and 185 fps with vsync off.
Mafia remake - 45 fps vsync on at native 4k. it remembered my lg cx settings and was downsampling the native 4k screen. with vsync off it went up to 53 fps. couldnt test on 1440p because it locks the framerate to 60 fps with vsync on.

overall, it seems to be a 15-20% hit. Fixing tearing definitely hits the gpu hard.
Nice work.

In the context of the Valhalla comparison, it’s also important to note that when the framerate is locked 60 - there is likely a lot more frames potential beyond that vsync lock.

Even a larger discrepancy between the 2 systems is possibly bring obfuscated.
 

Caio

Member
Already happened, games with 120hz on SS and not on PS5.

Back on topic, there is a scene where the Door opens(just to mention one, because it's full), both PS5 and XSX versions drops in res, but PS5 keeps steady 60 fps with no tearing, while XSX drops to 50 fps with tearing. Really PS5 performs much better. Let's focus on the main Topic here :D
 
Last edited:

ethomaz

Banned
I want to address that torch.

It takes away a good 10FPS from a 3090 on 4k/Ultra. The torch is an expensive render.. why? Because having more than 1 shadow casting light source wreaks havoc on ALL GPUs. It's a shame that that is the case but here we are where creating shadows continues to be very expensive. PS5 drivers are just really good at the dynamic res and probably drops resolution down enough to keep the FPS high.
It drop the res less than Xbox to maintain the 60fps... there is room to PS5 (it barely touches the minimum 1440p) while Xbox really needs to go below 1440p in that torch scene but it can’t due how the DRS is set.
 
Last edited:

geordiemp

Member
Given a high level overview. Please elignten me. Or are you just going to spit out buzz words without really knowing what they mean (Cache Scrubbers, I/O Complex, etc...)? Nothing in AC:V is meant to take advantage of anything custom in the PS5.

Also, I'm not disputing the results. I'm just surprised by them (which i'm sure everyone here is).

1. Geometry engine is different

2. Higher clocks

3. Cache scrubbers and coherency

4. CU shaders are different and do different work (Cerny and ND patent)

ALOVCup.png


5, 10 CU per shader array (same as 6800), 14 CU in XSX is odd

5. CPU on XSX is split distance across the die, Ps5 will not be, could be unified.

I could go on, but why. People see what they want to see.
 
Last edited:
VRR shouldn't be considered the panacea. I've been using G-Sync for years and you can still tell when it's running at a lower frame rate you just don't get that annoying judder that you get with v-sync where frames are repeated. 45 fps still looks like 45fps.

VRR is obviously way better than either no v-sync with tearing or v-synced 60fps that doesn't stay at 60fps but locked framerates are still better.

Maybe next gen when everyone has a VRR capable TV devs can rely on it as it would make optimising much easier as the odd 10-15 fps drop from 60fps is way less obvious than with v-sync.
 

Bitmap Frogs

Mr. Community
I want to address that torch.

It takes away a good 10FPS from a 3090 on 4k/Ultra. The torch is an expensive render.. why? Because having more than 1 shadow casting light source wreaks havoc on ALL GPUs. It's a shame that that is the case but here we are where creating shadows continues to be very expensive. PS5 drivers are just really good at the dynamic res and probably drops resolution down enough to keep the FPS high.

Oh no, good Sony news! Who’s going to save us?
 

Caio

Member
It drop the res less than Xbox to maintain the 60fps... there is room to PS5 (it barely touches the minimum 1440p) while Xbox really needs to go below 1440p in that torch scene but it can’t due how the DRS is set.

Exactly. DF clearly said PS5 performs BETTER, with XSX having screen tearing much more often, and can't even mantain 60 fps or close to when res drops to 1440p. In the scene where people move around with torches in the small village, in the whole scene PS5 keeps a perfect 60 fps with just a small drop to 58 fps, than back to steady 60 fps. XSX stays constantly below 50 fps, down to 46 fps, than 48 fps, and stay there. This is a massive 15% performance difference, plus add XSX is tearing much MUCH more.

Edited : also, PS5 stays at higher res in many scenarios.
 
Last edited:

Yoboman

Member
I want to address that torch.

It takes away a good 10FPS from a 3090 on 4k/Ultra. The torch is an expensive render.. why? Because having more than 1 shadow casting light source wreaks havoc on ALL GPUs. It's a shame that that is the case but here we are where creating shadows continues to be very expensive. PS5 drivers are just really good at the dynamic res and probably drops resolution down enough to keep the FPS high.
Thank God you're here, we needed an amateur armchair analysis
 
Not gonna lie. This thread just gets better and better.

I enjoy reading all the back and forth comments on these DF comparisons. It's been quite awhile since DF comparisons got this much attention. So I'm going to enjoy it while I can.
 

DForce

NaughtyDog Defense Force
he delete it what was the tweet

lol. I need to find a screenshot of it, but he said Xbox Series X would receive a 20%-50% over the PS5.

Edit:

The full quote

"Why PS5 is outperforming Xbox Series X. It won't last. A major upgrade is coming to Xbox Series X that will put a 20-50% performance gap between the two consoles"
 
Last edited:

gmoran

Member
Regardless of tooling. Logically this shouldn't be happening (for any comparison).

The PS5 is a 10 TF machine with a 3.5 ghz CPU
The XBSX is a 12 TF machine with a 3.8 ghz CPU

They're directly comparable as they're both RDNA2+Zen2 based. Scratching my head right now. It doesn't matter how minimal the difference is based on specs. The XSX should technically be offering better performance here.

There are loads of possible reasons, the issue is we don't know which is correct.

Firstly the GDK is not as close to the metal as the PS API, the hardware model is abstracted more. Furthermore if there are inefficiencies in the abstracted model?

Perhaps it is just immature GDK, but that doesn't fit so well as the case we are seeing has specific performance characteristics.

Although both consoles use the same CPU, the architecture is different in each case. Now the XSX being faster means that, all things being equal, it should be faster! But all this are not equal. It may be more efficient caching or offloading more tasks make the PS5 less CPU bound, it may only apply depending on SMT mode?

The mistake people have made from the start is confusing paper spec "models" for the actual machines, they are not the same. The test of every model is how it compares to the empirical data of reality.

XBox fans were far too sure of XSX superiority.

And to be clear, I expect XSX to regain the lead in performance in the future, will probably take some months though.
 

Caio

Member
lol. I need to find a screenshot of it, but he said Xbox Series X would receive a 20%-50% over the PS5.

Edit:

The full quote

"Why PS5 is outperforming Xbox Series X. It won't last. A major upgrade is coming to Xbox Series X that will put a 20-50% performance gap between the two consoles"

...and the elephants will begin to fly ...yeah yeah.
 

ethomaz

Banned
What makes it likely to be "A LOT more frames beyond the vsync lock"? It might be one or two frames at most. We won't be able to know unless devs implement an uncapped mode or talk about it.
It is because VSync has some penalties in performance... so to use VSync at 60fps you need to be considerably over the 60fps.

With one or two frames over 60fps you can’t use VSync.

Let’s say VSync has a 15% drop in performance... you will need to be running at over 70fps to lock the game at 60fps VSynced.

That is why even Alex said PS5 could be considerably higher than 60fps to allow VSync.
 
Last edited:
Top Bottom