• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF - Assassin's Creed Valhalla: PS5 vs Xbox Series X/ Series S Next-Gen Comparison!

onesvenus

Member
Processing is never so precise to be hitting it so bang close onto target with consistency. For a steady 30 or 60 to be a flatline; there is a need for a wide ceiling on balance.
That depends entirely on the difference between workloads. Do you know anything about them? If not, you can't claim you need a wide ceiling on balance.
15% was just an example based in the other post tests on PC... it could be 5% that makes it need to be just over 64fps.
I'd consider 4fps not "a lot" of frames that's the claim I was disputing. As you say we cannot be sure that it would go to 80Fps or something like that.
 

mitchman

Gold Member
I have never seen "stuttering" with a good vsync solution. I've seen framepacing, but that is totally different.
Frame pacing issues are experienced as stutters when a frame interval is skipped. It doesn't matter how "good" a vsync implementation is if it can't hit the target frame rate, it will either have to output the frame directly, causing tearing, or wait for the next sync interval meaning one frame will take 33ms to display, experienced as a stutter.
 
XSX version have better textures

texturequalityuakz2.png
Why do u guys need to get so desperate and lie ?they are identical setting as per ubi soft file .

FnKYQml.jpg
 
Last edited:
If PS5 has unified cache and Xbox does not then I would put the difference down to this. But I hadn’t heard this being called as fact until the post above but there is nothing concrete about this as far as I’m aware. If we ever get an architecture shot of the ps5 we could see it there.

As for the technical reason this is happening I’d almost put the Sony cache scrubbers as the reason without knowing any other hardware specifics. It seems like the kind of thing that this kind of custom hardware does for devs automatically and the area it’s happening in the game looks cpu heavy with shifting data requirements in memory (inside and outside and lots of enemy and allied ai). But I mean I really don’t know what I’m talking about and this is just my semi tech literate shot in the dark I could be way off base.
The cache scrubbers are not custom, AMD is using cache scrubbers on theyr cards by now.
 
Last edited:

spyshagg

Should not be allowed to breed
I want to address that torch.

It takes away a good 10FPS from a 3090 on 4k/Ultra. The torch is an expensive render.. why? Because having more than 1 shadow casting light source wreaks havoc on ALL GPUs. It's a shame that that is the case but here we are where creating shadows continues to be very expensive. PS5 drivers are just really good at the dynamic res and probably drops resolution down enough to keep the FPS high.

You again spreading misinformation? What do you do in the "industry"? Post credentials or past projects.

Serving up coffee or being assigned to do backup tasks doesn't make anyone an industry member. If you actually do, Its quite bad behavior.

Just keep quiet when you dont know what your talking about.
 

Andodalf

Banned
You again spreading misinformation? What do you do in the "industry"? Post credentials or past projects.

Serving up coffee or being assigned to do backup tasks doesn't make anyone an industry member. If you actually do, Its quite bad behavior.

Just keep quiet when you dont know what your talking about.

Lol, I suppose him getting vetted isn't enough for you? Shut up kid. Take a break
 

DeepEnigma

Gold Member
Frame pacing issues are experienced as stutters when a frame interval is skipped. It doesn't matter how "good" a vsync implementation is if it can't hit the target frame rate, it will either have to output the frame directly, causing tearing, or wait for the next sync interval meaning one frame will take 33ms to display, experienced as a stutter.

I still don't know what you're talking about. I see zero stuttering in sync'ed 60FPS+ games like DOOM, etc..
 

DeepEnigma

Gold Member
Don't appeal to authority just because one provides confirmation bias. If they act a fool like the best of the fanatic trolls, they get what they give.
 
Last edited:

Md Ray

Member
The Xbox One CPU runs at 1.75ghz and the PS4 runs at 1.6ghz don´t know how much will that affect the performance but is a bit higher... In this case the XSX is running a higher clock CPU and the GPU has 2TF more...
Xbox One's 1.75 GHz from 1.6 GHz is a 9% increase.
Xbox Series X's 3.6 GHz from 3.5 GHz is a 2.8% increase.

PS5's has a 22% advantage in other units of its GPU.
 

Fake

Member
UPDATE: The 15 per cent performance advantage mentioned here is averaged across a specific cross-section of play. As the graphs show, 'in the moment' differences can be as high as 25 per cent.]



Thx. I'll update the OP.

Yes?
 
Last edited:
The Xbox One CPU runs at 1.75ghz and the PS4 runs at 1.6ghz don´t know how much will that affect the performance but is a bit higher... In this case the XSX is running a higher clock CPU and the GPU has 2TF more...
The CPUs are virtually the same. 3.5 ghz vs 3.6 ghz and we know there is some CPU overhead on XSX.
 
I don’t doubt the PS5 will be able to handle this especially with that custom SSD. It’s not going to happen until they stop with cross gen though.
Unless one team makes the PS5 version, lets say the new Horizon Zero Dawn
only focusing on the PS5 hardware, and another team ( Sony developers have more teams then one) are making the PS4 version afterwords. And that is exactly what they are going to do😉
 

DForce

NaughtyDog Defense Force
XSX version have better textures

texturequalityuakz2.png

You're trying way too hard. If you look at the scene, the same texture becomes blurrier during certain parts of this scene.


It looks sharper at that point.

caob9GG.png



But then it looks blurry.

RPRBg2I.png


It's the way the textures are flirted, not that its using higher res textures.

This video shows in game texture comparisons and it's the same.

 

sircaw

Banned
He is a fool but at least he seems to be reaching some weird Acceptance phase.





The good thing is, even though he's being an arse, that initial statement will make many in his army of zealots think!

A lot of them will wake up and link that phrase to the results of the last couple of weeks, he does not even know it but he is helping others finally see the light.
 

JohnnyFootball

GerAlt-Right. Ciriously.
I was going to wait until my XSX came in to play this, but I can't. Just bought it on my PS5 and (for the moment) it's the better version. I do hope both versions unlock the framerate completely for TVs with VRR. I'd love to know how high the framerate can get.

I LOVE the 60 fps gameplay. It's nice to play a smooth 60 fps experience of an AC game.
 
Last edited:

assurdum

Banned
Talk about generic nonsense. If both opeates at full output is weaker. In 10GB of RAM the bandwidth advantge is significative. The raw output of operations is 20% bigger in this theorical scenario and in the end that is what you need to render. There is no two ways arroun it. Tools aside (right now the XSX is operating 40% worse than its theorical differential with PS5, this is going to change over time), the logical explanation is that PS5 is way more efficient, that is it's own advantage. An architecture to take the most out of those 10.3 TF.
The difference in bandwidth is not significant as you think, again the same obsessive conjecture because sheet paper said otherwise, bandwidth it's splitted in two speed and when the cpu operate everything run slower even in the GPU side because in the same bus you can't use two speed without affect the faster peak, so no you don't have idea of what you are talking. And the hell it means operate at full output.
 
Last edited:

frogger

Member
From what I heard, PS5 was originally planned for 2019, the developers had PS4 dev kit one year early than XSX. But this could just be hearsay. PS5 and XSX are very similar in term of architecture, no way a 10TF machine would outperform a 12TF machine. But there could be design flaw in XSX, will have to wait and see.
 

Bo_Hazem

Banned
From what I heard, PS5 was originally planned for 2019, the developers had PS4 dev kit one year early than XSX. But this could just be hearsay. PS5 and XSX are very similar in term of architecture, no way a 10TF machine would outperform a 12TF machine. But there could be design flaw in XSX, will have to wait and see.

Yeah Devs don't agree with you. Just acknowledge PS5 as the superior console and buy one and feel peace inside you.
 

DeepEnigma

Gold Member
From what I heard, PS5 was originally planned for 2019, the developers had PS4 dev kit one year early than XSX. But this could just be hearsay. PS5 and XSX are very similar in term of architecture, no way a 10TF machine would outperform a 12TF machine. But there could be design flaw in XSX, will have to wait and see.

They are not that similar with the customization done under the hood. And the 2019 was bogus. A rising tide lifts all boats, that cache tho.
 
Last edited:

ethomaz

Banned
From what I heard, PS5 was originally planned for 2019, the developers had PS4 dev kit one year early than XSX. But this could just be hearsay. PS5 and XSX are very similar in term of architecture, no way a 10TF machine would outperform a 12TF machine. But there could be design flaw in XSX, will have to wait and see.
They are way less similar than PS4 and Xbox One.
There a lot of more custom things in PS5 now.
 
Top Bottom