• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[NX Gamer] AC: Valhalla - XsX vs PS5 - The first REAL head to head battle.

What is XSX being bound by here?
vlcsnap-2020-11-13-18h56m20s419.png



He speaks about 120fps here but it applies to 60fps too.

Come on now. The PS5 drops just as low just after that scene. But there is actually a scene that I find puzzling, where PS5 has a definitive advantage: in the city with plenty of NPCs. PS5 never drops here and it's not about GPU. It's about CPU and both CPUs are virtually identical.

4VeCnRH.jpg


DeepEnigma DeepEnigma is faster than me!
 
Last edited:

Thirty7ven

Banned
Apparently Xbox switched to a new API called 'gamecore' and the dev tools are not that mature yet. Series X likely have some teething issues here at the beginning. The devs will get a better handle with the series as time goes by and there will be advantages over the PS5.

Seriously Why would people bother investing in higher quality GPU's on PC if the slightly weaker one gave the same resolution and frame assets and had the same textural assets? Next year is when the real benchmarking begins as the devs will have the consoles for some time!

I said we would be told to wait, I said devs would either be called lazy or something else would be responsible for their failure to deliver on MS’s PR.

I guess “early tools” is the new goalpost.
 

Old Empire.

Member
I can't comprehend this attitude.
Its always the same with Xbox "just wait and see!".
In 18 years I haven't seen anything worthwhile for Xbox exclusivity yet. (this is obviously just my subjective opinion and I have always had a gaming PC and a Playstation and sometimes a additional Nintendo console )

What attitude? The series only went on sale last week and this gen barely begun.. It's well known Sony gave devs kits to third party studios long before Microsoft did. These tests are not a sign of what's coming up next sorry. PS5 doing slightly better now, I don't think that will continue next year. Even here the advantages are very minor, small frame rate drops. This is not comparable. to last gen. I don't see how the PS5 can have the same resolution and frame rate as the series x when the true next gen titles start coming out. The better GPU in the series should tech wise be performing better? If it does not we'll talk then next year!
 

azertydu91

Hard to Kill
,


Your back Riky Riky , what you got for us this new time.

Your like the new Terminator model, you failed but your coming back better and improved every time.


Ps5 is JOHN CONNER
John conner ? Llike the autist half brother of John Connor?

Edit: we do agree that the last Terminator movie didn't exist right?Like Indiana jones or ocean's 8 and so many others.
 
Last edited:

Godfavor

Member
Ok that makes sense to me. I've seen some people argue it as a thing and I really can't see negative side of it. It essentially allows a GPU to be more efficient and gain some performance because of it.

It's not a bad thing by any means. When you design a closed box like a console you need the best tdp/performance ratio possible.

Having a higher tdp limit, a variable frequency won't be needed.
But it could mean a bigger ps5 for cooling. Not worth the trade off for a such a low percentage of 100% utilization of the apu. Sony engineers bet on that.

The "negative" side of things that people argue about is that constant rate is better. Of course it is, but that's not the point because when you don't have something like that you don't get the best of your gpu.

It is quite oxymoron really
 
It's not a bad thing by any means. When you design a closed box like a console you need the best tdp/performance ratio possible.

Having a higher tdp limit, a variable frequency won't be needed.
But it could mean a bigger ps5 for cooling. Not worth the trade off for a such a low percentage of 100% utilization of the apu. Sony engineers bet on that.

The "negative" side of things that people argue about is that constant rate is better. Of course it is, but that's not the point because when you don't have something like that you don't get the best of your gpu.

It is quite oxymoron really

From my understanding Sony could have hit 10.23TFs and have fixed clocks. But in order to do that would mean more CUs and lower clocks. For whatever reason they decided against that and chose to go with a narrow and fast design.
 

Edgelord79

Gold Member
Christ what an overly reactionary thread about consoles in which the game in question is known to be poorly optimized. Not only, it's one of the first on next-gen so it stands to reason things will not be efficient with development.

Meanwhile a 3090 can't even run Valhalla on PC at 4K 60 fps consistently yet here we are making asinine comparisons between two much inferior pieces of plastic and silicon running the same game and wondering why there are dips in performance.
 
Last edited:

ABnormal

Member
I'm amazed at how quiet Sony and AMD have been about talking PS5/RDNA 2 tech and/or custom tech. I think Mark Cerny might do his tech talks like he did after PS4 launched next year but at least a basic overview would be nice. We still don't even know the actual die size of PS5's APU even though Austin Evans and iFixit were directly asked to measure it!

To be fair, Cerny already did it plenty during Road to PS5. The custom parts are the I/O architecture, and the geometry engine, and he explained many things about them.

Sure, it's always possible to go into more detail, but usually it's reserved to developers. They have never used details like those as a marketing thing, probably because it appeals to a fraction of gamers.
 

Godfavor

Member
From my understanding Sony could have hit 10.23TFs and have fixed clocks. But in order to do that would mean more CUs and lower clocks. For whatever reason they decided against that and chose to go with a narrow and fast design.

Adding more cu's is very expensive, the apu size is the no1 reason for cutting costs.

Sony bet on it and allowed to release a cheaper model at 400$ without breaking the bank
 
Amount of excuses coming form xbots is embarrassing,

Its a Last gen game!
Dev tools are bad!
Ubisoft sucks!
Game is not optimised!
Wait for DF!

everyone told u there is more to it then just tf numbers lol.

We had 2 games so far its not looking good for the 12tf rdna 2 console. The 9tf rdna 1.5 console is beating it, screams bottle neck if u ask me.
 

sircaw

Banned
Amount of excuses coming form xbots is embarrassing,

Its a Last gen game!
Dev tools are bad!
Ubisoft sucks!
Game is not optimised!
Wait for DF!

everyone told u there is more to it then just tf numbers lol.

We had 2 games so far its not looking good for the 12tf rdna 2 console. The 9tf rdna 1.5 console is beating it, screams bottle neck if u ask me.

Yer but their packaging is awesome!
 
Amount of excuses coming form xbots is embarrassing,

Its a Last gen game!
Dev tools are bad!
Ubisoft sucks!
Game is not optimised!
Wait for DF!

everyone told u there is more to it then just tf numbers lol.

We had 2 games so far its not looking good for the 12tf rdna 2 console. The 9tf rdna 1.5 console is beating it, screams bottle neck if u ask me.

Well so far nobody had been able to prove the 9.2TF claim which I find funny. Just happy they didn't latch on to that 5TF claim that was made a while ago.
 
Yes, the price is awesome for such tech!

That's why I like consoles more than any other gadget. They always push for the best performance while having the lowest cost possible. I am such a nerd 😩

And it's the best looking PS5 IMO. Although I don't know if I can live without a disk drive.
 

M_A_C

Member
Amount of excuses coming form xbots is embarrassing,

Its a Last gen game!
Dev tools are bad!
Ubisoft sucks!
Game is not optimised!
Wait for DF!

everyone told u there is more to it then just tf numbers lol.

We had 2 games so far its not looking good for the 12tf rdna 2 console. The 9tf rdna 1.5 console is beating it, screams bottle neck if u ask me.
Don’t forget VRR, 1440p, and I’ve seen a few game-passes thrown in too.
 

inflation

Member
Come on now. The PS5 drops just as low just after that scene. But there is actually a scene that I find puzzling, where PS5 has a definitive advantage: in the city with plenty of NPCs. PS5 never drops here and it's not about GPU. It's about CPU and both CPUs are virtually identical.

4VeCnRH.jpg


DeepEnigma DeepEnigma is faster than me!
Flashbacks of AC:U incoming. Ubisoft shows CPU importance for the last two generations. 🤣
 
Last edited:

mejin

Member
there is no doubt devs will make better use of X hardware as time passes, but what amazes me is that in the bubble this won't be the case with PS5 hardware. Weeks ago people were expecting X to show the difference in horsepower right at the gate, today this diference is just a fart in the wind...

I already heard of moving goalposts, but mountains....damn
 

BlueHawk357

Member
I'm expecting them to be very similar, bit not necessarily PS5 being /better/.

Sorry NXGamer has a bias. So I'll be waiting on Digital Foundry, which won't be too long now.

If there is a difference between the consoles, I might just sub to NXGamer, but for now, the bias is too off-putting.
 

Sejanus

Member
Amount of excuses coming form xbots is embarrassing,

Its a Last gen game!
Dev tools are bad!
Ubisoft sucks!
Game is not optimised!
Wait for DF!

everyone told u there is more to it then just tf numbers lol.

We had 2 games so far its not looking good for the 12tf rdna 2 console. The 9tf rdna 1.5 console is beating it, screams bottle neck if u ask me.
Screams that third party doesn't care they have a lot of platforms to work.
They have cross Gens, streaming platforms, pc with ddr4(apus) pc with rt, pc without rt, hdr, vrs, mesh, dx11 dx12, Vulcan, working on new in-house engines.
You think they are really care for 1,8tf.
Xbox is stronger no one doubt it.
Only first party studios optimize the shits out of the console hardware.
And here is the real advantage of ps5, they really don't give a shit for nvidia hardware amd hardware (pc), switch, Stadia, Xbox, only for the first 2 years for the ps4 and ps4 pro.
The Xbox studios strategy is problematic with all this platforms xbox one sx, series sx, pc (because gamepass).
 
I'm expecting them to be very similar, bit not necessarily PS5 being /better/.

Sorry NXGamer has a bias. So I'll be waiting on Digital Foundry, which won't be too long now.

If there is a difference between the consoles, I might just sub to NXGamer, but for now, the bias is too off-putting.

Yes but Dictator doing the DF comparison has a heavy bias in favour of the other side so you shouldn't put all your faith in that either by your own logic. Just saying.
 
Last edited:
Differences that are easily overcome by Xbox VRR support. Best way to play on console.

Problem is not everyone's screen supports VRR, in fact most don't. And gamers shouldn't need to rely on a feature outside of the game's own optimizations to make it playable in a tolerable way (i.e no screen-tearing).

It's a bit weird seeing how some other cross-gen multiplats like NBA 2K1 aren't having these kind of differences between the PS5 and Series X versions, so it does bring into question Ubisoft's dev environment and pipeline surrounding both platforms. Do they have up-to-date Series devkits with Gamecore or not? When did they get them? Do they have different teams handling the two versions? If so, is one team prioritizing the PS5 edition (i.e is that team being given more budget and manpower to work with)?

The latter I would expect not to be the case since MS has marketing rights with Valhalla IIRC, but if that's the case then why did MS not provide better technical support to Ubisoft (assuming Ubisoft may've had outdated devkits)? Is the fact several of their 1P studios cleaning up 343i's mess something of a reason?

In any case, there's clearly some things around the SDK MS need to fix and iron out pronto. If this persists a pretty bad narrative surrounding their hardware may start to form and it'll be extremely difficult for them to shake it off. They might need a demonstration of Hellblade II (as one example) running real-time on Series X sooner rather than later just to shut these kind of things down, regardless of how that makes Ubisoft look here (because if such a demo were to pop up then the consensus would just shift to Ubisoft being lazy and not having the means to optimize for Series X which would kind of reflect badly on them. This is all complicated stuffs :S).

Dev tools are bad!

Game is not optimised!


We had 2 games so far its not looking good for the 12tf rdna 2 console. The 9tf rdna 1.5 console is beating it, screams bottle neck if u ask me.

Dev tools: Even insiders have been saying for months that Xbox SDKs were lagging behind. More information has come out more or less confirming this, as the Gamecore SDK (which is what MS have been retooling the devkit package into) has been running late.

Comparatively, Sony's PS5 SDK has basically been described by several devs as a "supercharged PS4 devkit", as in the APIs are very similar and the dev environment similarly the same. Combined with them finalizing their specs earlier, that's given devs more time to focus on optimizations, which leads into...

Optimizations: Well if the above is true, then it's a bit hard to argue that the game is not optimized. I would say it's not fully optimized on either platform tbh, but it's probably even less optimized on Series X since there are circumstances surrounding its SDK the PS5 hasn't had to deal with.

It's not an excuse to refer to these two things as factors when they go hand-in-hand and are seemingly true based on what we've seen in terms of information. Also it's worth noting that these are just two cross-gen multiplat games. DMC5 in particular, with odd framerate drops but in terms of visual detail is at least on par with the PS5 one if not better, going by the DF analysis. Other multiplat games like NBA 2K1 and DiRT 5 are running on par between the two systems with the expected bits of additional small details on Series X (and at least in NBA 2K1's case, virtually identical load times).

From my understanding Sony could have hit 10.23TFs and have fixed clocks. But in order to do that would mean more CUs and lower clocks. For whatever reason they decided against that and chose to go with a narrow and fast design.

The main reasons were costs and ensuring BC for PS4 and PS4 Pro games. The next-smallest design they could've chosen was 48 CUs. But the costs for the APU would've gone up a lot as a result, and they wanted to distribute those costs in other areas and try making up any performance losses through developing the variable frequency implementation...

...which btw AMD are leveraging for their Ryzen & RDNA2 GPUs in a very similar setup, which is almost precisely the feature Cerny hinted about at Road to PS5 being brought into the PC space as a "successful collaboration". For a bit I thought this would also include cache scrubbers but I just rewatching part of Road to PS5 today and Cerny mentions the Cache Coherency Engines are present in order to inform the GPU of stale data in the I/O caches and then have the GPU selectively purge those bits and replace them with new data.

That pretty much rules out the cache scrubbers having any role in the Infinity Cache, as the way he describes it, the cache scrubbers are in multiples split up among parts of the GPU (likely per-CU), I would assume either at the L0$ or more likely L1$. IC deals with the L2$ level; it's still theoretically possible cache scrubbers might be there but then that wouldn't really fall on the GPU "hardware" itself anymore but rather some HBCC block on the GPU to access data off the SSD through the DirectStorage implementation AMD talked about at their presentation.

Also while rewatching the presentation (or a part of it) today I caught this line:

If it takes about half a second to load, that's about 4 GB of compressed data to load. That sounds about right for next-gen

This is what Mark Cerny mentioned when describing just-in-time streaming of asset data, and it sticks out because this is what he figures is good enough for "next-gen" gaming. Which would mean that a drive offering raw bandwidth of, say, 1 GB/s to 2 GB/s (the former being highest lossy compression ratio of 3.99:1, the latter being typical lossless compression ratio of 2:1) should be capable of the same thing so many people have been attempting to claim only a drive as fast as PS5's or even Series X's/S's (being half the speed of Sony's) would be able to do.

And I bring up the latter in particular because at least for data loading, with the recent firmware patch on PS4 we've been seeing some of those games, on that system (with its SATA II interface) loading pretty much on par with BC title load times on PS5 and Series X, which was supposed to be seemingly impossible. Welp 🤷‍♂️ ...
 
Last edited:

Lysandros

Member
Come on now. The PS5 drops just as low just after that scene. But there is actually a scene that I find puzzling, where PS5 has a definitive advantage: in the city with plenty of NPCs. PS5 never drops here and it's not about GPU. It's about CPU and both CPUs are virtually identical.

4VeCnRH.jpg


DeepEnigma DeepEnigma is faster than me!
There are strong indications that PS5's CPU has unified L3 cache cluster like in Zen 3, so they may be not so identical.
 
Amount of excuses coming form xbots is embarrassing,

Its a Last gen game!
Dev tools are bad!
Ubisoft sucks!
Game is not optimised!
Wait for DF!

everyone told u there is more to it then just tf numbers lol.

We had 2 games so far its not looking good for the 12tf rdna 2 console. The 9tf rdna 1.5 console is beating it, screams bottle neck if u ask me.

If there’s a bottleneck in either console it’s PS5’s memory bandwidth 😝
 
Kinda funny to post that "Ubisoft sucks and they can't be used to measure how these consoles perform" fallacy.

When The Division released on PS4 and XBO, the base console was the XBO and they also had marketing rights for the title. It was a huuuge downgrade after the reveal, but still, they results showed exactly what prevailed until the One X and PS4 Pro came. That the PS4 had more power to run games at 1080p. It was noticeable. There was no reconstruction technique, nothing. Just base resolution. It's actually by Ubi/Massive, but still. It worked back then but now it doesn't right? lol
 

Sejanus

Member
There are strong indications that PS5's CPU has unified L3 cache cluster like in Zen 3, so they may be not so identical.
Do you take a bet that ps5 apu will be almost identical.
50mm difference will be 16mm for 64 bit controllers 30mm for 12cu more , 2mm for l2 cache .
 

THE:MILKMAN

Member
To be fair, Cerny already did it plenty during Road to PS5. The custom parts are the I/O architecture, and the geometry engine, and he explained many things about them.

Sure, it's always possible to go into more detail, but usually it's reserved to developers. They have never used details like those as a marketing thing, probably because it appeals to a fraction of gamers.

True, but I/we always like to know the full nerdy details. Given how Sony have actually taken a step back in disclosures recently I think you will be right we'll get very little directly from Sony about the details of PS5.

Probably have to rely on info from devs instead but even they now know when to keep quiet about things.....
 

ABnormal

Member
Screams that third party doesn't care they have a lot of platforms to work.
They have cross Gens, streaming platforms, pc with ddr4(apus) pc with rt, pc without rt, hdr, vrs, mesh, dx11 dx12, Vulcan, working on new in-house engines.
You think they are really care for 1,8tf.
Xbox is stronger no one doubt it.
Only first party studios optimize the shits out of the console hardware.
And here is the real advantage of ps5, they really don't give a shit for nvidia hardware amd hardware (pc), switch, Stadia, Xbox, only for the first 2 years for the ps4 and ps4 pro.
The Xbox studios strategy is problematic with all this platforms xbox one sx, series sx, pc (because gamepass).

Xbox has some more muscles, but it seems that his digestion is not so good, so a slimmer athlete with good digestion is able to feed his lesser muscles more efficiently.

The only thing that cares to gamers is the final result.

Needing more muscles to do the same thing is NOT a good thing.
 
Last edited:

ABnormal

Member
True, but I/we always like to know the full nerdy details. Given how Sony have actually taken a step back in disclosures recently I think you will be right we'll get very little directly from Sony about the details of PS5.

Probably have to rely on info from devs instead but even they now know when to keep quiet about things.....

I'm curious too about more details, so I hope some detailed data will leak from someone. Especially about the features of geometry engine and the caching system.
 
Last edited:

Sejanus

Member
Xbox has some more muscles, but it seems that his digestion is not so good, so a slimmer athlethe with good digestion is able to feed his lesser muscles more efficiently.

The only thing that cares to gamers is the final result.

Needing more muscles to do the same thing is NOT a good thing.
Brilliant tech analysis
Muscles athletes.. Ok
Did you say slimmer xaxaxaxa
A 300mm chip wants a ridiculous cooling tech, is a really high efficient digestion.
 

Redlight

Member
I can't comprehend this attitude.
Its always the same with Xbox "just wait and see!".
In 18 years I haven't seen anything worthwhile for Xbox exclusivity yet. (this is obviously just my subjective opinion and I have always had a gaming PC and a Playstation and sometimes a additional Nintendo console )

Oh no I remembered, Fable 1 was Interesting! Played it later on PC though.
You do realise that if you play an MS game on PC you're not wounding them, right? Or anyone here for that matter? Play the games on PC all you like...and btw, welcome to the Xbox ecosystem.
 
Top Bottom