FatKingBallman
Banned
I remember one guy who constantly claimed that the Xbox GPU was in fact a badly designed server blade.
Maybe it is when XSX GPU doesn't have a pure win in gameplay comparisons
I remember one guy who constantly claimed that the Xbox GPU was in fact a badly designed server blade.
Care to explain to us all how the XSX can display the same static scene at a higher frame rate than the PS5?
Maybe it is when XSX GPU doesn't have a pure win in gameplay comparisons
Ah yes, gameplay where the frame rate is capped and poor optimization causes issues.No need. On paper TF difference is 18%, average shows around 16% But i appreciate gameplay benchmark when the whole system is used, not GPU only for a very specific situation
Oh I think he has a better understanding of what a benchmark is than you.You don't understand what a benchmark is do you?
Ah yes, gameplay where the frame rate is capped and poor optimization causes issues.
So you don't want an actual performance benchmark comparison at all.
Amazing that another comparison where the PS5 losses handily ends in another 1000+ reply thread
And you can rarely determine performance from capped framerates.PS5 already lost on paper in GPU comparison long time ago. LOL.
But you can't play a game in photomode.
Jesus titty fucking fanboying christ...Anyway, i've said in this thread couple a pages back
From now on, for the sake of Xbox fans and their happiness and for the sake of XSXs power proveness, i hope that every game will only have photomode and ray tracing. No joy of gameplay, no destructions, no npcs, no shooting, no jumping, no changeable time of day...just pure ray tracing, reflections and photomode and gaming community will be happy. I just want future of gaming to be like that. Imagine games with over 100GB of ray tracing and photomode. And of course, future benchmarks will be provided by Alex. Long live Dictator
Ok, so if XSX does not win as much as expected or loses it is poor optimisation or poor use of its potential by the devs, but if the situation is reversed it is just how it is expected to be and PS5 was used to its full potential but it is just inferior.Ah yes, gameplay where the frame rate is capped and poor optimization causes issues.
So you don't want an actual performance benchmark comparison at all.
Amazing that another comparison where the PS5 losses handily ends in another 1000+ reply thread
And you can rarely determine performance from capped framerates.
Jesus titty fucking fanboying christ...
Imagine being this upset that the console that was known to be more powerful showed said power.
In short it seems like it’s better in some scenarios because of a higher GPU clock. I would say that there is nothing surprising about that at all, and I laugh everytime Richard says he’s ”fascinated” that PS5 push ahead. I’ve overclocked my PC enough to know it matters, they should’ve done that too. Some games will simply benefit from a faster clock. This battle will continue throughout the whole generation and the results will vary just as much. Don’t trust that tools will change anything, Sony will improve their tools as well.That's the thing under discussion though. A 'superior' hardware should overtrump a inferior hardware on a constant basis. What we see instead is that a 'inferior' hardware is outsmarting the 'superior' one on a constant basis.
This may change in the future... or it may not.
These are assumptions about the CPU. In photo mode you literally have a static world. Static meshes/geometry. Even as a GPU raytracing benchmark you have static buffers in photo mode so it's not benchmarking what the GPU would actually do in game. minimal IO on the bottom level (mesh/geometry) . You literally have no compute shaders doing work to write out deformed geometry for raytracing. You are just looking at rendering a static scene, not a game. So while it's easy to assume that it's a CPU bottleneck, immature tools and bugs on the XSX because photomode is performing better it's a lot more complicated than that. There are other things that happen when not in photomode, more read/writes to buffers, more compute even on the GPU, more data streaming. Causes of stutters and fps in game are a lot more than just "well if it's not the GPU it must be the CPU"seeing that the gpu isn't the problem (because in photomode is performing lot better than ps5) you think that is the same cpu clocked higher on xsx to bring problems?
I keep thinking about a problem of immature tools and above all (stuttering) a dev bug
Even at capped framerates whole system can struggle in some scenarios.
Well, i'm not upset. Just poking fun of XSX GPU supposedly superiority
with AMD vs Intel vs Nvidia?Fighting over fkn console bs . Join PC, it's very peaceful over here.
Do I need credibility when the facts are straight in your face?“King thrash” had no credibility to begin with
"Win" what exactly?Jesus Christ they are making analysis videos of the analysis videos to try get the win
It was DF who said they were "identical".I like Kingthrash energy but he really shouldn't have used the lower settings argument in his video. It's usual to have differences like this due to bugs or dynamic lighting (and even if that was the case, those slight differences couldn't explain the performance difference). And in the photomode (or in the cutscenes in Hitman 3) it's normal if XSX is rendering 15% better as it has a 15% more powerfull GPU.
We know the XSX has worse performance during gameplay (even in Control and Hitman 3) when the CPU enters into the equation. this is where he should have focused.
Easily i can check your post history too.
Awful lot of REEEEEEING going on from someone who isn't upset
with AMD vs Intel vs Nvidia?
Yes yes, the XSX is such a gimped system it struggles to display text on a screen.
you still don't get it eh?Yeah, in photomode where everything is static. But gameplay is another thing. So, much about superiority
thats is a benchmark of the gpus ....isn't a specific scenario man....stop acting voluntarily obtuseWhat DF proved is that there is a scenario where XSX’s GPU pulls ahead which is a statement nobody refused ever.
you still don't get it eh?
thats is a benchmark of the gpus ....isn't a specific scenario man....stop acting voluntarily obtuse
you still don't get it eh?
thats is a benchmark of the gpus ....isn't a specific scenario man....stop acting voluntarily obtuse
Look who is talking about being obtuse. This is particular scenes (static), a specific engine, with particular optimisations per platform, etc...
I might be wrong, but it seem you want wins badly (and project it on PS fans). Actually, more than wins you seem to want obliterations and while PS fans accepted the consoles being very close and GPU wise XSX pulling ahead in several scenarios some people are still stuck in a “monster console obliterates the competition” mode.
Your other account got banned for console warring, eh? I wonder how long this one will last.
If you were becoming afraid there must be something seriously wrong yes, I can feel how liberating seeing this, but that is weird coming from XSX fans as they had no reason to fear their console was crap or had no headroom.I dont think its about winning as such but as proved there is much more overhead with the series x
Thanks for your answer. But it is in my understanding they said it because the devs told them this. They probably didn't check for those differences as they were looking for the worse performing scenes on PS5, as usual proving the world the Playstation is worse than their Xbox.It was DF who said they were "identical".
It was them who should have considered the differences due to bugs or dynamic lighting...no?
I'm my video I prove they are not identical...contrary to what they said.... The burden of proof is on them
It was DF who said they were "identical".
It was them who should have considered the differences due to bugs or dynamic lighting...no?
I'm my video I prove they are not identical...contrary to what they said.... The burden of proof is on them
He hypothesised, something is causing some issues with the Series X, I believe developers are finding it harder early on to work with it than the PS5, nothing more. To insinuate some sort of hardware design flaw is insulting in my opinion.Leviathan is reliable. Anyway, IIRC last year NXGamer talked about XSXs CPU and possible bottlenecks. i think it was during Valhalla analysis
It was the developer who said it was identical settings but I mean you could’ve played the game yourself to notice how light,shadows,smoke,particles,reflective materials,randomizers etc work dynamically in this game, it would answer most of your questions. Just try starting the game yourself and try replicate one single of the DF screens with identical details.It was DF who said they were "identical".
It was them who should have considered the differences due to bugs or dynamic lighting...no?
I'm my video I prove they are not identical...contrary to what they said.... The burden of proof is on them
I do not know, but rumours are improved caching system (unified L3 cache) to increase efficiency and put less pressure on the memory system (it would go along with the work they have done to put less pressure on the RAM from the GPU with the cache scrubbers... it would seem that they wanted to go with a fully unified but not too expensive RAM solution and this would help achieve it.I emphasized hz diff from cpus to make you understand how in the test carried out by df the only thing missing is the AI and the game logic .... so as why a exactly the same but higher clocked CPU should have any bottlenecks? .. they are exactly same cpu. And in this test the gpu proven not to be the problem. Let's say that it is almost certainly about optimization ...tools immatury or something like that
09:35PM EDT - Q: Are you happy as DX12 as a low hardware API? A: DX12 is very versatile - we have some Xbox specific enhancements that power developers can use. But we try to have consistency between Xbox and PC. Divergence isn't that good. But we work with developers when designing these chips so that their needs are met. Not heard many complains so far (as a silicon person!). We have a SMASH driver model. The games on the binaries implement the hardware layed out data that the GPU eats directly - it's not a HAL layer abstraction. MS also re-writes the driver and smashes it together, we replace that and the firmware in the GPU. It's significantly more efficient than the PC.
According to a piece by John from DF, the gap actually started to grow relative to the base Xbox One and then after a while to the One S too as developers started to push PS4 more and PS4 Pro and Xbox One X entered the picture.Sort of makes sense. But with the previous generation things didn't really change that much as the generation went on. The PS4 was consistently ahead of the X1 by more or less the same amount. I doubt that either current gen system is suddenly going to create a huge delta between the two. Things will most likely remain similar between the two towards the end.
18.17% more powerful (TF).I like Kingthrash energy but he really shouldn't have used the lower settings argument in his video. It's usual to have differences like this due to bugs or dynamic lighting (and even if that was the case, those slight differences couldn't explain the performance difference). And in the photomode (or in the cutscenes in Hitman 3) it's normal if XSX is rendering 15% better as it has a 15% more powerfull GPU.
We know the XSX has worse performance during gameplay (even in Control and Hitman 3) when the CPU enters into the equation. this is where he should have focused.
He's right, just because the settings are identical it doesn't mean they will produce identical results visually. due to bugs or whatever. We've seen it in Watch_Dogs: Legion with RT on PS5 (puddles) and AF on Series X.You didn't mention the part where the actual developers told them they are identical, you know the people who made the game.
I emphasized hz diff from cpus to make you understand how in the test carried out by df the only thing missing is the AI and the game logic .... so as why a exactly the same but higher clocked CPU should have any bottlenecks? .. they are exactly same cpu. And in this test the gpu proven not to be the problem. Let's say that it is almost certainly about optimization ...tools immatury or something like that
18.17% more powerful (TF).
He's right, just because the settings are identical it doesn't mean they will produce identical results visually. due to bugs or whatever. We've seen it in Watch_Dogs: Legion with RT on PS5 (puddles) and AF on Series X.
Digital Foundry either didn't notice it or didn't feel like mentioning it.
So where do you get fear from? Where in my post did I put i was fearful?If you were becoming afraid there must be something seriously wrong yes, I can feel how liberating seeing this, but that is weird coming from XSX fans as they had no reason to fear their console was crap or had no headroom.
Perhaps, Xbox fans were sold a monster and 6 months and more of gloating and calling the other system a rushed last minute over-clocked solution without seeing the massive power advantage reflecting on games did raise tensions, but you have got MS’s PR to blame for that.
System design and overall usable performance is a complex can of worms... you could possibly produce synthetic demos that showed headroom in PS3’s RSX in isolation and in the CELL BE too (more easily) vs the competition, but what would it prove?
as I pointing out it isn’t about winning but what he photomode showed was that there was more overhead on the Xbox than ps5? Is that right?
this post will age very very very bad..saved for future crow eatingLook who is talking about being obtuse. This is particular scenes (static), a specific engine, with particular optimisations per platform, etc...
I might be wrong, but it seem you want wins badly (and project it on PS fans). Actually, more than wins you seem to want obliterations and while PS fans accepted the consoles being very close and GPU wise XSX pulling ahead in several scenarios some people are still stuck in a “monster console obliterates the competition” mode.
Yeah I didn’t say it was down to hardware, I just think it’s a poorly optimised game tbhYes, you are correct, but that overhead only appears to manifest itself in certain scenarios. Specifically based on this result, and in other games where the SX creeps ahead during real-time cinematics, it appears that its only when the CPU is at its most idle does the gap show itself.
This could be a fault with the PS5's smart-shift implementation, or it could indicate that the fillrate advantage of the SX GPU gets stymied by bus bandwidth, or some other system bottleneck when the whole APU is under load. To be honest, it could just be down to graphics API differences as regardless of the code being run there's a significant layer between that and the actual hardware.
Oh its "They" again. Yeah all PlayStation fans across the globe clubbed together to make this video.Jesus Christ they are making analysis videos of the analysis videos to try get the win
Damn, you'd have to be a new level of daft to not see why it would refer to xbox given it lead in prior scenes, what sort of logic do you have to say ps5 was behind, is now level so maybe ps5 is bottlenecked in this level scene when its the SX thats now lower than before in % lead LOL. If it was aimed at both you'd expect ps5 to drop by a similar % as in the other scenes if CPU's were equal.As I pointed out previously the section in the video were DF mentioned bottlenecks the series x managed to get a few FPS better performance but yet Sony fans presumed that it was pointed towards the x and not both consoles were it was.
I do not know, but rumours are improved caching system (unified L3 cache) to increase efficiency and put less pressure on the memory system (it would go along with the work they have done to put less pressure on the RAM from the GPU with the cache scrubbers... it would seem that they wanted to go with a fully unified but not too expensive RAM solution and this would help achieve it.
While it helps greatly with BC and allowing general OS updates independently from game OS ones, the virtualised approach does have a non zero impact... when using multi threading / SMT the clocks speed difference is even lower and actually negative in the case of XSS which you need to take into account as your minimum target for the game logic.
As titles stress disk I/O more and more the impact on the CPU grows: it could be that in this case the I/O Processor complex Sony built around the SSD keeps the CPU overhead small.
It could also be that, being a console that the low level graphics libraries still have a smaller overhead on PS5 (i.e.: lower CPU cost) maybe not as big as with the DX11 Xbox One vs PS4 GNM, but while optimised and allowing more direct HW access it still tries to keep in line to DX12U used on desktop:
Because the 16% advantage isn't about one specific scene where the small differences would matter, it's over 20 samples where that would be taken into account as we see the difference goes from nothing to over 30%. The average is 16%.
According to a piece by John from DF, the gap actually started to grow relative to the base Xbox One and then after a while to the One S too as developers started to push PS4 more and PS4 Pro and Xbox One X entered the picture.
Lots of bullshit numbers. With Hitman 3, it was the "44% more pixels" that "scales with compute units". Now it's "16% more frames" that "scales with TF".This is actually kinda interesting.
16% lead on average in a purely GPU related task. If you woulda told us this 6 months ago, there would be nothing but revolt.
I’m not surprised that Richard goes on a tangent about silicon investment return and “early days” and “parity”. He clearly wanted more ahahahaha so pathetic. And people say he’s neutral!
Complete disregard for draw calls doesn’t surprise me either!
as I pointing out it isn’t about winning but what he photomode showed was that there was more overhead on the Xbox than ps5? Is that right?
Daft is looking at the video and seeing what you only want to see, which you have doneOh its "They" again. Yeah all PlayStation fans across the globe clubbed together to make this video.
Let's ignore the fact kingthrash has been calling them out for years.
How about address the differences spotted and come up with your explanation.
Damn, you'd have to be a new level of daft to not see why it would refer to xbox given it lead in prior scenes, what sort of logic do you have to say ps5 was behind, is now level so maybe ps5 is bottlenecked in this level scene when its the SX thats now lower than before in % lead LOL. If it was aimed at both you'd expect ps5 to drop by a similar % as in the other scenes if CPU's were equal.
Sooooo......why isn't it on DF tho....they are the only ones echoing the devs. Lol....I mean even so it still doesn't explain the missing textures and lower reflective effects throughout. The dev said identical just like DF said.....both of them lied.It was the developer who said it was identical settings but I mean you could’ve played the game yourself to notice how light,shadows,smoke,particles,reflective materials,randomizers etc work dynamically in this game, it would answer most of your questions. Just try starting the game yourself and try replicate one single of the DF screens with identical details.
When the developer says it’s settings parity I think we need to research a bit better before we start claiming they’re lying.