Nothing about it is contradictory if you read carefully what I actually said. What I am speaking on are hardware advantages that are well documented in favor of Xbox Series X. They aren't up to feelings or opinion on if they are actual hardware advantages or not. Saying those hardware advantages will more readily come into play further into the gen when game engines and techniques that are designed to get the most from it actually arrive isn't the same as claiming victory based on results one year in or from a single title when we know the consoles have much left in the tank. And let's not pretend Series X isn't already demonstrating it's the more capable hardware by the expected percentages. It kinda is.
Below is what we're talking about
12TF 52 Compute Unit Full RDNA 2 GPU (or 26 Dual Compute Units)
3,328 Stream Processors
10GB @ 560GB/s 6GB @ 336GB/s
1825MHz clock speed (locked)
CPU @ 3.8GHz (without SMT) 3.6GHz (with SMT) both locked
Mesh Shaders, VRS Tier 2, Sampler Feedback Streaming, Ray Tracing, Hardware Machine Learning
The contention some are making is that the 12TFLOPs GPU of the Series X, with all the above features, doesn't command as much of a performance edge over the PS5 as many suggested it would. The problem with any such assumption is that it's largely premature since much of what's listed above from an advanced graphics performance feature standpoint, hasn't sniffed, let alone even appeared, in a game this early into the gen. Ray Tracing we know harms performance, and knew that entering the gen, so I won't label that a performance enhancer. Every other feature supported by the Series X GPU is designed to offer enhanced performance.
We have one instance total of a released game for both systems where one of these performance enhancing features are in use, just one. Doom Eternal. It used not the inferior version of VRS, the one which can be software emulated, but the more advanced hardware Tier 2 VRS. What was the performance outcome in what we know to be one of the most advanced and graphically impressive game engines out there?
That's a 23% performance advantage for Series X in balanced mode
That's a 14% performance advantage for Series X in RT mode.
That's a 29% performance edge for Series X in 120fps mode.
Then there's Avengers, another visually advanced looking game.
In the Quality mode where both consoles used dynamic res with native resolutions, Series X maintained a 13% performance advantage.
In the performance mode it's an even more insane 62% performance advantage for the Series X in terms of performance because the PS5 is literally rendering way less pixels. If we go by Digital Foundry's pixel count, then it's an even higher 74% edge. Now, I'll go with VGTech simply because they provide way more data compared to Digital Foundry.
Now something I've always been curious about is this. We know both consoles turn in excellent performance figures in avengers. Neither console is performing badly, which the stats back up. But we also know the PS5 is rendering, at any given moment, many less pixels compared to the Series X. At minimum it appears 62% less. And even then we get these kinds of stats.
You would think that if the PS5 in this particular game is rendering so many less pixels, how is it possible that its performance can be THIS close to the Series X, and not be pretty much flawless? It suggests that if the PS5 were anywhere close to the native resolutions of the Series X, it would performed quite a bit worse. Or, simply put, the resolution difference would have needed to have been even lower. The PS5's maximum checkerboard resolution of 3840x2160 works out to 1920x2160, that's 12.5% more pixels than 1440p. Series X's maximum native 4K is literally 100% more pixels by comparison. Yes, the Series X drops from that native 4K more often than the PS4 drops from its half checkerboard 4K, but at worst the Series X advantage is in the realm of 62% more resolution performance. But to be more fair, 50% more resolution if you leave PS5 at its version of 4K and instead use a lower native pixel count for Series X under its native 4K.
This makes what Crystal Dynamics told NXGamer even more valid. Had they not gone with Checkerboarding, they would have ended up with potentially worse image quality or something softer in appearance overall. Checkerboarding, even with its imperfections, guaranteed a better visual outcome than if they hadn't gone with it. Now we know CD takes their performance seriously, so clearly the game wouldn't have performed like shit at native, because they wouldn't have allowed it to. They would have simply run at a lower overall resolution to maintain the performance, and it would looked worse next to the Series X version. So they made the smartest decision. And just in case someone tries to claim PS5 has better overall performance because Series X had the lowest minimum framerate, which appears less than a quarter of 1% of the time
Finally, another highly visually advanced title on a game engine that's no joke.
Metro Exodus, yet another demanding game pushing visuals, stop me if you've heard this one before, but Series X is maintaining roughly 21-23% more performance in resolution. And no, PS5 maintaining 60fps less than 1% of the time over Series X does not constitute a performance win for the PS5 when it's already running at a lower resolution and dropping to lower than 1080p in more demanding scenes.
And finally, though I won't waste time posting it, but you can go look at the stats if you don't believe me, Resident Evil Village actually performs better on Series X Better than 2% of the time in RT mode. Both consoles are flawless in non RT mode, not a single drop according to the stats. And due to how fantastic Capcom's checkerboarded solution is it's near damn impossible to get a proper resolution count on the game, so inconclusive so we'll assume they must have been 100% identical. There are some titles where despite having a resolution advantage Series X's performance is just unacceptable, that's a PS5 win. The 4 games I just ticked through are not one of them.
Another game, the next gen version of Star Wars Jedi: Fallen Order is also again better on Series X, but that one is much closer. A roughly 8% resolution edge with less than 1%, not even a tenth of 1% framerate edge going to PS5 in both modes. But admittedly that game is far less, to practically nothing, to bark about because they are just that damn close.
Series X has more convincing cases, in more visually demanding games on very advanced engines. And guess what? That advantage is often indeed precisely the GPU performance advantage that it's been suggested to have. There are times it seems to almost exceed it. Feel free to look on the PC side of things for nearly matching GPU configurations and you will notice generally similar or smaller percentage advantages in framerate performance at
identical resolutions. Translation: in console terms
the stronger GPU maintaining higher fps at a particular resolution would be the one running at the higher resolution compared to the weaker GPU running at a lesser resolution for performance reasons, and in so doing they would get roughly equal framerates, exactly what you're seeing between PS5 and Series X. A familiar trend you will often see if you look at these PC game benchmarks is that often the weaker GPU running at the lower resolution will see generally better framerates than the stronger GPU was getting at its higher resolution, forcing the stronger chip to come down from its perch just a little bit to get closer to performing like the weaker card, just at a higher resolution because that's what it should be doing.
Just for fun people can use the 2080 Super to represent Series X and the 2070 Super or 5700XT to represent PS5 (differences in hardware and core makeup are roughly close to PS5 and Series X). What you'll often see is that the lower resolution framerate performance of weaker cards tend to be better than their more capable counterparts running at the higher resolutions. This would mean that the two cards through dynamic resolution would need to draw closer in resolution to meet their targets
www.anandtech.com
So if people want to label Touryst a benchmark, then we can show benchmarks for much more graphically demanding titles developed under similar circumstances and timeframes on more advanced game engines that demonstrate which platform is more capable. I'm unsure how a game that released nearly an entire year after the Series X launch with an entirely rewritten game engine and the added benefit of a lot more time under their belt (things Xbox couldn't benefit from) can be considered a fair benchmark of the capabilities of the two systems, but that won't stop people from trying to hold it up as evidence of platform capability. Then again, we certainly know people on the Xbox side would have done the same if the roles were reversed, so I guess all is fair. But there's a shiny asterisk all over this one.