• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Notebookcheck.net: No, the PS5 won’t offer anywhere near the graphics performance of Xbox Series X: Navi benchmarks prove it

Status
Not open for further replies.

Dory16

Banned
Downclock if old.

Sony’s claim that high clockspeeds offset its meagre shader allocation on the PlayStation 5 doesn’t hold water when Navi overclocking results are factored in. Due to non-linear performance/clock scaling, the likely performance deficit between the two consoles is in the 25-30 percent range, which may have a major impact on 4K performance.

During Sony’s Playstation 5 spec unveiling, Sony made much of the fact that the PS5’s GPU was more “agile” than the competition. The logic they offered was that, because GPU clockspeeds are tied to more than just the shader cores, higher clocks mean higher throughput across the chip, which can offset the lack of hardware shader resources.

This was an arrow shot right across Microsoft’s bow: Redmond, days earlier, had revealed the Xbox Series X’s immensely powerful 12 TFLOP GPU, the fastest GPU AMD has ever made. The 52 CU part features 3328 shaders and runs at 1800 MHz, right in line with what we’ve seen in other Navi parts like the RX 5700. It’s backed by GDDR6 memory that delivers 561 GB/s of bandwidth. Sony, in contrast, unveiled a much more conservative GPU for the Playstation 5, with 36 CUs (the same number of shaders as the RX 5700) tied to 448 GB/s memory.

Sony’s part, however, operates at a much higher 2.23 GHz max clock speed, which allowed the company to claim that it delivers over 10 TFLOP of compute, just fifteen percent behind the Xbox Series X. Reading between the lines, Sony’s claim that high clock speeds matter more than raw hardware resources implies that the performance gap between the PS5 and Xbox Series X might be even narrower than 15 percent.

This is misleading for two key reasons. For starters, the PS5 only delivers 10 TFLOPs of notional compute power when it’s running at its maximum boost frequency. Sony themselves have asserted that clockspeeds will be pulled back depending on power draw, meaning that there will be scenarios where the PS5 delivers less. The Xbox Series X, in contrast, runs at a rock solid 1800 MHz, subject neither to thermals nor power draw.

The second reason has to do with what we already know about Navi clockspeed scaling. RDNA parts do not scale well at higher clockspeeds. Overclocking tests on the RX 5700 XT-close analogue for the PS5’s GPU-indicate that a massive 18 percent overclock from stock up to 2.1 GHz resulted in just a 5-7 percent improvement to frame rates. This is the exact opposite of Sony’s claim, which implies better-than-linear performance scaling with clockspeeds. RDNA2 is an iterative update to the first-gen RDNA architecture found in Navi 10 parts. This makes it very likely that the PS5 will also behave similarly: upping the clocks to 2.2 GHz won’t magically offset the substantial difference in hardware allocation between the Series X and the Playstation 5 GPUs.

This leads to the sobering conclusion that in real-world workloads, the PS5 might be 30 percent or more slower than the Xbox Series X. We don’t expect the world’s fastest SSD or individual raindrop audio rendering to offset that.

Link: https://www.notebookcheck.net/No-th...ries-X-Navi-benchmarks-prove-it.458625.0.html
 

TGO

Hype Train conductor. Works harder than it steams.
tenor.gif
 
The second reason has to do with what we already know about Navi clockspeed scaling. RDNA parts do not scale well at higher clockspeeds. Overclocking tests on the RX 5700 XT-close analogue for the PS5’s GPU-indicate that a massive 18 percent overclock from stock up to 2.1 GHz resulted in just a 5-7 percent improvement to frame rates.
Using RDNA 1 to hypothesize when the RDNA 2 isn't even out in the market is disingenuous to say the least.

Again, both companies can claim whatever they want, but at the end of the day, Digital Foundry and the rest will prove the difference, whether it's 15% or 30%.

Downclock if old.
FranXico FranXico , OP started the thread with this quip indicating the intent behind such a blatant clickbait thread.
 

Dory16

Banned
Don’t speak for me my bro. I pasted the entire article in so nobody needs to click on anything. No need to put on the war paint.
 
I think the bottom line is the XsX will almost always have a slight edge in multi-plats, whether it be in resolution (native 4k vs CB 4k, or 1800p-2000p) or frame rate (locked 60 vs variable).

It probably won't be a real leap in many cases, but noticeable IMO.

What this article says about overclocking the 5700 XT and diminishing returns the higher the OC goes has been stated by DF when PS5 first got covered. As far as how this will translate into the actual PS5 performance, I guess that remains to be seen.

Both consoles will excel with their first party visuals
 
Last edited:

ethomaz

Banned
I think the bottom line is the XsX will almost always have a slight edge in multi-plats, whether it be in resolution (native 4k vs CB 4k, or 1800p-2000p) or frame rate (locked 60 vs variable).

It probably won't be a real leap in many cases, but noticeable IMO.

What this article says about overclocking the 5700 XT and diminishing returns the higher the OC goes has been stated by DF when PS5 first got covered. As far as how this will translate into the actual PS5 performance, I guess that remains to be seen.

Both consoles will excel with their first party visuals
It can be shocking to some but PS5 is not based in the same hardware as 5700 XT.
The examples of overclocking in the 5700 XT tells us nothing about the PS5 because 2.2Ghz can be the normalized clocks of RDNA 2 with 40 CUs.
 
Get a life, OP.

I'm not going to even comment until Digital Foundry gets their hands on the consoles this upcoming holiday season and test the shit out of them and see how big of a difference there really is.
Are you serious right now ?
You are taking fanboyism way too far, don't shoot the messenger.
We have tweet threads (no one gives a shit), you even posted a controller thread.
 
Yeap RDNA benchmarks are proof of what RDNA 2 can do.
People needs to stop the useless comparisons.
People can compare RDNA, RDNA 2, GCN, etc. The point of it, is to show that each of these architectures exhibit less gains, when overclocking to the high end with less CU's, versus running the chip a little slower, while having more CU's. Unless physics have been altered or there is some secret sauce involved, this trend will continue in gpu technology.
 
Status
Not open for further replies.
Top Bottom