• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is the PS5 GPU Really Underpowered - What Can We Really Expect

Kumomeme

Member
i dont think ps5 gpu is underpowered just because the rival is stronger..remember it still around just 16% differences

i believe like how last gen are where ps4 and the pro model same as xbox one and x version use midrange gpu, ps5 gpu basically that..from cerny's presentation 52cu gpu never been an option for them, i gues based on comparison he give in the presentation, 48 is the highest cu count that they pursue but choose high frequency 36cu instead for various reason

it just compared to sony, this time ms goes further length, suprising even us for stronger gpu thats all(which is awesome)

or they simply had to choose to allocated console budget, invested for stronger gpu or faster ssd?
...if this is true, we already know their choice

sony had tons of capable 1st party studio..remember ICE team? and such..i not worry much..they surely already take account on various devs input including 3rd party..especially i doubt the console they built gonna held off their reputable first party studio vision

maybe the question is the 'performance' difference..how many fps ps5 able to output vs xsx..i guess that is the things gonna differentiate each console in the end...

i believe the goal and philosophy for each console design is different....we already hear loud and clear from phil that xsx target is to be 4k60fps machine and from cerny presentation, with stuff like superfast ssd, the IO, tempest engine etc we can see their 'philosphy' is different than that..maybe their goal is not framerate performance like xsx but they aimed to make sure devs can achieved their artistic and gameplay vision, reduce as much bottleneck they can, give more resources like faster storage streaming speed and such...im not suprise, if tons of their first party studio game end up run at 30fps like how ps4 is but we can expect their game be something specials. Like the choice i speculated above, choosing stronger gpu for better performance and resolution or faster ssd and custom audio chip for allowing devs 'dream further' ?

This is the minimum based console, atleast the lowest denominator already had this kind of spice ..we can expect better resolution and framerate from their next revision release.
 
Last edited:

geordiemp

Member
Does performance scale linearly with frequency with that 35% improvement?

We dont know for RDNA2 do we .

Posters showing overclocks on GCN / RDNA1 or whatever is not relevant when you have 50 % improve / watt and a revised architecture.

Nobody knows...YET/

Maybe PC RDNA2 will go even high clocks than Ps5, lets wait and see. We have nothing to reference yet.
 
Last edited:

Tamy

Banned
Does performance scale linearly with frequency with that 35% improvement?

Well, DF did a test with RDNA 1.0 cards, where they showed that 10 TF from 36 compute units leads to way less performance than 10 TF from 40 compute units, so overclocking doesn't scale very well - fo RDNA 1.0 cards.

We do not know how it works with RDNA 2.0 cards. We will see end of the year when the new cards launch, then DF will make a new test.
 

Gamernyc78

Banned
You've only seen a picture of it "pie_tears_joy:
We don't know anything about the launch lineup, let alone the lineup for the rest of the generation
Based on the specs 3rd party games will definitely look better on XSX, 1st party we'll have to see since Sony has amazing studios.

No need to wait and see. This is said every Gen but we know what sequels we are getting from Sony and what studios are working on them and we know thyll look better than competitors first parties. It always happens that way since ps3 days. God of War will be coming and it'll blow shit out the water, another Naughty Dog game will do the same no matter what it's called, Horizon Zero Dawn sequel will melt retinas etc. This isn't a blind faith but a faith built on games we've already experienced from the same developers through various generations who've proven they are just on a different level. Sony first parties will look the best.
 
Last edited:

Tamy

Banned
No need to wait and see. This is said every Gen but we know what sequels we are getting from Sony and what studios are working on them and we know thyll look better than competitors first parties. It always happens that way since ps3 days. God of War will be coming and it'll blow shit out the water, another Naughty Dog game will do the same no matter what it's called, Horizon Zero Dawn sequel will melt retinas etc. This isn't a blind faith but a faith built on ganes we've already experienced from the same developers through various generations who've proven they are just on a different level. Sony first parties will look the best.

This isn't how it works though. Things might change. We don't know yet, just look at the last of us part II for example. indefinitely delayed.. Who knows what happens?!

Also, did you see the PS4 launch lineup back then?

This is the PS4 exclusive games from Sony back then:

Sony Computer Entertainment Titles:

Killzone Shadow Fall: 73% at Metacritic
Knack: 54% at Metacritic


at launch.


Later on, so not even at launch, we had more exclusive games like:

Driveclub 71% at Metacritic
The Order: 1886: 63% at Metacritic


So, it will take years until we see all those games you are talking about! I don't think we will see horizon 2 or god of war at launch! At least, Sony did not announce anything like that for the launch

So, let's wait and see.
 

Gamernyc78

Banned
This isn't how it works though. Things might change. We don't know yet, just look at the last of us part II for example. indefinitely delayed.. Who knows what happens?!

Also, did you see the PS4 launch lineup back then?

This is the PS4 exclusive games from Sony back then:

Sony Computer Entertainment Titles:

Killzone Shadow Fall: 73% at Metacritic
Knack: 54% at Metacritic


at launch.


Later on, so not even at launch, we had more exclusive games like:

Driveclub 71% at Metacritic
The Order: 1886: 63% at Metacritic


So, it will take years until we see all those games you are talking about! I don't think we will see horizon 2 or god of war at launch! At least, Sony did not announce anything like that for the launch

So, let's wait and see.

Launch lineups on none of the consoles were on fire. Did you just mention Driveclub? The racing game that had issues with online but stillllll looked better than other first parties till several years after? Every game won't be great but overall they will, that we know and they will outdo their direct competitor. My point still stands.

This is how it works. Sony already has specific first party devs that overall, continuously each gen shit gold and goty contenders. Doesn't matter if it's year one or year four we all know how gaming goes. Moving from generation to generation tht doesn't change. Naughty Dog is still there, Sony Santa Monica, Guerrilla, etc... Microsoft doesn't have this and hasn't had several first party studios churn out goty contenders year after year since idk when.

My comment still stands. Ps exclusives will still look better given history, will be better overall metawise and will mature and get better as the generation continues. Data from past generations make this argument cogent. And don't know if you remember but Second Son and Killzone Shadow fall still looked better than almost every exclusive on Xbox one. Shit Second Son still looks better than new games being released recently like Crackdown 3 and State of Decay.

When all is said and done Sony will have the better portfolio of games consensus wise, quantity wise, new ip wise and first parties with the better graphics.
 
Last edited:

Mobilemofo

Member
I have been told that God of War PS4 looks like a blurry mess, so maybe the next one will be a bit sharp because of the extra TFs.

Time for true cinematic frame rate: 24FPs locked!

Seriously, I expect most games to run at 60fps on these machines, the graphical benefits of going 30 will not be as obvious this time around.
Whoever told you god of war on the ps4 is a blurry mess is a idiot. And your an idiot for believing it.
 

Gamernyc78

Banned
Whoever told you god of war on the ps4 is a blurry mess is a idiot. And your an idiot for believing it.

God of War is a blurry mess🤦‍♂️??? Lol I had already put tht person on block because I knew what frivolous, lies were coming. Thts a new one, God of War is blurry huh.

Ppl can't just deal with facts but have to put out fanboy tainted responses. Sony games having bigger metas, several goty winners and contenders every Gen, pushing graphical benchmarks, etc are all facts but some ppl rather deal with fanboy made up metrics.

Leave him he won't listen.
 
Last edited:

rnlval

Member
  1. I think it's a foregone conclusion that the RDNA 2 PC GPUs coming this year will indeed be clocked above the PS5 clocks (rumblings suggest well above 2.5Ghz). AMD has suggested as much and the PS5 just shows what's possible even in a closed box with limited power.
  2. People still talk about the PS5 "variable clocks" like it's something so foreign. In the PC space, both the CPU and GPU operate with varying clock speeds during execution. This is just how it works but it is a foreign concept in a console. However, based on what Cerny has said, the PS5 clocks will likely be less variable than their PC counterparts. The typical operating frequency will be at the caps of 3.5Ghz and 2.23Ghz for the CPU and GPU. This has the benefit of saving a ton of power when the workload does not demand full frequency while keeping a consistent experience expected from a console.
  3. I pointed out that the diminishing returns on clock speed increase on the PC is largely due to external factors with power gating and board design. Since a console is designed from the ground up, they can design around such limitations when building the box. There of course is still a curve when it comes to clock speed gains that will hit a wall at some point. But if Sony designs the PS5 around the 2.23 Ghz value, it can ensure that the box has everything around it to maximize performance at that frequency
It's interesting that nobody talks about how the RTX 2080 Ti is not really a 13 TFLOP card because it rarely ever runs at it's max frequency :messenger_smirking:. In fact, most PC GPUs never come close to reaching their theoretical max performance when running workloads. That's one of the fundamental differences between the PC platform and a console. Developers can control every aspect of the execution on a console so you can maximize the efficiency of the hardware. It's all about efficiency guys. That 13 TFLOP 2080 Ti may be only reaching a max throughput of 6 TFLOPs in even the best case (i.e. most demanding games). This is especially true when you realize that the games it is running is largely designed for MUCH lower hardware specs (1.8 TFLOP PS4 for example).

This is also why we consistently see console games that seem to punch above their weight and do things thought not possible on such low end hardware. We are implicitly comparing that to PC standards (i.e what a 1.8 TFLOP GPU can do on PC) which is wrong since it's entirely dependent on the software it runs and the software designed for a console in constructed differently than that on a PC in many ways. But that God of War game for example is able to maximize those 8 Jaguar cores and 1.8 TFLOP GPU performance to a degree that PC software generally doesn't come close to.

clock_vs_voltage.jpg


RTX 2080 Ti has 1824 Mhz average clock speed, hence 15.876 TFLOPS average.

RTX 2080 Ti is not 13 TFLOPS card. LOL
 
Great write-up.

I know it would end up badly, but OP should do the same for Series X, so to have a broader understanding. The gap between the two is definitely there, and it will show in some third party titles, but these machines are going to be really, really capable.

That is when devs stop building for last gen hardware.
 

pawel86ck

Banned
This isn't how it works though. Things might change. We don't know yet, just look at the last of us part II for example. indefinitely delayed.. Who knows what happens?!

Also, did you see the PS4 launch lineup back then?

This is the PS4 exclusive games from Sony back then:

Sony Computer Entertainment Titles:

Killzone Shadow Fall: 73% at Metacritic
Knack: 54% at Metacritic


at launch.


Later on, so not even at launch, we had more exclusive games like:

Driveclub 71% at Metacritic
The Order: 1886: 63% at Metacritic


So, it will take years until we see all those games you are talking about! I don't think we will see horizon 2 or god of war at launch! At least, Sony did not announce anything like that for the launch

So, let's wait and see.
The Order only 63% at metacritic? Why? Superb graphics, great atmosphere and fun shooting mechanics. I hope we will see The Order sequel on PS5.
 

Neo_game

Member
The difference between the two consoles :
- Lower Resolution
- Less FPS
- Less Ray Tracing
- Less details
That's about it, most won't even care, just buy what you like.

Let's not forget pricing, more ram, higher the price, never believed that 24gb ram speculation.

Not sure if serious ? The only difference I see is some 20% less pixels for PS5.
 

Journey

Banned
The difference will be negligible
PS4 and Xbox One - 40%
PS5 and XSX - 15-20%

I don't understand where the 15% is coming from when XSX teraflop number is constant and even at the best case scenario, the PS5 will never shrink this difference down to just 15% because XSX will never have to be clocked down to 11.8 Terfaflops (10.28 + 15% = 11.8).

Also

-Xbox Series X has 25% more VRAM bandwidth (Bandwidth is a huge factor)
-Xbox Series X has 45% more CUs (RDNA 1 tests prove that more CUs beat out frequency by just adding 4, imagine 16 more)
-Xbox Series X has Variable Rate Shading which has shown significant improvement in performance on PC, on console will be much more.
-Xbox Series X has Direct Machine Learning which will improve visual fidelity like DLSS (Deep Learning SS)
 
I don't understand where the 15% is coming from when XSX teraflop number is constant and even at the best case scenario, the PS5 will never shrink this difference down to just 15% because XSX will never have to be clocked down to 11.8 Terfaflops (10.28 + 15% = 11.8).

Also

-Xbox Series X has 25% more VRAM bandwidth (Bandwidth is a huge factor)
-Xbox Series X has 45% more CUs (RDNA 1 tests prove that more CUs beat out frequency by just adding 4, imagine 16 more)
-Xbox Series X has Variable Rate Shading which has shown significant improvement in performance on PC, on console will be much more.
-Xbox Series X has Direct Machine Learning which will improve visual fidelity like DLSS (Deep Learning SS)
While that is all true, we don't know yet if Ps5 will have VRS and ML or not.
 

Caio

Member
PS5 is a very powerful and competitive piece of hardware, and in the hands of great developers like the ones Sony have, it will do miracles. It's not the PS5 underpowered as a Next Gen Console, it's XSX which has exceeded all possible expectations, and everybody tends to compare PS5 VS XSX as an obvious consequence when Consoles are revealed and Next Gen is coming. From what I have understood, and give me a break if I'm wrong, PS5 should be much much more powerful than the base PS4; the PS5 CPU is making circles around the Jaguar CPU, the GPU at 10,28 TF peak RDNA2 also easy outperform the PS4 GPU by a 8X factor in performance. Add the ultra fast SSD , Tempest Audio, a very efficient architecture and a much more powerful CPU which is even free from Audio task as the Tempest will do it in full glory. So, is PS5 a disappointment ? Hell no, just some people were expecting too much, and the hype and gossip suggesting a 13,3 TF...that was really too much, but many people were believing that. IMO PS5 is a very good Console, simply MS have created a GOD Console , and I still can't believe what Microsoft did and all the hardware, specs, brutal power, optimization, Velocity architecture, and software optimization. It will be an amazing Next Gen, and I cannot wait ;D
 
Last edited:

Later on, so not even at launch
, we had more exclusive games like:

Gee I wonder why?

Infamous Second Son - 80% Metacritic (launch window)
Resogun - 84% Metacritic (An actual launch game)

But leave those two out and mention titles 11-15 months after launch.

If you're gonna play list wars, try not to be so transparent.
 

Hobbygaming

has been asked to post in 'Grounded' mode.
I don't understand where the 15% is coming from when XSX teraflop number is constant and even at the best case scenario, the PS5 will never shrink this difference down to just 15% because XSX will never have to be clocked down to 11.8 Terfaflops (10.28 + 15% = 11.8).

Also

-Xbox Series X has 25% more VRAM bandwidth (Bandwidth is a huge factor)
-Xbox Series X has 45% more CUs (RDNA 1 tests prove that more CUs beat out frequency by just adding 4, imagine 16 more)
-Xbox Series X has Variable Rate Shading which has shown significant improvement in performance on PC, on console will be much more.
-Xbox Series X has Direct Machine Learning which will improve visual fidelity like DLSS (Deep Learning SS)
XSX has a split ram pool and PS5's SSD cache streaming capabilities is the same thing VRS is doing but likely better
 

jaytyla

Neo Member
How many people here believe that Sony under delivered with the PS5's graphical capabilities based on the 10.3 TFLOP metric? Do you subscribe to the notion that the PS5's GPU is nothing more than a RX 5700/XT card or that it's only on par with midrange GPUs today?

This post is about providing real data and educated estimations to dispel the notion that the PS5 GPU is only a "midrange" GPU that is not on par with today's top tier commercial GPUs. Indeed, looking at the TFLOP number in isolation is indeed very misleading and the truth about the actual performance of the GPU points paints a very different picture. I know many of you don't know me but I can say that I am not just pulling this info from nowhere. I have over 15 years experience working in gaming and have spent nearly 5 years of my career doing critical analysis of GPU performance. Take it for what it is.

Before I begin, a full disclaimer: this post is not about a comparison to or commentary on the Xbox Series X. No console fanboyism here please. The fact is, the Xbox Series X has a bigger GPU with more theoretical horsepower. Period. Nobody is refuting that so please no Xbox defense force please.

Like many, I too was initially somewhat disappointed to when I first heard the PS5 specs mainly because there was so much information before hand that pointed to more performance being a possibility. We've all been hearing about the 12-14 TFLOP monster that Sony was building and honestly it's not about the raw numbers that matter. I was more happy about the idea that both consoles would come out being really close in power which benefits gamers by establishing a high baseline where neither machine will have subpar 3rd party releases. But after taking some time to process the specs and information Sony released as well as doing some in depth analysis, I am pretty happy with what Sony ended up with from a GPU standpoint.

Let me be clear: the goal of what I'm presenting here is not to define an absolute performance metric for PS5 with a given piece of content. In other words, I am not trying to predict that PS5 can run game X at Y Fps specifically. That is impossible since there are some many variables affecting overall performance that we do not know about: CPU, memory, driver, other console specific optimizations etc. Instead what I am doing is establishing a realistic expectation of a baseline of performance of the GPU specifically by looking at known real world performance data from comparable hardware.

How am I doing this? Let me break it down:
  1. Let's establish a comparison mapping to other known GPUs based on their GPU architectures and theoretical computation power based on what we know:
    • We know that AMD's RDNA architecture is a general 25% increase in performance per clock when compared to GCN -> 1TFLOP (RDNA) = 1.25 TFLOP (GCN)
    • We know that RDNA 2 will be even more efficient than RDNA (i.e. perf per clock and per watt will be better). Now we can guess how much more efficient based on some actual hints from Sony and AMD:
      • Mark Cerny himself during the PS5 tech dive revealed that the size of each CU in the PS5 GPU is roughly 62% larger than a PS4 CU. Thus, there is the equivalent of 58 PS4 CUs in the PS5. So 36 CU (PS5) = 58 CU (PS4). Now 58 CUs running at the PS5's 2.23 Ghz frequency => ~16.55 TFLOP (GCN). So what is the conversion factor to get from 10.28 TFLOP (RDNA 2) to 16.55 TFLOP (GCN)? Well it turns out that the additional perf per clock to reach that ratio is precisely 17%. So by this data: 1 TFLOP (RDNA 2) = 1.17 TFLOP (RDNA 1)
      • AMD has already said that they are pushing to deliver a similar improvement with RDNA 2 over RDNA 1 as saw from GCN to RDNA 1. They have also confirmed that RDNA 2 will see a 50% improvement in perf/watt over RDNA 1. GCN to RDNA 1 saw a 50% perf/watt and 25% perf/clock increase. A 25% further increase in perf/clock in RDNA 2 sounds pretty ambitious and i will be more conservative. But we can use this as an upper bound.
      • AMD has talked about mirroring their GPU progression to that of their CPU. They have specifically talked about increasing CPU IPC by roughly 15% every 12-18 months. The 10-15% range is typical of GPU generational transitions in the past
    • Using the 25% ratio of RDNA to GCN and a 15% ratio of RDNA 2 to RDNA 1, we can calculate the equivalent amount of theoretical performance (i.e TFLOPs) for the PS5 GPU in terms of both RDNA performance and GCN performance:
PS5 TFLOP = 10.28
PS5 TFLOP (RDNA 1) = 12.09 (used to compare against RX 5700 and RX 5700 XT)
PS5 TFLOP (GCN) = 16.13 (used to compare against Radeon VII, PS4)
2. We can also note that it is actually easier to guessestimate the PS5 GPU performance since there is a GPU on the market very similar to it in the RX 5700. The GPU config in terms of CU count, number of shader cores, memory bus size, memory bandwidth etc is exactly a match for the PS5. At a high level, the PS5 is simply an extremely overclocked RX 5700 in terms of hardware. Now typically on PC, overclocking a GPU gives limited returns due to power issues and system design limitation that will not exist in a console. So if we calculate that the PS5's typical GPU clock of 2.23 Ghz is indeed ~34% higher than the typical GPU clock of the RX 5700 at 1.670 Ghz, we can extrapolate PS5 as being roughly 35% higher than that of an RX 5700. However, doing that raw translation does not account for RDNA 2 additional efficiencies. So if we add the 15% uplift in efficiency, we can get a pretty good idea of the PS5 GPU performance. It turns out that this projected value is pretty much identical to the TFLOP conversion factors I computed above :messenger_winking:
3. Now that we have a quantitative comparison point, we can calculate a PS5 projected performance target based on theoretical performance from comparable GPUs. For example, RX 5700 XT = 9.7 TFLOPs (RDNA 1) and PS5 = 12.09 TFLOP (RDNA 1) That puts the PS5 projected performance at ~25% higher than a RX 5700 XT. Using these calculations for other GPUs as reference points we get the following:

PS5 vs Xbox Series X = -15% (PS5 is 15% slower)
PS5 vs RX 5700 = 153% (PS5 is 53% faster)
PS5 vs RX 5700 XT = 125% (PS5 is 25% faster)
PS5 vs Radeon VII = 120 % (PS5 is 20% faster)
PS5 vs PS4= 8.76x (PS5 is nearly 9x faster)
4. Finally, now that we have a performance factor for some common GPUs across various AMD architectures, we can see where a projected PS5 performance will rank compared to the fastest cards on the market including Nvidia cards. I've looked at several industry aggregate sites such as Eurogamer, TechpowerUP, and GPUCheck (numerous games tested) as well as a couple of high profile games such as DOOM Eternal, Call of Duty Modern Warfare, and Red Dead Redemption 2 to look at where the PS5 performance will fall. I've done this analysis across numerous performance metrics, resolutions, and difference GPU references defined above to see if the data was consistent. The goal here was to identify which GPU currently on the market had the closest performance to a projected PS5 performance. I've highlighted the 4K rows since 4K is the target resolution for the PS5. The summery table shows which GPUs came closest to the projected PS5 performance at different resolutions. The raw results are below:

ecvwP0.jpg

**Note: Game performance was captured from TechpowerUp benchmark analysis using max settings at all resolutions

Key Takeaways:
  1. General takeaway is that in most cases at higher resolutions, the PS5 performance is actually slightly higher than that of the RTX 2080 Super.
  2. Note that the 1080p values are a bit misleading since some games are CPU bound at that resolution. Thus, most GPUs exhibit lower perf which is why the RTX 2080 Ti was the closet at 1080p.​
  3. These numbers do no take into account other factors that can improve PS5 GPU performance even further such as: GPU specific optimizations, console specific optimizations, lower level driver compared to PC, I/O throughput improvements in PS5, memory subsystem etc​
  4. This analysis is just a rough estimate and again is not to be taken literally in terms of actual performance in games. There are still a ton of variables and unknown factors. But it does account for known information to give a good relative performance baseline to set expectations on how much performance the PS5 GPU may possess. The answer is that it is definitely not "just an RX 5700 XT" and will likely have more performance than a 2070 Super​
  5. My analysis went well beyond these websites, game titles, and reference GPUs. I presented the highlights but the overall takeaways is the same from the additional data: performance is most in line with a RTX 2080 Super.​
So is the PS5 GPU underpowered? The data shows that the actual game performance is roughly around a RTX 2080 Super at a minimum in most cases which is currently the 2nd fastest commercially available GPU on the market! Anyone that can call that under-powered or "midrange" is...not very knowledgeable on this topic. Yes, by this same analysis the Xbox Series X would be matching or exceeding a RTX 2080 Ti which is amazing! The point here is that both consoles will have plenty of graphical horsepower and the PS5 in general is still a significant step up from anything AMD has released to date and is a generational leap over the PS4!

Every should be excited but please stop spreading FUD about PS5 performance :messenger_winking:

Can you do an analysis on the Xbox Series X, if it’s on par with a 2080 Ti?
 

MH3M3D

Member
Just because Xbox series X is powerful doesn't mean that PS5 is underpowered. Considering what it will probably cost its a steal compared to similar specs on a PC.
Console gamers need to count their blessings and choose whichever machine runs their favorite games.
 

Journey

Banned
XSX has a split ram pool and PS5's SSD cache streaming capabilities is the same thing VRS is doing but likely better


10GB dedicated for VRAM at 560GB/s bandwidth.

Why would it matter if it has split memory? It only matters if 10GB is not enough for VRAM and it has to dig in to the slower pool, but that slower pool will be for OS (2.5GB reserved) and 3.5GB for sound and CPU functions.

The most demanding games running at 4K on high end PCs today (2080 Ti) are taking up around 3-4GB of VRAM with all settings set to ULTRA, so you can see that even years down the road where this number triples, there's still room before they fill these 10GB of full speed bandwidth. PS5 is limited to 448GB/s from day ONE.

And NO, SSD ram is for fast loading and instant level loading, it will NOT replace GDDR6 functions, that's just crazy talk.
 

Hobbygaming

has been asked to post in 'Grounded' mode.
10GB dedicated for VRAM at 560GB/s bandwidth.

Why would it matter if it has split memory? It only matters if 10GB is not enough for VRAM and it has to dig in to the slower pool, but that slower pool will be for OS (2.5GB reserved) and 3.5GB for sound and CPU functions.

The most demanding games running at 4K on high end PCs today (2080 Ti) are taking up around 3-4GB of VRAM with all settings set to ULTRA, so you can see that even years down the road where this number triples, there's still room before they fill these 10GB of full speed bandwidth. PS5 is limited to 448GB/s from day ONE.

And NO, SSD ram is for fast loading and instant level loading, it will NOT replace GDDR6 functions, that's just crazy talk.
The 560 gb/s is only in ideal situations and if developers are already complaining about it's set up then it's not unified

Also, like I said same thing VRS is doing

kxDafPj.jpg
 

Sosokrates

Report me if I continue to console war
The 560 gb/s is only in ideal situations and if developers are already complaining about it's set up then it's not unified

Also, like I said same thing VRS is doing

kxDafPj.jpg

I question the what the ssd can be used for beside textures, i mean if the ssd can be used for loads of stuff ehy bother with some gddr6? Why not use 8gb or even 4gb?
 

Sosokrates

Report me if I continue to console war
The difference will be negligible
PS4 and Xbox One - 40%
PS5 and XSX - 15-20%

Depends how much the ram bandwidth of the xsx comes into play, while the ps4 had a sizable gpu advantage and it did benefit of having a single fast pool if ram, the effective speed of the X1s ram was about 150GB/s
 

TLZ

Banned
Well, DF did a test with RDNA 1.0 cards, where they showed that 10 TF from 36 compute units leads to way less performance than 10 TF from 40 compute units, so overclocking doesn't scale very well - fo RDNA 1.0 cards.

We do not know how it works with RDNA 2.0 cards. We will see end of the year when the new cards launch, then DF will make a new test.
I don't know why Richard did that. I know we don't have RDNA 2 cards, but RDNA 1 card testing is completely irrelevant. In addition to the customizations in both consoles.
 

S0ULZB0URNE

Member
The PlayStation 5 GPU is not underpowered, the PlayStation 5 GPU is extremely overclocked. Sony should have sold the PlayStation 5 as a 8-9.2 TF's Console which the PlayStation 5 probably was meant to be in the first place which would be still great. But now they probably have heat issues and a variable GPU Clock instead which can't be optimal ;)
Dumb fuks will be dumb fuks
 

Journey

Banned
The 560 gb/s is only in ideal situations and if developers are already complaining about it's set up then it's not unified

Also, like I said same thing VRS is doing


A console is by nature, an ideal situation. When would a situation NOT be ideal? when VRAM exceeds 10GB? With Witcher 3 running at 4K with everything set to Ultra not using more than 3GB of Vram, it's clear that 10GB at 560GB/s is enough even IF they decide to just use that pool and nothing else.
 

S0ULZB0URNE

Member
A console is by nature, an ideal situation. When would a situation NOT be ideal? when VRAM exceeds 10GB? With Witcher 3 running at 4K with everything set to Ultra not using more than 3GB of Vram, it's clear that 10GB at 560GB/s is enough even IF they decide to just use that pool and nothing else.
Consoles are different and shouldn't be compared.
 

DForce

NaughtyDog Defense Force
The same idiots that think the xsx ssd is "underpowered" compared to the PS5.

Going by the numbers, it is in comparison. You're talking about a -56% +129% gap. That's much greater than the CPU, GPU and Memory. But people such as yourself think it's not that much better while the numbers show different.
 

Hobbygaming

has been asked to post in 'Grounded' mode.
Depends how much the ram bandwidth of the xsx comes into play, while the ps4 had a sizable gpu advantage and it did benefit of having a single fast pool if ram, the effective speed of the X1s ram was about 150GB/s
XB1's real bandwidth was 102gb/s they tried to say it was over 200 at one point but everyone figured out that you couldn't reach those speeds for read and write
 

psorcerer

Banned
RTX 2080 Ti has 1824 Mhz average clock speed, hence 15.876 TFLOPS average.

Yep. It's not even 15TF, it's double of that if you factor in INT units and even triple of that if tensor cores and DLSS is enabled.
But its performance in actual games is pretty mediocre for that amount of processing power.
 

yurinka

Member
They made stuff like Uncharted 4, God of War, Dreams, Final Fantasy VII Remake, Street Fighter V, Bloodborne, The Last Guardian, Spiderman, TLOU 2, Driveclub, Until Dawn, Detroit or Wipeout Omega Collection and The Order in a PS4.

I don't care if PS5 will have more teraflops, pentaports or fart bandwith. They blew me away with these and other PS4 games, I loved them. So I'll trust them. I know PS5 will be more powerful and inmersive than PS4 so they will do even more amazing games. Which is fine, I'll enjoy them.
 

dxdt

Member
XSX has a split ram pool and PS5's SSD cache streaming capabilities is the same thing VRS is doing but likely better
Microsoft has the same design intention of having the right LOD at the right time and only when it's needed. That's Xbox Velocity Architecture (fancy marketing name). The idea is to have 100GB of texture that's available just in the time it's needed. No more wasting of RAM resources. You can have a vast landscape of the highest quality LOD you want but the GPU still need enough TF and/or BW to render it.
 

VFXVeteran

Banned
It's interesting that nobody talks about how the RTX 2080 Ti is not really a 13 TFLOP card because it rarely ever runs at it's max frequency :messenger_smirking:. In fact, most PC GPUs never come close to reaching their theoretical max performance when running workloads. That's one of the fundamental differences between the PC platform and a console. Developers can control every aspect of the execution on a console so you can maximize the efficiency of the hardware. It's all about efficiency guys. That 13 TFLOP 2080 Ti may be only reaching a max throughput of 6 TFLOPs in even the best case (i.e. most demanding games). This is especially true when you realize that the games it is running is largely designed for MUCH lower hardware specs (1.8 TFLOP PS4 for example).

Can you give several examples of GPU stats showing a 2080Ti is only using 50% workload? Also, while you are at it, can you show a benchmark on the console that shows GPU workloads of 99%? Because that's the only way your crazy paragraph can be factually correct.

This is also why we consistently see console games that seem to punch above their weight and do things thought not possible on such low end hardware. We are implicitly comparing that to PC standards (i.e what a 1.8 TFLOP GPU can do on PC) which is wrong since it's entirely dependent on the software it runs and the software designed for a console in constructed differently than that on a PC in many ways. But that God of War game for example is able to maximize those 8 Jaguar cores and 1.8 TFLOP GPU performance to a degree that PC software generally doesn't come close to.

I would love to know what tech in GoW makes it excel over a 3rd party game's tech on the PC @ max details. Because it's impossible for any PS4 to run the majority of 3rd party games on Ultra settings from a PC.
 
Last edited:

Hobbygaming

has been asked to post in 'Grounded' mode.
Microsoft has the same design intention of having the right LOD at the right time and only when it's needed. That's Xbox Velocity Architecture (fancy marketing name). The idea is to have 100GB of texture that's available just in the time it's needed. No more wasting of RAM resources. You can have a vast landscape of the highest quality LOD you want but the GPU still need enough TF and/or BW to render it.
We'll see whose 1st party gets the better results :)
 

Journey

Banned
Right, they're closed platforms
I don't know why Richard did that. I know we don't have RDNA 2 cards, but RDNA 1 card testing is completely irrelevant. In addition to the customizations in both consoles.

Completely irrelevant? Hyperbole much? this isn't GCN vs RDNA which have a completely different setup. This is RDNA vs RDNA, it's the same architecture with version 2 adding a few more features to its set.

If you're expecting the outcome to do a complete 180 in terms of it scaling positive or negatively to CU count, then you'll be sorely disappointed, it might actually scale more so to CUs given how each CU will be doing more in RDNA2
 
Last edited:

TLZ

Banned
Right, they're closed platforms


Completely irrelevant? Hyperbole much? this isn't GCN vs RDNA which have a completely different setup. This is RDNA vs RDNA, it's the same architecture with version 2 adding a few more features to its set.

If you're expecting the outcome to do a complete 180 in terms of it scaling positive or negatively to CU count, then you'll be sorely disappointed, it might actually scale more so to CUs given how each CU will be doing more in RDNA2
It's still irrelevant with that much more efficiency in addition to the customizations for the PS5 and XSX.
 

farmerboy

Member
We can all argue till the cows come home, it won't count for shit until we see a game.

After being told that the two consoles were the same (and thinking that this may be a good thing), it's refreshing to see some clear differences between the two.
 

Journey

Banned
It's still irrelevant with that much more efficiency in addition to the customizations for the PS5 and XSX.


The point is that if RDNA 2 scales even more towards CU, then it stands to reason that XSX larger CU count will favor it. The customization is not related to the way the CUs perform, but instead on weather it has features like Variable Rate Shading, something that's exclusive and patented by MS.
 
Last edited:

Hobbygaming

has been asked to post in 'Grounded' mode.
The point is that if RDNA 2 scales even more towards CU, then it stands to reason that XSX larger CU count will favor it. The customization is not related to the way the CUs perform, but instead on weather it has features like Variable Rate Shading, something that's exclusive and patented by MS.
AMD has a patent for VRS, it isn't exclusive to MS, only the DX12 VRS extension is
 
Top Bottom