• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: PS5 Pro Codenamed "Trinity" targeting Late 2024 (some alleged specs leaked)

Would you upgrade from your current PS5?

  • For sure

    Votes: 377 41.0%
  • Probably

    Votes: 131 14.2%
  • Maybe

    Votes: 127 13.8%
  • Unlikely

    Votes: 140 15.2%
  • Not a chance

    Votes: 145 15.8%

  • Total voters
    920

ChiefDada

Gold Member
I will do you one better. Lets take the 2080ti.

2080ti has more VRAM and compute than 2070S. That's not "doing one better", that's proving my point that 2070S could never run at base PS5 fidelity mode.


Can we agree that the GPU+ CPU is more powerful than the PS5? Ok good.

Powerful in terms of compute, sure. But at these settings, the 2080ti 11gb VRAM buffer is still bottlenecked no question. Main reason PS5 tends to punch above weight is the memory efficiencies. I keep telling people memory is important and compute potential is useless without enough of it. And furthermore, these benchmarks are using higher than PS5 settings. PC RT includes RT shadows and RTAO as well and all presets are at ultra per chart as you can see.
 

Mr.Phoenix

Member
2080ti has more VRAM and compute than 2070S. That's not "doing one better", that's proving my point that 2070S could never run at base PS5 fidelity mode.
But then you are missing my point... My point was to show you what kinda hardware you need to just hit 1440p@60fps with RT. And to point out that a 2070 and even a 2080ti cant do it. And why I used the 2070 ininitiually is because its generally agreed to be around the PS5 spec.

But to emphasize that point, I have taken an even more powerful GPU and CPU to prove it... and yet you are making this about the chosen GPU and trying to avoid the actual point.


Powerful in terms of compute, sure. But at these settings, the 2080ti 11gb VRAM buffer is still bottlenecked no question. Main reason PS5 tends to punch above weight is the memory efficiencies. I keep telling people memory is important and compute potential is useless without enough of it. And furthermore, these benchmarks are using higher than PS5 settings. PC RT includes RT shadows and RTAO as well and all presets are at ultra per chart as you can see.
are you kidding me? Like are you serious.../ so you are saying that a 11GB, 515GB/s GPU paired with 32GB of RAM is still bottlenecked but suggesting that the PS5, with its at best 13.5GB of 480GB/s RAM shared between CPU and GPU is 'punching above its weight'?????

You do realize what kinda settings the PS5 is running right?


You know what nevermind... It just occurred to me that you are more interested in 'winning' an argument as opposed to actually learning something or admitting you are wrong.

Its all good, I bow out from this. Guess we will see what happens with the Pro.
 
Last edited:

S0ULZB0URNE

Member
But then you are missing my point... My point was to show you what kinda hardware you need to just hit 1440p@60fps with RT. And to point out that a 2070 and even a 2080ti cant do it. And why I used the 2070 ininitiually is because its generally agreed to be around the PS5 spec.

But to emphasize that point, I have taken an even more powerful GPU and CPU to prove it... and yet you are making this about the chosen GPU and trying to avoid the actual point.



are you kidding me? Like are you serious.../ so you are saying that a 11GB, 515GB/s GPU paired with 32GB of RAM is still bottlenecked but suggesting that the PS5, with its at best 13.5GB of 480GB/s RAM shared between CPU and GPU is 'punching above its weight'?????

You do realize what kinda settings the PS5 is running right?


You know what nevermind... It just occurred to me that you are more interested in 'winning' an argument as opposed to actually learning something or admitting you are wrong.

Its all good, I bow out from this. Guess we will see what happens with the Pro.
PS5 punches above its weight because of its I/O that streams data up to 22gbs with oodle compression and the industry’s best developers.
 

ChiefDada

Gold Member
are you kidding me? Like are you serious.../ so you are saying that a 11GB, 515GB/s GPU paired with 32GB of RAM is still bottlenecked but suggesting that the PS5, with its at best 13.5GB of 480GB/s RAM shared between CPU and GPU is 'punching above its weight'?????
That's exactly what I'm saying. In fact, it's even worse than I thought:

Why do you think the 3080 is performing <10% better than 2080ti?

Z0biU5K.jpg


And it's even more horrific than I imagined: These cards would buckle at 1440p and 4k WITHOUT RT

I5WHete.jpg


It's like my wife at AMD said:


 

Mr.Phoenix

Member
PS5 punches above its weight because of its I/O that streams data up to 22gbs with oodle compression and the industry’s best developers.
damn.... smh.

I give up. there really is no point, even when faced with actual data and knowing how the PS5 runs the exact same game... you guys are still here talking about something punching above its weight. Like that's got anything to do with the argument.

You know what the specs of the PS5 are. You know what presets R&C on the PS5 runs at. Now compare that to the GPUs I listed out all paired with a 13900k CPU and 32GB or system RAM. If you guys cant see it for yourselves then there is really no need for any of this.
 

S0ULZB0URNE

Member
damn.... smh.

I give up. there really is no point, even when faced with actual data and knowing how the PS5 runs the exact same game... you guys are still here talking about something punching above its weight. Like that's got anything to do with the argument.

You know what the specs of the PS5 are. You know what presets R&C on the PS5 runs at. Now compare that to the GPUs I listed out all paired with a 13900k CPU and 32GB or system RAM. If you guys cant see it for yourselves then there is really no need for any of this.
So you don't understand what fast data streaming does.
 

Gaiff

SBI’s Resident Gaslighter
damn.... smh.

I give up. there really is no point, even when faced with actual data and knowing how the PS5 runs the exact same game... you guys are still here talking about something punching above its weight. Like that's got anything to do with the argument.

You know what the specs of the PS5 are. You know what presets R&C on the PS5 runs at. Now compare that to the GPUs I listed out all paired with a 13900k CPU and 32GB or system RAM. If you guys cant see it for yourselves then there is really no need for any of this.
He isn't entirely wrong though. Rift Apart on PC is quite heavy on memory and can choke out 10GB cards even at 1440p. It's actually one of the rare games where decreasing the AF from 16x to 8x nets a significant reduction on the memory footprint, and I've even seen benchmarkers recommend it for 8GB card which is almost unheard of. AF has been basically free on PC for a few years now. Typically, you'd think bandwidth would be the biggest issue but in this game, sheer VRAM amount matters too. You can see the 3090 roflstomping the 3080 by 60% at 4K when it's typically like 10-15% faster. And lo and behold, it's also only 11% faster than the 12GB 4070 which is most of the time around 3080-tier.

I also firmly believe the PS5's memory management is much better optimized than on PC which leads it to being able to cruise at 4K/40Hz rather easily.

I do have have a 2080 Ti and it has no trouble at PS5 settings but exceeding that 11GB isn't particularly difficult at 4K with all those settings cranked up. An 8GB 2070S would be stutter city, even with Performance Mode settings. You need to drop the textures to High and the AF to 8x if I remember correctly.

The smartest thing this generation of consoles and the last one did was making sure they had enough memory. 16GB with 13.5GB usable should be plenty most of the time. From what developers said, it's also significantly easier to manage memory on consoles. They didn't repeat the mistake of allocating paltry amounts of memory even for the time like the PS360 had.

But yeah, 8K output is still silly, even from a significantly lower base resolution. I wouldn't even entertain it. We laughed at it when they mentioned it for a 3090 and that thing matches a 7900 XT in RT workload and can even beat it with a lot of RT.
 
Last edited:

FireFly

Member
I have one but you don’t need it to see how impressive it looks because it’s textures and not resolution, meaning it downsamples. If it were resolution as well it’d look even better.
Right, but the debate was about whether an 8K output resolution is worthwhile, not whether using 8K textures is worthwhile. No (resolution) downsampling is going on unless you are running the UE5 demo on PC at 8K and displaying it on a 4K or lower screen. If the existing 4K console demo looks good, it has nothing to do with the benefits of an 8K output.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
PS5 only titles are.
Maybe not using it to the fullest but they are taking advantage of the I/O.
The bottleneck moved to how the game makes use of the I/O capabilities: ok now you can stream GBs super quickly, but can your game engine consume it all?

This is where a Pro console could help with a bit more CPU and RAM bandwidth grunt (as well as a bit more RAM if the rumours of additional OS dedicated RAM is true, like they did for PS4 Pro freeing more RAM) making it easier on devs… but then again these improvements would not help base consoles performance so we need devs to spend more time wrestling the API’s provided and upgrading their engines.
With XSX2 and PS6 only exclusive titles we may take advantage of the extra grunt these machines have without remorse… ah, generations ;).
 

bitbydeath

Member
Right, but the debate was about whether an 8K output resolution is worthwhile, not whether using 8K textures is worthwhile. No (resolution) downsampling is going on unless you are running the UE5 demo on PC at 8K and displaying it on a 4K or lower screen. If the existing 4K console demo looks good, it has nothing to do with the benefits of an 8K output.
If you can see the difference in textures then that would apply to resolution as well.
 

Gaiff

SBI’s Resident Gaslighter
PS5 only titles are.
Maybe not using it to the fullest but they are taking advantage of the I/O.
Sure they do but they don't stretch the I/O bandwidth even close to its limits. Hell, there are a million parts of the pipeline that would bottleneck before the I/O. If anything, it's overkill on PS5.
 

FireFly

Member
If you can see the difference in textures then that would apply to resolution as well.
If a demo or video is running at 4K, then it only contains 4K worth of information per frame. So any "difference" you see has to be related to the quality of the image and not any extra information. And if you are sitting at a distance where your eye can only resolve 4K worth of detail, a higher resolution will be wasted. (Unless downsampling down to 4K for SSAA)
 

Forth

Member
Is it possible this new upscaling method could apply to everything or will the games need patching?
I ask because the thought of RDR2 looking better is something I'd like to happen.
 
If you can see the difference in textures then that would apply to resolution as well.

Internal Render Resolution and Texture Resolution are two different things.

For example the 8K textures used in the UE5 demo were at an 8K resolution independent of the internal and output render resolution. Why? Because it allows them to be much more highly detailed, especially when you begin to zoom in as one of the posters here pointed out.
 
Is it possible this new upscaling method could apply to everything or will the games need patching?
I ask because the thought of RDR2 looking better is something I'd like to happen.

Rumour is, that a new version is on its way for ps5 and series consoles next year.

Would think if the pro is also out that year. Sony will push the same line about including pro settings. As they did with the previous pro and the release year

I think it was rated in Korea, or something similar
 

Brucey

Member
Hey remember when this gen was announced and Xbox was all like “We got that MS Machine Learning algo” and everybody was writing fan fiction online about a DLSS like solution from them?

Been three years.
Remember when Xbox was going to run away with the premier cloud streaming experience because of the MS datacenter infrastructure advantage? Turns out hosting enterprise office apps and servers didn't really amount to a hill of beans when it came to delivering streaming games.
 

bitbydeath

Member
Internal Render Resolution and Texture Resolution are two different things.

For example the 8K textures used in the UE5 demo were at an 8K resolution independent of the internal and output render resolution. Why? Because it allows them to be much more highly detailed, especially when you begin to zoom in as one of the posters here pointed out.
But are you really expecting games to have 8K resolution and not 8K textures?
 

PeteBull

Member
As a big playstation fan( altho i had og xbox, it had amazing games, so ofc i bought it :p ) atm it looks like microsoft is barely competing, especially outside of US no1 has/plays xbox anymore, true competition for sony is pc and maybe even more likely upcoming switch succesor, those at least have some srs exclusives of high quality.

So w/e microsoft gonna do, if its streaming aproach or not, it looks like it will be irrelevant from the get go, just like artrocities called google stadia or amazon luna(took me a while to remember/find promotional video for this shit :p )
 
Last edited:

bitbydeath

Member
If a demo or video is running at 4K, then it only contains 4K worth of information per frame. So any "difference" you see has to be related to the quality of the image and not any extra information. And if you are sitting at a distance where your eye can only resolve 4K worth of detail, a higher resolution will be wasted. (Unless downsampling down to 4K for SSAA)
I feel like this is going off-track. Did you see a difference in the UE5 video texture wise or not?
 

FireFly

Member
I feel like this is going off-track. Did you see a difference in the UE5 video texture wise or not?
The discussion about texture differences is off track, because the topic was whether the PS5 Pro outputing at 8K would be worthwhile.

No one was claiming that 8K textures are pointless, and as I've pointed out it's perfectly compatible to claim that 8K textures make a difference, while an 8K output does not, at a given viewing distance – where your eye can only resolve 4K of detail. Again, you yourself seem to be highlighting demos that were only run in 4K! So, the response is, yes that 4K UE5 demo looks really good, and if it was higher resolution, I wouldn't notice unless I sat sufficiently close to the screen.
 

S0ULZB0URNE

Member
The bottleneck moved to how the game makes use of the I/O capabilities: ok now you can stream GBs super quickly, but can your game engine consume it all?

This is where a Pro console could help with a bit more CPU and RAM bandwidth grunt (as well as a bit more RAM if the rumours of additional OS dedicated RAM is true, like they did for PS4 Pro freeing more RAM) making it easier on devs… but then again these improvements would not help base consoles performance so we need devs to spend more time wrestling the API’s provided and upgrading their engines.
With XSX2 and PS6 only exclusive titles we may take advantage of the extra grunt these machines have without remorse… ah, generations ;).
Pro will help more with brute forcing un-optimized 3rd party games that can't be programmed to take advantage of the I/O as the others don't have it.
 

King Dazzar

Member
while an 8K output does not, at a given viewing distance – where your eye can only resolve 4K of detail
There's always been debate around 8k, 4k and even 1080p with regards viewing differences together with size of panel. But I will say that a 4k upscale on an 8k panel, for my eyes, gives more refinement to detail. I would assume due to pixel density. And that's before you start sending the panel native 8k content. I notice it instantly when I go to view 4k panels, having got used to it.

I didn't read all the conversation. But my understanding is that texture resolution is nothing at all to do with image output resolution.
 
The bottleneck moved to how the game makes use of the I/O capabilities: ok now you can stream GBs super quickly, but can your game engine consume it all?

This is where a Pro console could help with a bit more CPU and RAM bandwidth grunt (as well as a bit more RAM if the rumours of additional OS dedicated RAM is true, like they did for PS4 Pro freeing more RAM) making it easier on devs… but then again these improvements would not help base consoles performance so we need devs to spend more time wrestling the API’s provided and upgrading their engines.
With XSX2 and PS6 only exclusive titles we may take advantage of the extra grunt these machines have without remorse… ah, generations ;).
On PC you need CPU with double power of PS5 CPU to run at similar framerate in the I/O heavy games like Spider-man, Ratchet or even TLOU. When you see benchmark PC vs PS5 the CPU must be taken into the equation. Very hard to directly compare GPUs, here a super 2070 should be enough in theory to compete against PS5 GPU. Possible in multiplatform games, but impossible in I/O intensive first party games, as expected.
 
Last edited:

bitbydeath

Member
The discussion about texture differences is off track, because the topic was whether the PS5 Pro outputing at 8K would be worthwhile.

No one was claiming that 8K textures are pointless, and as I've pointed out it's perfectly compatible to claim that 8K textures make a difference, while an 8K output does not, at a given viewing distance – where your eye can only resolve 4K of detail. Again, you yourself seem to be highlighting demos that were only run in 4K! So, the response is, yes that 4K UE5 demo looks really good, and if it was higher resolution, I wouldn't notice unless I sat sufficiently close to the screen.
You don’t get one without the other. 8K textures are part and parcel with 8K resolution.
 
Last edited:

Zathalus

Member
On PC you need CPU with double power of PS5 CPU to run at similar framerate in the I/O heavy games like Spider-man, Ratchet or even TLOU. When you see benchmark PC vs PS5 the CPU must be taken into the equation. Very hard to directly compare GPUs, here a super 2070 should be enough in theory to compete against PS5 GPU. Possible in multiplatform games, but impossible in I/O intensive first party games, as expected.
Not sure where you are getting those CPU numbers from, but a 3600x holds roughly 60fps in both TLOU and Ratchet and close to 100fps in Spider-Man. You certainly don't need double the CPU performance.
 
Last edited:

ChiefDada

Gold Member
Sure they do but they don't stretch the I/O bandwidth even close to its limits. Hell, there are a million parts of the pipeline that would bottleneck before the I/O. If anything, it's overkill on PS5.

With that logic, the 4090 (and most modern GPUs) with it's 80TF of compute performance is overkill since other parts of the pipeline will bottleneck it as well. We'll hit PS5 i/o limit way before we hit 4090 limit.
 

Gaiff

SBI’s Resident Gaslighter
With that logic, the 4090 (and most modern GPUs) with it's 80TF of compute performance is overkill since other parts of the pipeline will bottleneck it as well. We'll hit PS5 i/o limit way before we hit 4090 limit.
The insane compute is a byproduct of these cards being discarded ML chips where NVIDIA makes the real money. It just turns out that they're also really good at gaming so yes, you are correct.
 
Last edited:

leizzra

Member
Textures work differently. It’s not like you need a 4K texture for a 4K game. It all comes to how big it’ll be at the screen. In theory 4k texture should take you the whole screen (then again 4096x4096 vs 3840x2160) so that it’s pixel to pixel. Of course it mostly doesn’t happen, and not the whole texture is being used equally (like on characters).

We still don’t have 4K textures in most of the 4K games (especially on consoles). Many times they aren’t needed for great results. Especially when you are using tillable materials mixed with lower res texture sets.
 

FireFly

Member
You don’t get one without the other. 8K textures are part and parcel with 8K resolution.
Of course you do. The UE5 demos are using 8K textures but outputting at a 4K resolution. And your own claim is that you can notice the difference in these demos.
 
Last edited:

bitbydeath

Member
I don't have a demo with 4K textures to compare. But this is completely irrelevant to the point that the UE5 demos are using 8K textures but outputting at a 4K resolution.

So the claim that "You don’t get one without the other" is wrong.
Gotta admit, I am lost.
Thought we were talking about graphics being more impressive in 8K, which includes both textures and resolution.

Your argument AFAIK, is that it’s not.
 

FireFly

Member
Gotta admit, I am lost.
Thought we were talking about graphics being more impressive in 8K, which includes both textures and resolution.

Your argument AFAIK, is that it’s not.
That's not my argument. 8K clearly provides more information, and at a given viewing distance you can perceive that extra information. An 8K texture can be displayed "natively" on an 8K screen, so you can see detail you wouldn't be able to see on a 4K screen.

My argument is simply that beyond the viewing distance at which your eye can resolve this "extra" information, 8K is not of benefit (barring SSAA via downsampling). If your eye can only "see" 4K worth of detail, it doesn't matter that you have a giant 8K texture blown up on the screen. I don't really see how you could disagree with this, unless you have a different understanding of how the human eye works.
 

bitbydeath

Member
That's not my argument. 8K clearly provides more information, and at a given viewing distance you can perceive that extra information. An 8K texture can be displayed "natively" on an 8K screen, so you can see detail you wouldn't be able to see on a 4K screen.

My argument is simply that beyond the viewing distance at which your eye can resolve this "extra" information, 8K is not of benefit (barring SSAA via downsampling). If your eye can only "see" 4K worth of detail, it doesn't matter that you have a giant 8K texture blown up on the screen. I don't really see how you could disagree with this, unless you have a different understanding of how the human eye works.
Well, the 8K textures on the UE5 demo show noticeable improvements, even at 4K. Maybe i’m just built differently. 🤷‍♂️
 

FireFly

Member
Well, the 8K textures on the UE5 demo show noticeable improvements, even at 4K. Maybe i’m just built differently. 🤷‍♂️
If your eyes can resolve more than 4K at your viewing distance from your screen, you would see the benefit of an 8K UE5 demo. If they can't, you won't.

Either way, you won't find out until you test with 8K content on an 8K screen.
 

ChiefDada

Gold Member
My argument is simply that beyond the viewing distance at which your eye can resolve this "extra" information, 8K is not of benefit (barring SSAA via downsampling). If your eye can only "see" 4K worth of detail, it doesn't matter that you have a giant 8K texture blown up on the screen. I don't really see how you could disagree with this, unless you have a different understanding of how the human eye works.

This is true in most cases for live action content, but it's not the case for real time video games. There is image break up even at native 4k output on a 4k screen because of the limitations of real time rendering. This is why the uplift of 4k to 8k in a video game will be far more apparent than the same resolution bump from a 4k to 8k blue ray movie.
 

King Dazzar

Member
Can someone explain to me why you guys think an 8k texture (which is used in a 3D space i.e my understanding is 8k means a 8192x8192px square. 4k means a 4096x4096px square), which can then be wrapped around a 3D mesh. Somehow needs an 8k TV for its increase in detail to be perceived? This does not equate to 2d monitor/TV display resolutions, which arent even squares, they are rectangular. When we talk about texture resolution, its completely different from Tv resolutions. At least thats always been my understanding.

Shouldn't the PS5 Pro debate be around the entire image as a whole either benefitting or not from the 8k output resolution. And nothing at all to do with the wrong interpretation of what "8k" actually means when it comes to textures?
 
Last edited:
If your eyes can resolve more than 4K at your viewing distance from your screen, you would see the benefit of an 8K UE5 demo. If they can't, you won't.

Either way, you won't find out until you test with 8K content on an 8K screen.
you can downsample 8k to your 4k display, and if you notice an improvement, you'll also notice an improvement moving to a native 8k display... even though the improvement from a native 8k display will be a lot better.
 

bitbydeath

Member
Can someone explain to me why you guys think an 8k texture (which is used in a 3D space i.e my understanding is 8k means a 8192x8192px square. 4k means a 4096x4096px square), which can then be wrapped around a 3D mesh. Somehow needs an 8k TV for its increase in detail to be perceived? This does not equate to 2d monitor/TV display resolutions, which arent even squares, they are rectangular. When we talk about texture resolution, its completely different from Tv resolutions. At least thats always been my understanding.

Shouldn't the PS5 Pro debate be around the entire image as a whole either benefitting or not from the 8k output resolution. And nothing at all to do with the wrong interpretation of what "8k" actually means when it comes to textures?
It’s more that 8K resolutions would be needed to drive the usage of 8K textures, carriage, horse and all that.
 
It’s more that 8K resolutions would be needed to drive the usage of 8K textures, carriage, horse and all that.
8k displays arent needed for 8k textures.
tv/monitor/display resolution is completely independent from texture resolution.
this is why you can select "high res textures" in a game, but the game still lets you select any display resolution you want, including very low ones.

think of textures like stickers in real life. you slap them on objects.
in games, you can "scale" textures, which means you can make the sticker as big or small as you like.
high res textures start as big stickers.
low res textures start as small stickers.

if you stretch a small sticker into a big sticker, it'll get blurry. this is why you want to be able to use big stickers.

sometimes objects (with stickers on them) get close to your screen. big stickers have more detail, so they'll look better up close.
 

bitbydeath

Member
8k displays arent needed for 8k textures.
tv/monitor/display resolution is completely independent from texture resolution.
this is why you can select "high res textures" in a game, but the game still lets you select any display resolution you want, including very low ones.

think of textures like stickers in real life. you slap them on objects.
in games, you can "scale" textures, which means you can make the sticker as big or small as you like.
high res textures start as big stickers.
low res textures start as small stickers.

if you stretch a small sticker into a big sticker, it'll get blurry. this is why you want to be able to use big stickers.

sometimes objects (with stickers on them) get close to your screen. big stickers have more detail, so they'll look better up close.
I agree, but more devs would feel obligated to use 8K textures if the resolution existed.
 

ChiefDada

Gold Member
MLiD coming to identical conclusions on my view that RDNA 3 refresh/PS5 Pro GPU will expose RDNA 3 as underperforming architecture.






Also agreed with him on Zen 2 capability for PS5 Pro

 

ChiefDada

Gold Member
Look at that framerate in high fps mode, it dips below 80 very often not coz of gpu bottleneck, but coz cpu is far from enough to hold stable 120, gpu gap between base ps4 and base ps5 is close to 6x difference(1,8tf vs 10,2tf, even if we dont account for architectural imrpovements which obviously arent small), but cpu gap is at max 3x, otherwise we wouldnt get dips under 80 in high fps mode.

PS4 Pro runs GoW Ragnarok at 45fps average in performance mode and can go as high as 55fps. On shit Jaguar CPU. Believe me we are GPU bottlenecked with GoW Ragnarok and likely all cross-gen games that don't feature RT. Base PS5 could run the game at locked 120 if they reduced visuals or maybe even just resolution to 1080p. Zen 2 is sleeping in Ragnarok.

 

Audiophile

Member
One thing I'm concerned with is memory bandwidth. RT as I understand it is extremely intensive when it comes to bandwidth. If we're seeing something like ~2.25x RT capabilities in addition to other areas, I wonder how a bump to ~576GB/s (18gbps chips) from 448GB/s (14gbps chips) is gonna cut it. I expect some fat cache will be needed for starters..
 

Loxus

Member
MLiD coming to identical conclusions on my view that RDNA 3 refresh/PS5 Pro GPU will expose RDNA 3 as underperforming architecture.






Also agreed with him on Zen 2 capability for PS5 Pro


It's just his specualtion on if that rumor is true.
He even says it himself.

If you watch enough of his videos, you'll notice he believes the PS5 Pro is using RDNA4.
 

ChiefDada

Gold Member
One thing I'm concerned with is memory bandwidth. RT as I understand it is extremely intensive when it comes to bandwidth. If we're seeing something like ~2.25x RT capabilities in addition to other areas, I wonder how a bump to ~576GB/s (18gbps chips) from 448GB/s (14gbps chips) is gonna cut it. I expect some fat cache will be needed for starters..

If the idea is to keep the same ray quality/ray count as base PS5 fidelity mode but just increase frames then I don't think they have a bandwidth problem just a compute bottleneck that PS5 Pro would be able to solve. Just my guess I could be wrong.


It's just his specualtion on if that rumor is true.
He even says it himself.

If you watch enough of his videos, you'll notice he believes the PS5 Pro is using RDNA4.

Oh for sure he is speculating but I agree with his reasoning I think he makes valid points similar to what I've said before about GPU in comparison to RDNA 3

The architectural benefits AMD assumed would achieve with RDNA 3 by going chiplet didn't pan out and PS5 Pro will expose this. PS5 Pro monolithic so it will be more power efficient.

and effectiveness of Zen 2 CPU

Yes you become MORE cpu bottlenecked with more RT, but base PS5 is GPU bottlenecked in like 95% of games fidelity and performance modes. Look at Hardware Unboxed CPU benchmarks for Spiderman Remastered using a 3090 as constant GPU. At 4K resolution with High RT, the Ryzen 3600 is averaging well over 70fps. Remember PS5 CPU performs MUCH better primarily because of i/o and decompression offloading. We are GPU bottlenecked by FAR. Even in Performance RT modes we are still heavily GPU bottlenecked. And this is mainly caused by limited RT HW in RDNA2. CPU isn't the issue. Sony is right to focus on RT and upscaling. The more I think about it, I wonder what the hell they are going to use all of this extra compute for if RT traversal and AI upscaling is true.
 

PeteBull

Member
PS4 Pro runs GoW Ragnarok at 45fps average in performance mode and can go as high as 55fps. On shit Jaguar CPU. Believe me we are GPU bottlenecked with GoW Ragnarok and likely all cross-gen games that don't feature RT. Base PS5 could run the game at locked 120 if they reduced visuals or maybe even just resolution to 1080p. Zen 2 is sleeping in Ragnarok.


Oh i know very well how ps4pr0 runs ragnaraok, and its 45 fps avg or as high as 55 doesnt matter, coz its still cpu bottlenecked, in cpu bottlenecked scenarios it doesnt even hit 40 :p
Edit: afaik not a single game that runs 30 or 30 with drops on base ps4 that can hold stable 120 in high fps mode on ps5, only targer 120, but with big dips :p

Pretty sure same thing on xbox series consoles, they only can get to stable 120 modes in games that run 60 on last gen, good example is halo infinite, multiplayer mode runs 120 on xbox series(even series s, on smaller maps), but campain- well, u can see it all in DF's vid, keep in mind its not testing at launch but season 2, when devs had plenty time for optimisation/patches xD
big focus on pc and series s campain there, but few snippets of xsx 120fps mode dipping in open worlds when there isnt much going on, imagine how nasty it gets when its durning driving vehicles or big shootouts :p
we can see more of those nasty fps drops in halo infinite 120fps mode on xsx there, again not driving when its worst, notice how john tries to dmg control it all, i remember back in the days when they discovered mario kart dips 1 frame every now and then they made huge deal out of it, for comparision =D

Edit:
Here is how cpu bottleneck looks, even on s2 halo infinite, ofc DF didnt do this test coz it would show reality, aka improvements were made but it still is far from stable 120fps(dips under 100 in cpu intentive scenarios)
But u can see it for urself, btw in exactly same mission xbox one x(so weaksauce jaguar cpu, last gen) was getting 50 to 60fps on performance mode.
All in all proof is there, cpu's in those next gen consoles arent even 3x stronger from last gen, from what i see around 2,5x stronger only.
 
Last edited:

DeepEnigma

Gold Member
Well, the 8K textures on the UE5 demo show noticeable improvements, even at 4K. Maybe i’m just built differently. 🤷‍♂️
If your eyes can resolve more than 4K at your viewing distance from your screen, you would see the benefit of an 8K UE5 demo. If they can't, you won't.

Either way, you won't find out until you test with 8K content on an 8K screen.
Downsampling is a real thing that improves fidelity and IQ PC gamers and select console games were doing years and years before DLSS upscaling ML was a thing.

So yes, you can notice a fidelity bump of downsampled 8K assets over 4K native or less ones on 4K sets. Artifacts (or lack there of) are a major improvement.
 
Last edited:
Top Bottom