• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: PS5 Pro Codenamed "Trinity" targeting Late 2024 (some alleged specs leaked)

Would you upgrade from your current PS5?

  • For sure

    Votes: 377 41.0%
  • Probably

    Votes: 131 14.2%
  • Maybe

    Votes: 127 13.8%
  • Unlikely

    Votes: 140 15.2%
  • Not a chance

    Votes: 145 15.8%

  • Total voters
    920

FireFly

Member
Downsampling is a real thing that improves fidelity and IQ PC gamers and select console games were doing years and years before DLSS upscaling ML was a thing.

So yes, you can notice a fidelity bump of downsampled 8K assets over 4K native or less ones on 4K sets. Artifacts (or lack there of) are a major improvement.
I was talking about viewing the content natively at 8K rather than downsampling to 4K.

But I acknowledge that you will see fewer artifacts from running at the higher resolution, which I overlooked. So there is some image quality benefit, if not a detail benefit.
 
Resolution is pointless if assets are low resolution or/and everything becomes vaseline in motion. Currently we game at 1080p - 1440p (native) on PS5 which is 4 times / 2.25 times less pixels than native 4K. There is some big margin of improvement before hitting native 4K. We probably won't even have native 4K games on PS5 Pro, notably at 60fps.
One thing I'm concerned with is memory bandwidth. RT as I understand it is extremely intensive when it comes to bandwidth. If we're seeing something like ~2.25x RT capabilities in addition to other areas, I wonder how a bump to ~576GB/s (18gbps chips) from 448GB/s (14gbps chips) is gonna cut it. I expect some fat cache will be needed for starters..
Yes we hope they are going to restructure GPU caches and Shader Array setup, which they will on RDNA3.5 tech. I don't think bandwidth will be a big concern.
 

Panajev2001a

GAF's Pleasant Genius
Resolution is pointless if assets are low resolution or/and everything becomes vaseline in motion. Currently we game at 1080p - 1440p (native) on PS5 which is 4 times / 2.25 times less pixels than native 4K. There is some big margin of improvement before hitting native 4K. We probably won't even have native 4K games on PS5 Pro, notably at 60fps.

Yes we hope they are going to restructure GPU caches and Shader Array setup, which they will on RDNA3.5 tech. I don't think bandwidth will be a big concern.
I think that things such as ray reconstruction and ray coherency sorting (or BVH traversal in HW) are at most not increasing bandwidth but improving efficiency if not reducing bandwidth actually in ideal conditions.

Bandwidth will be improved, so will in chip caches, but I think focus is more on balancing the chip better between rasterisation and ray tracing. More than aiming at peak usage of every single FLOPS they add I think the focus is giving enough power for some of the times PS5 meets a bottleneck to power through. It is possible that peak performance might overstrip bandwidth like it did on PS4 Pro… maybe they have found some clever way not to repeat the same design “flaw” of PS4 Pro (I am not sure it was a priority for them at the time), but I see PS6 to be the place where they focus on performance and efficiency over the long term even in ways that may require PS6 specific coding / engine changes (benefit exclusive titles the most).
They would then analyse any remaining bottleneck in PS6 with the first year or two of developers’ time spent with PS6 gathering their feedback / collecting performance profiles to help them design PS6 Pro.

I think their Pro strategy is based on identifying real world dev stressed issues/bottlenecks and experiment with some ideas or parts of an idea they have for the next generation console to battle test them. On top of that they probably have an agreement with AMD and TSMC that makes it cheaper to do a Pro design and a base console SoC shrink together.
 

Mahavastu

Member
If your eyes can resolve more than 4K at your viewing distance from your screen, you would see the benefit of an 8K UE5 demo. If they can't, you won't.

Either way, you won't find out until you test with 8K content on an 8K screen.

I always will love this video where 2 dudes look from like 2cm from the screens and can not for sure say which screen is 4k and which 8k.

I am pretty sure that 8k for normal screen sizes is overrated. My main screen is 125 inch, and 4k looks great, even from 2m away...

 

King Dazzar

Member
I always will love this video where 2 dudes look from like 2cm from the screens and can not for sure say which screen is 4k and which 8k.

I am pretty sure that 8k for normal screen sizes is overrated. My main screen is 125 inch, and 4k looks great, even from 2m away...


I guarantee you that if you hook up a PC and run the desktop at 8k on a 125" screen from 2m away, you'll see an extra crispness to text. As long as the 8k image is being displayed natively. Do I choose 8k 60fps at 420 over 4k 120hz at 444 on my 85" 8k panel? Personally I'd generally rather have the smoother 4k 120hz image. But to say there is no perceivable difference is nonsense. And my eyes are ancient!
 

welshrat

Member
I always will love this video where 2 dudes look from like 2cm from the screens and can not for sure say which screen is 4k and which 8k.

I am pretty sure that 8k for normal screen sizes is overrated. My main screen is 125 inch, and 4k looks great, even from 2m away...



Yeah, that is a great video and basically how I feel about it all. The last 4 TV's I have had are 65 inch 4k LED (latest is SONY 95K mini LED ) and next year I will jump on the OLED or if Sony improve the Mini LED further I may stick (still concerned somewhat about motion and screen burn with OLED. Anyhow 8k is a waste unless we get really big and ideally we need better upscaling or native 4k gaming rather than worrying about 8k nonsense.
 

King Dazzar

Member
Yeah, that is a great video and basically how I feel about it all. The last 4 TV's I have had are 65 inch 4k LED (latest is SONY 95K mini LED ) and next year I will jump on the OLED or if Sony improve the Mini LED further I may stick (still concerned somewhat about motion and screen burn with OLED. Anyhow 8k is a waste unless we get really big and ideally we need better upscaling or native 4k gaming rather than worrying about 8k nonsense.
For what its worth. The only reason I went with an 8k panel was due to the pixel density for 4k upscaling at 85". And the fact that high end LCD's which have the extra luminance are usually gated around 8k panels. It wasnt for actual 8k content. Theres so little of it and then when we get into gaming, the horrendous extra power required isnt ever going to be worth the trade off. I truly believe 4k is the sweet spot resolution wise. Having said that, I'd miss my 4k upscaled to 8k.
 

welshrat

Member
For what its worth. The only reason I went with an 8k panel was due to the pixel density for 4k upscaling at 85". And the fact that high end LCD's which have the extra luminance are usually gated around 8k panels. It wasnt for actual 8k content. Theres so little of it and then when we get into gaming, the horrendous extra power required isnt ever going to be worth the trade off. I truly believe 4k is the sweet spot resolution wise. Having said that, I'd miss my 4k upscaled to 8k.
Fair enough. Largest I will go is 77 so I think 4k will be fine for me for the next 5 years.
 

PeteBull

Member

Once again good info/confirmation how strong base ps5 gpu performance is like, roughly rtx 2070s.
We will benefit from midgen upgrade a ton.
Remember thats how 2070s performance looks on avg compared to other/stronger cards on the pc market, by now 3 years into this gen, its barely enough.

That downclocked/downvolted rx 6800xt in ps5pr0 cant come soon enough =D
 

Once again good info/confirmation how strong base ps5 gpu performance is like, roughly rtx 2070s.
We will benefit from midgen upgrade a ton.
Remember thats how 2070s performance looks on avg compared to other/stronger cards on the pc market, by now 3 years into this gen, its barely enough.

That downclocked/downvolted rx 6800xt in ps5pr0 cant come soon enough =D

That's with RT. Back then 2070s were compared using raster performance. Beginning of the gen DF were saying PS5 RT = 2060 RT.
 
Last edited:

PeteBull

Member
That's with RT. Back then 2070s were compared using raster performance. Beginning of the gen DF were saying PS5 RT = 2060 RT.
Ofc, but i used this particular vid coz its 3rd party publisher, and game is solid, technically, on both machines + its really a looker, hell it looks better from most of 1st party games too, that was usually very uncommon thing in previous gen's.
This gen 3rd parties making their games on pair/betterlooking from 1st party studios, ofc big cudos to those devs, but on the other hand i cant happen but wonder wtf happened to talent in those first party studios that 3rd parties matching/surpassing them in graphical fidelity right now.
 
It's just his specualtion on if that rumor is true.
He even says it himself.

If you watch enough of his videos, you'll notice he believes the PS5 Pro is using RDNA4.
These people who keep saying zen 2 is fine are only looking at cross gen games without rt next gen games with rt are way more cpu demanding
 
Oh i know very well how ps4pr0 runs ragnaraok, and its 45 fps avg or as high as 55 doesnt matter, coz its still cpu bottlenecked, in cpu bottlenecked scenarios it doesnt even hit 40 :p
Edit: afaik not a single game that runs 30 or 30 with drops on base ps4 that can hold stable 120 in high fps mode on ps5, only targer 120, but with big dips :p

Pretty sure same thing on xbox series consoles, they only can get to stable 120 modes in games that run 60 on last gen, good example is halo infinite, multiplayer mode runs 120 on xbox series(even series s, on smaller maps), but campain- well, u can see it all in DF's vid, keep in mind its not testing at launch but season 2, when devs had plenty time for optimisation/patches xD
big focus on pc and series s campain there, but few snippets of xsx 120fps mode dipping in open worlds when there isnt much going on, imagine how nasty it gets when its durning driving vehicles or big shootouts :p
we can see more of those nasty fps drops in halo infinite 120fps mode on xsx there, again not driving when its worst, notice how john tries to dmg control it all, i remember back in the days when they discovered mario kart dips 1 frame every now and then they made huge deal out of it, for comparision =D

Edit:
Here is how cpu bottleneck looks, even on s2 halo infinite, ofc DF didnt do this test coz it would show reality, aka improvements were made but it still is far from stable 120fps(dips under 100 in cpu intentive scenarios)
But u can see it for urself, btw in exactly same mission xbox one x(so weaksauce jaguar cpu, last gen) was getting 50 to 60fps on performance mode.
All in all proof is there, cpu's in those next gen consoles arent even 3x stronger from last gen, from what i see around 2,5x stronger only.

Thank you for discussing this
 
Resolution is pointless if assets are low resolution or/and everything becomes vaseline in motion. Currently we game at 1080p - 1440p (native) on PS5 which is 4 times / 2.25 times less pixels than native 4K. There is some big margin of improvement before hitting native 4K. We probably won't even have native 4K games on PS5 Pro, notably at 60fps.

Yes we hope they are going to restructure GPU caches and Shader Array setup, which they will on RDNA3.5 tech. I don't think bandwidth will be a big concern.
Games like avatar won’t be native 4k 60 on pro but games like rift apart horizon far cry racing games would
 

Gaiff

SBI’s Resident Gaslighter
That's with RT. Back then 2070s were compared using raster performance. Beginning of the gen DF were saying PS5 RT = 2060 RT.
It's a pretty good showing. Consoles have specific optimizations not on PC. Could explain why they manage to match the RTX 2070S in a game featuring quite a bit of ray tracing.
 

leizzra

Member
As I already wrote it's not like the same day/a few days after the dev kits arrive you'll get any info. Dev kits are in small numbers and mostly for rendering programmers for debugging code. Those people won't probably leak anything, even if they know the details (as I stated earlier they may even don't care about the spec - nuts right, but this was the case with few programmers that I know, and one of them is a top dog). The best thing we can have is a photo of the dev kit made by someone else. Even then we won't know the spec because it's not like it's written on the dev kit.

I bet that more likely would be that a small indie dev could do that (because it's more likely that other co-workers know that the studio has one) but probably they don't have it still.

If I remember correctly with PS5 we didn't had a good leaks until Mark Cerny presentation. We had dev kit photos a few month before that though.
 
It's a pretty good showing. Consoles have specific optimizations not on PC. Could explain why they manage to match the RTX 2070S in a game featuring quite a bit of ray tracing.
Maybe there will come a day with complete optimization where the console can equal a 2080 in rt but that may be getting hopes up a bit
 

FingerBang

Member
I always will love this video where 2 dudes look from like 2cm from the screens and can not for sure say which screen is 4k and which 8k.

I am pretty sure that 8k for normal screen sizes is overrated. My main screen is 125 inch, and 4k looks great, even from 2m away...


What are you compensating for?
 

HeisenbergFX4

Gold Member

I can't take him seriously (and sadly I used to)

Watched like 30 seconds and jumped ahead and he claims his sources tell him Xbox is going to start next gen earlier than the slide showing 2028

If dude was on GAF he would have known that literal months ago

NOW his sources tell him this?

Reaction GIF
 

MikeM

Member
Resolution is pointless if assets are low resolution or/and everything becomes vaseline in motion. Currently we game at 1080p - 1440p (native) on PS5 which is 4 times / 2.25 times less pixels than native 4K. There is some big margin of improvement before hitting native 4K. We probably won't even have native 4K games on PS5 Pro, notably at 60fps.

Yes we hope they are going to restructure GPU caches and Shader Array setup, which they will on RDNA3.5 tech. I don't think bandwidth will be a big concern.
I don’t think we need native 4k. Upscaling works well but not with how low some of these games run their game base resolution at.
 
I can't take him seriously (and sadly I used to)

Watched like 30 seconds and jumped ahead and he claims his sources tell him Xbox is going to start next gen earlier than the slide showing 2028

If dude was on GAF he would have known that literal months ago

NOW his sources tell him this?

Reaction GIF
His source is Kepler's tweets anybody can read. Not clicking on that.
 

Thirty7ven

Banned
Think people need to cool it with the custom ML processor, and dedicated RT cores.

I would look at the PS4 pro and extrapolate from that. Higher clocks, more compute, more bandwidth, and a tweak or two on the GPU side for devs who are willing to further optimize.

Talk about RT cores and ML processor does in fact sound like PS6 and not PS5 pro.
 

Mr.Phoenix

Member
Think people need to cool it with the custom ML processor, and dedicated RT cores.

I would look at the PS4 pro and extrapolate from that. Higher clocks, more compute, more bandwidth, and a tweak or two on the GPU side for devs who are willing to further optimize.

Talk about RT cores and ML processor does in fact sound like PS6 and not PS5 pro.
The ML processor? Yes, I will agree with you on that, simply because FSR doesn't need it. Would reconstruction be better with it? Absolutely, but FSR is good enough as is as long as the PS5pro keeps a base rez of 1440p in fidelity mode or 1080p in performance mode. The ML hardware, I can see that being a PS6 thing.

As for the RT stuff, that I believe is a shoo-in unless AMD is just that useless. I feel we are making it sound a lot more than it is. It's not dedicated RT cores, but more like 3rd-generation RT cores. RDNA2 had 1st gen, RDNA3 had 2nd gen, and RDNA4 will have 3rd gen. I believe that 3rd-gen RT from RDNA4 and 2nd-gen dual issue compute from RDNA4 are the things the RDNA3.5 PS5pro GPU will take from RDNA4.

As far as AMD GPUs go, the two stand-out areas that make them suck compared to Nvidia are RT and ML hardware. Not gonna hold my breath on the ML stuff as AMD seems to focus on their hardware-agnostic FSR, but as for RT, the simple reason AMD GPU RT is so bad is because they do not do BVH hardware acceleration. They let their shader cores do it. RDNA4 RT cores are expected to also accelerate the BVH tree which would make all the difference.

So a PS5pro having ~50% more RT cores, that are also clocked higher and faster because they natively handle BVH compute, would result in having RT that is over 3-4x better than what is in the PS5 currently.
 
Last edited:

ChiefDada

Gold Member
Think people need to cool it with the custom ML processor, and dedicated RT cores.

I would look at the PS4 pro and extrapolate from that. Higher clocks, more compute, more bandwidth, and a twak or two on the GPU side for devs who are willing to further optimize.

Talk about RT cores and ML processor does in fact sound like PS6 and not PS5 pro.

Nah, ML hardware is a lock with talks of 8k from Tom Henderson. How far they push with additional RT is the million dollar question.
 

onQ123

Member

Neo_game

Member
We learned that whatever PS5 Pro specs are the next Xbox is somehow 2X that so we shouldn't get excited about the PS5 Pro .

May be it is true but not sure Pro is the correct name for a console releasing 6yrs later even 2025 is late for a Pro IMO. It is more like a next gen console.
 

Panajev2001a

GAF's Pleasant Genius
as for RT, the simple reason AMD GPU RT is so bad is because they do not do BVH hardware acceleration. They let their shader cores do it. RDNA4 RT cores are expected to also accelerate the BVH tree which would make all the difference.
nVIDIA has implemented BVH traversal acceleration in HW and devs seem to vastly prefer that as an option (PS5 titles that want control over it will probably retain it, in some cases it might be advantageous… then again it is the beauty of consoles: allowing API’s to expose different facets of the HW more directly) and on top of that they are able to reorder workload in order to batch work together more effectively (ray coherency sorting, ensuring you process batches of rays going in the “same” direction, is a way to minimise execution divergence which is bad for HW efficiency). Ray reconstruction and HW accelerated ML image denoising help them reduce how many rays they have to throw (hopefully this is something there is HW acceleration for in PS5 Pro too, but PS6 is where the big new AI dream will truly blossom IMHO).

PowerVR ray coherency tracking and sorting: https://blog.imaginationtech.com/co...racing-the-benefits-of-hardware-ray-tracking/

nVIDIA solution (not sure if Apple implemented the whole PowerVR implementation but it is likely although they will just not divulge those details) needs more software assistance but it does a lot of the hard work in HW and yet it is more general: https://developer.nvidia.com/blog/i...frame-rates-with-shader-execution-reordering/ . This is likely what AMD is working on too.

Ray reconstruction (DLSS feature): https://www.nvidia.com/en-gb/geforce/news/nvidia-dlss-3-5-ray-reconstruction/

Apple with their PowerVR IP caught up to this level of RT capabilities (PowerVR actually was there years ago, before nVIDIA) and I think AMD with Sony will try to have something similar before RDNA4 full is out on desktop and PS5 Pro is a good product to try it all on (getting Sony to finance your R&D too).
 
Last edited:

RavionUHD

Member
Thinking about 8k on a PS5 Pro is completely absurd (for high fidelity games).
New games like Avatar run on 720P internally in Performance mode, a PS5 Pro could do the same in 1080P with the alleged specs.
 

onQ123

Member
May be it is true but not sure Pro is the correct name for a console releasing 6yrs later even 2025 is late for a Pro IMO. It is more like a next gen console.
X is the " Pro " model what we have is the smartphone / tablet rollout they might not sell much with each device but it disrupts the model that PlayStation is the king of & with Xbox hiding console sells soon it will look like PlayStation is falling off because people will not buy consoles so fast if they know another one is right around the corner.

Fake Edit: I thought I posted this hours ago but I didn't finish the post & don't even remember what I was saying so I deleted the other half lol
 

Hunnybun

Member
The ML processor? Yes, I will agree with you on that, simply because FSR doesn't need it. Would reconstruction be better with it? Absolutely, but FSR is good enough as is as long as the PS5pro keeps a base rez of 1440p in fidelity mode or 1080p in performance mode. The ML hardware, I can see that being a PS6 thing.

As for the RT stuff, that I believe is a shoo-in unless AMD is just that useless. I feel we are making it sound a lot more than it is. It's not dedicated RT cores, but more like 3rd-generation RT cores. RDNA2 had 1st gen, RDNA3 had 2nd gen, and RDNA4 will have 3rd gen. I believe that 3rd-gen RT from RDNA4 and 2nd-gen dual issue compute from RDNA4 are the things the RDNA3.5 PS5pro GPU will take from RDNA4.

As far as AMD GPUs go, the two stand-out areas that make them suck compared to Nvidia are RT and ML hardware. Not gonna hold my breath on the ML stuff as AMD seems to focus on their hardware-agnostic FSR, but as for RT, the simple reason AMD GPU RT is so bad is because they do not do BVH hardware acceleration. They let their shader cores do it. RDNA4 RT cores are expected to also accelerate the BVH tree which would make all the difference.

So a PS5pro having ~50% more RT cores, that are also clocked higher and faster because they natively handle BVH compute, would result in having RT that is over 3-4x better than what is in the PS5 currently.

I wonder if that amount of RT improvement would mean that a CPU bound game (ie as GTA VI is predicted to be) that has lots of RT could see the load on the CPU reduced enough that some kind of performance mode could become a possibility.
 

HeisenbergFX4

Gold Member
I wonder if that amount of RT improvement would mean that a CPU bound game (ie as GTA VI is predicted to be) that has lots of RT could see the load on the CPU reduced enough that some kind of performance mode could become a possibility.
Imagine the sales for these next consoles (no matter their naming) if they would be the only place to play GTA6 at 60 fps and current consoles struggling to maintain 30

Bravo Tv Pump Rules GIF by Slice
 

Bluntman

Member
Nvidia's full-featured RT hardware supports DXR Tier 1.1 inline raytracing.

RDNA 2 is missing early BVH subtree culling hardware. RDNA 2 RT hardware is the inferior implementation.

While DXR 1.1 is a lot better than DXR 1.0, it's nowhere near to PS5's custom API which brings (for example) RT traversal under full programmatic control. So as I said earlier, having dedicated RT hardware for stuff like traversal would actually be a step back.

Other improvements to the RT units are possible, obviously.
 
Top Bottom