• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: PS5 Pro Codenamed "Trinity" targeting Late 2024 (some alleged specs leaked)

Would you upgrade from your current PS5?

  • For sure

    Votes: 377 41.0%
  • Probably

    Votes: 131 14.2%
  • Maybe

    Votes: 127 13.8%
  • Unlikely

    Votes: 140 15.2%
  • Not a chance

    Votes: 145 15.8%

  • Total voters
    920

ChiefDada

Gold Member
Well, I stand corrected then, didnt know anyone here was saying it, And sorry for calling the then hypothetical you.. delusional. Gonna address why I said that though since I am talking to an actual person who thinks it.

No worries! I like speculating and debating with people so close to the full spec leaks!
Yes, we have the quality and performance mode in games. Typically, that quality mode is running at 1440p-4K + DRS + reconstruction and usually has the complete suite of IQ features the game offers. While running at 30fps. And then we have the performance mode that would in addition to running at 720p-1080p before reconstruction to usually 1440p, also cut back on some IQ presets. Can be changes to draw distance, geometric detail, RT, shadows...etc.

So the question is, how do you think they use that extra power?

They use the power to deliver "uncompromised" (today's fidelity mode) ray tracing both in terms of image quality (native 4k minimum) and performance (60fps minimum). That is the one and only goal of the PS5 Pro. From my perspective, both HW AI upscaling and HW accelerated ray tracing are a lock for PS5 Pro. I am basing this entirely on Tom Henderson who has a perfect track record so far as it relates to Playstation. Tom mentions accelerated ray tracing by name in his report, and his mention of 8k performance mode all but confirms AI hw in my opinion.

To keep things simple I will focus on base PS5 fidelity mode where we both agree native resolution is in 1440-4k range typically alongside RT usage. In my opinion, base PS5 is GPU limited because of ray tracing. It's not farfetched to anticipate an RT focused mid-gen upgrade to have 2x RT lift when RDNA 2 is the baseline. Therefore, we will already be at native 1440p-4k locked 60fps, even before any raster uplift from larger GPU. When adding rumored 50% raster uplift, we're now at locked 4k60 for majority of games. So why the need or desire to upscale from native 4k. Because even Sony's best TAA solution is no match for an AI driven upscaler and why not provide ultra pristine anti aliasing if it has minimal render cost for AI inferencing (~1 millisecond for 3090 in Alex's video going from 1440p to 8k). It's not redundant because unlike CGI and movies, real-time games still struggle far worse with aliasing even at native 4k.

We keep saying imagine quality going from 4k to 8k isnt big improvement anymore, but gpu power needed to push 4x more pixels each frame, is crazy big.
Ps5pr0 wont have 4090 and 7800x3d , and even if it did it still wouldnt be able to push 8k native resolution, on actual current gen games.

It wouldn't be native 8k. I've been saying that this entire time. It would render the same number of pixels or slightly higher than base PS5.

Honestly cannot see them giving a toss about 8k upscaling at this point, maybe PS6 but they are not going to shift many 8k TVs in the next few years. I mean PC gamers have barely transitioned to 4k (I am still on 2 QHD screens as decent 4k panels with good refresh are too expensive as is the GPU horsepower.). When the PS4 Pro appeared I and many of my friends had already transitioned to 4K, in fact I was on my second 4K SONY TV. its not the same this time round.

It's not about 8k TV adoption. It's about AI Upscaling being, by far, the best solution for reconstruction and TAA. See example below.

E1ETm5J.jpg
 

welshrat

Member
No worries! I like speculating and debating with people so close to the full spec leaks!


They use the power to deliver "uncompromised" (today's fidelity mode) ray tracing both in terms of image quality (native 4k minimum) and performance (60fps minimum). That is the one and only goal of the PS5 Pro. From my perspective, both HW AI upscaling and HW accelerated ray tracing are a lock for PS5 Pro. I am basing this entirely on Tom Henderson who has a perfect track record so far as it relates to Playstation. Tom mentions accelerated ray tracing by name in his report, and his mention of 8k performance mode all but confirms AI hw in my opinion.

To keep things simple I will focus on base PS5 fidelity mode where we both agree native resolution is in 1440-4k range typically alongside RT usage. In my opinion, base PS5 is GPU limited because of ray tracing. It's not farfetched to anticipate an RT focused mid-gen upgrade to have 2x RT lift when RDNA 2 is the baseline. Therefore, we will already be at native 1440p-4k locked 60fps, even before any raster uplift from larger GPU. When adding rumored 50% raster uplift, we're now at locked 4k60 for majority of games. So why the need or desire to upscale from native 4k. Because even Sony's best TAA solution is no match for an AI driven upscaler and why not provide ultra pristine anti aliasing if it has minimal render cost for AI inferencing (~1 millisecond for 3090 in Alex's video going from 1440p to 8k). It's not redundant because unlike CGI and movies, real-time games still struggle far worse with aliasing even at native 4k.



It wouldn't be native 8k. I've been saying that this entire time. It would render the same number of pixels or slightly higher than base PS5.



It's not about 8k TV adoption. It's about AI Upscaling being, by far, the best solution for reconstruction and TAA. See example below.

E1ETm5J.jpg
Can't see it being a good use of resources personally but hopefully we will get more leaks next year with possibly a reveal in February so will hopefully know more then.
 

THE:MILKMAN

Member
Would be really stupid for Sony to leave dev kits out that long especially if it prevents them from using better tech imagine if them just leaving these dev kits sitting for more than a year is the reason we can’t get say zen 5 over 4

Well the solid Henderson rumours say the dev kits are now in third-party devs hands (and logically even earlier for Sony devs) and the launch being November '24. That is more than a year.

PS4 Pro dev kit didn't start to go out to third-party devs until April and it also launched in the Nov of '16.

Either the PS5 Pro is coming earlier (like Spring) or Sony thinks devs will need more time with the HW or some other reason. A year plus with the dev kit seems a long time either way.

Some expectations in this thread are getting a bit out of hand, it's a PS5 Pro people, not a PS6. Don't expect a generational leap.

I'm still confident the specs will land close to what I said back in July: 2xGPU with new RDNA features/generation everything else the same besides maybe SSD capacity.

Some surprise stuff from Cerny would be nice.
 
Last edited:

Mr.Phoenix

Member
No worries! I like speculating and debating with people so close to the full spec leaks!


They use the power to deliver "uncompromised" (today's fidelity mode) ray tracing both in terms of image quality (native 4k minimum) and performance (60fps minimum). That is the one and only goal of the PS5 Pro. From my perspective, both HW AI upscaling and HW accelerated ray tracing are a lock for PS5 Pro. I am basing this entirely on Tom Henderson who has a perfect track record so far as it relates to Playstation. Tom mentions accelerated ray tracing by name in his report, and his mention of 8k performance mode all but confirms AI hw in my opinion.
I can see a situation where fidelity mode is left as is and the extra power allows the DRS to stick at its 4K upper bound and it stays at 30fps. Or even a 40fps mode. Or one where the PS5pro just means that fidelity mode takes the OG PS5 fidelity mode and runs it at 60fps instead of 30fps.
To keep things simple I will focus on base PS5 fidelity mode where we both agree native resolution is in 1440-4k range typically alongside RT usage. In my opinion, base PS5 is GPU limited because of ray tracing. It's not farfetched to anticipate an RT focused mid-gen upgrade to have 2x RT lift when RDNA 2 is the baseline. Therefore, we will already be at native 1440p-4k locked 60fps, even before any raster uplift from larger GPU. When adding rumored 50% raster uplift, we're now at locked 4k60 for majority of games.
That's the thing though, going from 1440p-4K isn't 50% more pixels. Its like 110% more pixels. The PS5pro would either take a game from 1440p to native 4K and keep the fps at 30,. or keep the rez at 1440p and take the fps up to 60. Not both. And this is where the reconstruction comes in and also why that whole 8K thing doesn't make sense. It just makes more sense to run the game natively at 1440p and reconstruct that up to 4K, while maintaining all the best IQ presets of a fidelity mode and running at 60fps.
So why the need or desire to upscale from native 4k. Because even Sony's best TAA solution is no match for an AI driven upscaler and why not provide ultra pristine anti aliasing if it has minimal render cost for AI inferencing (~1 millisecond for 3090 in Alex's video going from 1440p to 8k). It's not redundant because unlike CGI and movies, real-time games still struggle far worse with aliasing even at native 4k.
I hope I have made you understand why native 4k... even if used as some sort of TAA solution, is just not worth it. Like not worth it at all. No one is looking at the games on these current-gen consoles and saying its biggest problem is AA. What we want, is higher frametrates and better rez/IQ while getting that.
It wouldn't be native 8k. I've been saying that this entire time. It would render the same number of pixels or slightly higher than base PS5.
Nope. I know what you are thinking here, you are looking at how with reconstruction (AI to be exact), you can take a 1080p game and make it look like a 4K game, so with the same techniques you can also take a 4K game and make it look like an 8K game then supersample that back down to 4K to get awesome AA.

That is a herculean waste of resources when all you have to do is use 1440p and AI reconstruct that to 4K and you end up with a game that looks as good and times better than any native 4K image. While also running at 60fps but most importantly, fitting the memory and bandwidth confines have the hardware. Your solution doesn't just mean the game stays at 30fps, but will actually need more RAM to pull off than the alternative.
 
Those games will then have higher resolution, more impressive RT, and likely better upscaling.

I'm referring to the games that don't run at 60 on the base PS5 model. I doubt a Pro would get a stable 60 if the CPU is the reason for 30fps. Which may happen with GTA6.

Take Alan Wake 2 for example. Quality mode on the consoles run at 1296p upscaled to 4k with FSR and no RT. The Pro can almost certainly run that at 1296p upscaled to 4k with whatever newer upscaling solution is developed with RT reflections and at a stable 60fps. It's not going to play it at native 4k/60fps with multiple RT effects, not even the 4090 can do that.

Just pointing out that expecting AMD or Sony to leapfrog Nvidia for a mid-gen upgrade is almost certainly not happening.
I think a bit higher I think the pro will do it at 1440p 60 with maybe 2 rt effects
 
If it’s pure apples to apples sure but the current PS5 is already punching above its weight, even a slight upgrade with some sorry of DLSS or frame gen clone would be a massive uptick on image quality and frames.

The consoles aren’t aiming for 4k120fps


4k upscaled from a fairly decent base and 60 is plenty enough. That base being something akin to Quality or balanced on DLSS.
I do actually think the pro would be aiming for 4k 120 on remasters and simple to run games I imagine all those cross gen ps4 games with ps5 upgrades could get near it
 
I think people are going to be very disappointed once the real specs of the PS5 Pro are announced.There's no way a £500-£600 console is going to perform like a 4070 Ti,and certainly not a 4080 ffs
Rasterization is different from raytracing it also wouldn’t apply to every game only the most optimized like exclusives
 
Well the solid Henderson rumours say the dev kits are now in third-party devs hands (and logically even earlier for Sony devs) and the launch being November '24. That is more than a year.

PS4 Pro dev kit didn't start to go out to third-party devs until April and it also launched in the Nov of '16.

Either the PS5 Pro is coming earlier (like Spring) or Sony thinks devs will need more time with the HW or some other reason. A year plus with the dev kit seems a long time either way.



I'm still confident the specs will land close to what I said back in July: 2xGPU with new RDNA features/generation everything else the same besides maybe SSD capacity.

Some surprise stuff from Cerny would be nice.
Really hope somehow the rumor the dev kits are already out is wrong we as consumers should want this leak to be wrong it only means a worse pro
 
These same arguments come all the time, it wasn’t that long ago 4K wasn’t needed and said to be not noticeable (diminishing returns) compared to 1080P. After 8K people will complain that you won’t find any better than 8K, 16K isn’t needed. History repeats.
Tbf 8k is genuinely the max you would ever need on a tv at least it’s what most movies are rendered at and you need a tv bare minimum 75+ inches but preferably 83+ to see any real difference from 4k. For 16k you would need a 120+ inch tv to see any difference from 8k and no one owns that and I doubt ever will. 16k makes more sense for vr
 
I don't think anyone here (and if anyone here is saying that then they are delusional) expects the PS5pro to run anything at native 4k. Hell, even the PS4pro used CBR to do 4K. And the pro is no different.

Short of coming out and flat-out saying it, I think Sony has made it clear that when it comes to gaming, they see native 4 as a waste of resources. I don't even think game engines are designed with native 4K in mind anymore.

And what do you mean by leapfrog Nvidia? I am sure I (and quite a few others) have said that the PS5pro, is going to be a fidelity mode 1440p reconstructed to 4K@60fps console. Basically, every game (or type of game) that has a performance mode right now, will be running at 60fps with settlings slightly better than whatever that same game uses in its fidelity mode.
Some games actually would do 4k 60+ on the pro like rift apart tbf just not every game
 

THE:MILKMAN

Member
Really hope somehow the rumor the dev kits are already out is wrong we as consumers should want this leak to be wrong it only means a worse pro

I'm afraid if you do believe the November '24 launch date then it is almost certain the specs are locked today (quite a bit before today even). I seem to remember that with the PS4 Pro the rumours started at GDC '16 and then the docs leaked in mid March that stated the dev kits were going out to TP in April but further snooping I did showed on the Japanese MIC site (like the FCC) the dev kit was tested in December 2015 and we know the APU at least was final spec per the leaked PDF.

I still think the Pro will put out some quality looking games and the paper specs shouldn't be seen as disappointing. The reality is it is ever harder and expensive to get big boosts to performance.
 
Lol @ people saying 8k is useless... until they see 8k supersampling on their 4k TV and swear they can never go back to native 4k/TAA. Those DF comparisons will be something else. We've been down this road plenty of times before. Pro models are enthusiast class consoles per Sony's own admission so for the people who shun excess and overindulgence they should know there's a console available for them too and that's base PS5.

fkXBB0a.jpg




I'm saying it. There are plenty of games that run comfortably in the 1440p-4k/30 range on base PS5 that can benefit from AI upscaling to 8k. Avatar, Spider-Man 2, God of War Ragnarok, Forbidden West/Burning Shores, TLOU Pt 1, Death Stranding, the list goes on.

To my knowledge, Tom Henderson has a perfect track record when it comes to Sony insider info. The 8k performance mode is real.
Also simpler to run games like persona 5 which have no issue doing native 4k 60 on base could likely do native 8k on a pro so there’s that as well
 
Well, I stand corrected then, didnt know anyone here was saying it, And sorry for calling the then hypothetical you.. delusional. Gonna address why I said that though since I am talking to an actual person who thinks it.

Yes, we have the quality and performance mode in games. Typically, that quality mode is running at 1440p-4K + DRS + reconstruction and usually has the complete suite of IQ features the game offers. While running at 30fps. And then we have the performance mode that would in addition to running at 720p-1080p before reconstruction to usually 1440p, also cut back on some IQ presets. Can be changes to draw distance, geometric detail, RT, shadows...etc.

So the question is, how do you think they use that extra power?

I think they use it to get the quality mode to 60fps (or at the very worst above 50fps at all times +VRR) while even adding a few things that are better than the base quality mode and to get the performance mode as close to 120fps as possible with VRR while using a minimum rez of 1080p. I believe this is the route the go because its something that the majority of people that own a TV today/2024/2025...etc will benefit from.

I do not see them channeling that power towards 8K anything considering how few people have an 8K TV. And that whole AI reconstruction to 8K thing and then basically supersampling back down to 4K... is redundant if you are running on a 4K TV.
Tbf one benefit of targeting 8k is if they have the power to even get close on some games it would gurantee they could do 4k 60 as a performance mode or even 4k 120 as long as this 8k is a resolution mode and not the only mode I actually think it will be beneficial to target
 
Honestly cannot see them giving a toss about 8k upscaling at this point, maybe PS6 but they are not going to shift many 8k TVs in the next few years. I mean PC gamers have barely transitioned to 4k (I am still on 2 QHD screens as decent 4k panels with good refresh are too expensive as is the GPU horsepower.). When the PS4 Pro appeared I and many of my friends had already transitioned to 4K, in fact I was on my second 4K SONY TV. it’s not the same this time round.
I think the ps6 will support up to 120fps or higher for 8k since hdmi 3.0 should be out then
 
We keep saying imagine quality going from 4k to 8k isnt big improvement anymore, but gpu power needed to push 4x more pixels each frame, is crazy big.
Ps5pr0 wont have 4090 and 7800x3d , and even if it did it still wouldnt be able to push 8k native resolution, on actual current gen games.
Thats why gta6 trailer looks so good, much better than anything else, and its not even 1440p native, same with the matrix demo, those 2 examples scream current/next gen like nothing else.

If u decide to use up gpu power for pushing native 8k, then games wont be able to look next gen, look how lil complexity the tourist game has, which internally runs in 8k60 on ps5(yes, it outputs in 4k60), take TLoU2 from 1,8tflop base ps4 and it devours it with ease in terms of what game actually looks better.

Quick example- ps4 1080p30, for same game(settings quality), to run it in 4k60 u need 8x gpu and 2x cpu power, to run it in 8k60 u need 24x gpu power and 2x cpu power, and game wont be much better looking from 4k60game to ur eyes.

Dimnishing returns are real, u can check urself, 720p( 921,6k pixels) to 1080p(2073,6k pixels) is bigger jump to ur eyes than 1080p to 4k despite 1080p to 4kbeing exactly 4x amount of pixels, while 720p to 1080p is only 2,25x pixel amount).

Above native 4k u really have to sit extremly close to the screen to notice even small differences, and price u gotta pay for jump from 4k to 8k native is humongous, just as much as 1080p to 4k, in terms of gpu power needed.

Thats the reason ppl keep telling its not worth it coz that gpu oomph, especially in small factor console box can be used much better for many other things, not for tiny increase in image quality thats already superb at native 4k.
One benefit of targeting 8k and even getting close guarantees you can also 4k 60 minimum as a performance mode so I actually think it’s a good target
 

ChiefDada

Gold Member
That's the thing though, going from 1440p-4K isn't 50% more pixels. Its like 110% more pixels.

It doesn't need to hit native 4k in order to be beneficial. As seen in the DF video I posted. 1440p AI upscaled to 8k is noticeably better than native 4k with normal TAA and even more so with native 1440p upscaled to 4k.

The PS5pro would either take a game from 1440p to native 4K and keep the fps at 30,. or keep the rez at 1440p and take the fps up to 60. Not both. And this is where the reconstruction comes in and also why that whole 8K thing doesn't make sense. It just makes more sense to run the game natively at 1440p and reconstruct that up to 4K, while maintaining all the best IQ presets of a fidelity mode and running at 60fps.

And this is where we disagree since you think it's an "either/or" situation and I believe the bigger/faster GPU coupled with additional RT hardware will yield a compounding effect. Like I said before, if ray traversal unit was the only improvement from base PS5 GPU that would be enough to have native 1440p-4k locked 60fps for existing well optimized games like Ratchet and Spiderman.

Nope. I know what you are thinking here, you are looking at how with reconstruction (AI to be exact), you can take a 1080p game and make it look like a 4K game, so with the same techniques you can also take a 4K game and make it look like an 8K game then supersample that back down to 4K to get awesome AA.

That is a herculean waste of resources when all you have to do is use 1440p and AI reconstruct that to 4K and you end up with a game that looks as good and times better than any native 4K image. While also running at 60fps but most importantly, fitting the memory and bandwidth confines have the hardware. Your solution doesn't just mean the game stays at 30fps, but will actually need more RAM to pull off than the alternative.

I don't see how it's a waste of resources when it would be AI hardware and not the GPU pixel shaders doing the work.
 
I'm afraid if you do believe the November '24 launch date then it is almost certain the specs are locked today (quite a bit before today even). I seem to remember that with the PS4 Pro the rumours started at GDC '16 and then the docs leaked in mid March that stated the dev kits were going out to TP in April but further snooping I did showed on the Japanese MIC site (like the FCC) the dev kit was tested in December 2015 and we know the APU at least was final spec per the leaked PDF.

I still think the Pro will put out some quality looking games and the paper specs shouldn't be seen as disappointing. The reality is it is ever harder and expensive to get big boosts to performance.
Yeah I do expect November-December 2024 (though a small delay to first half 2025 isn’t impossible) it’s weird to finalize the specs this early even in your pro example they weren’t finalized this earlier. They just need to wait an extra month or 2 and zen 5 could make it is Sony not aware of this?
 
No worries! I like speculating and debating with people so close to the full spec leaks!


They use the power to deliver "uncompromised" (today's fidelity mode) ray tracing both in terms of image quality (native 4k minimum) and performance (60fps minimum). That is the one and only goal of the PS5 Pro. From my perspective, both HW AI upscaling and HW accelerated ray tracing are a lock for PS5 Pro. I am basing this entirely on Tom Henderson who has a perfect track record so far as it relates to Playstation. Tom mentions accelerated ray tracing by name in his report, and his mention of 8k performance mode all but confirms AI hw in my opinion.

To keep things simple I will focus on base PS5 fidelity mode where we both agree native resolution is in 1440-4k range typically alongside RT usage. In my opinion, base PS5 is GPU limited because of ray tracing. It's not farfetched to anticipate an RT focused mid-gen upgrade to have 2x RT lift when RDNA 2 is the baseline. Therefore, we will already be at native 1440p-4k locked 60fps, even before any raster uplift from larger GPU. When adding rumored 50% raster uplift, we're now at locked 4k60 for majority of games. So why the need or desire to upscale from native 4k. Because even Sony's best TAA solution is no match for an AI driven upscaler and why not provide ultra pristine anti aliasing if it has minimal render cost for AI inferencing (~1 millisecond for 3090 in Alex's video going from 1440p to 8k). It's not redundant because unlike CGI and movies, real-time games still struggle far worse with aliasing even at native 4k.



It wouldn't be native 8k. I've been saying that this entire time. It would render the same number of pixels or slightly higher than base PS5.



It's not about 8k TV adoption. It's about AI Upscaling being, by far, the best solution for reconstruction and TAA. See example below.

E1ETm5J.jpg
Some games actually would be native 8k simpler to run though like crash bandicoot, persona 5, potentially the upcoming persona 3 reload etc
 

Panajev2001a

GAF's Pleasant Genius
Like I said before, if ray traversal unit was the only improvement from base PS5 GPU that would be enough to have native 1440p-4k locked 60fps for existing well optimized games like Ratchet and Spiderman.
If like the rumours say they have AI for ray reconstruction and the GPU supports ray coherency sorting on top of that it should be a nice boost RT capabilities wise. Reflections in the 40 Hz quality mode should be much much improved and they can go ballistic with crowds, details, and resolution improvements.
 
Last edited:

Zathalus

Member
I don't see how it's a waste of resources when it would be AI hardware and not the GPU pixel shaders doing the work.
Because upscaling is not free, even with dedicated AI cores. 1440p->4k is far easier to run then 1440p->8k.

Those resources are better spent elsewhere, like better RT or more advanced graphical options.
 

ChiefDada

Gold Member
Because upscaling is not free, even with dedicated AI cores. 1440p->4k is far easier to run then 1440p->8k.

Those resources are better spent elsewhere, like better RT or more advanced graphical options.

For 3090 Alex says ~1 millisecond. Would be even less the closer we can render to native 4k. That is negligible for a PS5 Pro equivalent with 30-60 fps budget. That is benefit of console arena where 30fps and 60fps is tolerable. Also the AI and RT hardware are presumably distinct.
 

Zathalus

Member
For 3090 Alex says ~1 millisecond. Would be even less the closer we can render to native 4k. That is negligible for a PS5 Pro equivalent with 30-60 fps budget. That is benefit of console arena where 30fps and 60fps is tolerable. Also the AI and RT hardware are presumably distinct.
8k ultra performance with a 3090 is around 38fps, 4k quality (so same internal res of 1440p) is around 54fps. So you loose a third of performance for a slight image quality bump on a 4k display. Ultra Performance 8k does perform only slightly worse then straight up native 4k, so perhaps Alex is referring to that. If your choice is between those two, then the 8k option can make sense, but these days most would just use DLAA.

AI and RT hardware have always been distinct on Nvidia but upscaling will always have a cost, with or without AI because the upscaling place on shaders. With DLSS (and XeSS) the AI is just used as a pass to enhance the upscaled image, to solve issues like ghosting, image stability, and fine detail. Hence why DLAA has no performance cost vs native, as it is DLSS without the upscaling part.
 

bitbydeath

Member
Tbf 8k is genuinely the max you would ever need on a tv at least it’s what most movies are rendered at and you need a tv bare minimum 75+ inches but preferably 83+ to see any real difference from 4k. For 16k you would need a 120+ inch tv to see any difference from 8k and no one owns that and I doubt ever will. 16k makes more sense for vr
The ability of the human eye to resolve detail is fixed, so there is an upper limit, at a given screen size and viewing distance. It looks like higher than 4K resolutions are "worth it" with a 65" screen, sitting less than 1.2M away. So if you use a huge TV like a regular PC monitor, 8K can deliver some benefits. I wonder what proportion of the audience that applies to.

We saw the benefits of 8K textures in the UE5 demo, I’m pretty sure everyone saw the benefits, no matter the screen type.

 
For 3090 Alex says ~1 millisecond. Would be even less the closer we can render to native 4k. That is negligible for a PS5 Pro equivalent with 30-60 fps budget. That is benefit of console arena where 30fps and 60fps is tolerable. Also the AI and RT hardware are presumably distinct.
I assume any game that does 4k 120 on the pro could use this ai up scaling right?
 
What?

35mm film reaches 4K tops

Digital VFX in Hollywood are still rendered at 2K resolution (2048x1080) to save money

8K is a joke that isn't funny....

Absolutely pointless format
No some animated movies the models are rendered in 8k like some dreamworks and Pixar movies
 
No some animated movies the models are rendered in 8k like some dreamworks and Pixar movies

Who cares??? I'm talking about real movies, live action

99% of the cinema/TV contents need to be UPSCALED to get to 8K....

Why do you think nobody is buying 8K TVs????

Because it doesn't make sense to watch everything upscaled on some GIANT TV that would be required anyway to tell the theoretical difference
 
Last edited:

ChiefDada

Gold Member
Who cares??? I'm talking about real movies, live action

99% of the cinema/TV contents need to be UPSCALED to get to 8K....

Why do you think nobody is buying 8K TVs????

Because it doesn't make sense to watch everything upscaled on some GIANT TV that would be required anyway to tell the theoretical difference

Movies/TV shows and Real-Time Graphics are not the same. Which is why 8k super sampling can be a net positive an get us closer to the pre rendered/photo-real look we all want in our real-time graphics.

iN5DMxq.jpg
 

Perrott

Gold Member
I do not see them channeling that power towards 8K anything considering how few people have an 8K TV. And that whole AI reconstruction to 8K thing and then basically supersampling back down to 4K... is redundant if you are running on a 4K TV.
Well, I wouldn't call it redundant if it kickstarts an arms race within Sony's first-party studios to come up with better upscaling solutions that would make the most of the enhanced AI capabilities that the PS5 Pro might offer, which I assume would also translate into benefits when having to use upscaling to get to 4K/60fps in performance modes.

Plus, PS6 is headed the 8K route anyway, so it doesn't hurt studios to start getting used to trying to reach such high-resolution targets.
 
Last edited:

splattered

Member
Ngl I kinda hate these threads. It's fun when stuff actually leaks but the speculation side of things just go off the rails so quick and often. I still have PTSD from the ps5/SX speculation thread... remember saying stuff like ratchet and clank looked good but a lot of the games would probably be cross-gen at first, and stuff like ratchet could probably be done on a regular hard drive unlike their SSD needed claims. And I got absolutely dogpiled back then
 

FireFly

Member
We saw the benefits of 8K textures in the UE5 demo, I’m pretty sure everyone saw the benefits, no matter the screen type.


The benefit of 8K textures is that they allow you to get closer to an object without a visible loss of detail, not that they magically allow your eyes to resolve more detail. If your viewing distance is sufficiently far, that 8K texture will not look more detailed on an 8K screen than it does on a 4K screen. But you will still see the "extra" detail as you get closer in-game.
 
Last edited:

MikeM

Member
Well, I wouldn't call it redundant if it kickstarts an arms race within Sony's first-party studios to come up with better upscaling solutions that would make the most of the enhanced AI capabilities that the PS5 Pro might offer, which I assume would also translate into benefits when having to use upscaling to get to 4K/60fps in performance modes.

Plus, PS6 is headed the 8K route anyway, so it doesn't hurt studios to start getting used to trying to reach such high-resolution targets.
Can’t even master 4k (even a 4090 struggles depending on the game). Why is 8k even remotely being considered?
 

bitbydeath

Member
The benefit of 8K textures is that they allow you to get closer to an object without a visible loss of detail, not that they magically allow your eyes to resolve more detail. If your viewing distance is sufficiently far, that 8K texture will not look more detailed on an 8K screen than it does on a 4K screen. But you will still see the "extra" detail as you get closer in-game.
Even the still image shows incredible texture detail. That’s the power of 8K.
 

Mr.Phoenix

Member
And this is where we disagree since you think it's an "either/or" situation and I believe the bigger/faster GPU coupled with additional RT hardware will yield a compounding effect. Like I said before, if ray traversal unit was the only improvement from base PS5 GPU that would be enough to have native 1440p-4k locked 60fps for existing well optimized games like Ratchet and Spiderman.
Nope. Not true. If the PS5 had proper RT acceleration akin to an Nvidia GPU, then it would be equivalent in raster and RT to a 2070 super. Which runs R&C with RT at 1440p @ around 39fps. Now even if you doubled the power of the GPU (raster and RT) so basically you are now in 4070ti territory. You at that point can only run R&C in native 4K at around 45fps with the 2070 super dropping to around 20fps.

There is an exponential growth in power required when running RT at higher rez. That's why even at the lower native rez PS5 runs, they calculate RT at an even lower rez than that too.

A 4070ti (mind you, PS5pro should be more in line with a 4070 not the Ti) can run the game with RT at 1440p@74fps. But AI reconstruction (AI-R) up to 4K is not free you know, even if you have dedicated hardware for it. So if you factor in AI-R that would probably bring that down to 65fps. And even worse if you are pushing it up to 8K. And this is using what would be a properly optimized game, now imagine what third parties would do.

Mind you, these numbers are the absolute best-case scenarios using a GPU and CPU that would be better and more powerful than what we will get in the PS5pro in every single way.
I don't see how it's a waste of resources when it would be AI hardware and not the GPU pixel shaders doing the work.
As I said above, even AI-R is not free. On more capable hardware DLSS adds like 1.5 - 2ms per frame.
Well, I wouldn't call it redundant if it kickstarts an arms race within Sony's first-party studios to come up with better upscaling solutions that would make the most of the enhanced AI capabilities that the PS5 Pro might offer, which I assume would also translate into benefits when having to use upscaling to get to 4K/60fps in performance modes.

Plus, PS6 is headed the 8K route anyway, so it doesn't hurt studios to start getting used to trying to reach such high-resolution targets.
No. There is no race. If you are using AI hardware for reconstruction, then it means you are using machine learning reconstruction. How that works is that there is a more or less fixed instruction set or algorithm that the AI runs to handle the reconstruction. Its not something devs can improve themselves or compete with each other. Sony would be the one making updates to the tech on the SDK level.

This 8K talk is just nonsense and unreasonable. Dont even see it for the next gen. How many 8K TVs or monitors are you seeing around right now? 8K would have to be making the rounds in the PC high-end space right now to be something that a PS6 can try and do in 2030.
 

ChiefDada

Gold Member
Nope. Not true. If the PS5 had proper RT acceleration akin to an Nvidia GPU, then it would be equivalent in raster and RT to a 2070 super. Which runs R&C with RT at 1440p @ around 39fps. Now even if you doubled the power of the GPU (raster and RT) so basically you are now in 4070ti territory. You at that point can only run R&C in native 4K at around 45fps with the 2070 super dropping to around 20fps.

You mean the 2070S with 8gb VRAM???


episode 6 glow up GIF by BBC Three
 

Mr.Phoenix

Member
You mean the 2070S with 8gb VRAM???


episode 6 glow up GIF by BBC Three
Pls... let's not do this. Gonna make me start questioning how well you know what you are talking about. That 2070s with 8GB VRAM is on a PC that can have as much as 32GB or RAM. And besides, I am not making up the numbers. You can check a myriad of review sites and see how R&C runs on different GPUs.

And as if DF heard us...


They are talking about how unnecessary even 4K is much less 8K. Especially in this world of AI upscaling, there are simply far better things to attempt to do with that power than to be trying to do the absolute most expensive AA solution possible. Which is all this 8K nonsense you are talking about is. AA.
 

ChiefDada

Gold Member
Pls... let's not do this. Gonna make me start questioning how well you know what you are talking about. That 2070s with 8GB VRAM is on a PC that can have as much as 32GB or RAM. And besides, I am not making up the numbers. You can check a myriad of review sites and see how R&C runs on different GPUs.

Are you kidding me!??? Even as a console fanboy I am telling you it wouldn't be appropriate to benchmark a 2070S w/ 8gb VRAM against PS5 fidelity mode settings in Ratchet and Clank Rift Apart. If you can find any video of 2070S running RC at native 4k 30fps with RT Reflections a la PS5 I'll gift you gold.

They are talking about how unnecessary even 4K is much less 8K. Especially in this world of AI upscaling, there are simply far better things to attempt to do with that power than to be trying to do the absolute most expensive AA solution possible. Which is all this 8K nonsense you are talking about is. AA.

Well they believe the PS5 Pro itself is unnecessary so that should clue you in on how much I value their opinion on these sorts of topics.
 

Mr.Phoenix

Member
Are you kidding me!??? Even as a console fanboy I am telling you it wouldn't be appropriate to benchmark a 2070S w/ 8gb VRAM against PS5 fidelity mode settings in Ratchet and Clank Rift Apart. If you can find any video of 2070S running RC at native 4k 30fps with RT Reflections a la PS5 I'll gift you gold.
I will do you one better. Lets take the 2080ti.

11GB RAM
13.4TF
616GB/s of bandwidth
CPU: i9 13900k@5Ghz+ and 32GB of RAM

Can we agree that the GPU+ CPU is more powerful than the PS5? Ok good.

First, 1440p
performance-rt-2560-1440.png


and then 4K
performance-rt-3840-2160.png



So on this test system, the 2080Ti, manages 1440p@ 44.8fps and native 4K@22fps.

Now mind you, these are NATIVE resolutions and better IQ presets than the PS5... oh and better RT too. So no DRS or temporal interjection.

So, having seen what hardware that is significantly more powerful than the PS5 can do, still wanna continue this hubris?

Now look at the 4070ti in that chart... you would need the PS5 to have a GPU that powerful paired with a CPU as powerful too (hint, PS5pro CPU is not even half as powerful) to have those kinda results. And what you are hoping for with the whole 8K thing, would be even more demanding than that. Oh... and the GPU in the PS5pro is not even remotely as powerful as the 4070ti.
 
Last edited:
Top Bottom