• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Lately, I am a little worried about the processing power of PS5/series X. Am I alone?

Chronicle

Member
When these consoles were announced everyone was touting how much of a true generational leap they were from the previous generation. Now some are saying thats not the case. The leap from PS3/360 to PS4/XBOX One wasn't that big due to jaguar chips. Most stated then that the consoles were already limited. Well they gave us The Witcher 3 and TLOU2 plus RDR2. I'm really hoping we see some big advancements soon. I mean we really need that big MP shooter that will define this gen and another great RPG that hopefully can blow our minds. These are big powerful machines with big cooling fans. I bet we have some awesome stuff to come. Fingers crossed.
 
When these consoles were announced everyone was touting how much of a true generational leap they were from the previous generation. Now some are saying thats not the case. The leap from PS3/360 to PS4/XBOX One wasn't that big due to jaguar chips. Most stated then that the consoles were already limited. Well they gave us The Witcher 3 and TLOU2 plus RDR2. I'm really hoping we see some big advancements soon. I mean we really need that big MP shooter that will define this gen and another great RPG that hopefully can blow our minds. These are big powerful machines with big cooling fans. I bet we have some awesome stuff to come. Fingers crossed.
3 words: physically based renders

I always thought right away that the leap was huge last gen (remember the first time you saw KZ shadowfall or Ryse?), it culminated in the Order 1886 and UC4. Then, 4k/hdr and mid gen helped to keep that magic alive with Horizon and GoW. However, as the gen went on that feeling waned when seeing PC's blow the consoles away at 60 fps vs 30fps.

Fast forward to this gen. The only thing I see that could be as transformative as PBR is Ray Tracing and well, these consoles can't really do the type of RT that is needed to really blow me away (sorry but reflections at low resolutions aren't it). Comparing Control on PC vs console makes that pretty apparent. Or Dying Light 2 with RT GI, shadows and reflections. They look almost a gen ahead to my eyes.

I wish we had more powerful systems though they are good value for now.
 

TrebleShot

Member
These are mass consumer lowest common denominator boxes.
They are aimed at movie/mp lovers who mostly play on large TVs in their living rooms about 5-10 feet away.

Therefore you will get 4k prioritised so that it looks nice and sharp on whatever cheap huge 4k to someone has brought and their mop games run smoothly hence 60;120 frames for such experiences.

Mid gen refresh won’t happen, you’ll get ps now streaming at higher fidelity.
 

Chronicle

Member
3 words: physically based renders

I always thought right away that the leap was huge last gen (remember the first time you saw KZ shadowfall or Ryse?), it culminated in the Order 1886 and UC4. Then, 4k/hdr and mid gen helped to keep that magic alive with Horizon and GoW. However, as the gen went on that feeling waned when seeing PC's blow the consoles away at 60 fps vs 30fps.

Fast forward to this gen. The only thing I see that could be as transformative as PBR is Ray Tracing and well, these consoles can't really do the type of RT that is needed to really blow me away (sorry but reflections at low resolutions aren't it). Comparing Control on PC vs console makes that pretty apparent. Or Dying Light 2 with RT GI, shadows and reflections. They look almost a gen ahead to my eyes.

I wish we had more powerful systems though they are good value for now.
Yeah that Killzone demo was such a pleasant sight at the time. We need that know. I remember when the 360 launched and I wasn't all that pumped until the next fall when Splinter Cell came out and then Gears of War. Wow what a leap in graphics and gameplay!

It's nern a year now. We need something. Something huge.
 

bender

What time is it?
Until we have a multiple generations in a row targeting the same resolution and frame rates, consoles are always going to be well behind the curve. There is only so much you can do in a $400-$500 device that needs work in a 20OW power envelope. High end graphics cards almost double that power consumption by themselves.
 
Last edited:
Sony is probably going to release a PS5 Pro with full ray tracing abilities. It’ll be the shape of PS5 digital with a the disc slot in the middle of the console instead of side, ill come in black. Dont worry.

PS5-R
 

VFXVeteran

Banned
All we've seen is last gen games on last gen engines so far, if we are still having this discussion in 2 years then we have a problem but it will get better when last gen dies and as far as I can tell at least Microsoft have moved on.
It will get better from here.
Last gen isn't the problem. In fact, this should be a tell-tale sign of weakness throughout the entire generation.

It's like trying to load up Quake 1 and struggling to render it at 60FPS with basic graphics features. How would you ever think something even more intensive would run at 60FPS?

These consoles *are* at their limits with regards to the GPU. Their power is quite known and the APIs for the graphics engines are already in place. There is no extra performance to be had with more complex stuff AND running 60FPS to boot. Something has to give.

I doubt a mid-gen refresh is happening. To solve this bandwidth problem, you'd need more than overclocking hardware to maintain backward compatibility with the initial hardware and no change to the low-level API functionality.

New hardware is required therefore we'll have to wait for the next iteration OR buy a PC (which has the bandwidth at the moment or at least has a hardware upscaler solution in DLSS).
 

legacy24

Member
Say my name. Say it.
 

lh032

I cry about Xbox and hate PlayStation.
HFW, a cross gen game already looks one of the best game among all console platforms and PC
 
Last gen isn't the problem.

These consoles *are* at their limits with regards to the GPU.
Huh?!!!!?

Devs Having to work on 4 additional pieces of hardware (x1s, x1x, ps4, ps4 pro) and limiting games so they can run on all 7 current console machines, 8 if there’s a switch version.

Oh yeah, that’s great for the optimization process. 😵‍💫 Clearly the consoles are tapped.

Yeah they’re not going to magically get games that are running in 1080p60 to 4k60 for the sequel but they can eek out more geometry, use more advanced engines (returnal is on old ass Ue4) that make better use of streaming, use better reconstruction methods/better AA etc. etc.

I’m not picking on Returnal because it does look good but i’d expect a big improvement for the sequel. Ditto any sequels for games that were cross gen.

Like damn, don’t tell me you didn’t notice the graphics improvement from gears 1 to 3 on xbox 360 ; a console with a straight forward easy to use gpu like current gen.
 
Last edited:

Sega Orphan

Banned
Diminishing returns. You have the gap between PS4 and Xbox One, and we noticed a decent difference. We have a gap between the XSX and PS5, but the difference is indistinguishable. Think about it, you have a whole PS4 extra in the XSX, and it ultimately means nothing. Kinda a good thing really, ad buying a PS5 or XSX means you won't get gimped in comparison.
 

Hoddi

Member
It's the whole reason why I've kept my 1080p TV. The best looking 'next-gen' games won't be running at 4k60 but closer to 1080p30.
 
I am a huge fan of graphics technology in games but is undeniable for me how much more pleasurable Is to play games at 60fps instead of 30fps, specially games with first person view and games that require aim.

Today I was playing the campaign of Crossfire X and it feels really terrible in 30fps, like if the game is almost in slow down.
Even loving the feel of 60fps, I had to play Ratchet and clank, Miles Morales, guardians of the galaxy, forza horizon 5, all in graphics mode because honestly the 60FPS version had always big trades in visuals.

Now with even digital foundry thinking that is better to play Forbidden West in graphics mode because of the lack in image in the performance mode, I am really getting worried about the power of this new consoles.

Dont get me wrong, the games nowadays are beautiful for me, but honestly, apart from that demo of hellblade 2 ( that looks clearly a 30fps game for me), I am really not seeing some really new rendering revolution, apart from the so hyped ray tracing, that justifies so many restrictions for this games running in 60fps with the visuals of the graphics mode.

I am not a technical guy so, I honestly don’t know what can really be achieved in this new consoles but seeing amazing technical wizards developers like Guerrilla Games, Playground Games, Insomniac and some others having to make sacrifices to make his games run in 60fps is a little worrying in my opinion.

I think I will be totally ok if the games could have graphics mode with ray tracing/30fps and the performance mode being 60fps without ray tracing but with the exactly same other graphical features of the graphics mode but that isn’t happening.

I would love to read more of the more technical guys here of the forum about this.
This machines are really already being pushed to its limits that 60fps is not achievable with the visuals of the Horizon Forbidden West and Forza Horizon 5 in his graphics mode for exemple?
The Matrix Awakens…
 

Riky

Banned
Last gen isn't the problem. In fact, this should be a tell-tale sign of weakness throughout the entire generation.

It's like trying to load up Quake 1 and struggling to render it at 60FPS with basic graphics features. How would you ever think something even more intensive would run at 60FPS?

These consoles *are* at their limits with regards to the GPU. Their power is quite known and the APIs for the graphics engines are already in place. There is no extra performance to be had with more complex stuff AND running 60FPS to boot. Something has to give.

I doubt a mid-gen refresh is happening. To solve this bandwidth problem, you'd need more than overclocking hardware to maintain backward compatibility with the initial hardware and no change to the low-level API functionality.

New hardware is required therefore we'll have to wait for the next iteration OR buy a PC (which has the bandwidth at the moment or at least has a hardware upscaler solution in DLSS).

Every console generation starts like this, even on Xbox One there is a huge difference between Forza Horizon 2 and 5 with the same hardware.

Yes we haven't had the uptick in bandwith or Ram this gen but as Jason Ronald said they see this generation as relying on efficiency. We've only had a handful of games with Tier 2 VRS for starters and they perform very well in Gears 5 and Doom Eternal, we even have 120fps on Halo Infinite. So when engines incorporate SFS and Mesh Shaders etc that will help to offset the bandwith limitations.

Time will tell but every console generation has shown dramatic improvements over time and I don't see why this gen will be any different.
 
Last edited:
Checkerboarding, FSR, TSR, or any new Reconstruction technic should offset the GPU bottle neck. As for CPU, Both console are more than capable to the end of its cycle. There is no need to worry. Especially compare to last gen, this gen had much better spec to begin with.

What's really concerning is Series S, 1440p is too much for the device, we could expect 1080p to be the norm for it.
 
Last edited:

VFXVeteran

Banned
Huh?!!!!?

Devs Having to work on 4 additional pieces of hardware (x1s, x1x, ps4, ps4 pro) and limiting games so they can run on all 7 current console machines, 8 if there’s a switch version.

Oh yeah, that’s great for the optimization process. 😵‍💫 Clearly the consoles are tapped.

Yeah they’re not going to magically get games that are running in 1080p60 to 4k60 for the sequel but they can eek out more geometry, use more advanced engines (returnal is on old ass Ue4) that make better use of streaming, use better reconstruction methods/better AA etc. etc.

I’m not picking on Returnal because it does look good but i’d expect a big improvement for the sequel. Ditto any sequels for games that were cross gen.

Like damn, don’t tell me you didn’t notice the graphics improvement from gears 1 to 3 on xbox 360 ; a console with a straight forward easy to use gpu like current gen.
The issue with these threads is that people think all the graphics engines have old quirky code that MUST be migrated to the new generations thereby ensuring unoptimization where they could see gains of 40% or more. We completely overlook that these GPUs are not much different in architecture than last gen's GPUs. We can all clearly see what these GPUs can do with games like DS and R&C. You simply aren't going to get much better than that. All the 3rd party AAA games have graphics engines that encompass a variety of hardware. They are not limited to low end hardware routines. We've seen this with the PC for years and years.

You will continue to see lowered rendering resolution, limited RT, and low samples for things like shadow maps, light loops, and pre-baked GI/AO. Without any of those significantly improving, we just aren't going to see "generational leaps" IMO.
 

Haggard

Member
You will continue to see lowered rendering resolution, limited RT, and low samples for things like shadow maps, light loops, and pre-baked GI/AO. Without any of those significantly improving, we just aren't going to see "generational leaps" IMO.
I think you are dismissing the possible impact of tech like Nanite and Lumen too easily. Lumen, as low-sampled and faked as it is is still leagues better than the usual prebaked and lightprobed lighting, and being able to pack an "infinite" amount of geometry in a scene with Nanite at a near fixed render budget is nothing short of revolutionary, even though it`s currently still very limited in the mesh types it supports.
I´d wait with the doom and gloom until we`ve seen how that tech actually performs in real games and if other developers might even be able to match/top that tech.
 
Last edited:

acm2000

Member
series and ps5 both have more than enough power when not abusing ray tracing and this is what developers should focus on utilising, the hardware isnt powerful enough for decent ray tracing performance outside of the most basic stuff like rt shadows or reflections.
 

TonyK

Member
I'm with you. In fact, if high end GPUs where affordable I would return to PC. I started Ps4 with illusion but after a couple of years of blurry 30fps games I bought a PC and only played in console Sony exclusives.

As I have a 1080ti, not capable of RT, I returned to full console with PS5. But now, after seeing the power of the new consoles, I'm starting to think in returning to PC. But, there are no 3080s at a normal price so... I'm condemned to play at 30fps to see something not achievable with my PC.
 

RJMacready73

Gold Member
I'll always err on the side of Graphics over fps, when I tried playing these new releases in performance mode I always had this cloud of fomo hanging over me as I played, i.e. how much better would this scene look in the "other mode" whereas when playing in Resolution mode at a stable 30fps I never felt that I was missing that additional smoother response or frames as the game was still very much playable. Now if you're visually used to playing in 60fps yes it's jarring as hell but if you're used to 30fps it really is no big deal.

I mean Ratchet at 40fps for me felt no real different to the 60fps but visually I could tell the difference, honestly I think offering a 40fps mode for those of us with 120hz TV's is definitely the best middle ground and besides Devs will always go hard on visuals first and foremost on the consoles
 
It's fine, the problem stems from developers choosing to make the 30fps version the default, thereby making it the version that gets more effort put into it and the 60fps version is an afterthought.
 

VFXVeteran

Banned
I think you are dismissing the possible impact of tech like Nanite and Lumen too easily. Lumen, as low-sampled and faked as it is is still leagues better than the usual prebaked and lightprobed lighting,
Yes, that is true but that doesn't make developing stop there. Even though it's better doesn't make it ideal for a permanent solution for lighting.

and being able to pack an "infinite" amount of geometry in a scene with Nanite at a near fixed render budget is nothing short of revolutionary, even though it`s currently still very limited in the mesh types it supports.
Again, I don't think this is something that's good enough for the next decade. The number and quality of pixels will always be the limiting factor here.

I´d wait with the doom and gloom until we`ve seen how that tech actually performs in real games and if other developers might even be able to match/top that tech.
If we go by last gen's UE4 demos, then we all know that several of those demos was never matched (Infiltrator comes to mind). I don't think that a developer will match UE5 with their own custom engine this generation. But that's my opinion. Given the enormous time-sync required to revamp a graphics engine with new tech and average of 1-2 new IP releases, most studios only get 2 chances within the entire generation before the next one comes up.
 
Last edited:

Bitmap Frogs

Mr. Community
A lot of it is being squandered rendering everything at 4k... i'd be more than happy with 1440p or 1800p upscaled... and believe me, we'll get there at some point when companies wanna start pushing the graphics envelope.
 

Haggard

Member
Yes, that is true but that doesn't make developing stop there. Even though it's better doesn't make it ideal for a permanent solution for lighting.
Well, no one said anything about "stopping" there. This is just something that would represent an actual generational and clearly visible jump over the last console gen where we´ve had 99% static systems.
Again, I don't think this is something that's good enough for the next decade. The number and quality of pixels will always be the limiting factor here.
I kind of don`t understand the next decade thing when we`re just talking about generational jumps over last decades console hardware :)
Yes, the GPU throughput will always be the limiting factor, but the question is if a system like Nanite can keep the requirements in check while providing a true jump in visually perceivable quality over last gen. And that looks promising so far.
If we go by last gen's UE4 demos, then we all know that several of those demos was never matched (Infiltrator comes to mind). I don't think that a developer will match UE5 with their own custom engine this generation. But that's my opinion. Given the enormous time-sync required to revamp a graphics engine with new tech and average of 1-2 new IP releases, most studios only get 2 chances within the entire generation before the next one comes up.
This gen we`ve had 2 of the "jaw dropping" demos actually playable on our own machines though.
That may bode well for real life implementations...at least for games using this specific engine. It`s also too early to dismiss the sheer talent of a lot of other developers like Naughty Dog or Rockstar.
 
Last edited:

Hunnybun

Member
I think the consoles are powerful enough.

Games like Ratchet, Horizon FW, Plague Tale Requiem all look fantastic. Others like Hellblade 2 and Starfield show terrific promise for the future.

My one real doubt is the RT hardware. It seems pretty weak and generally an unacceptable demand on performance. The reflections in R&C, for example, provide about as much benefit as tweaking a couple of settings on a PC game: it's really minor, incidental stuff. And yet it reduces resolution by like 40%! It's clearly not worth it in those cases.

Otoh, as a 60fps devotee, I quite like there being some feature I can usually just turn straight off to double the frame rate.

Maybe the RTGI we saw in Metro Exodus will become more common as the generation goes on. That looks fantastic and IMO is worth the hit to clarity. But it's still so early in terms of actually seeing games designed for the consoles. It's going to be next year before we really get an idea of how this generation will end up looking.
 

Hunnybun

Member
RT was great on Ratchet and Spiderman though.

Does it?

I find it almost a total waste of time in Ratchet.

Spiderman it's a bit more important because you've got these huge flat glass walls everywhere, so having decent reflections does help tie the image together a bit. As long as you don't stop to actually look closely at them of course. Cos then it's clear how severe the compromises really are.
 

Pagusas

Elden Member
I fully expect the midgen updates to have a hardware DLSS equivalent + extra RT ability + faster clock rates. I don't think we'll see major changes to the actual memory architecture or cpu's though. I 100% feel it'll be all about shoehorning in ML upscaling to give 60fps/4k to Current Gen titles (And we'll see 60fps performance mods disappear on the standard PS5/Xbox X hardware with mid/late gen games)
 

Roufianos

Member
Are you talking about Spiderman's 60 fps RT mode? Because it looks so barebones compared to the non-rt 60 mode and even worse next to the 4k/30 fidelity mode.

Seriously, it's nice and clean from an image quality pov but the settings are drastically reduced.

I agree with you about Demons Souls though. That is one of the best looking games on any platform at 60 fps. Why? Because they didn't have to turn down graphics settings to get it running at 1440p and it has great image quality thanks to smart dev choices in how they went about upscaling the resolution.
Oh I agree, Spider-Man on PS5 ain't much of a looker.
 

VFXVeteran

Banned
I kind of don`t understand the next decade thing when we`re just talking about generational jumps over last decades console hardware :)
Yes, the GPU throughput will always be the limiting factor, but the question is if a system like Nanite can keep the requirements in check while providing a true jump in visually perceivable quality over last gen. And that looks promising so far.
We can't just rely on enhanced geometry detail to be the sole feature for a generational jump.

This gen we`ve had 2 of the "jaw dropping" demos actually playable on our own machines though.
Aside from the geometry, there wasn't really anything else innovative in the way of shading and lighting though. And the resolution is absolutely a concern from that Matrix demo. These demos weren't even games. Imagine putting AI code in the mix where CPU throughput is higher.

That may bode well for real life implementations...at least for games using this specific engine. It`s also too early to dismiss the sheer talent of a lot of other developers like Naughty Dog or Rockstar.
I'm not dismissing them though. I'm being a realist.
 

VFXVeteran

Banned
Time will tell but every console generation has shown dramatic improvements over time and I don't see why this gen will be any different.
I disagree here. To me, there isn't much improved from early gen to late gen. KZ: Shadow Fall, 1866, Infamous 2, etc.. all looked pretty comparable to TLOU 2, UC4, HZD, GOW etc.. and certainly rendering parameters stayed the same proving a limit was reached very early on in the generation.
 
Last edited:

MikeM

Member
Last gen isn't the problem. In fact, this should be a tell-tale sign of weakness throughout the entire generation.

It's like trying to load up Quake 1 and struggling to render it at 60FPS with basic graphics features. How would you ever think something even more intensive would run at 60FPS?

These consoles *are* at their limits with regards to the GPU. Their power is quite known and the APIs for the graphics engines are already in place. There is no extra performance to be had with more complex stuff AND running 60FPS to boot. Something has to give.

I doubt a mid-gen refresh is happening. To solve this bandwidth problem, you'd need more than overclocking hardware to maintain backward compatibility with the initial hardware and no change to the low-level API functionality.

New hardware is required therefore we'll have to wait for the next iteration OR buy a PC (which has the bandwidth at the moment or at least has a hardware upscaler solution in DLSS).
Until engines are built to leverage primitive/mesh shaders, i’d argue there is still lots of performance left on the table.
 

Haggard

Member
We can't just rely on enhanced geometry detail to be the sole feature for a generational jump.
That is subjective. Fact is that the geometric detail and picture stability the UE5 demos showed is without equal in games and absolutely impossible on the previous console gen in any meaningful scope.
Aside from the geometry, there wasn't really anything else innovative in the way of shading and lighting though.
This isn`t offline CGI, this is about real time rendering. We´re playing catch up, and UE5 is a big leap forward in terms of making features accessible that simply were too expensive or too time consuming to finetune so far. Lumen combined with Nanite meshes is by FAR the cheapest possibility for an impactful RT lighting implementation we`ve had so far. That is a great accomplishment on engineering level, and the package is probably the most innovative thing we`ve seen an engine do in at least a decade.
 
Last edited:

Hunnybun

Member
Can anybody explain to me how more polygons are handled by engines? I'd always assumed that it was just a computational thing, but then people started talking about how the SSDs will allow much more detailed geometry in the assets, as if it's a more of a memory issue.

How does it work? Is it one or the other or a bit of both?

Because so far that does seem to be a real hallmark of next gen games: pretty much all the ones I've played have way more complex geometry. To the extent that I now really notice the difference if I go back to PS4 games. HZD, say, looks really blocky in its scenery, which is something I just never notices at the time.

Long story short, this seems like something the new consoles really excel in.
 
Top Bottom