It might just be. This time the CPUs are great with power to spare, all they really need is adjusting the resolution target and giving players a choice. I can't think of any reason not to and 99% of games released so far have proven it. Even WD Legion is adding a 60fps mode.Yeah, I know its always what many want at the start of each gen and reality comes crashing in, but so far the 60 fps dream might be reality this time.
Every game i’ve played so far on PS5 has 60fps by default except Miles.
There is but I didn’t use it.Does the system level defaults influence that? Sure there is a OS level default for performance vs resolution
This is great. But what is not so great, to me, is the sacrifices on how to get there. DRS is going to be standard this entire gen and I hate that reality.
Well, that's subjective. I want a clean render. I don't care if the game runs at 60FPS. I'd rather have the quality render @ 30FPS. Best of both worlds would simply be having hardware powerful enough to do 4k @ 60FPS.DRS>>>> framedrops
Agreed, unless there is a really good, non graphical reason why it needs to be 30fps. And it better have damn good motion blur then lol.I want 60fps to be at least an option for players for the rest of this gen in every single game.
We will see how this gen goes. It looks promising but to console players it's always been a carrot on a stick situation in regards to framerate.I'm increasingly confident that the gleeful, doom-mongering, 30fps is here FOREVER, crowd will be proven wrong this generation.
The logic was never very sound anyway, given the improvement in frame rates from like 20fps in the mid 90s to dodgy 30fps on the PS360 to solid 30fps last time.
If we shoot for 60FPS with these GPUs then we will sacrifice a lot of computations for the more complex rendering features. You won't get the SSD->VRAM pipeline like in UE5 @ 60FPS with 8k texture maps. You'll forego RT completely. You'll probably have to stick with lower texture filtering than 16x, you'll be stuck with even lower res during DRS (depending on complexity, we are looking at 1080p levels). There would literally be no progress on the graphics front. We'll be stuck with what we have now for the entire generation.VFXVeteran are you laughing because you care a lot about resolution, or do you honestly think 30fps can still add more to the presentation than 60fps? Like holy crap at all the graphical giants 6th gen had at 60fps. And here we are 3 generations after!
Just the loss of motion resolution alone is worth shooting for 60fps. 60fps at a lower resolution is actually more clear.
I like native 4K, but only if it’s not sacrificing anything else to get there
We already have RT at 60fps in the early days. But, honestly not every game even needs RT (except for reflections, if that's the cheapest way to kill SSR usage because it's an eyesore). The SSD is no less important at 60fps unless i'm missing something (please tell me if I am); Ratchet and clank at 60fps with RT exists and allows for that instant warping. I mean yeah, you can do more of everything at 30fps, that goes without saying. Not sure why we need 8k textures at 4k or lower. Some of that detail would be hidden by the resolution, right? Texture filtering, i'll agree there, i'd like to see 16x af standard and 60fps will probably get in the way sometimes. But if developers wanted it, they could factor that in.If we shoot for 60FPS with these GPUs then we will sacrifice a lot of computations for the more complex rendering features. You won't get the SSD->VRAM pipeline like in UE5 @ 60FPS with 8k texture maps. You'll forego RT completely. You'll probably have to stick with lower texture filtering than 16x, you'll be stuck with even lower res during DRS (depending on complexity, we are looking at 1080p levels). There would literally be no progress on the graphics front. We'll be stuck with what we have now for the entire generation.
I'd much rather have a better approximation to the rendering equation @ 30FPS. That requires more pixels inside the triangles and all the other benefits of rendering framebuffers at a higher resolution for all the screenspace FX.
RT simple reflections with extremely low resolution scene objects isn't what I call doing RT right. It looks worse than just having sharper SSR. Also, RT is a lot deeper than what you guys are understanding. Cyberpunk/Metro has the right idea with RT their lighting engine. That takes up enormous GPU bandwidth.We already have RT at 60fps in the early days.
The reason why you would need the SSD in the UE5 demo is because you are trying to store enormous # of triangles and/or really high resolution textures. You kill your rendering budget by trying to aim for 60FPS and trying to render large texture sizes. Why store 8k texture maps and render the final framebuffer at 1080p? It defeats the purpose of using the SSD->VRAM if you don't have enough resolution to see any kind of detail. It's like loading in 8k textures but then using a filter kernel so wide on that texture that you start averaging many texels to fit inside of one large screen pixel. It'll look like a blurry mess.The SSD is no less important at 60fps unless i'm missing something (please tell me if I am);
That's just fast loading and the RT is minimal (low grade reflections). That offers nothing big as far as enhanced graphics features (unlike last gens PBR shaders).Ratchet and clank at 60fps with RT exists and allows for that instant warping.
Yes, but the delta difference of native 4k isn't as dramatic as a 1440p/1080p framebuffer resolution - you'd see really nice details up close. That's the reason for using high res textures.I mean yeah, you can do more of everything at 30fps, that goes without saying. Not sure why we need 8k textures at 4k or lower. Some of that detail would be hidden by the resolution, right?
Evidently from what we are seeing it's too expensive to run.Texture filtering, i'll agree there, i'd like to see 16x af standard and 60fps will probably get in the way sometimes. But if developers wanted it, they could factor that in.
But you can't describe what this means. Anything that you put as "shinier" is going to cost ms of GPU power - which drops frames.We can get shinier graphics than last gen while remaining at 60fps.
Demon Souls has complex texturing layers with high resolution normal/relief maps. It can't render those high res textures at 60FPS which is why they are significantly reduced running in "Performance" mode. That is a clear indication that trying to stream 8k textures with the SSD 6 60fps will be near impossible - especially if the textures are unique.Although a small leap, demons souls is already a bit further than last gen stuff, at least overall, technically. It's got more polygons, and it .
Game developers already know what they are doing. They have plenty of papers to read and example code to implement. There is no magic with these consoles as far as development goes. They just have to make the content. Most of the games will be 3rd party - which means their engines will be portable to multiple platforms. That leaves no room for custom optimizations for each platform. Besides, a lot of these multiplatform engines are already very well optimized natively (i.e. UE, Frostbite, REEngine, etc..) for the graphics features we've seen so far.Saying we'll have no improvement is head scratching ; games are barely using the SSDs yet in this cross gen period. Reconstruction methods will get better and better, and developers will come to grips with the new features more and more. Games are still being made with 2012 GCN architecture in mind.
I mean anything that adds to the presentation. Extra geometry, better AA methods, higher res shadows more particles (ratchet ps5), 4k textures where a previous game didn't have that etc. Compare Ratchet Ps5 to Ps4 when released. Bells and whistles don't mean RT.But you can't describe what this means. Anything that you put as "shinier" is going to cost ms of GPU power - which drops frames.
Oh come on, this is just you with blinders on. Rockstar "knew what they are doing" with GTA 3 ; doesn't mean San andreas wasn't leaps and bounds ahead. Epic knew what they were doing with Gears 1 ; but look at Gears 3. Exclusives still exist. Plus, i'm sure you know how hard it is during the cross gen period for developers, and how much more optimization they'll be able to do without being shackled to Xbox one, PS4 and their pro models. Cross gen periods are brutal for developers. The lowest common denominator is still the base Xbox one!Game developers already know what they are doing. They have plenty of papers to read and example code to implement. There is no magic with these consoles as far as development goes. They just have to make the content. Most of the games will be 3rd party - which means their engines will be portable to multiple platforms. That leaves no room for custom optimizations for each platform. Besides, a lot of these multiplatform engines are already very well optimized natively (i.e. UE, Frostbite, REEngine, etc..) for the graphics features we've seen so far.
The reason why you would need the SSD in the UE5 demo is because you are trying to store enormous # of triangles and/or really high resolution textures.
You kill your rendering budget by trying to aim for 60FPS and trying to render large texture sizes. Why store 8k texture maps and render the final framebuffer at 1080p?
But you can't describe what this means. Anything that you put as "shinier" is going to cost ms of GPU power - which drops frames.
The textures in demons souls are the same resolution in performance mode.
You are saying there will be 8k/4k/2k, etc.. and you'll get automatic MIP-mapping. The MIP-mapping algorithm requires speed much much faster than SSD->VRAM. I highly doubt that's how they are using it. If your statement is true, that simple demo should have been running at 4k@60FPS with the proper LOD being loaded to keep that kind of bandwidth. Instead we see a limit of 1440p@30FPS. We need to define why there is a limit when it's streaming and what causes that limit.no, the reason is to stream parts of # of triangles from stored models and part of lot of textures of high resolution( the resolution of textures depends what you need), in the UE5 demo they stream what they need, as they mentioned they have billions of triangles in the source models but they crunches it to 20 millions or so, the same for textures, they may be 8K but they are virtualized so they take what they need based on distance from camera and resolution, the idea is you dont have to make LOD, mip maps, etc
Most games today render out 2-4k texture maps. The magic of the UE5 demo is indeed the high res textures (8k) and the high res geometry that completely replaces any sort of baking of normal maps, etc.. If we stick to 2-4k textures, then most games won't need SSD->VRAM capabilities in order to look good. Especially the PC, which can just load them all in at the beginning of a level at once.why you need to render 8k textures at 1080p to begin with? you can have them stored at 8k but you will be virtualizing them and ending with the best texture posible required for the object without having to put 8k texture on the triangles of a small rock, its like a mip map but automatic
Appreciate your input.Good convo. Appreciate the discussion.
The texture size in demon souls is not the same resolution from 30FPS to 60FPS. I can prove it.
You are saying there will be 8k/4k/2k, etc.. and you'll get automatic MIP-mapping. The MIP-mapping algorithm requires speed much much faster than SSD->VRAM. I highly doubt that's how they are using it. If your statement is true, that simple demo should have been running at 4k@60FPS with the proper LOD being loaded to keep that kind of bandwidth. Instead we see a limit of 1440p@30FPS. We need to define why there is a limit when it's streaming and what causes that limit.
Most games today render out 2-4k texture maps. The magic of the UE5 demo is indeed the high res textures (8k) and the high res geometry that completely replaces any sort of baking of normal maps, etc.. If we stick to 2-4k textures, then most games won't need SSD->VRAM capabilities in order to look good. Especially the PC, which can just load them all in at the beginning of a level at once.
in the quote you posted I mentioned they were virtualized, also the demo includes many assets we dont know how much different textures are loaded(only that tere are a lot and are cinematic grade) I think I dont know how that translates to "the demo should have been running at 4k@60FPS" as if everything was free just because you can get the texture and geometry you need at the correct distance very fast, they may improve resolution and framerate with some optimization I get the impresion from the demo is more a stress test, devs will get more playing to its strengths and also I think there more optimizations made to the engine after the demo and will get improving during generation too also there may be more engines that exploit the system differently with other results
current games already look very good, I think DF mentioned spiderman on regular PS4 looked fantastic even at 4k or something like that I think they also mentioned the UE5 demo looked soo good they had trouble stablishing its resolution wich probably had to do with using triangles that go to pixel size
loading from SSD very fast doesnt mean only textures can get better, there is a lot of advantages and devs will find more, I get the impresion(maybe I am wrong) you are talking about games as if every game have the same requirements wich is not true, you can have games with 8k textures or you may prefer a bigger number of textures at 2-4k or you may settle with an insane amount of different small resolution textures blending other textures to add detail or may preffer more geometry or instant load of assets or scenes to feel as if everything is big world or you can balance all that or go insane with lot of high resolution assets and 3d models in big scenes, the idea is freedom in your development
The point of argument is how limited the consoles are to what the game companies want to achieve. People would rather have 60FPS games across the board. That comes with a cost. I'm just pointing out the sacrifices that would need to be made to achieve 60FPS. A lot of those sacrifices will mitigate any "superior graphical feature" like RT lighting for example. Bottom line, gamers can't expect everything because the hardware isn't powerful enough.
I'd like to see you try and play F-zero gx at 30fps. If that were possible.I get people like 60FPS because of how smooth it is, but why is it that anytime I hear people talk about it they refer to 30FPS as if it's terribly detrimental to gameplay? I've been playing games for years and while smooth is nice, 30FPS has never hurt my ability to respond to what's happening.
Wouldn't it be far better if devs weren't focusing so hard on resolution and FPS and instead were allowed 1080p 30FPS? Think of how many system resources that would free up for the gameplay itself.
Some genre's such as Fighting or Racing games absolutely warrant high FPS. I'm talking about the industry as a whole though. Most games don't demand such a high bar so why is it being put there?I'd like to see you try and play F-zero gx at 30fps. If that were possible.
1080p30fps on Ps5 would be counterproductive ; there were PS4 games that had detail hidden because of the 1080p resolution that became apparent with pro patches. To take that detail much further, at 1080p still would be rendering things that aren't even completely appreciable.
1440p or higher at 30fps would be where the more detail approach would start to make sense, though i'm not an advocate of it.
Well, that's subjective. I want a clean render. I don't care if the game runs at 60FPS. I'd rather have the quality render @ 30FPS. Best of both worlds would simply be having hardware powerful enough to do 4k @ 60FPS.
We've had incredible games that defined genres far longer than 60 FPS or 1440p were a thing. I don't see why those to be a priority now (or ever really).
I would push this even if I was working on a game. That's just a personal preference.that's why your name is VFX veteran and not gaming veteran I guess.
Game developers didn't need a PS5/XSX in order to keep 60FPS. They could have done that last gen too @ 720/1080p.fuck 30fps, it's something we hopefully see the very last time this gen, and hopefully we won't see it that often either!
an interactive medium needs to push the part that benefits the interactive nature of the medium not the window dressing around it. performance should always come first
So for a brief moment in 3D gaming on consoles 60FPS became standard only to drop again. My point is that 60 FPS isn't necessary as a bar for games to meet and bringing up 2D systems or the fact that games existed with it doesn't change that. Yeah I played games on 2D systems. My enjoyment of them wasn't dependent on the frame rate hitting 60. It was great gameplay with stable performance.that's why your name is VFX veteran and not gaming veteran I guess.
fuck 30fps, it's something we hopefully see the very last time this gen, and hopefully we won't see it that often either!
an interactive medium needs to push the part that benefits the interactive nature of the medium not the window dressing around it. performance should always come first
I can't even start to describe how much this sentence made me cringe... have you played a single videogame on any 2D system or during the 6th generation? wtf?
60fps WAS the standard. we abondoned that standard for a brief moment during the PS1 and N64 era where 3D was a new thing. but then quickly made it the standard again during the PS2 era. it was only after that that developers primarily focused on graphics instead of performance. and on Xbox One/Ps4 it was also the hardware's fault tbh, with their dogshit CPUs and outdated GPUs they were immediately prone to jave performance issues with many games that push anything beyond what launch titles looked like