• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Gollum publishing director says 60fps is a priority for nextgen - also expands on PS5 capabilities

cormack12

Gold Member
Source: https://respawnfirst.com/the-lord-of-the-rings-gollum-60-fps-ps5/

ZpewxF.jpg


kqAXxf.jpg


 

Bramble

Member
Man I gotta admit I am very pleased with the 60fps output up until now. I do however fear that, when games get designed more and more solely for the next gen systems and graphics improve, we will be seeing more 30fps titles. I hope I'm wrong.
 

Skifi28

Member
Yeah, I know its always what many want at the start of each gen and reality comes crashing in, but so far the 60 fps dream might be reality this time.
It might just be. This time the CPUs are great with power to spare, all they really need is adjusting the resolution target and giving players a choice. I can't think of any reason not to and 99% of games released so far have proven it. Even WD Legion is adding a 60fps mode.
 

Hunnybun

Member
I'm increasingly confident that the gleeful, doom-mongering, 30fps is here FOREVER, crowd will be proven wrong this generation.

The logic was never very sound anyway, given the improvement in frame rates from like 20fps in the mid 90s to dodgy 30fps on the PS360 to solid 30fps last time.
 

Kuranghi

Member
Great news, but they probably shouldn't have rendered their cutscenes out at 30fps if they are targetting 60fps, that'll be a bit jarring to transition between them.
 

fatmarco

Member
If a game is cross-gen there's really no reason it shouldn't be 60 fps, and judging by the way this game looks it's most definitely cross-gen.
 

VFXVeteran

Banned
This is great. But what is not so great, to me, is the sacrifices on how to get there. DRS is going to be standard this entire gen and I hate that reality.
 
VFXVeteran VFXVeteran are you laughing because you care a lot about resolution, or do you honestly think 30fps can still add more to the presentation than 60fps? Like holy crap at all the graphical giants 6th gen had at 60fps. And here we are 3 generations after!

Just the loss of motion resolution alone is worth shooting for 60fps. 60fps at a lower resolution is actually more clear.

I like native 4K, but only if it’s not sacrificing anything else to get there
 
Last edited:
J

JeremyEtcetera

Unconfirmed Member
I'm increasingly confident that the gleeful, doom-mongering, 30fps is here FOREVER, crowd will be proven wrong this generation.

The logic was never very sound anyway, given the improvement in frame rates from like 20fps in the mid 90s to dodgy 30fps on the PS360 to solid 30fps last time.
We will see how this gen goes. It looks promising but to console players it's always been a carrot on a stick situation in regards to framerate.
 

SilentUser

Member
Give me 60 fps option and I will most likely play it this way. I really don't need a native 4k for now, 1440p is enough with most of upscaling tech we have right now.
 

VFXVeteran

Banned
VFXVeteran VFXVeteran are you laughing because you care a lot about resolution, or do you honestly think 30fps can still add more to the presentation than 60fps? Like holy crap at all the graphical giants 6th gen had at 60fps. And here we are 3 generations after!

Just the loss of motion resolution alone is worth shooting for 60fps. 60fps at a lower resolution is actually more clear.

I like native 4K, but only if it’s not sacrificing anything else to get there
If we shoot for 60FPS with these GPUs then we will sacrifice a lot of computations for the more complex rendering features. You won't get the SSD->VRAM pipeline like in UE5 @ 60FPS with 8k texture maps. You'll forego RT completely. You'll probably have to stick with lower texture filtering than 16x, you'll be stuck with even lower res during DRS (depending on complexity, we are looking at 1080p levels). There would literally be no progress on the graphics front. We'll be stuck with what we have now for the entire generation.

I'd much rather have a better approximation to the rendering equation @ 30FPS. That requires more pixels inside the triangles and all the other benefits of rendering framebuffers at a higher resolution for all the screenspace FX.
 
If we shoot for 60FPS with these GPUs then we will sacrifice a lot of computations for the more complex rendering features. You won't get the SSD->VRAM pipeline like in UE5 @ 60FPS with 8k texture maps. You'll forego RT completely. You'll probably have to stick with lower texture filtering than 16x, you'll be stuck with even lower res during DRS (depending on complexity, we are looking at 1080p levels). There would literally be no progress on the graphics front. We'll be stuck with what we have now for the entire generation.

I'd much rather have a better approximation to the rendering equation @ 30FPS. That requires more pixels inside the triangles and all the other benefits of rendering framebuffers at a higher resolution for all the screenspace FX.
We already have RT at 60fps in the early days. But, honestly not every game even needs RT (except for reflections, if that's the cheapest way to kill SSR usage because it's an eyesore). The SSD is no less important at 60fps unless i'm missing something (please tell me if I am); Ratchet and clank at 60fps with RT exists and allows for that instant warping. I mean yeah, you can do more of everything at 30fps, that goes without saying. Not sure why we need 8k textures at 4k or lower. Some of that detail would be hidden by the resolution, right? Texture filtering, i'll agree there, i'd like to see 16x af standard and 60fps will probably get in the way sometimes. But if developers wanted it, they could factor that in.

These are video games, not CGI theme parks ala the avengers or Pacific rim. 60fps is important for motion clarity, responsiveness and the overall presentation. At 1440p60 with temporal injection you can have much more detail than 1440p30 on Ps4 pro. If a game is complex enough to drop the resolution all the way to 1080p, then it will damn well be more impressive than last gen assuming it's optimized. In other words, there'd still be a leap. PS4 pro has games which drop below 1080p with DRS ; see Nioh for example.

We can get shinier graphics than last gen while remaining at 60fps. Although a small leap, demons souls is already a bit further than last gen stuff, at least overall, technically. It's got more polygons, and it . Going further into the uncanny valley while sacrificing gameplay and motion clarity is pointless for games. Thank god they didn't have this mentality in 6th generation on ps2. Unless a game needed a lower framerate to even work, like shadow of the colossus. Which again, i'm fine with 30fps if that's the case. There has to be a good excuse.

Saying we'll have no improvement is head scratching ; games are barely using the SSDs yet in this cross gen period. Reconstruction methods will get better and better, and developers will come to grips with the new features more and more. Games are still being made with 2012 GCN architecture in mind.

I think you should wait and see what Ratchet looks like in its 60fps mode before writing off 60fps mode completely. If a game comes out and is 30fps and just blows me away visually compared to 60fps stuff, I will happily admit it. But I kinda think such a game would be high resolution anyway, and dropping that resolution could result in mostly the same detail and mostly the same level of image quality. Demons souls is already a great example of that!

We'll see how it plays out, but it's clear that 60fps is much less limiting than the last generation, and esp. the ps3 and 360 generation.
 
Last edited:

VFXVeteran

Banned
We already have RT at 60fps in the early days.
RT simple reflections with extremely low resolution scene objects isn't what I call doing RT right. It looks worse than just having sharper SSR. Also, RT is a lot deeper than what you guys are understanding. Cyberpunk/Metro has the right idea with RT their lighting engine. That takes up enormous GPU bandwidth.

The SSD is no less important at 60fps unless i'm missing something (please tell me if I am);
The reason why you would need the SSD in the UE5 demo is because you are trying to store enormous # of triangles and/or really high resolution textures. You kill your rendering budget by trying to aim for 60FPS and trying to render large texture sizes. Why store 8k texture maps and render the final framebuffer at 1080p? It defeats the purpose of using the SSD->VRAM if you don't have enough resolution to see any kind of detail. It's like loading in 8k textures but then using a filter kernel so wide on that texture that you start averaging many texels to fit inside of one large screen pixel. It'll look like a blurry mess.

Ratchet and clank at 60fps with RT exists and allows for that instant warping.
That's just fast loading and the RT is minimal (low grade reflections). That offers nothing big as far as enhanced graphics features (unlike last gens PBR shaders).

I mean yeah, you can do more of everything at 30fps, that goes without saying. Not sure why we need 8k textures at 4k or lower. Some of that detail would be hidden by the resolution, right?
Yes, but the delta difference of native 4k isn't as dramatic as a 1440p/1080p framebuffer resolution - you'd see really nice details up close. That's the reason for using high res textures.

Texture filtering, i'll agree there, i'd like to see 16x af standard and 60fps will probably get in the way sometimes. But if developers wanted it, they could factor that in.
Evidently from what we are seeing it's too expensive to run.

We can get shinier graphics than last gen while remaining at 60fps.
But you can't describe what this means. Anything that you put as "shinier" is going to cost ms of GPU power - which drops frames.

Although a small leap, demons souls is already a bit further than last gen stuff, at least overall, technically. It's got more polygons, and it .
Demon Souls has complex texturing layers with high resolution normal/relief maps. It can't render those high res textures at 60FPS which is why they are significantly reduced running in "Performance" mode. That is a clear indication that trying to stream 8k textures with the SSD 6 60fps will be near impossible - especially if the textures are unique.

Saying we'll have no improvement is head scratching ; games are barely using the SSDs yet in this cross gen period. Reconstruction methods will get better and better, and developers will come to grips with the new features more and more. Games are still being made with 2012 GCN architecture in mind.
Game developers already know what they are doing. They have plenty of papers to read and example code to implement. There is no magic with these consoles as far as development goes. They just have to make the content. Most of the games will be 3rd party - which means their engines will be portable to multiple platforms. That leaves no room for custom optimizations for each platform. Besides, a lot of these multiplatform engines are already very well optimized natively (i.e. UE, Frostbite, REEngine, etc..) for the graphics features we've seen so far.
 
VFXVeteran VFXVeteran You don't care about motion resolution, smoothness and input response?

No I get it, RTX can be used for every part of the pipeline, but it only makes sense in that regard, to use it for GI and everything in between, if realism is the primary goal. Otherwise, it's too expensive and wasteful and can be appropriated through other methods. Nintendo's artstyle for example is going to look better at native 4k and appropriated lighting vs. 1080p with RT everything. Not every game, or indeed every part of a game needs RT. Guerilla games stated when they made shadow fall, that there is still room for baked lighting as it gives more artistic control to developers, which makes a lot of sense.

4K textures are enough. I can live with that. Myself and many others can appreciate N64 games, we can live with 4k textures. We're not going to have photorealism this gen just because 8k textures are possible. Many Ps4 games didn't have 4k textures! With regards to SSR, it's awful. Having something clip and warp out of existance is incredibly distracting and a clear step back from prior generation methods. Esp. examples on gamecube. Hell, Mario 64 has more appealing reflections and that is no joke. I don't care how complex a technique is, it needs to look good.

The textures in demons souls are the same resolution in performance mode. There are tessellation and shadow differences. If that's what you meant, then yeah, but it's a small difference in the middle of a fight. I understand what you're saying that it's impossible, I just don't see how that extra detail is worth doing for this interactive medium.
But you can't describe what this means. Anything that you put as "shinier" is going to cost ms of GPU power - which drops frames.
I mean anything that adds to the presentation. Extra geometry, better AA methods, higher res shadows more particles (ratchet ps5), 4k textures where a previous game didn't have that etc. Compare Ratchet Ps5 to Ps4 when released. Bells and whistles don't mean RT.
Game developers already know what they are doing. They have plenty of papers to read and example code to implement. There is no magic with these consoles as far as development goes. They just have to make the content. Most of the games will be 3rd party - which means their engines will be portable to multiple platforms. That leaves no room for custom optimizations for each platform. Besides, a lot of these multiplatform engines are already very well optimized natively (i.e. UE, Frostbite, REEngine, etc..) for the graphics features we've seen so far.
Oh come on, this is just you with blinders on. Rockstar "knew what they are doing" with GTA 3 ; doesn't mean San andreas wasn't leaps and bounds ahead. Epic knew what they were doing with Gears 1 ; but look at Gears 3. Exclusives still exist. Plus, i'm sure you know how hard it is during the cross gen period for developers, and how much more optimization they'll be able to do without being shackled to Xbox one, PS4 and their pro models. Cross gen periods are brutal for developers. The lowest common denominator is still the base Xbox one!

Anyway, at the end of the day i'm looking at this through the lens of what's best for games, and not just the highest fidelity graphics. Considering your work though, I can understand your focus on that aspect. I'm sure some developers will highly optimize their 30fps modes, even if optional 60fps modes become standard.
 
Last edited:
VFXVeteran VFXVeteran

The reason why you would need the SSD in the UE5 demo is because you are trying to store enormous # of triangles and/or really high resolution textures.

no, the reason is to stream parts of # of triangles from stored models and part of lot of textures of high resolution( the resolution of textures depends what you need), in the UE5 demo they stream what they need, as they mentioned they have billions of triangles in the source models but they crunches it to 20 millions or so, the same for textures, they may be 8K but they are virtualized so they take what they need based on distance from camera and resolution, the idea is you dont have to make LOD, mip maps, etc



You kill your rendering budget by trying to aim for 60FPS and trying to render large texture sizes. Why store 8k texture maps and render the final framebuffer at 1080p?

why you need to render 8k textures at 1080p to begin with? you can have them stored at 8k but you will be virtualizing them and ending with the best texture posible required for the object without having to put 8k texture on the triangles of a small rock, its like a mip map but automatic





But you can't describe what this means. Anything that you put as "shinier" is going to cost ms of GPU power - which drops frames.

he is referring to overall better graphics than last gen at 60 fps due to the better specs wich is totally true, sure you cannot go crazy if you aim 60 fps but I dont think he means using full RT everywere and things like that, that depends the game and how "shinier" you want to go

everything you put on screen will require ms time from GPU but that is what the GPU is for :p
 
Last edited:
Companies really need to unlock their older games. I nearly got Arse Creed Odyssey in spring PS5 sale until I did my due diligence and found out it's a locked 30fps. I'll take that hit to 30 in 5 years time as games get bigger and more complex. But not on a fucking PS4 game.
 

VFXVeteran

Banned
no, the reason is to stream parts of # of triangles from stored models and part of lot of textures of high resolution( the resolution of textures depends what you need), in the UE5 demo they stream what they need, as they mentioned they have billions of triangles in the source models but they crunches it to 20 millions or so, the same for textures, they may be 8K but they are virtualized so they take what they need based on distance from camera and resolution, the idea is you dont have to make LOD, mip maps, etc
You are saying there will be 8k/4k/2k, etc.. and you'll get automatic MIP-mapping. The MIP-mapping algorithm requires speed much much faster than SSD->VRAM. I highly doubt that's how they are using it. If your statement is true, that simple demo should have been running at 4k@60FPS with the proper LOD being loaded to keep that kind of bandwidth. Instead we see a limit of 1440p@30FPS. We need to define why there is a limit when it's streaming and what causes that limit.

why you need to render 8k textures at 1080p to begin with? you can have them stored at 8k but you will be virtualizing them and ending with the best texture posible required for the object without having to put 8k texture on the triangles of a small rock, its like a mip map but automatic
Most games today render out 2-4k texture maps. The magic of the UE5 demo is indeed the high res textures (8k) and the high res geometry that completely replaces any sort of baking of normal maps, etc.. If we stick to 2-4k textures, then most games won't need SSD->VRAM capabilities in order to look good. Especially the PC, which can just load them all in at the beginning of a level at once.
 
Last edited:
Good convo. Appreciate the discussion.

The texture size in demon souls is not the same resolution from 30FPS to 60FPS. I can prove it.
Appreciate your input.

Admittedly I just watched a video and with my eye it seemed the same, but also D dark10x says it's the same. I don't doubt you mate, but yeah if you could show that i'm sure more than just me would appreciate it.
 

VFXVeteran

Banned
Appreciate your input.

Admittedly I just watched a video and with my eye it seemed the same, but also D dark10x says it's the same. I don't doubt you mate, but yeah if you could show that i'm sure more than just me would appreciate it.
u2m2Ft2.jpg


The best way to tell whether textures are using the same resolution is to pay attention to the normal maps. Normal maps are full 32-bit float values and because of the accuracy required to get the height values which compute normals for the triangles, the light loop is especially sensitive to normals. When you have a lower res normal map along with texture filtering, the detail starts to appear blurred as you can see in the highlighted blue circles compared to the red circles in the high res map. My eye can pick this up immediately because I'm used to it. Others may not be able to notice the difference while playing.
 
You are saying there will be 8k/4k/2k, etc.. and you'll get automatic MIP-mapping. The MIP-mapping algorithm requires speed much much faster than SSD->VRAM. I highly doubt that's how they are using it. If your statement is true, that simple demo should have been running at 4k@60FPS with the proper LOD being loaded to keep that kind of bandwidth. Instead we see a limit of 1440p@30FPS. We need to define why there is a limit when it's streaming and what causes that limit.

in the quote you posted I mentioned they were virtualized, also the demo includes many assets we dont know how much different textures are loaded(only that tere are a lot and are cinematic grade) I think I dont know how that translates to "the demo should have been running at 4k@60FPS" as if everything was free just because you can get the texture and geometry you need at the correct distance very fast, they may improve resolution and framerate with some optimization I get the impresion from the demo is more a stress test, devs will get more playing to its strengths and also I think there more optimizations made to the engine after the demo and will get improving during generation too also there may be more engines that exploit the system differently with other results

Most games today render out 2-4k texture maps. The magic of the UE5 demo is indeed the high res textures (8k) and the high res geometry that completely replaces any sort of baking of normal maps, etc.. If we stick to 2-4k textures, then most games won't need SSD->VRAM capabilities in order to look good. Especially the PC, which can just load them all in at the beginning of a level at once.

current games already look very good, I think DF mentioned spiderman on regular PS4 looked fantastic even at 4k or something like that I think they also mentioned the UE5 demo looked soo good they had trouble stablishing its resolution wich probably had to do with using triangles that go to pixel size

loading from SSD very fast doesnt mean only textures can get better, there is a lot of advantages and devs will find more, I get the impresion(maybe I am wrong) you are talking about games as if every game have the same requirements wich is not true, you can have games with 8k textures or you may prefer a bigger number of textures at 2-4k or you may settle with an insane amount of different small resolution textures blending other textures to add detail or may preffer more geometry or instant load of assets or scenes to feel as if everything is big world or you can balance all that or go insane with lot of high resolution assets and 3d models in big scenes, the idea is freedom in your development
 
Last edited:

VFXVeteran

Banned
in the quote you posted I mentioned they were virtualized, also the demo includes many assets we dont know how much different textures are loaded(only that tere are a lot and are cinematic grade) I think I dont know how that translates to "the demo should have been running at 4k@60FPS" as if everything was free just because you can get the texture and geometry you need at the correct distance very fast, they may improve resolution and framerate with some optimization I get the impresion from the demo is more a stress test, devs will get more playing to its strengths and also I think there more optimizations made to the engine after the demo and will get improving during generation too also there may be more engines that exploit the system differently with other results



current games already look very good, I think DF mentioned spiderman on regular PS4 looked fantastic even at 4k or something like that I think they also mentioned the UE5 demo looked soo good they had trouble stablishing its resolution wich probably had to do with using triangles that go to pixel size

loading from SSD very fast doesnt mean only textures can get better, there is a lot of advantages and devs will find more, I get the impresion(maybe I am wrong) you are talking about games as if every game have the same requirements wich is not true, you can have games with 8k textures or you may prefer a bigger number of textures at 2-4k or you may settle with an insane amount of different small resolution textures blending other textures to add detail or may preffer more geometry or instant load of assets or scenes to feel as if everything is big world or you can balance all that or go insane with lot of high resolution assets and 3d models in big scenes, the idea is freedom in your development

The point of argument is how limited the consoles are to what the game companies want to achieve. People would rather have 60FPS games across the board. That comes with a cost. I'm just pointing out the sacrifices that would need to be made to achieve 60FPS. A lot of those sacrifices will mitigate any "superior graphical feature" like RT lighting for example. Bottom line, gamers can't expect everything because the hardware isn't powerful enough.
 
Last edited:
The point of argument is how limited the consoles are to what the game companies want to achieve. People would rather have 60FPS games across the board. That comes with a cost. I'm just pointing out the sacrifices that would need to be made to achieve 60FPS. A lot of those sacrifices will mitigate any "superior graphical feature" like RT lighting for example. Bottom line, gamers can't expect everything because the hardware isn't powerful enough.

it is true hardware is limited but is hardly something we dont know from past generation specially on a forum like this, its true for every hardware not just consoles, I dont see where gamers are expecting "everything" the generation will finish and there will be more effects that will be available with more power in the future
 
Last edited:

stickkidsam

Member
I get people like 60FPS because of how smooth it is, but why is it that anytime I hear people talk about it they refer to 30FPS as if it's terribly detrimental to gameplay? I've been playing games for years and while smooth is nice, 30FPS has never hurt my ability to respond to what's happening.

Wouldn't it be far better if devs weren't focusing so hard on resolution and FPS and instead were allowed 1080p 30FPS? Think of how many system resources that would free up for the gameplay itself.
 
I get people like 60FPS because of how smooth it is, but why is it that anytime I hear people talk about it they refer to 30FPS as if it's terribly detrimental to gameplay? I've been playing games for years and while smooth is nice, 30FPS has never hurt my ability to respond to what's happening.

Wouldn't it be far better if devs weren't focusing so hard on resolution and FPS and instead were allowed 1080p 30FPS? Think of how many system resources that would free up for the gameplay itself.
I'd like to see you try and play F-zero gx at 30fps. If that were possible.

1080p30fps on Ps5 would be counterproductive ; there were PS4 games that had detail hidden because of the 1080p resolution that became apparent with pro patches. To take that detail much further, at 1080p still would be rendering things that aren't even completely appreciable.

1440p or higher at 30fps would be where the more detail approach would start to make sense, though i'm not an advocate of it.
 

stickkidsam

Member
I'd like to see you try and play F-zero gx at 30fps. If that were possible.

1080p30fps on Ps5 would be counterproductive ; there were PS4 games that had detail hidden because of the 1080p resolution that became apparent with pro patches. To take that detail much further, at 1080p still would be rendering things that aren't even completely appreciable.

1440p or higher at 30fps would be where the more detail approach would start to make sense, though i'm not an advocate of it.
Some genre's such as Fighting or Racing games absolutely warrant high FPS. I'm talking about the industry as a whole though. Most games don't demand such a high bar so why is it being put there?

We've had incredible games that defined genres far longer than 60 FPS or 1440p were a thing. I don't see why those to be a priority now (or ever really). Games could focus on advancing mechanics and interaction with the benefit of modern tech allowing a stable frame rate and (what I consider to still be) a gorgeous level of detail. The industry needs to slow the fuck down and take a breath instead of continually cutting itself on the graphical edge.
 

01011001

Banned
Well, that's subjective. I want a clean render. I don't care if the game runs at 60FPS. I'd rather have the quality render @ 30FPS. Best of both worlds would simply be having hardware powerful enough to do 4k @ 60FPS.

that's why your name is VFX veteran and not gaming veteran I guess.

fuck 30fps, it's something we hopefully see the very last time this gen, and hopefully we won't see it that often either!

an interactive medium needs to push the part that benefits the interactive nature of the medium not the window dressing around it. performance should always come first

We've had incredible games that defined genres far longer than 60 FPS or 1440p were a thing. I don't see why those to be a priority now (or ever really).

I can't even start to describe how much this sentence made me cringe... have you played a single videogame on any 2D system or during the 6th generation? wtf?

60fps WAS the standard. we abondoned that standard for a brief moment during the PS1 and N64 era where 3D was a new thing. but then quickly made it the standard again during the PS2 era. it was only after that that developers primarily focused on graphics instead of performance. and on Xbox One/Ps4 it was also the hardware's fault tbh, with their dogshit CPUs and outdated GPUs they were immediately prone to jave performance issues with many games that push anything beyond what launch titles looked like
 
Last edited:

VFXVeteran

Banned
that's why your name is VFX veteran and not gaming veteran I guess.
I would push this even if I was working on a game. That's just a personal preference.

fuck 30fps, it's something we hopefully see the very last time this gen, and hopefully we won't see it that often either!

an interactive medium needs to push the part that benefits the interactive nature of the medium not the window dressing around it. performance should always come first
Game developers didn't need a PS5/XSX in order to keep 60FPS. They could have done that last gen too @ 720/1080p.

If you want pixelated resolution with low res texture maps, small anisotropic filtering, sucky unrealistic ambient GI light probes, light maps without casting shadows and conventional 2D sprites for transparent objects, you can always play on a Nintendo or an Indie game on PC.
 
Last edited:

stickkidsam

Member
that's why your name is VFX veteran and not gaming veteran I guess.

fuck 30fps, it's something we hopefully see the very last time this gen, and hopefully we won't see it that often either!

an interactive medium needs to push the part that benefits the interactive nature of the medium not the window dressing around it. performance should always come first



I can't even start to describe how much this sentence made me cringe... have you played a single videogame on any 2D system or during the 6th generation? wtf?

60fps WAS the standard. we abondoned that standard for a brief moment during the PS1 and N64 era where 3D was a new thing. but then quickly made it the standard again during the PS2 era. it was only after that that developers primarily focused on graphics instead of performance. and on Xbox One/Ps4 it was also the hardware's fault tbh, with their dogshit CPUs and outdated GPUs they were immediately prone to jave performance issues with many games that push anything beyond what launch titles looked like
So for a brief moment in 3D gaming on consoles 60FPS became standard only to drop again. My point is that 60 FPS isn't necessary as a bar for games to meet and bringing up 2D systems or the fact that games existed with it doesn't change that. Yeah I played games on 2D systems. My enjoyment of them wasn't dependent on the frame rate hitting 60. It was great gameplay with stable performance.
 

Alright

Banned
1080p@60fps with all of the physics bells and destructive whistles attached. Upscale using whatever magic technique you have to make it look better. I miss having the physics of HL or the destruction of BF:BC in games.
 

Daymos

Member
Better graphics?
4k resolution?
60 fps+?

Pick two or pay $2,000+ for an ultra high end PC. Slap a PS6 sticker on it, hook it up to your TV, and attach a playstation controller to steam.
 
Last edited:

cromofo

Member
I might actually get a PS5 down the road if 60fps become the standard. Anything lower than that is unplayable for me. 60 is bare minimum.
 
Top Bottom