• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

36 Teraflops is still not enough for 4K 60 FPS minimum

Astral Dog

Member
I couldn't give a fuck about 4k when 1080p still hasn't got the levels of destruction that Bad Company 2 did on the Xbox 360, two fucking generations ago.

Give me physics, particle effects and all of the bells and whistles, before you chase 4k
Hopefully with new CPUs advancements in other areas can be done, but its gonna take awhile until the cross/early gen phase is done
 

SlimySnake

Flashless at the Golden Globes
I couldn't give a fuck about 4k when 1080p still hasn't got the levels of destruction that Bad Company 2 did on the Xbox 360, two fucking generations ago.

Give me physics, particle effects and all of the bells and whistles, before you chase 4k
its coming later this year.


5ka60P4.gif


I expect most games to be 1440p next gen. We have already seen this with the 1440p 30 fps UE5 demo and 1440p 60 fps for Demon Souls.
 

Alright

Banned
Hopefully with new CPUs advancements in other areas can be done, but its gonna take awhile until the cross/early gen phase is done

its coming later this year.


5ka60P4.gif


I expect most games to be 1440p next gen. We have already seen this with the 1440p 30 fps UE5 demo and 1440p 60 fps for Demon Souls.
Call me a cynical old fart, but I've heard all the promises before. PS2 toy story graphics took another two generations to get close.

I see promises dangled in front of us, with 4k and 1440p, Ray tracing etc tagged on. I don't want 4k, or RT or 1440p, not until I can blow up buildings like I could in BC2, or hell, even Syndicate wars for the psx! edit: rant not over. Kingdom under fire 1 and crusaders for the xbox, 99 nights on 360 and the creme de la creme of physics; Half Life 1 and or 2

And as for advancements, how can we go backwards from the 360? Just gimme whatever was in that, beef it up so it can do it in 1080p, bosh, done. (up scaling and Back compact doesn't count cause its then still an old arse game with a shiny coat of paint)
 
Last edited:

Woo-Fu

Banned
No amount of hardware will ever be enough when a developer can choose to spend that performance on something other than framerate/resolution. Welcome to reality.
 

SlimySnake

Flashless at the Golden Globes
Call me a cynical old fart, but I've heard all the promises before. PS2 toy story graphics took another two generations to get close.

I see promises dangled in front of us, with 4k and 1440p, Ray tracing etc tagged on. I don't want 4k, or RT or 1440p, not until I can blow up buildings like I could in BC2, or hell, even Syndicate wars for the psx! edit: rant not over. Kingdom under fire 1 and crusaders for the xbox, 99 nights on 360 and the creme de la creme of physics; Half Life 1 and or 2

And as for advancements, how can we go backwards from the 360? Just gimme whatever was in that, beef it up so it can do it in 1080p, bosh, done. (up scaling and Back compact doesn't count cause its then still an old arse game with a shiny coat of paint)
if you look at the processing power of the jaguar CPUs, they are basically on par with the cell and the xbox 360 CPUs. Around 100 flops. someone ran a simulation to confirm this. i believe it was in a GDC conference. I mean the PS4 CPU clock is 1.6 ghz which is exactly half that of the cell which was 3.2 ghz.

So while they got a decent jump in the GPU (5x for xbox, 8x for ps4) it clearly didnt happen for the CPUs. They went with larger environments and called it a day.

This time around, the CPUs are a generation ahead with almost 7x-8x performance compared to last gen base consoles. It will let them push more NPCs, more destruction, and higher framerates. The vision wont be compromised. If we dont get better destruction and more NPCs, it wont be because of weak hardware. DICE are making a next gen only game and have promised amazing destruction and massive number of players on screen at once, and I think they will get there if not this year then with their next release.

of course, if they tack on RT or focus on hitting native 4k then we can forget about all that, but looking at everyone's willingness to drop to 1440p (AC valhalla and Watch Dogs both spend most of the time at 1440p on next gen consoles) we can safely say that they will not be wasting a lot of resources on native 4k rendering.

60 fps might limit some destruction though.
 

JMarcell

Member
That's because game developers focus on developing console versions as a priority, leaving PC version for a small and/or less talented crew. The reason for that is that console version sells three times more, and optimize a game for PC is more difficult (since the developer has to consider the various different types of CPU/GPU combinations and try to make it run in every one of them). To make things worse, PC has a piracy problem, and to avoid the game to be pirated the developers use DENUVO wich tanks game performance even more.
 
You can't backup your choices based on ideas on the Internet or the clown Jensen Huang , but you definitely do with a list of games and other stuff , I don't care about peoples who do know what they are talking about and uploading daily videos for this topic specifically, or whining in toxic pc sites , it doesn't change the fact that the final say goes to the games itself , my rtx blabla blows yours out of the water doesn't change the fact that it performs modestly on x game , deal with it .
 

Soodanim

Member
You essentially need a fast paced game played with a mouse to really see the difference. Any controller game would be too slow.
I’d say that’s half right. Fast paced either in terms of camera movement in the 2 dimensions of view, or if the game is fast in the 3rd dimension like a racing game. Imagine something like Wipeout or F-Zero at 360fps.
 
Last edited:

Paulxo87

Member
In lets say 20 years when hardware is so powerful that you can make a game look completely photorealistic to your hearts desire and the sense of scale is infinite you'll then have your 60fps always games lmao. I don't understand why people can never grasp this.
 
In lets say 20 years when hardware is so powerful that you can make a game look completely photorealistic to your hearts desire and the sense of scale is infinite you'll then have your 60fps always games lmao. I don't understand why people can never grasp this.
60 fps is already becoming the standard. We won't see many games this gen that don't have a 60 fps mode.
 

Paulxo87

Member
60 fps is already becoming the standard. We won't see many games this gen that don't have a 60 fps mode.

It's only standard for now because none of these games have been pushing the hardware AT ALL and they are cross gen to begin with. They are merely making you feel you're getting more for your money. Demons souls was built ground up as a 60fps game because it was a design choice. All the visuals everything they wanted had to be 60fps rock solid.

You don't think naughty dogs new game, etc in a few years time won't target 30fps for max visuals? think again friend lol
 
Last edited:

TheMan

Member
I feel like 4k60 is basically a design choice more than anything. Developers CAN do this, and they could probably do it last gen to. They just have to design the game around that performance spec.
 
if you look at the processing power of the jaguar CPUs, they are basically on par with the cell and the xbox 360 CPUs. Around 100 flops. someone ran a simulation to confirm this. i believe it was in a GDC conference. I mean the PS4 CPU clock is 1.6 ghz which is exactly half that of the cell which was 3.2 ghz.

Actually, PS3 Cell was over 200 GFLOPS
 

RoboFu

One of the green rats
There is no hard line that will get you any resolution across the board. You can make a game run like total shit at 480p if you are not careful.
 
Last edited:

Alright

Banned
if you look at the processing power of the jaguar CPUs, they are basically on par with the cell and the xbox 360 CPUs. Around 100 flops. someone ran a simulation to confirm this. i believe it was in a GDC conference. I mean the PS4 CPU clock is 1.6 ghz which is exactly half that of the cell which was 3.2 ghz.

So while they got a decent jump in the GPU (5x for xbox, 8x for ps4) it clearly didnt happen for the CPUs. They went with larger environments and called it a day.

This time around, the CPUs are a generation ahead with almost 7x-8x performance compared to last gen base consoles. It will let them push more NPCs, more destruction, and higher framerates. The vision wont be compromised. If we dont get better destruction and more NPCs, it wont be because of weak hardware. DICE are making a next gen only game and have promised amazing destruction and massive number of players on screen at once, and I think they will get there if not this year then with their next release.

of course, if they tack on RT or focus on hitting native 4k then we can forget about all that, but looking at everyone's willingness to drop to 1440p (AC valhalla and Watch Dogs both spend most of the time at 1440p on next gen consoles) we can safely say that they will not be wasting a lot of resources on native 4k rendering.

60 fps might limit some destruction though.
1440p with that magic upscale voodoo looks good enough to me.

The good thing is, you've given me my optimism back, because I was wondering why we saw such a lack of open world and complex games last gen (ps4/1x) and the CPU thing makes sense!
 

Paulxo87

Member
It will, but it will also offer a 60 fps mode. I guarantee it.

offering a 60fps mode implies all the assets/sense of scale were designed etc for a 60fps game. Ground up built as a 60fps game. They don't design a game around 30fps and then are easily able to give you a 60fps mode. It's the other way around.

but it's whatever time will tell
 
offering a 60fps mode implies all the assets/sense of scale were designed etc for a 60fps game. Ground up built as a 60fps game. They don't design a game around 30fps and then are easily able to give you a 60fps mode. It's the other way around.

but it's whatever time will tell
Just cutting the resolution in half might be enough to push it from 30 to 60. Unless you think the game will be CPU limited to 30 fps, which is... possible, but very improbable.
 

Jokerevo

Banned
My 3090 is still not enough for 4K 60 in all games...what are the consoles makers thinking trying to advertise 4K native 1080p or 1440p should be standard on those machine with better detail. What do you guys think?
Lmao you think that raw power is the reason why your PC should squash 4k 60fps?

PC has had all this power for so long and never actually harnessed it because anyone who develops for PC has to make their games scale due thousands of different configs and because of this you cannot optimise for shit.

That's the price of customisation and flexibility...not to mention Nvidia have zero interest in solving the problem, they want to keep you on the upgrade treadmill for as long as they can...
 
Lmao you think that raw power is the reason why your PC should squash 4k 60fps?

PC has had all this power for so long and never actually harnessed it because anyone who develops for PC has to make their games scale due thousands of different configs and because of this you cannot optimise for shit.

That's the price of customisation and flexibility...not to mention Nvidia have zero interest in solving the problem, they want to keep you on the upgrade treadmill for as long as they can...
I'm fine with that, as long as I'm getting much better performance than "next gen" consoles. Consoles are like apple products. You gotta be in their garden and succumb to however they want things done. You have no say so in the matter, and end up paying much more, for less options and performance than PC gamers.
 

Razvedka

Banned
My 3090 is still not enough for 4K 60 in all games...what are the consoles makers thinking trying to advertise 4K native 1080p or 1440p should be standard on those machine with better detail. What do you guys think?
Well for one, iirc, Nvidia's new flop calc is inflated by something like 25% in the 3000 series relative to 2000 series.

For another, flops aren't everything and if your game isn't well put together or engineered to use the resources available then you won't have great results. I've also been reading that depending on how the card is out together regarding stream processors/CUs you could have diminishing returns (relative to that theoretical figure).

As for your 3090 specifically, I'd say give it time. I think as the next gen tech refresh hits for engines you'll see your mileage out of that card increase. Just need time for tech to mature.
 

longdi

Banned
1440p + vrr seems a sweet spot.

I have a 200hz gsync monitor and i prefer the variable refresh over high fps. I do feel a difference between 60hz and 200hz, it is smoother and all. But vrr is the real deal that makes gaming smoother.
 

Kumomeme

Member
it is really fall to only teraflops number?

i curious at how much hardware resource it actually need
how many ram/vram, speed, cpu clock, core and thread to sustain fps etc. is needed.
 
Last edited:

Agent_4Seven

Tears of Nintendo
What do you guys think?
I think consoles will always be better when it comes to developers squizeeng all the juce from them and making imposible possible, not to mention the advantage of optimizing their games for just one platform with way better and faster unified memory layout and faster access times. In the mean time, PC has to deal with way, way slower DDR memory which needs to play catch up, while CPU and GPU doing brute-force work to overcome unified memory layout on consoles and make games run better on 19857918760981760879 PC configurations.
 
Last edited:
There is a reason you often see consoles using medium and sometimes below low settings in multiplat games. Simply because ultra settings can look barely better than medium at times, but halves your frames. 3090 easily doubles your frames at next gen console equivalent settings in regular games and will tripple in RT games.
 
My 3090 is still not enough for 4K 60 in all games...what are the consoles makers thinking trying to advertise 4K native 1080p or 1440p should be standard on those machine with better detail. What do you guys think?
Optimization, have you heard of it? By the time this generation is up 4k 60 fps wi be the norm.
 

ZehDon

Member
Well OP, console manufacturers aren't actually advertising "native 4K" as a blanket message for their hardware - they're just advertising "4K". Heck, the 3090 from the thread title specifically targets 8K re-constructed via DLSS. The word 'native' isn't really used for advertising purposes. So, your post isn't really off to a good start there because you're not basing it on reality. In general, targeting a universal native 4K is a pretty terrible use of the hardware. Volumetric and emissive effects rendered natively at that pixel count alone will hurt, and you can underscore that sentence if you want higher accuracy via ray tracing. Add in all the expected bells and whistled of modern games, and that 4K native image gives you pretty lacking results. So, everyone will be using a combination of rendered elements are variable resolutions to build up to a 4K final output... which we knew before the consoles were even announced. Not sure why you thought differently?

As for the 3090 TFLOP comparison, you're using pretty terrible comparisons. As John Carmack said:


So, with the Xbox pushing 12TFLOPs of "on paper" power, that would be around 24TFLOPs of realised power. The RTX 3080 has 29TFLOPs in comparable metrics. That should be more than enough grunt for re-constructed 4K as a general target, though I expect it to drop towards the end of the console generation.
 
Last edited:

sunnysideup

Banned
Yea reading all the bullshit from pc fanboys how they run all games 4k 60fps on their 5 year old toaster while console peasants are stuck checkerboarding 30fps. And then you buy a powerful pc and there are not many game that you can run at highest settings 4k/60fps. How the fuck are these pc fandolts able to do it on their gtx 1060?

Also i think the way games are benchmarked is fucking stupid. Always going for framerate averages. Framerate averages means fucking nothing. Its useless.

Either you should test if you can lock it at 30/60/120 or whatever. Or at what fps you can lock it at without any fps drops, 99% stable. If you are playing on freesync monitor. Playing games with an unlocked framerate that constantly drops 20-30 fps is like playing with seizures and vomiting all over your keyboard.

Also all games should have a graphics demo like tomb raider where you can stress test the settings without playing the game and lock the framerate/resolution. Something thats like 10% more demanding than the actual games. So you can ensure you get a stable locked framerate while playing the game. And dont ruin the experience when actually playing the game by fucking with the settings.
 
Last edited:
Welcome to NeoGAF, a gaming forum where posters knows more about computing hardware than John fucking Carmack.
Appeal to authority, so not an argument. As I said, the tweet is verifiably false. Just look at any current multiplatform game. On consoles, the games run pretty much exactly as you'd expect them to run on similar PC hardware. Twice the performance my ass.
 
Last edited:
Top Bottom