• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

36 Teraflops is still not enough for 4K 60 FPS minimum

LordOfChaos

Member
agreed , 4k is overrated , 1440p is better for stable framerates

Thiiis, particularly when they add AMD Super Resolution for upscaling. Why spend twice as much GPU power for marginally better results when we're still limited on GPU power and can spend it elsewhere.

4K is 3840×2160 (a total of over 8 million pixels), while 1440p is 2560×1440 (3.6 million pixels), I think people sometimes forget what a difference this is, while with a DLSS 2.0 tier upscaling or even close, it would be nearly imperceptible from a couch to use a 1440p render and upscale.

Really want the PS5 to add a native 1440p output mode too
 
Last edited:

Shubh_C63

Member
I can't wait to ditch my laptop and get a proper PC for 1440p 60fps.
Destiny slumping down to 30 sometimes is like kick in the nut. "literally unplayable" but jokes aside it kills the mood.
 

Tqaulity

Member
My 3090 is still not enough for 4K 60 in all games...what are the consoles makers thinking trying to advertise 4K native 1080p or 1440p should be standard on those machine with better detail. What do you guys think?
[face palm] :messenger_unamused: When are people going to learn that you cannot make a blanket statement like that in a vacuum. Performance (resolution and fps) is a function of workload. So guess what, your RTX 3090 can run nearly all games at native 4K 60fps...if YOU TURN THE SETTINGs DOWN!

What your statement is really saying is that you can't run every game at native 4K 60fps with "maximum" settings. Yeah that's fine for you PC elitist crowd that would never imagine playing a game at anything lower than MAX settings. But then you compare to the consoles as if it were ever an aspiration for any sane developer to develop a console game running at maximum PC settings (total waste of resources). In fact, most do not and they run at some mix of settings that work best in balancing IQ and performance for the given hardware. Yeah, the proper way to develop a game.

So moral of the story: it's entirely plausible that the game struggling at 4K native ~50fps and max settings on your 3090 could run at native 4K 60fps with an optimized mix of lower settings on PS5/XSX....and you probably wouldn't even notice the difference unless viewing side by side comparisons.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
1440p144 till I can get a 4K144 monitor for pocket change so I ain't that fussed....give us that update Sony!

But in reality 4K on PC and console should be using TAAU, Checkerboarding, DLSS, SuperResolution or whatever new techniques Devs come up with the 4K native dream isn't yet here but be happy you have a good 4K screen today cuz chance Hopper will finally be able to deliver native 4K.
 

Valonquar

Member
I dunno, these bargain bin 10 year old games off steam run like a dream. The even older MMO seems to look great too.
 
Last edited:

Buggy Loop

Member
I mean, at what settings?

Ultra nightmare can it run crysis settings are dumb as fuck and most often useless , and that’s coming from a long time PC gamer.

I quite like when Digital foundry makes an optimized setting list for games, they often nail a good balance of performance vs « can you really tell the difference? »
 

cireza

Banned
Who cares when the 68000 is enough for 320x224@60fps goodness. We would not be crying for 120fps if we hadn't made the move to LCD panels by the way.
 

Lethal01

Member
It's not poor optimization. It's just sheer bandwidth limitations. The entire shader pipeline is dependent on pixels requiring shading. All games are "shader" limited. The more pixels you need to shade, the slower the pipeline will be. That's why I told people that even the 3090 isn't good enough for ALL games. It totally depends on how you are designing your game and how the graphics pipeline's role will play.

So they poorly optimized for the bandwidth limitations of the current device.

So yeah, it's poorly optimized.
 

Jigga117

Member
[face palm] :messenger_unamused: When are people going to learn that you cannot make a blanket statement like that in a vacuum. Performance (resolution and fps) is a function of workload. So guess what, your RTX 3090 can run nearly all games at native 4K 60fps...if YOU TURN THE SETTINGs DOWN!

What your statement is really saying is that you can't run every game at native 4K 60fps with "maximum" settings. Yeah that's fine for you PC elitist crowd that would never imagine playing a game at anything lower than MAX settings. But then you compare to the consoles as if it were ever an aspiration for any sane developer to develop a console game running at maximum PC settings (total waste of resources). In fact, most do not and they run at some mix of settings that work best in balancing IQ and performance for the given hardware. Yeah, the proper way to develop a game.

So moral of the story: it's entirely plausible that the game struggling at 4K native ~50fps and max settings on your 3090 could run at native 4K 60fps with an optimized mix of lower settings on PS5/XSX....and you probably wouldn't even notice the difference unless viewing side by side comparisons.
To add to this it brings in the point about Gpus and Next Gen systems. None of the games are made for them. Every new GPU is back compatible game. No game is optimize for the latest GPU. Same as to why existing games don't perform as well with next gen even with added touches because of optimization and tools. Nothing is ever going to be perfect. Lower your expectations and make setting changes to reach 4k@60
 

EverydayBeast

thinks Halo Infinite is a new graphical benchmark
4K is heading to 8K with these kinds graphics pc gamers will definitely get bang for their bucks.
 

ZywyPL

Banned
Of those marketed 36TF about 1/3rd are used for integer computations, so realistically it's more like 22-24. As or the consoles, bare in mind they often run games at 30FPS, exactly to prioritize the resolution, so marketing them as 4K machines is absolutely valid. Not to mention the games run at medium-high settings instead of ultra, which barely (if at all) boosts the visuals any further. So all in all, 4K60 is easily doable, starting with something like 2080S, but ultimately it's only up to you whether you're chasing the setting bars in the options menu or the the actual on-screen visuals/performance. Granted, some titles will have absolutely terrible optimization and will put any hardware out there to its knees, while looking barely better than PS360 titles, but those kind of titles shouldn't be used to prove anything.



1440p was the next logical step but Microsoft and Sony wanted to slap "4K" on those boxes so badly....


So "logical" that a 1440p TV doesn't even exist... But hey, there's XSS that does target 1440p, and it looks like shit on 4K display.
 

Onironauta

Member
Taking about performance in terms of resolution and fps only makes sense in relation to what you're trying to render.
You could probably run a ps1-level game at 8k and 1000+ fps.
 
I'm happy enough with 120hz thank you very much. 144hz sounds nice but when you consider the difference between 60 and 120hz, going from 120 to 144 is well within the realm of diminishing returns.

As for 240hz, good luck running that playing modern games at 1440p. Maybe in 3 years time.

I've got a rig in the kids room that gets 240fps locked on fortnite @ comp settings with a 1660 Ti.
 
I think that the whole resolution war is retarded and completely ignores that there are two other corners on the triangle, one of which actually matters to how the game feels to play.

RB8J05C.png


Graphics...it's clearly graphics...this is clearly a joke
 

sncvsrtoip

Member
even 100 tf could be not enough ;) it all depends of game settings and it visuals (and there is no such thing like too much power for aaa production), for me dynamic res or even better something like dlss is way to go
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
Lower a couple settings.. PC games often have settings that are just completely unoptimized, or potentially just not even feasible until some future hardware (independent of teraflops)
 

Outrunner

Member
Your 3090 does not have 36 TeraFlops.

I´m so sorry you fall into Nvidia marketing, you should know more specially considering you browse core forums like Neogaf.

Seeing how many people here own xboxes I'd say this is a casual forum.
 

Dream-Knife

Banned
My 3090 is still not enough for 4K 60 in all games...what are the consoles makers thinking trying to advertise 4K native 1080p or 1440p should be standard on those machine with better detail. What do you guys think?
PS3 claimed 1080 120fps. It's just what they do I guess.

In two years the ps5 pro will claim 8k 60 or something. Probably won't be any more powerful than a 6700xt either.

Keep in mind you're playing on ultra. Consoles are running low or medium.
 
Last edited:
You can run anything at 4k60 on a 3080 with tweaked settings.

And 3090 IS 36 TFs, but clock speed, rops, bandwidth etc. didn't increase with that massive boost in compute, so it's much, much less than a 1:1 improvement in performance when looking at flops compared to 2080 ti.

Basically, stop using flops as a performance metric.
 

RaySoft

Member
My 3090 is still not enough for 4K 60 in all games...what are the consoles makers thinking trying to advertise 4K native 1080p or 1440p should be standard on those machine with better detail. What do you guys think?
That´s because the 36TF is a silly metric to use. It´s almost as saying you have the best car engine, but your car still handles badly.
If you where using your 3090 to mine bitcoins, it would be another story.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Nvidia tflops are not what they used to be. The 3080 has 3x the shaders and tflops of the 2080 (200% more) and offers only 80% more performance. If the performance had scaled with tflops, your card would have easily been able to run everything at 4k 60.

My rtx 2080 can run lots of games at 4k 60.
You just have to settle for medium setting. I just choose to run games at 1440p and 120 fps instead.
 
My 3090 is still not enough for 4K 60 in all games...what are the consoles makers thinking trying to advertise 4K native 1080p or 1440p should be standard on those machine with better detail. What do you guys think?
Hahahaa this is a stupid post cause theres no soecific teraflops for any resolution and frame rates not even for a super computer, how much resolution and frame rates you can display depends on the workload and what your trying to render. Yes the nextgen consoles are 4k 60 machines for whatever games developers make on those platforms,

I see this silly posts everytime:
Your rtx 3090 is only as good to hit those targets depending on the game and what the developer wanted to achieve you can easilly make a game thatll kill and crash any system and in 3 years your rtx 3090 will just be a 1080p or 720p machine cause it wont render future games at ultra presets in 4k.
 

itsArtie

Member
36 Teraflops is more than enough. Devs simply don't want to waste time to optimize games for your card specifically, when they can optimize it for consoles, port it to pc, optimize it for 3-4 cards and hope it works well on all others.
 
Top Bottom