• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

36 Teraflops is still not enough for 4K 60 FPS minimum

ZehDon

Member
Appeal to authority, so not an argument...
Lol, so you even understand what that term means? I’m not making an argument, I’m deferring to an expert opinion.
Do you think deferring to scientists on matters in their field of expertise is “appeal to authority”?
What an embarrassing series of posts.
 
Lol, so you even understand what that term means? I’m not making an argument, I’m deferring to an expert opinion.
Do you think deferring to scientists on matters in their field of expertise is “appeal to authority”?
What an embarrassing series of posts.
Lmao, so you won't even try to make a counterargument? I mean it would be futile since you literally can't. Have a good one.
 

sunnysideup

Banned
1080p30 at medium settings. So basically like a PS4.
its a 7850 sorry. No fucking way.

This is typical pc fanboy bollocks. a 7850 is not playable today. It runs everything like dogshit. I have a gtx 1050ti on my laptop. It also runs everthing like dogshit and it is vastly more powerful than a ps4.

this is a 7850:

 
Last edited:
its a 7850 sorry. No fucking way.

This is typical pc fanboy bollocks. a 7850 is not playable today. It runs everything like dogshit. I have a gtx 1050ti on my laptop. It also runs everthing like dogshit and it is vastly more powerful than a ps4.

this is a 7850:




No, its tipical consoletard horseshit and misleading. Why are you comparing an 8 gig ps4 with a 2 gig card from 2012 ? The card doesnt do well in certain games because it hits the vram limit. Use a 3 gig card from 2013 and its perfectly comparable. Why are people still trying to push the magical console optimization in this day and age, when we know as a fact that its just lower resolution, with massive drops everywhere and low details ? Thats the console "optimization"
 
Did you watch the same vid i watched. That is unplayable. Its looks like dog shit, choppy beyond recognishion.

Now i start to understand pc fanboys. You are all fucking bonkers.
This post is funny because the game runs pretty much like it runs on a PS4 (slightly worse because of the VRAM limitation). If you think it looks like choppy dogshit, you kinda dissed the PS4 :messenger_beaming:
 
Like i said. You are blind and bonkers.
Sure Jan GIF
 

sunnysideup

Banned
xkays66.png


Hm0nUkJ.png



k3QVGjs.png




Behold the power of console shit - low resolution, with droped frames everywhere and medium, low and lower than lowest possible on PC. To call oneself a gamer and use plastic turds to game, when computers exist and even cherlead for them
I am a pc gamer only. I have an rx6800/r5800. You lot are just wrong and delusional. Low end pc gaming is fucking dog shit and in no way comparable to a console. It just isnt.

You need a vastly more powerful pc to get a comparable experience this late gen.
 
Last edited:

sunnysideup

Banned
I literally showed you how it's comparable. Just use your eyes.
No you did not. Are you blind? The frame rate and frame timings are all over the place. It hitches and stutters and settings look like shit.

To run a game at a solid 30fps on a pc you need atleast 45- 50 fps unlocked. Othewise its going to run like dogshit.
 
No you did not. Are you blind? The frame rate and frame timings are all over the place. It hitches and stutters and settings look like shit.

To run a game at a solid 30fps on a pc you need atleast 45- 50 fps unlocked. Othewise its going to run like dogshit.


Yes, you need a million frames on PC to run at 30. Since PC is not using some of that magical sauce that consoles use. You keep comparing a console with 8 gigs of shared memory with a gpu with only 2 gigs of vram. Then proceed to make these amazing claims. Why dont you use a 3 or 4 gig gpu ? Why are you insisting with these bullshit claims of yours that are demonstrably false ?
 
So, with the Xbox pushing 12TFLOPs of "on paper" power, that would be around 24TFLOPs of realised power. The RTX 3080 has 29TFLOPs in comparable metrics. That should be more than enough grunt for re-constructed 4K as a general target, though I expect it to drop towards the end of the console generation.
Imagine still thinking PS5, XSX is about equal to an rtx 3080, when you already have examples where next gen consoles struggle to better an rtx 2060s in WD:Legion, Control, etc.
 

ZywyPL

Banned
60 fps is already becoming the standard. We won't see many games this gen that don't have a 60 fps mode.

For now, since we're in the cross-gen era with PS4/XB1 titles running on a much more powerful hardware, but add RT to the mix and the framerate goes down back to 30. And each week more and more of those PS4/XB1 games already start to struggle with 4K. As always, it's just a matter of 3 years for the devs to start maximizing the consoles, and my personal guess is 1440p@30+RT will then become a standard, so we will go back to the usual sluggish gameplay and washed out image but with shiny graphics. So here's hoping that AMD's DLSS counterpart shows up soon enough.


As for the 3090 TFLOP comparison, you're using pretty terrible comparisons. As John Carmack said:

on.


After the consoles switched to PC architecture, and at the same time PCs started using low-level APIs, this just doesn't hold up true anymore, if a PC matches the consoles specs it runs the games identical, if not better thanks to custom settings.
 

Whitecrow

Banned
For now, since we're in the cross-gen era with PS4/XB1 titles running on a much more powerful hardware, but add RT to the mix and the framerate goes down back to 30. And each week more and more of those PS4/XB1 games already start to struggle with 4K. As always, it's just a matter of 3 years for the devs to start maximizing the consoles, and my personal guess is 1440p@30+RT will then become a standard, so we will go back to the usual sluggish gameplay and washed out image but with shiny graphics. So here's hoping that AMD's DLSS counterpart shows up soon enough.




After the consoles switched to PC architecture, and at the same time PCs started using low-level APIs, this just doesn't hold up true anymore, if a PC matches the consoles specs it runs the games identical, if not better thanks to custom settings.
Wrong.

PC still have Windows, that is a general purpose operating system with lots of services running in the background, and DirectX layers, that have to be totally hardware agnostic and work in every single hardware combination.

A dedicated machine will always perform better. Having the same architechture doesnt matter if software enviroment is different.
 

ZehDon

Member
Imagine still thinking PS5, XSX is about equal to an rtx 3080, when you already have examples where next gen consoles struggle to better an rtx 2060s in WD:Legion, Control, etc.
Imagine quoting a post and cropping out the part where the poster stipulated “this is a terrible comparison”.

TFLOPs - as in the metric from the thread title - doesn’t really hold up as a raw metric anymore when discussing resolution. Games are simply doing too much, and reconstruction has come too far for that it matter. Perhaps most post wasn’t clear.

Taking Carmack’s experienced opinion, the XSX could punch about double it’s on paper specs. So, giving the XSX best case scenario - double its grunt - this still isn’t enough to push native 4K, which was the OPs bemoan. But, it would be more than enough to generally target a reconstructed 4K.
Using real world figures, we see it land in the realm of the 2070/2080 Supers, depending on the game. That kind of power should still be enough to target a reconstructed 4k, but I’d expected it to drop as the gen rolls on.
 
Imagine quoting a post and cropping out the part where the poster stipulated “this is a terrible comparison”.

TFLOPs - as in the metric from the thread title - doesn’t really hold up as a raw metric anymore when discussing resolution. Games are simply doing too much, and reconstruction has come too far for that it matter. Perhaps most post wasn’t clear.

Taking Carmack’s experienced opinion, the XSX could punch about double it’s on paper specs. So, giving the XSX best case scenario - double its grunt - this still isn’t enough to push native 4K, which was the OPs bemoan. But, it would be more than enough to generally target a reconstructed 4K.
Using real world figures, we see it land in the realm of the 2070/2080 Supers, depending on the game. That kind of power should still be enough to target a reconstructed 4k, but I’d expected it to drop as the gen rolls on.
OP didn't realize that Nvidia flops and AMD flops are different. 36 Nvidia flops are 23 AMD flops. If you want double the power of the XSX, you need the 6900XT. And yes, the 6900XT is roughly twice as powerful as the XSX. Carmack was wrong.
 

Razvedka

Banned
Appeal to authority, so not an argument. As I said, the tweet is verifiably false. Just look at any current multiplatform game. On consoles, the games run pretty much exactly as you'd expect them to run on similar PC hardware. Twice the performance my ass.
I really hate it when people trot out logical fallacy assertions in lieu of actually rendering a reasonable argument which states why a given source is wrong.

Decrying "appeal to authority" is perhaps the most egregious lately. Reminds me of this guy I know who would accuse anyone who disagreed with him by citing a source as a "credentialist" who "couldn't think for themselves".

Yes, hiding behind the words of others in a formally structured debate is bad form. However, this is not a formally structured debate and as a heuristic we do look to the thoughts of others associated with, or leads of, the field we're discussing. That doesn't render them automatically right, but on balance it means they're probably more right on the topic than a layman. I would not trust open cranial surgery to some random armchair surgeon on the internet with zero medical experience vs the guy with the MD.

Do Carmacks words hold true in 2021 as they did in 2014? I honestly don't know, that's an interesting conversation to have. But just dismissing one of the godfathers of modern video gaming is a very weak stance.
 
Last edited:

buenoblue

Member
I feel terrible for op, 3090 is garbage for them. I mean 3090 and nor vrr display? You doing it wrong dude. Just turn down some settings and enjoy your games.
 
Do Carmacks words hold true in 2021 as they did in 2014? I honestly don't know, that's an interesting conversation to have. But just dismissing one of the godfathers of modern video gaming is a very weak stance.
He never asked for numbers. But RDR2, the graphically heaviest game that runs on last gen consoles, runs almost as good on an equivalent PC (the VRAM bottleneck is a bitch). That disproves what Carmack said.
 

OverHeat

« generous god »
I feel terrible for op, 3090 is garbage for them. I mean 3090 and nor vrr display? You doing it wrong dude. Just turn down some settings and enjoy your games.
I got a LG CX and AW3821DW g-sync Ultimate display so I’m set for VRR...
 
Its enough, just developers can be lazy when it comes to optimizing games, they either do it after release or not at all.

Also real gamers play at 1440p.
 
Last edited:

skneogaf

Member
Crysis remastered, cyberpunk just to name 2 hit high 50 sometimes

Both of those games have dlss so I'd use the quality setting and get back to playing the games.

Job done.


I mean I still have a 2080ti and with my gsync oled I set games to 4k@60fps and pretty much maximum settings 9/10 times I don't notice fps issues.
 

Md Ray

Member
You mean like this?

k3QVGjs.png
On PC, using a comparable CPU to those Jaguar CPUs would produce even worse results than this in this section. Before upgrading to 3700X, I had an i5-3330 (a CPU that was roughly 2x faster than the PS4 CPU due to its high clock speed and vastly superior IPC) which would at times suffer from long pauses, hitches & stutters in Saint Denis. The kind of stutters and long pauses were simply nowhere to be found on PS4. Sure, there were frame-rate dips as seen above, but never experienced hard pauses on the PS4 version of this game. And I have put countless hours on both console and PC versions.

If you had a comparable GPU to PS4 from 2013/14 like the 7850/7870, then you'd have stutters related to VRAM limitation (on top of having CPU-related stutters) forcing you to use lower quality textures whereas on PS4/XB1 it's equivalent to PC's Ultra textures.

Consoles have a very thin layer of API, less driver and OS overhead thus requiring less CPU grunt than a comparable PC. That's what John Carmack meant to say when he said: "a console will deliver twice the perf of a PC" back in 2014. These days the tools and the APIs on PC have indeed gotten better and devs are able to extract a lot of perf out of PC hardware as they do on consoles but to some extent, consoles do perform more efficiently still.
 
Last edited:

Liar

Banned
Most people will agree that 60fps is enough for smooth, and pleasant gameplay, but how much more is better than that, that there is no need to add more fps?

I mean that you simply can't notice difference anymore.

IMO, it sometime seems like for most AAA games, one needs to wait several years for technology to develop enough to run games at max settings, with above 60fps.
 

sunnysideup

Banned
On PC, using a comparable CPU to those Jaguar CPUs would produce even worse results than this in this section. Before upgrading to 3700X, I had an i5-3330 (a CPU that was roughly 2x faster than the PS4 CPU due to its high clock speed and vastly superior IPC) which would at times suffer from long pauses, hitches & stutters in Saint Denis. The kind of stutters and long pauses were simply nowhere to be found on PS4. Sure, there were frame-rate dips as seen above, but never experienced hard pauses on the PS4 version of this game. And I have put countless hours on both console and PC versions.

If you had a comparable GPU to PS4 from 2013/14 like the 7850/7870, then you'd have stutters related to VRAM limitation (on top of having CPU-related stutters) forcing you to use lower quality textures whereas on PS4/XB1 it's equivalent to PC's Ultra textures.

Consoles have a very thin layer of API, less OS, and driver overhead thus requiring less CPU grunt than a comparable PC. That's what John Carmack meant to say when he said: "a console will deliver twice the perf of a PC" back in 2014. These days the tools and the APIs on PC have indeed gotten better and devs are able to extract a lot of perf out of PC hardware as they do on consoles but to some extent, consoles do perform more efficiently still.

Yep. I tried to run a bunch of games on my i5/gtx 1050ti(before buying my high end pc). It just is not comparable to a console experience. When you go low end on pc you run into so many issues the whole experience gets unbearable.

While i think my rx6800/r5800 outshines ps5 today, for multiplatform titles, i do not think it will be as good of an experience 5 years from now. Ps5 will still be playable.
 

Nvzman

Member
agreed , 4k is overrated , 1440p is better for stable framerates
1440p/1080p 144fps > 4K 60fps
As someone whos tried 4K 60fps, its an absolute meme for PC gaming. High frame-rate gaming is wayyyyyy nicer, imo for PC games 1440p144fps is the way to go, 4k60fps should just be for sitting back on the couch playing consoles games tbh or a movie (which is really weirdly hard on PC, many services refuse to stream in 4K on PC because of piracy concerns), and I WOULD say productivity, but in order for 4K to even be readable and not microscopic text on a PC monitor, you need the Windows 10 DPI scaling to be at 150%, which is literally the same exact thing as 1440p's 100%.
 

Zeroing

Banned
I think one console maker, overpromised, because their maniac fans wanted the most powerful console in the word/galaxy/universe...JOKING.

Now seriously, consoles are optimized to their fixed hardware, developers can optimize their games to reach that performance. But depends on the time they got etc.

off topic, I need to update the drivers for my graphic card ...
 
On PC, using a comparable CPU to those Jaguar CPUs would produce even worse results than this in this section. Before upgrading to 3700X, I had an i5-3330 (a CPU that was roughly 2x faster than the PS4 CPU due to its high clock speed and vastly superior IPC) which would at times suffer from long pauses, hitches & stutters in Saint Denis. The kind of stutters and long pauses were simply nowhere to be found on PS4. Sure, there were frame-rate dips as seen above, but never experienced hard pauses on the PS4 version of this game. And I have put countless hours on both console and PC versions.

If you had a comparable GPU to PS4 from 2013/14 like the 7850/7870, then you'd have stutters related to VRAM limitation (on top of having CPU-related stutters) forcing you to use lower quality textures whereas on PS4/XB1 it's equivalent to PC's Ultra textures.

Consoles have a very thin layer of API, less driver and OS overhead thus requiring less CPU grunt than a comparable PC. That's what John Carmack meant to say when he said: "a console will deliver twice the perf of a PC" back in 2014. These days the tools and the APIs on PC have indeed gotten better and devs are able to extract a lot of perf out of PC hardware as they do on consoles but to some extent, consoles do perform more efficiently still.


John Carmack was saying that 7 years ago in relation to more exotic hardware that was in 360 and ps3. Its certainly not happening now. I dont even know how this is even a thing anymore. If consoles have this "thin" layer of api, why are we getting almost like for like performance with similar computer hardware ? Most of the times, we can perfectly match the performance of a game on console with its equivalent pc hardware. Yes, the ps4/5 api is a bit lower, but its certainly not doing any wonders worth talking about
 
Last edited:

Kenpachii

Member
Wrong.

PC still have Windows, that is a general purpose operating system with lots of services running in the background, and DirectX layers, that have to be totally hardware agnostic and work in every single hardware combination.

A dedicated machine will always perform better. Having the same architechture doesnt matter if software enviroment is different.

Funny enough consoles reserve more then PC's so that logic doesn't fly.
 

Kenpachii

Member
Yep. I tried to run a bunch of games on my i5/gtx 1050ti(before buying my high end pc). It just is not comparable to a console experience. When you go low end on pc you run into so many issues the whole experience gets unbearable.

While i think my rx6800/r5800 outshines ps5 today, for multiplatform titles, i do not think it will be as good of an experience 5 years from now. Ps5 will still be playable.

Dumb comparison.

6800 is more in line with a 290 then what PS4 had and the 290 ( aka 970 ) still steamrols every game to this day. So will the 6800 if games are builded for PS5 they will be useful for the 6800.

Hell even a 970 at this day of age still plays everything single game without effort, and that's a 7 year old gpu.

6800 with its 16gb of v-ram pool will age like fine wine for the entire PS5 generation and start of PS6 generation.
 
Last edited:

Kenpachii

Member

windows only eats about 1-5% of your cpu core, PS5 reserves a entire core. Windows 10 only requires 1gb of dedicated memory, PS5 reserves 2.

Its not hard to understand. Now if a guy on PC wants to run 300 chrome tabs and watch on 4 different screens 8k movies while playing red dead redemption and slam 64gb of memory in his PC that's on him he can do that if he wants, however saying PC has more overhead then consoles in 2021 is just straight up wrong specially with low level api's that are pretty much in line with what consoles have.

Also i highly doubt anybody still codes to the metal on those consoles even remotely, its probably also layered with api layers for BC solutions and forwards BC. Consoles are more PC's these days then any console before and services beat a bit more performance any day of the week. That's also why PS4 had such high v-ram to boot with.
 
Last edited:
Top Bottom