• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

36 Teraflops is still not enough for 4K 60 FPS minimum

Kenpachii

Member
Which resolution and quality settings? Because I have a 1080ti and it's not enough for recent games.

2 3090 in sli is not enough, performance is never enough. U can mod games to single digits without effort even the ugliest games in the world if you crank the settings up far enough. I am talking here about PS4 settings vs 970/290 the same way the guy is comparing a 6800 to PS5. Its comparable.
 

Kenpachii

Member
2GB's of ram is the minimum windows 10 uses and it's often above that.

Its above that because that memory is available to use, the same as cod warfare uses 10gb of v-ram on 1080p because its faster ram, does it need 10gb of v-ram for 1080p? nope not even remotely. 1gb is the minimum and that's what's required to run the OS at the end of the day and when more memory is needed and its not available it will use the harddrive to slam the data on.

Consoles also evolved past this stage where they no longer are coding to the metal because people expect all there services to be there on the background whenever they start playing. Consoles have even a harder time on top of it because of extra security measurements they have to take and hardware they have to reserve for future solutions. Which with PC they just tell you "need more memory" and it gets shipped. This is what crippled the PS4 so hard on memory allocation.

That's also why a lot of games even from sony themselves that hit PC require more hardware on PC because it simple has more in it. or with horizon specifically the devs are just utter dog shit at porting.
 
Last edited:

A.Romero

Member
It's never enough, really, you could render 4k60fps on a PS3 if you really wanted.
Maybe render but I think it couldn't output it due to HDMI bandwidth limitations.


I have a 2080 Super and try to play at 4K. Some games do well others not so much. RTX without DLSS is just not viable.

It's weird because I like playing at 60 FPS and high res but for example PS4 games that run at 1080/30 stable like Uncharted or FF7R don't bother me at all. I guess it depends of the kind of game.

I don't have a high refresh rate screen but I did had an oculus rift that's supposed to be 90hz, don't remember noticing a huge difference.
 

Shane89

Member
meaningless post.
your pc games are not optimized. that's the reason.

setup a pc with PS3.like spec (256vram + 256ddr3 ram) and try run something that looks like Killzone 2. you can barely reach the main manu.
 

OverHeat

« generous god »
meaningless post.
your pc games are not optimized. that's the reason.

setup a pc with PS3.like spec (256vram + 256ddr3 ram) and try run something that looks like Killzone 2. you can barely reach the main manu.
Report the post if it’s meaningless
 
lol i actually agree. my monitor can do up to 165hz but anything between 120-165hz is not really noticeable. 120hz is good. my TV is 120hz and playing games on it is a great experience. definitely way better than 60hz.

240hz is more for esports games like CS:GO, LoL, Overwatch, etc. those games aren't that demanding so it's easy to get 240hz on them.
I don't notice fps issues anymore unless under 45-50 fps. Although the only game that plays at that range for me today is Cyberpunk with full rtx and dlss on. I own a evga 2060 rtx and waiting for my queue step-up to 3060ti (come on evga send that email already, its been since december 3rd wtf...) Gaming at 1080p (as that is what my 27" monitor and tvs are, is not a big deal for me. Looks fine and can go up to 144hz. Sure the 144hz is a nice thing to see in afterburner, but I really don't notice the difference.
Gsync, has removed frame tearing and judder so all this becomes a moot point unless your resolution gets too low (like under 40fps). Most games are running over 100fps unless they are from the last 2 years and on all ultra settings, then they are between 60-100.
 
Heck I'm stuck playing 1080p on my 3090 due to CPU limitation till AMD stock builds up enough for me to get a 5900X or something. Currently on a 2700X.
 

VFXVeteran

Banned
You are confusing bandwidth with workload.
With enough workload, you can even fuck up 1080p60fps.
Sorry about that. I always consider workload as a separate isolated thing and bandwidth as the workload done over time. I can see the confusion. It's like specifying a distance to travel. Workload would be adding more miles to the distance but without what time can you travel the distance, you don't really have a good picture of how a vehicle can perform.

Anyways, could you post those games that your GPU is struggling with?
RDR 2, FS2020, Marvel Avengers and Cyberpunk are all games that can not hold 4k/60FPS no matter what workload has been given. FS2020 is clearly an incredible piece of software so that I can completely understand. I sortof can understand Cyberpunk too.
 
Last edited:

RoadHazard

Gold Member
You realise this benchmark will change constantly, right? There is no magical number of teraflops that will run "all games" at 4k60

This. It doesn't take much to run a PS3 game at 4K60, but it does take a lot to run a PS5 game at that resolution and frame rate. And once the PS6 rolls around a 3090 won't even be enough for 1080p60.

So asking "how many TF do I need for 4K60" makes no sense, there's so much more to game rendering than that.
 
Last edited:

VFXVeteran

Banned
This. It doesn't take much to run a PS3 game at 4K60, but it does take a lot to run a PS5 game at that resolution and frame rate. And once the PS6 rolls around a 3090 won't even be enough for 1080p60.
I seriously doubt that. It's very rare for a game to require more than the latest technology such that running it at the lowest feasible resolution possible doesn't yield a locked 60FPS. A 3090 will be good for all games for another 7yrs easily. A new generation of consoles after this one still will lag behind the actual GPUs coming from Nvidia/AMD. We often forget that it takes years to develop a console after the initial decision to pick parts for that console. If PS6 comes out in 7yrs, Sony has at most 1 more year from now to decide on a piece of hardware. They don't decide based on the latest graphics cards 6yrs from now and then make the console in year 7.

Since the Ampere cards (specifically the 3090) were such a huge step above the 20-series boards, I don't see a PS6 having that kind of silicon for $500. I can see a 3060Ti however. The 1080Ti was the power equivalent of the PS5 that was picked when 2080Ti had been on store shelves for some time. Consoles are that far behind.
 

RoadHazard

Gold Member
I seriously doubt that. It's very rare for a game to require more than the latest technology such that running it at the lowest feasible resolution possible doesn't yield a locked 60FPS. A 3090 will be good for all games for another 7yrs easily. A new generation of consoles after this one still will lag behind the actual GPUs coming from Nvidia/AMD. We often forget that it takes years to develop a console after the initial decision to pick parts for that console. If PS6 comes out in 7yrs, Sony has at most 1 more year from now to decide on a piece of hardware. They don't decide based on the latest graphics cards 6yrs from now and then make the console in year 7.

Since the Ampere cards (specifically the 3090) were such a huge step above the 20-series boards, I don't see a PS6 having that kind of silicon for $500. I can see a 3060Ti however. The 1080Ti was the power equivalent of the PS5 that was picked when 2080Ti had been on store shelves for some time. Consoles are that far behind.

That's not how it works. The GPUs in the PS5 and XSX weren't available in 2014. The 1080TI was released in 2017, and is not as powerful as the PS5 GPU anyway.
 
Last edited:

VFXVeteran

Banned
That's not how it works. The GPUs in the PS5 and XSX weren't available in 2014.
Their designs were though.

In any case, even if we cant' predict the timeline when they will choose the next console, we know for a fact that the next console won't outperform it's PC equivalent. Irrefutable evidence is already there. I claim that games will not be so complicated that a 3090 wouldn't be able to render them at 1080/60. And I also claim that a PS6 won't be more powerful than a 3090 when it comes out.
 

VFXVeteran

Banned
Source? I've never heard anyone claim this before. RDNA2 existed in 2014?
I'm speaking more of Nvidia. How soon do you think Sony will move on a PS6 architecture? It's 2021 now and the console GPUs are nearly 2 generations behind a 3090. What type of games do you think will choke a 3090 by the time the PS6 comes out (which means that a PS5 wouldn't even be able to run it) and what company will make such a game?
 

RoadHazard

Gold Member
I'm speaking more of Nvidia. How soon do you think Sony will move on a PS6 architecture? It's 2021 now and the console GPUs are nearly 2 generations behind a 3090. What type of games do you think will choke a 3090 by the time the PS6 comes out (which means that a PS5 wouldn't even be able to run it) and what company will make such a game?

What GPU from 2013-2014 can run PS5-level graphics at 1080p60?

And the console GPUs aren't two generations behind a 3090, at most one (less in certain ways).
 
Last edited:

VFXVeteran

Banned
2013-2104?
What GPU from 2013-2014 can run PS5-level graphics at 1080p60?

And the console GPUs aren't two generations behind a 3090, at most one (less in certain ways).
Any of the 980Tis could run a PS5 game at 1080/60FPS if you lower the workload of the GPU (i.e. lower the texture resolution, lower the LOD, lower the shadow map quality, lower the texture filtering, etc..). Cyberpunks minimum spec calls for a 780. Can a PS5 run FS2020 right now even though a 3090 is too slow to render it at max settings? Sure. Can it run at 3090 settings? No.
 

RoadHazard

Gold Member
2013-2104?

Any of the 980Tis could run a PS5 game at 1080/60FPS if you lower the workload of the GPU (i.e. lower the texture resolution, lower the LOD, lower the shadow map quality, lower the texture filtering, etc..). Cyberpunks minimum spec calls for a 780. Can a PS5 run FS2020 right now even though a 3090 is too slow to render it at max settings? Sure. Can it run at 3090 settings? No.

I was obviously talking about running the game at similar settings, not lowering the settings a full generation. And I still doubt it for games built specifically for this generation of consoles. Cross-gen games I don't consider to be proper PS5/XSX games.
 
Last edited:

VFXVeteran

Banned
I was obviously talking about running the game at similar settings, not lowering the settings a full generation. And I still doubt it for games built specifically for this generation of consoles. Cross-gen games I don't consider to be proper PS5/XSX games.
Yea, but even still consoles are not in lock step with the PC GPUs. I think that's the mixup. A 3090 will outperform a PS6 is my claim. We'll see. Nvidia is still ahead by a generation over AMD. AMD completely wasn't prepared for full on RT cores with ML cores for DLSS. Their next iteration needs to address this if there will be any hope of PS6 being feasible to run games right now like Cyberpunk with all the rendering options that still bring 3090 to it's knees.

So I agree that *some* games will make the 3090 struggle with 1080/60 7yrs from now, but if the 3090 struggles, the PS6 will also struggle since PS5/XSX are already behind the 3090 by near 2 generations.
 

Zathalus

Member
I seriously doubt that. It's very rare for a game to require more than the latest technology such that running it at the lowest feasible resolution possible doesn't yield a locked 60FPS. A 3090 will be good for all games for another 7yrs easily. A new generation of consoles after this one still will lag behind the actual GPUs coming from Nvidia/AMD. We often forget that it takes years to develop a console after the initial decision to pick parts for that console. If PS6 comes out in 7yrs, Sony has at most 1 more year from now to decide on a piece of hardware. They don't decide based on the latest graphics cards 6yrs from now and then make the console in year 7.

Since the Ampere cards (specifically the 3090) were such a huge step above the 20-series boards, I don't see a PS6 having that kind of silicon for $500. I can see a 3060Ti however. The 1080Ti was the power equivalent of the PS5 that was picked when 2080Ti had been on store shelves for some time. Consoles are that far behind.
You are seriously underestimating the advancement of GPUs over a 7 year time frame. When the PS4 launched the top end GPU was the GeForce GTX 780 Ti and the Radeon R9 290X, both around 2.5x times as fast as the PS4 equivalent GPU, the Radeon 7850. The R9 290X especially was 5.6 TFLOP, roughly a 3x increase in raw numbers over the PS4 GPU. Both metrics are larger then the difference between the XSX GPU and the 3090/6900XT in normal rasterization performance. The XSX GPU is equal to a ~2080 and the 3090 is only 70% faster then that GPU. The 3060ti is also less then 10% more powerful then the GPU in the XSX. That won't even cut it for a mid-gen refresh. Games that are going to stress the consoles in 5 or so years time (aka 1440p/30FPS) will ensure the 3090 struggles with 1440p/60FPS on those titles, especially with the higher settings that PCs run at. I don't think the 3090 will ever need to drop down to 1080p for this generation however.

Assuming the next generation consoles are on 3nm, they are going to be quite a bit faster then the 3090 can ever hope to be. Even if RDNA 2 currently lags behind in RT performance, the AMD of today is not the AMD of 7-9 years ago. The Radeon division is not going to be starved of money and hemorrhage talent like it has since then. AMD over the next 7 years will be putting far more money and resources into GPU research that it has previously, now that they are no longer on the point of bankruptcy (and are frankly dominating the CPU segment on all fronts). MCM module GPUs is going to directly address the issue with monolithic GPU dies that AMD and Nvidia are currently facing as well. Radeon is playing catch up with DLSS and RT performance, but DLSS is a software solution that does not require Tensor cores to run (although it helps) and the second one is obviously going to improve dramatically over the next 7 years. They managed to catch Nvidia in normal rasterization and with much better performance per watt to boot.

I can easily see the next Xbox and PS6 being 4-5x times as powerful as the current consoles at a minimum. For the PS6 especially, Sony can just double the current GPU size and increase clock speeds by another 50% (doable on 3nm, current RNDA 2 7nm cards reach 2.6Ghz), that combined with IPC increases will mean a 4x increase over the current PS5 GPU. That would obliterate the 3090 in rasterization. Bandwidth can certainly scale up as well, with the advancements being done in GDDR and HBM.
 

RoadHazard

Gold Member
Yea, but even still consoles are not in lock step with the PC GPUs. I think that's the mixup. A 3090 will outperform a PS6 is my claim. We'll see. Nvidia is still ahead by a generation over AMD. AMD completely wasn't prepared for full on RT cores with ML cores for DLSS. Their next iteration needs to address this if there will be any hope of PS6 being feasible to run games right now like Cyberpunk with all the rendering options that still bring 3090 to it's knees.

So I agree that *some* games will make the 3090 struggle with 1080/60 7yrs from now, but if the 3090 struggles, the PS6 will also struggle since PS5/XSX are already behind the 3090 by near 2 generations.

A 3090 will not outperform a PS6 released 6-7 years from now. Absolutely not.
 

VFXVeteran

Banned
You are seriously underestimating the advancement of GPUs over a 7 year time frame. When the PS4 launched the top end GPU was the GeForce GTX 780 Ti and the Radeon R9 290X, both around 2.5x times as fast as the PS4 equivalent GPU, the Radeon 7850. The R9 290X especially was 5.6 TFLOP, roughly a 3x increase in raw numbers over the PS4 GPU. Both metrics are larger then the difference between the XSX GPU and the 3090/6900XT in normal rasterization performance. The XSX GPU is equal to a ~2080 and the 3090 is only 70% faster then that GPU. The 3060ti is also less then 10% more powerful then the GPU in the XSX. That won't even cut it for a mid-gen refresh. Games that are going to stress the consoles in 5 or so years time (aka 1440p/30FPS) will ensure the 3090 struggles with 1440p/60FPS on those titles, especially with the higher settings that PCs run at. I don't think the 3090 will ever need to drop down to 1080p for this generation however.

We can eat up bandwidth very easily on the consoles now. They are completely underpowered compared to the generation of Ampere cards. There are 3 main reasons for this as we all know. 1) RT 2) No DLSS and 3) not enough bandwidth to support 4k/60 with the max rendering features that gaming companies can design.

The consoles will go another generation having lower rendering settings, less FPS and less resolution than the top tier GPUs. A mid-gen refresh (if there is one) won't be so radically different than the PS5/XSX that it will put it on Ampere's footing. A mid-gen refresh was only implemented last gen because the original PS4/X were so underpowered that MS and Sony realized they went to conservative on their designs. Even then, their overall design didn't change much. They still used the same GPU and CPU. Both of those companies are limited to what they can have in hardware because AMD is behind. That's just the way it is right now.

Assuming the next generation consoles are on 3nm, they are going to be quite a bit faster then the 3090 can ever hope to be.
We'll see.

One thing I feel that you guys continue to have miscalculated steps is because you aren't taking cost into account. You simply can't get the kind of hardware that the 3090 has in it + fast SSD + more VRAM + bluray, etc.. for literally 25% of the price. It's complete wishful thinking. This type of assumptions ruled the console tech thread for literally 2yrs and pretty much everyone was way too high on their expectations. My sources were right and there is no reason to think they'll be wrong the next go around.
 

VFXVeteran

Banned
A 3090 will not outperform a PS6 released 6-7 years from now. Absolutely not.
If Nvidia is at 5-generation series of cards, then yea, you'll be correct. But I somehow doubt they will follow their normal 2yr generation releases this time. I also don't think the gap between generations will be as large as it was with 20-series and 30-series cards. Life doesn't always follow the exact circumstances from the past. If that were the case, then we'd all be millionaires.
 
A mid-gen refresh (if there is one) won't be so radically different than the PS5/XSX that it will put it on Ampere's footing. A mid-gen refresh was only implemented last gen because the original PS4/X were so underpowered that MS and Sony realized they went to conservative on their designs. Even then, their overall design didn't change much. They still used the same GPU and CPU. Both of those companies are limited to what they can have in hardware because AMD is behind. That's just the way it is right now.
Well, X1X was a pretty big design change in that it got rid of ddr3 and esram for lots of fast gddr5.

Do you say a ps5 pro won't reach ampere because of no tensor and rt cores? Because even the PS4 pro was ahead of 7970, and that card was better than Nvidia stuff at the time. Amd has pretty much closed the gap in rasterization performance. Or even faster in games that like higher clocks instead of compute.

I would be pretty shocked if they bothered with a Ps5 pro that couldn't even beat the top cards of today
 

kikii

Member
My 3090 is still not enough for 4K 60 in all games...what are the consoles makers thinking trying to advertise 4K native 1080p or 1440p should be standard on those machine with better detail. What do you guys think?
this happens when u have i3 cpu and 8gb ddr3 :p
 

Zathalus

Member
We'll see.

One thing I feel that you guys continue to have miscalculated steps is because you aren't taking cost into account. You simply can't get the kind of hardware that the 3090 has in it + fast SSD + more VRAM + bluray, etc.. for literally 25% of the price. It's complete wishful thinking. This type of assumptions ruled the console tech thread for literally 2yrs and pretty much everyone was way too high on their expectations. My sources were right and there is no reason to think they'll be wrong the next go around.
You can have made this exact same argument 7 years ago, just with using the original GTX Titan instead of the 3090. How could a console possibly have a GPU more then twice as powerful as my $999 GPU? Or have a CPU more powerful then anything on the market? Or a state of the art 1TB SSD?

Funnily enough, the GPU in the XSX is more powerful then any Titan class GPU that came before the Titan V. It basically matches the Titan Xp, a $1199 card that launched 4 years ago. It even has slightly more bandwidth as well.

Its puzzling that you think the leap for the next-gen consoles would not even amount to twice the GPU power. Especially with the advantages of being on 3nm and the likely inclusion of MCM. GPU advances do not appear to be slowing down.
 

VFXVeteran

Banned
Well, X1X was a pretty big design change in that it got rid of ddr3 and esram for lots of fast gddr5.

Do you say a ps5 pro won't reach ampere because of no tensor and rt cores? Because even the PS4 pro was ahead of 7970, and that card was better than Nvidia stuff at the time. Amd has pretty much closed the gap in rasterization performance. Or even faster in games that like higher clocks instead of compute.

I would be pretty shocked if they bothered with a Ps5 pro that couldn't even beat the top cards of today
I mean really think about that. PS5/XSX still run multiplatform games at much lower settings than Ampere. What would they change if the GPU stays the same (CPU is fine and not needed)? And more importantly, how to put it out costing just $100 more than PS5? I don't even think their will be a mid-gen refresh in 2yrs. PS5/XSX are very good starting level consoles from last gen unlike PS4 back then. There really is no need or practicality with adding higher clocked GPU. I don't see it happening especially with how shortages are ridiculous now and probably will be this entire year.
 

VFXVeteran

Banned
You can have made this exact same argument 7 years ago, just with using the original GTX Titan instead of the 3090. How could a console possibly have a GPU more then twice as powerful as my $999 GPU? Or have a CPU more powerful then anything on the market? Or a state of the art 1TB SSD?
No because the Titan and the 3090 are radically different. This generation the Nvidia cards took 2 steps up from 20x series cards. It was a monumental jump. They aren't going to have another monumental jump next year in 2022. That's why I said that nothing stays the same. Everything changes.. even the rate, power and time of these releases.

Its puzzling that you think the leap for the next-gen consoles would not even amount to twice the GPU power. Especially with the advantages of being on 3nm and the likely inclusion of MCM. GPU advances do not appear to be slowing down.
You keep looking at numbers. I'm looking at game performance - which is what developers judge. The games out now (including exclusives) should tell you that consoles are struggling to maintain 4k FPS and advanced features that are pretty easy for Ampere cards to render. RT is completely a wash with the consoles without DLSS (and I mean hardware - you can't just add a software solution and think it's going to be free.)
 
I mean really think about that. PS5/XSX still run multiplatform games at much lower settings than Ampere. What would they change if the GPU stays the same (CPU is fine and not needed)? And more importantly, how to put it out costing just $100 more than PS5? I don't even think their will be a mid-gen refresh in 2yrs. PS5/XSX are very good starting level consoles from last gen unlike PS4 back then. There really is no need or practicality with adding higher clocked GPU. I don't see it happening especially with how shortages are ridiculous now and probably will be this entire year.
I'm a bit lost man tbh. If Ps4 Pro came out 3 years later after ps4 launch why would a Ps5 pro come out 2 years later after ps5 launch? Perhaps I jumped into another convo and we're off track lol. I agree that Ps5 is much more future proof than ps4 was.

Presumably, Ps5 pro would come no sooner than holiday 2023, probably 2024 looking at shortages. By then, they can absolutely have a better gpu than ampere. Even if it's not 36TF, i'm pretty sure clocks, rops and other aspects of the chip would surpass peak ampere. The only wild card as I can see are tensor + RT cores potentially not being in a ps5 pro.
 

Hezekiah

Banned
1440p is a barely upgrade from 1080p imo. On tv it does not matter though. 1080p with great TAA/DLSS can look just as good. On monitor I will argue that 4k is benefitial
Nearly double the number of pixels is 'barely an upgrade'. Sure, OK.
 

yurinka

Member
The 3090 has more teraflops than users. These GPUs have a market too small to be profitable for devs to invest a lot of work optimizing for them. They optimize more for their main market: consoles. And particularly PS4 and for the near future, PS5. If you play on PC or another console, if it's pretty similar, like the Xboxes, the games will be properly optimized. If your hardware is too distant from them, good luck.
 
Last edited:

Razvedka

Banned
We can eat up bandwidth very easily on the consoles now. They are completely underpowered compared to the generation of Ampere cards. There are 3 main reasons for this as we all know. 1) RT 2) No DLSS and 3) not enough bandwidth to support 4k/60 with the max rendering features that gaming companies can design.

The consoles will go another generation having lower rendering settings, less FPS and less resolution than the top tier GPUs. A mid-gen refresh (if there is one) won't be so radically different than the PS5/XSX that it will put it on Ampere's footing. A mid-gen refresh was only implemented last gen because the original PS4/X were so underpowered that MS and Sony realized they went to conservative on their designs. Even then, their overall design didn't change much. They still used the same GPU and CPU. Both of those companies are limited to what they can have in hardware because AMD is behind. That's just the way it is right now.


We'll see.

One thing I feel that you guys continue to have miscalculated steps is because you aren't taking cost into account. You simply can't get the kind of hardware that the 3090 has in it + fast SSD + more VRAM + bluray, etc.. for literally 25% of the price. It's complete wishful thinking. This type of assumptions ruled the console tech thread for literally 2yrs and pretty much everyone was way too high on their expectations. My sources were right and there is no reason to think they'll be wrong the next go around.
Then how can you explain the current consoles? I mean PS5 has faster I/O than pretty much anything on PC to date, it's taken awhile to catch up to that kind of raw throughput. It's already been noted that the XSX GPU smashes flagship cards that came out just 4 years ago. Both have competent Ryzen CPUs and dedicated silicon to offload other tasking.

I don't understand how you can make this statement given ample historical precedent. I mean we're literally seeing your argument dissolve in real time just by looking at the current machines that just released and gazing back not even half a decade.

And you can be sure that MS and Sony are getting sweetheart deals on the AMD silicon in their machines because of all the R&D $$$ they've been throwing at AMD and collaborating. The cost of the PS5 and XSX are lower than they'd otherwise be thanks to that alone.

RDNA3 rumors are also pretty spicy iirc. By the time PS6 and whatever MS calls their next box hits we'll be on a successor architecture beyond that, at the least.
 
Last edited:

VFXVeteran

Banned
I'll leave this thread to you guys to speculate. I hate getting into conversations about speculation because they never turn out how people wish they would. All I can say is look at the games and how the consoles are running them now. It's not pretty by a long shot. Yes, the SSD tech is something that PCs haven't gotten yet, but that's not the same as the GPU/CPU combo. Those will always be where the buck stops. If the consoles are so up-to-date right now, they wouldn't be stuggling with native 4k renders with graphics features like anisotropic filtering, higher HDAO, enhanced textures, and LOD geometry.

I'll just sit back and watch the news *if* even a mid-gen ever comes to fruition and then we can discuss at that time why they aren't as powerful as a $1,500 GPU.
 

Zathalus

Member
No because the Titan and the 3090 are radically different. This generation the Nvidia cards took 2 steps up from 20x series cards. It was a monumental jump. They aren't going to have another monumental jump next year in 2022. That's why I said that nothing stays the same. Everything changes.. even the rate, power and time of these releases.


You keep looking at numbers. I'm looking at game performance - which is what developers judge. The games out now (including exclusives) should tell you that consoles are struggling to maintain 4k FPS and advanced features that are pretty easy for Ampere cards to render. RT is completely a wash with the consoles without DLSS (and I mean hardware - you can't just add a software solution and think it's going to be free.)
The Turing-Ampere leap is nothing out of the ordinary, Maxwell-Pascal offered a greater performance uplift. So does RDNA-RDNA2 for that matter (not IPC, but the impressive performance per watt increase of 55%).

I see no reason why Nvidia will not release another card next year that will once again be 30%-40% faster then the highest tier offering. AMD certainly will, with the jump to 5nm that they will be making.

There is no trend pointing towards GPUs having less leaps in performance over the next 7 years.

One final note, DLSS does not need dedicated hardware to run. DLSS 1.9 ran just fine on shaders and that was without the INT 8/4 capabilities of RDNA2.
 

tvdaXD

Member
4K is overrated. There's a reason many digital intermediates in movies are 2K res. Because past 2K res, you won't see a difference anymore at full size on a monitor if you're at a normal viewing distance. It's pure marketing bullshit to sell you a bigger and more expensive TV and/or console. The focus should always be framerate.
 

acm2000

Member
3090 isn't really designed for gaming.

Pc games are designed to be scaled up for new hardware.

4k 60 is easily doable, 4k 120 is easily doable, you're not meant to just smash everything on max and expect it to work.
 

tvdaXD

Member
I’m playing Valhalla on PS5 and I disagree.
You totally missed the point tho, don't get me wrong, I love playing Spider Man on the PS5 and especially the PS5 version looks amazing. But you won't see any difference if the picture was 2K or 4K unless you walk up to the screen.
So again, they should not try to increase resolution beyond a point that is literally pointless and use that GPU horsepower for framerate instead.
 
Top Bottom