• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

36 Teraflops is still not enough for 4K 60 FPS minimum

ZywyPL

Banned
Just turn off AA, you dont need it at 4k

Honestly, seeing how all post-processing AA techniques blur the image, some more, some less, but still, and some create weird artifacts like ghosting etc., I'm keen to not use any AA at all. There is some visible shimmering here and there, especially on heavy foliage, but the overall picture is so damn sharp, like, this is EXACTLY what I bought my 4K TV for. Although AA nowadays barely makes a hit on the performance, you lose like 2-3FPS at very worst.
 

Razvedka

Banned
I'll leave this thread to you guys to speculate. I hate getting into conversations about speculation because they never turn out how people wish they would. All I can say is look at the games and how the consoles are running them now. It's not pretty by a long shot. Yes, the SSD tech is something that PCs haven't gotten yet, but that's not the same as the GPU/CPU combo. Those will always be where the buck stops. If the consoles are so up-to-date right now, they wouldn't be stuggling with native 4k renders with graphics features like anisotropic filtering, higher HDAO, enhanced textures, and LOD geometry.

I'll just sit back and watch the news *if* even a mid-gen ever comes to fruition and then we can discuss at that time why they aren't as powerful as a $1,500 GPU.
A $1500 GPU from what year? What you're saying doesn't make any sense. If you take a game from 5 years ago and do a native port of it to these machines (not running in some back compat mode) it will have maxed out settings.

The XSX could assuredly play a title from 2015 with settings cranked to maximum, or otherwise superior to the best GPU available from that time. I'd bet same with PS5.

The feature sets of the current consoles are pretty cutting edge. That doesn't mean they have the raw grunt of a flagship cutting edge GPU of today. It does mean they can face punch much older GPUs, regardless of their launch cost.

And certain settings on OC gobble up resources as you crank them up. With hardware like the consoles devs are aiming for the best visuals at the most stable performance- this isn't PC land. Games that hit a year ago and can have their settings cranked out the wazoo on PC might not necessarily get that same treatment on the next gen consoles that just released- though some might. I think one good example of this actually happening is Cold War, isn't it?

I need to go back and reread your argument. Because my current understanding of it is "in 2028 the consoles will not have a GPU more powerful/sophisticated than the 3090 that just launched".
 
Last edited:

Damigos

Member
Honestly, seeing how all post-processing AA techniques blur the image, some more, some less, but still, and some create weird artifacts like ghosting etc., I'm keen to not use any AA at all. There is some visible shimmering here and there, especially on heavy foliage, but the overall picture is so damn sharp, like, this is EXACTLY what I bought my 4K TV for. Although AA nowadays barely makes a hit on the performance, you lose like 2-3FPS at very worst.
On WoW i have 4k60 without AA and 4k40 with AA x8. I use a 980 ti
 

Armorian

Banned
Just turn off AA, you dont need it at 4k

What? Modern games look like shit without TAA/DLSS, lots of shader aliasing even in 3840x2160 (and probably beyond)

Only way to fix is somewhat is TAA (and it doesn't use a lot of resources so turning it off won't do... shit) that have lot of issues (ghosting), DLSS (that has it's own issues and right now is not in a lot of games) or 4x DSR/Downsampling with FXAA/SMAA - this produce the best results but of course needs the most resources. With 2560x1080 minitor I can downsample a lot of games from 5120x2160 on 3070 and results are amazing.

MSAA is dead for years...
 

Bo_Hazem

Banned
What? Modern games look like shit without TAA/DLSS, lots of shader aliasing even in 3840x2160 (and probably beyond)

Only way to fix is somewhat is TAA (and it doesn't use a lot of resources so turning it off won't do... shit) that have lot of issues (ghosting), DLSS (that has it's own issues and right now is not in a lot of games) or 4x DSR/Downsampling with FXAA/SMAA - this produce the best results but of course needs the most resources. With 2560x1080 minitor I can downsample a lot of games from 5120x2160 on 3070 and results are amazing.

MSAA is dead for years...

What was interesting is Insomniac using temporal-injection for anti-aliasing on Spider-man Miles Morales, which is very interesting and the final image looks stunning in native 4K. If it's patented by them then could be licensed, I think, it should make the final image better. Also the Performance RT mode looks pretty good vs the Native 4K mode.
 

Armorian

Banned
What was interesting is Insomniac using temporal-injection for anti-aliasing on Spider-man Miles Morales, which is very interesting and the final image looks stunning in native 4K. If it's patented by them then could be licensed, I think, it should make the final image better. Also the Performance RT mode looks pretty good vs the Native 4K mode.

This, UE4 TU and what AMD is planning to do are interesting but so far DLSS offers the best reconstruction

5120x2160 (4X DSR) + SMAA ~35 FPS

y4Ji5PXA_o.png


2560x1080 + TAA ~90 FPS

8lYDtyGB_o.png


5120x2160 (4X DSR) + DLSS (B) ~56 FPS

SqjWwsFo_o.png




5120x2160 (4X DSR) + TAA ~44 FPS

At1YU348_o.png
 

Bo_Hazem

Banned
This, UE4 TU and what AMD is planning to do are interesting but so far DLSS offers the best reconstruction

5120x2160 (4X DSR) + SMAA ~35 FPS

y4Ji5PXA_o.png


2560x1080 + TAA ~90 FPS

8lYDtyGB_o.png


5120x2160 (4X DSR) + DLSS (B) ~56 FPS

SqjWwsFo_o.png




5120x2160 (4X DSR) + TAA ~44 FPS

At1YU348_o.png

Yup, but problems only appear in motion for DLSS, while standing still it looks stunning. You can see blurriness and artifacts in motion on big screens at least. Still a great achievement.
 
Last edited:

yamaci17

Member
36 tflops are fake

It's more like 23 tflops

Math is simple for comparison with RDNA2 and Turing.

Halve the FLOPS. Multiply by 1.25-1.3x

Ampere TFLOPS are some arbitrary therotical maximum performance that will never be achieved until Nvidia does some special driver magic or something

Rtx 3070 20 tflops = rtx 2080ti 13 tflops
20/2 = 10
10*1.3 = 13

Rtx 3080 30 tflops = rx 6800xt 20 tflops (considering they have identical performance)
30/2 = 15
15*1.3 = 19.5

Again, 4K 60 fps for consoles are also fake, since they cant even push locked 1440p 60 fps in valhalla and had to resort to cheap tricks, going below 1296p and such with series x and ps5 (yeah yeah, it will get better, sure...)
 
Last edited:

lh032

I cry about Xbox and hate PlayStation.
PS owner here, I think developers should aim for 1440 max, with ray tracing (min to med setting) and 60 fps, too early for 4k

If AMD fidelity FX really works, 1440p with ray tracing (high) and 60 fps should be possible.
 
Last edited:
Top Bottom