• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD’s Big Navi GPU Codenamed ‘Sienna Cichlid’ Will Allegedly Feature 5120 Stream Processors.

thelastword

Banned
Komachi_ensaka recently posted a CU count of AMD's upcoming Navi 21 GPU and this is the same count that has been posted by Chiphell leakers and other various twitter leakers so It is definitely starting to look credible. At 80 CUs you are looking at 5120 stream processor and could be the big Navi that gamers have been waiting for almost 3 years now.

AMD's Navi 21 GPU to have 80 CUs according to yet another leak
Twitter user _rogame also created a compilation of the leaked specifications and this is a great table for reference purposes:

80 CUs translate to 5120 stream processors (assuming the same ratio as GCN) and if AMD is able to run these at a minimum of 1700 MHz, you are looking at an astounding 17.5 TFLOPs of power. Considering this is 7nm, AMD should easily be able to hit that (unless they run into TBP constraints) although you are still looking at quite a steep power draw of around 300W (not that any gamer in the high-end segment cares about power draw).

It is entirely possible that this is the Big Navi that was promised to us almost 2.5 years ago and the one that will deliver to AMD fans the high-end card they have been waiting for. On the other hand, it is also possible that this is a new die that the company prepared and while these graphics cards will obviously more powerful than the RX 5800 series, it is still not Big Navi.

Considering we are looking at at least four confirmed launches from AMD where graphics cards are concerned (thanks to EEC fillings), it is very likely that the Open VR entrant we saw recently is one of these. Nomenclature would dictate that the RX 5950 XT (or 6950 XT or whatever AMD decides to call it) be significantly more powerful than any of its younger siblings (like the already-released RX 5700) and going by the steps involved, we can easily see that this is going to be quite a powerful card.


AMD's CEO, Dr. Lisa Su, has also promised high-end Radeon RX Navi GPUs. Lisa also mentioned that AMD is heavily investing in ray tracing for their 2020 discrete GPU lineup which would feature the 2nd Generation RNDA architecture with hardware-level integration to support ray-tracing. The features to expect from 2nd Generation RDNA Navi GPUs would be:

  • Optimized 7nm+ process node
  • Enthusiast-grade desktop graphics card options
  • Hardware-Level Ray Tracing Support
  • A mix of GDDR6 and HBM2 graphics cards
  • More power-efficient than First-Gen Navi GPUs
AMD's GPU side has been pretty much lackluster over the past couple of years and while the CPU side has made one hell of a comeback, its time for Radeon to shine once again. The last impressive launch that I recall was Hawaii and the RX 5950 XT/6950XT looks like it could restart AMD's competitive streak in the GPU side of things as well. Equipped with a process advantage over its competitors, and with NVIDIA backed into a corner with its expensive Turing chips, this is one of the biggest opportunities for AMD Radeon to shine again.


https://wccftech.com/amds-big-navi-...ill-allegedly-feature-5120-stream-processors/
 
Even with all the benefits (better image quality and performance) DLSS2.0 still has problems though....



It's also a technology that's advancing rapidly. DLSS was laughed at in it's first iteration as being a blurry mess, now it's freeing up a lot of power at the cost of some occasional artifacts. DLSS 3.0 may solve these problems while running even more efficiently.

It's definitely headed in the right direction. I'm a huge fan of the potential of being able to run games at the 120fps with the same hardware I currently run games at 60fps with the same settings and not being able to see the difference without pausing the game and pulling out a magnifying glass.
 

kittoo

Cretinously credulous
I was almost sure that I will go AMD as long as they didn't disappoint this time also. But then DLSS (and maybe raytracing performance but we dont know anything about AMD's raytracing) threw a spanner in works. By all accounts DLSS is amazing now. Damn. I really hope AMD has something up their sleeves.
 
AMD has done amazingly well in the CPU space. But as far as GPUs go, you should just assume they will be 18 months behind whatever Nvidia is putting out.

I'm expecting AMD to put out something that competes well with Nvidia's RTX 2000 lineup... while losing badly to Ampere 3000.

I think AMD will lose both in power and in features.

PLEASE prove me wrong AMD.
 

sinnergy

Member
So the 80 CU rumor from about a year ago are true , maybe Series X would even have 80. Would be fun reading here ...
 

rofif

Banned
Wtf is currently or a steam processor. I've had every make gpu since voodoo1 and now have rtx 2070 and I have no idea
 
Wtf is currently or a steam processor. I've had every make gpu since voodoo1 and now have rtx 2070 and I have no idea
Stream, not Steam.

Kind of archievement if you have been able to live trought so many generations of hardware and havent had interest to learn the basics of basics.

Lets just say they are kind of like cpu cores.

If you dont know what those are, then it is too long road to explain
 

rofif

Banned
Stream, not Steam.

Kind of archievement if you have been able to live trought so many generations of hardware and havent had interest to learn the basics of basics.

Lets just say they are kind of like cpu cores.

If you dont know what those are, then it is too long road to explain
Achievement, not Archievement.
CU and Stream processor is not "basics". It's just a marketing name.
Don't need to treat me like an idiot because I don't buy into all the marketing. PC gaming is my life... I just so happen to do it for too long and kinda hate it, which is a shame because it's the only thing I know :p
All in all, when I buy a new gpu, I look up what new features it has (like rtx, dlss) and what price/performance ratio.
It's not like I have s3 virge and buy voodoo and the whole world changes anymore. Or when I buy one of first cards with pixel shaders capability or using dx9. The Compute Units or whatever don't mean much nowadays and card's architecture/specs do not always translate directly to framerate
 
Last edited:
I think that 80CUs is the maximum possible and fastest rdna2 card might have a little less than that, I've heard 72CUs. Still, there's an IPC improvement going from the 36CUs on the 5700XT to this, so it doesn't need to double the performance of the 5700XT to be 40% faster than the 2080 Ti.
 
I think that 80CUs is the maximum possible and fastest rdna2 card might have a little less than that, I've heard 72CUs. Still, there's an IPC improvement going from the 36CUs on the 5700XT to this, so it doesn't need to double the performance of the 5700XT to be 40% faster than the 2080 Ti.
5700XT has 40 CU’s. 5700 is the one with 36.
 

Dampf

Member
Even with all the benefits (better image quality and performance) DLSS2.0 still has problems though....


Yeah, but in that specific game problems are more severe as the issue is apparently that the engine does not give DLSS the right motion vectors.

However, if we talk about overall image quality it even improves that quite nicely in Death Stranding with an overall cleaner image, more details and better temporal stability.

Its crazy that people bought a $1200 Turing card for 4k gaming, a few years later, they are telling us they will buy perhaps a $1400 rtx3090 with even more power draw than Turing for 1080p DLSS. You can't make this up....

Nothing new. People like to have the most recent stuff.
 
Last edited:

Bo_Hazem

Banned
Komachi_ensaka recently posted a CU count of AMD's upcoming Navi 21 GPU and this is the same count that has been posted by Chiphell leakers and other various twitter leakers so It is definitely starting to look credible. At 80 CUs you are looking at 5120 stream processor and could be the big Navi that gamers have been waiting for almost 3 years now.

AMD's Navi 21 GPU to have 80 CUs according to yet another leak
Twitter user _rogame also created a compilation of the leaked specifications and this is a great table for reference purposes:

80 CUs translate to 5120 stream processors (assuming the same ratio as GCN) and if AMD is able to run these at a minimum of 1700 MHz, you are looking at an astounding 17.5 TFLOPs of power. Considering this is 7nm, AMD should easily be able to hit that (unless they run into TBP constraints) although you are still looking at quite a steep power draw of around 300W (not that any gamer in the high-end segment cares about power draw).

It is entirely possible that this is the Big Navi that was promised to us almost 2.5 years ago and the one that will deliver to AMD fans the high-end card they have been waiting for. On the other hand, it is also possible that this is a new die that the company prepared and while these graphics cards will obviously more powerful than the RX 5800 series, it is still not Big Navi.

Considering we are looking at at least four confirmed launches from AMD where graphics cards are concerned (thanks to EEC fillings), it is very likely that the Open VR entrant we saw recently is one of these. Nomenclature would dictate that the RX 5950 XT (or 6950 XT or whatever AMD decides to call it) be significantly more powerful than any of its younger siblings (like the already-released RX 5700) and going by the steps involved, we can easily see that this is going to be quite a powerful card.


AMD's CEO, Dr. Lisa Su, has also promised high-end Radeon RX Navi GPUs. Lisa also mentioned that AMD is heavily investing in ray tracing for their 2020 discrete GPU lineup which would feature the 2nd Generation RNDA architecture with hardware-level integration to support ray-tracing. The features to expect from 2nd Generation RDNA Navi GPUs would be:

  • Optimized 7nm+ process node
  • Enthusiast-grade desktop graphics card options
  • Hardware-Level Ray Tracing Support
  • A mix of GDDR6 and HBM2 graphics cards
  • More power-efficient than First-Gen Navi GPUs
AMD's GPU side has been pretty much lackluster over the past couple of years and while the CPU side has made one hell of a comeback, its time for Radeon to shine once again. The last impressive launch that I recall was Hawaii and the RX 5950 XT/6950XT looks like it could restart AMD's competitive streak in the GPU side of things as well. Equipped with a process advantage over its competitors, and with NVIDIA backed into a corner with its expensive Turing chips, this is one of the biggest opportunities for AMD Radeon to shine again.


https://wccftech.com/amds-big-navi-...ill-allegedly-feature-5120-stream-processors/

Sweet! Meeting the same philosophy of next PS5 Pro ;) 40+40CU's.
 

ZywyPL

Banned
Sounds decent, the question will be the RT performance, because who knows, maybe their solution won't tank the performance like it does on Turing cards, so no DLSS-equivalent won't be needed. Seeing how the upcoming consoles are already handling RT effects in 4K60, one thing is sure - the upcoming NV vs AMD battle will be very interesting.


Its crazy that people bought a $1200 Turing card for 4k gaming, a few years later, they are telling us they will buy perhaps a $1400 rtx3090 with even more power draw than Turing for 1080p DLSS. You can't make this up....

Believe it or not but many people are able to spend more than a mere 400-500$ every 8 years on a gaming equipment. And considering how high prices the top-end parts hold throughout the years, selling your old hardware makes the upgrade actually cheap as hell.
 

PerfectDark

Banned
The problem with AMD is their GPU's run too hot and the hardware is too bugy. Every time I go AMD gpu it ruins my PC nd it runs too hot or my games crash or a issue with Wattman software. Everytime I get sick of AMD and spend the extra for Nvidia smooth sailing.

AMD cpu's rock. AMD GPU's are straight up broken.
 

thelastword

Banned
Sweet! Meeting the same philosophy of next PS5 Pro ;) 40+40CU's.
Yeah, or more precisely 36x2 = 72 CU's, would be similar to their butterfly approach on PS4 PRO. It lines up perfectly for an upgrade. Who knows if 40+40 may not make it in, since yields have been so good on PS5....
 

Dampf

Member
Sounds decent, the question will be the RT performance, because who knows, maybe their solution won't tank the performance like it does on Turing cards, so no DLSS-equivalent won't be needed. Seeing how the upcoming consoles are already handling RT effects in 4K60, one thing is sure - the upcoming NV vs AMD battle will be very interesting.

RT will always tank performance depending on the implementation (so settings or use case), it has not much to do with Turings RT solution, but more with the general performance of cards. When you activate screen space reflections or post processing effects in any game etc those are tanking performance too, same with RT. There are a lot of indicators which suggest the Xbox Series X perform similar to a 2080, even in Raytracing games.

A 3080Ti with its 40% better performance compared to the 2080Ti is already a nice increase which can lead to either higher performance or higher settings (RT too!), depending on your choice. Or with DLSS, you can have both.

RDNA2 likely won't have a feature like DLSS, so you have to either sacrifice performance or visual quality. However, that's not an issue if AMD prices the series lower, they can still be great value cards.
 
Last edited:
I'm expecting AMD to put out something that competes well with Nvidia's RTX 2000 lineup... while losing badly to Ampere 3000.

I think AMD will lose both in power and in features.

Agreed.

Also, the thing holding me back with AMD (compared to Nvidia) is that their drivers don't seem to be as strong as Nvidia's. Kinda scares me off a little.

Then there's the magic that is DLSS 2.0 and of course I doubt AMD can match Nvidia's ray tracing features on Ampere.

Lastly, the latest rumors indicate Ampere 3090 (or whatever it will be called) will hit 10,000 in TSE. BTW, a 2080ti scores around 6300 points in Time Spy Extreme.
 

Arun1910

Member
Nice specs but NVIDIA has DLSS 2.0. That greatly improves image quality AND performance at the same time

This is the reason why I am sticking to NVIDIA. I've said it many times, but DLSS 2.0 is absolutely amazing tech and offers such a better gaming experience for those who can't splash out on the top card at any one time.
 
This is the reason why I am sticking to NVIDIA. I've said it many times, but DLSS 2.0 is absolutely amazing tech and offers such a better gaming experience for those who can't splash out on the top card at any one time.

Yes, and Cyberpunk will have DLSS on launch. :O

Should help with Ray Tracing and ULTRA settings enabled!
 

ZywyPL

Banned
RT will always tank performance depending on the implementation (so settings or use case), it has not much to do with Turings RT solution, but more with the general performance of cards. When you activate screen space reflections or post processing effects in any game etc those are tanking performance too, same with RT. There are a lot of indicators which suggest the Xbox Series X perform similar to a 2080, even in Raytracing games.

Not true all all, you say like we are still using the same VooDoo cores from decades ago, just in more quantity and higher clocks, but the architectures progress with every generation, just check various GCN iterations vs RDNA1 - same number of CUs, same clocks, yet different results, and the exact will happen with RT cores, what NV introduced with Turing cards is not the one and only, ultimate solution, it's just the beginning and will only get better over time, Ampere RT cores are actually rumored to have 3-4x the performance of RT cores from Turing cards, and AMD will have their own, different architecture for handling RT as well.
 

thelastword

Banned
Nothing new. People like to have the most recent stuff.
Believe it or not but many people are able to spend more than a mere 400-500$ every 8 years on a gaming equipment. And considering how high prices the top-end parts hold throughout the years, selling your old hardware makes the upgrade actually cheap as hell.
My argument is not even over price, but that people have been hollering about 4K native as the standard for years. 1080ti, Nvidia touted it as it's 4K native GPU, 2080ti was the same, it's 4K native GPU. Now people are telling me, they are not excited for native 4K with raytracing on a future $1400 GPU, but instead for an ML application that works on a 1080p image or even lower. Machine Learning or not, DLSS is not going to match native, it's the same way VRS will not offer a better image if detail is not being culled out of view....

I think DLSS have improved tremendously, but like anything in the industry there will be competition, we are too quick to jump on the hype bandwagon even when there is little adoption. DLSS was supposed to be the big reason everybody bought a Turing card, when RTX support fell flat on it's face, DLSS did not deliver then. It's better now, but it still has to be implemented by devs and you can bet your bottom dollar that Checkerboard Rendering is being improved and AMD's Fidelity FX suite of tools has many IQ features launching with NAVI 2 amongst other new features.....AMD has innovated on the software front more than any company the last 10 years. Now they have the hardware on the high end, their software suite will show up even more....
 

ZywyPL

Banned
My argument is not even over price, but that people have been hollering about 4K native as the standard for years.

Because no one knew at the time it could've been done just as good, if not even better better than native 4K (other than downsampling 8K to 4K), but here we are today, with AI upscaling techniques providing even better image quality and twice the performance, it's a no-brainer, a win-win scenario. And potentially, the upcoming cards will be able to render games at 4K60 on their own, with DLSS bumping that up to 120.
 
Because no one knew at the time it could've been done just as good, if not even better better than native 4K (other than downsampling 8K to 4K), but here we are today, with AI upscaling techniques providing even better image quality and twice the performance, it's a no-brainer, a win-win scenario. And potentially, the upcoming cards will be able to render games at 4K60 on their own, with DLSS bumping that up to 120.

Yeah, a lot people don't realize (or believe...) that DLSS 2.0 not only looks just as good as native, but it actually looks BETTER than native res. No reason not to use it since the IQ will improve slightly and performance as well. Win-win.
 

Dampf

Member
Not true all all, you say like we are still using the same VooDoo cores from decades ago, just in more quantity and higher clocks, but the architectures progress with every generation, just check various GCN iterations vs RDNA1 - same number of CUs, same clocks, yet different results, and the exact will happen with RT cores, what NV introduced with Turing cards is not the one and only, ultimate solution, it's just the beginning and will only get better over time, Ampere RT cores are actually rumored to have 3-4x the performance of RT cores from Turing cards, and AMD will have their own, different architecture for handling RT as well.

Yes, architectures will progress, that is true. However, you cannot compare it to the old 3D times, we are in an entirely different situation now. You won't suddenly get 4x the performance increase by tinkering with the RT cores, that is not how it works. Expect around 10-20% better RT performance on top of the already speculated 40% raster increase, not more, likely less. That rumor comes from a YouTuber called Moores Law is Dead and is frankly, BS if you have any knowledge about the matter you will see why. Let me explain.

RT cores are meant to accelerate BVH intersection and traversal and after that, the result is then being processed by the shader cores (your general performance so to speak) and that takes a lot more time to render than the RT specific stuff happening on the RT cores. If you have more reflections, shadows and AO to render, of course that affects your performance drastically. See this frame of Metro Exodus with RTX on.

contentteller.php


The green lines represent the time the RT cores need to compute the RT specific stuff (in Pascal, it's that giant steady line in the middle). As you can see it's just a fraction compared to the overall shading process. You can optimize it of course, but it won't lead to a sudden 4x overall increase in FPS, you see that for yourself, yes?

AMD's solution is quite similar, the difference is that the RT cores are specific sections inside the TMUs or CUs that accelerate BVH intersection tests, we already know the Series X does around 4x the intersection tests compared to a Turing GPU ( Nvidia and Microsoft likely used different BVH depth for their calculations so take that with a grain of salt). However, BVH traversal is not being accelerated by specific cores in AMD solution, unlike Turing. There is a lot more to RT performance than just intersection tests, a lot of factors like memory bandwidth, shader performance, API, implementation and likely more variables. We can already see the Xbox Series X performing similar to a 2070-2080 in Minecraft DXR which is what we can expect.
 
Last edited:

Kerotan

Member
It's also a technology that's advancing rapidly. DLSS was laughed at in it's first iteration as being a blurry mess, now it's freeing up a lot of power at the cost of some occasional artifacts. DLSS 3.0 may solve these problems while running even more efficiently.

It's definitely headed in the right direction. I'm a huge fan of the potential of being able to run games at the 120fps with the same hardware I currently run games at 60fps with the same settings and not being able to see the difference without pausing the game and pulling out a magnifying glass.
Slightly different but I remember the pc crowd laughing at Sony's checkerboarding technique for the pro. They won't be laughing when they unveil their next gen checkerboarding as DSLL has shown them the light.
 
Slightly different but I remember the pc crowd laughing at Sony's checkerboarding technique for the pro. They won't be laughing when they unveil their next gen checkerboarding as DSLL has shown them the light.
DLSS 2.0 >>> next gen checkerboarding it seems, and DLSS 3.0 will be ever better.
 

thelastword

Banned
Because no one knew at the time it could've been done just as good, if not even better better than native 4K (other than downsampling 8K to 4K), but here we are today, with AI upscaling techniques providing even better image quality and twice the performance, it's a no-brainer, a win-win scenario. And potentially, the upcoming cards will be able to render games at 4K60 on their own, with DLSS bumping that up to 120.
Yeah, a lot people don't realize (or believe...) that DLSS 2.0 not only looks just as good as native, but it actually looks BETTER than native res. No reason not to use it since the IQ will improve slightly and performance as well. Win-win.
Show me where DLSS is better than native. I was the biggest proponent of Checkerboard Rendering, but not once did I ever assume it was better than native and the same applies for DLSS, it's pretty close, CB was pretty close to native in many scenarios too, but it's not better....

I think we can conclude that the feature is good, it definitely is, just as CB was, but these algorithms are there to save frames and lighten the GPU load....which is a good thing too. The problem is when everybody keeps going on this DLSS hype train and over-exaggerate like; "it's better than native", and so Nvidia will diminish the perf you get on their GPU's and sell you cards at a high price, as long as it supports DLSS at 1080p, instead of pushing boundaries where 4k 60/120fps become standard natively. Nvidia is laughing at the DLSS smooching, because they realize that you are willing to buy an expensive $1400 card and that card does not necessarily have to push the boundaries of 4k 60fps.....or 8K...They will capitalize on that if DLSS gains support. Yet let's be honest, when AMD presents it's own ML based Image Renconstruction technique, everybody will quit the DLSS hype train and they will be back on the Native 4K 120fps, 8K 60, my card is a bigger beast than yours....



Just an FYI, I think reconstrcution would be great for a future handheld or perhaps the next Nintendo system, since they won't go the bleeding edge route...
 
Rumours circling that ps5 has tech on par with DLSS 3.0. Regardless my point stands, you ain't laughing at checkerboarding now. "but but but but it it ain't native 4k".
And that's just a rumour. If it were reality, it would have definitely been talked about in depth with Cerny. Something like that doesn't go unnoticed, and ps5 doesn't seem to have an equivalent to tensor cores like Nvidia 2xxx series. Rumors are cool and all, but when they hold no weight, it becomes nothing more than wishful thinking.
 
Show me where DLSS is better than native. I was the biggest proponent of Checkerboard Rendering, but not once did I ever assume it was better than native and the same applies for DLSS, it's pretty close, CB was pretty close to native in many scenarios too, but it's not better....

I think we can conclude that the feature is good, it definitely is, just as CB was, but these algorithms are there to save frames and lighten the GPU load....which is a good thing too. The problem is when everybody keeps going on this DLSS hype train and over-exaggerate like; "it's better than native", and so Nvidia will diminish the perf you get on their GPU's and sell you cards at a high price, as long as it supports DLSS at 1080p, instead of pushing boundaries where 4k 60/120fps become standard natively. Nvidia is laughing at the DLSS smooching, because they realize that you are willing to buy an expensive $1400 card and that card does not necessarily have to push the boundaries of 4k 60fps.....or 8K...They will capitalize on that if DLSS gains support. Yet let's be honest, when AMD presents it's own ML based Image Renconstruction technique, everybody will quit the DLSS hype train and they will be back on the Native 4K 120fps, 8K 60, my card is a bigger beast than yours....



Just an FYI, I think reconstrcution would be great for a future handheld or perhaps the next Nintendo system, since they won't go the bleeding edge route...
Stop equating DLSS to checkerboarding... it's far superior.. and yes, DLSS2.0 IS actually superior to native 4K TAA in most regards. It's far more temporally stable, lines are sharper and more defined. Text is better and more legible... and higher framerates come with better motion resolution.

DLSS2 completely embarrasses checkerboard rendering in on of the best examples of that tech... Death Stranding.

The rest of your post is just ridiculously laughable.... not nearly enough games support DLSS for Nvidia to just be like "hey let's not push boundaries at 4K or 8K".... my god.. You'll see that once again, Nvidia GPUs trounce AMD GPUs native or not.

Rumours circling that ps5 has tech on par with DLSS 3.0. Regardless my point stands, you ain't laughing at checkerboarding now. "but but but but it it ain't native 4k".
There's no rumors of the sort. There's absolutely NOTHING out there currently that suggests PS5 has anything on par with DLSS at all... let alone some made up "DLSS3.0". Nothing out there suggests that PS5 has enough INT4/8 capability to come even close.
 
Last edited:

thelastword

Banned
Slightly different but I remember the pc crowd laughing at Sony's checkerboarding technique for the pro. They won't be laughing when they unveil their next gen checkerboarding as DSLL has shown them the light.
Look how they will back track and go back to the native 8K hypetrain when Checkerboard 2.0 or AMD's solution is shown, Just you wait. Still, all this talk about DLSS is just hype to sell Ampere cards. DLSS was supposed to be revolutionary 3 years ago on Turing. The hype train for it has began yet again, but where is the adoption. The same for RTX, "it should have just worked", till people saw how much it sucked the life-energy out of those cards.....

Now all DLSS does is lower the GPU footprint of expensive future Ampere cards so Nvidia could finally sell that raytracing works and is viable, but as I've said before. Raytracing will only work when the consoles say so. DLSS hype did not sell the feature in 2018. The ML reconstruction feature that most devs adopt will the one that's more prevalent on consoles.........


Rumours circling that ps5 has tech on par with DLSS 3.0. Regardless my point stands, you ain't laughing at checkerboarding now. "but but but but it it ain't native 4k".
That geometry engine is probably going to be the best thing this gen among a sea of great tech and features....Yet, it's great to see that at launch PS5 has no need to use Reconstruction or VRS 3x3 yet. They are pushing 4k native with raytracing. I think that speaks to how well Cerny's design is working to give the best performance in real-time vs the best performance on paper...

Yet, I hear you. I remember all too well the PC guy's approach to checkerboard rendering, even when Sony games were the best looking games last gen with the feature implemented, like with Horizon, Death Stranding, Detroit, GOW etc....At least the consoles are cheap, but to suck up to Nvidia at $1400-1500 on account of 1080p DLSS instead of 4k 120fps or 8k 60fps is a bit puzzling.....
 

Bo_Hazem

Banned
Yeah, or more precisely 36x2 = 72 CU's, would be similar to their butterfly approach on PS4 PRO. It lines up perfectly for an upgrade. Who knows if 40+40 may not make it in, since yields have been so good on PS5....

No I meant 40+40 for Big Navi, the premium model, and 36+36 for the lesser model, just like 5700 and 5700XT. PS5 Pro gonna be a 20.6-22TF beast few years from now! Day one for me.
 
Last edited:
No I meant 40+40 for Big Navi, the premium model, and 36+36 for the lesser model, just like 5700 and 5700XT. PS5 gonna be a 20.6-22TF beast few years from now! Day one for me.
I'm almost sure the ps5 will be half the TF of that. Maybe ps6/pro... Unless the SSD adds teraflops... Not like I haven't heard that before.
 

Bo_Hazem

Banned
Stop equating DLSS to checkerboarding... it's far superior.. and yes, DLSS2.0 IS actually superior to native 4K TAA in most regards. It's far more temporally stable, lines are sharper and more defined. Text is better and more legible... and higher framerates come with better motion resolution.

DLSS2 completely embarrasses checkerboard rendering in on of the best examples of that tech... Death Stranding.

The rest of your post is just ridiculously laughable.... not nearly enough games support DLSS for Nvidia to just be like "hey let's not push boundaries at 4K or 8K".... my god.. You'll see that once again, Nvidia GPUs trounce AMD GPUs native or not.


There's no rumors of the sort. There's absolutely NOTHING out there currently that suggests PS5 has anything on par with DLSS at all... let alone some made up "DLSS3.0". Nothing out there suggests that PS5 has enough INT4/8 capability to come even close.

DLSS 2.0 in that video comparison is using 2080Ti, you should know better. And it still has some major flows. Checkerboarding is old and it should get smarter now, and it goes toe-to-toe with native 4K 2080Ti.

Timestamped:





And more flaws of DLSS 2.0 exaggerated sharpening

 
Last edited:
Top Bottom