• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD RDNA 3 GPUs To Be More Power Efficient Than NVIDIA Ada Lovelace GPUs, Navi 31 & Navi 33 Tape Out Later This Year

tusharngf

Member
AMD-RDNA-3-GPU-Navi-31-Flagship-For-Radeon-RX-Gaming-GPUs-1030x579.png


According To KittyYYuko, a very reliable leaker with multiple correct information regarding NVIDIA & AMD GPUs posted in the past, AMD is expected to tape out multiple Navi 3X GPUs based on its RDNA 3 graphics architecture later this year. As per his latest tweet, AMD will be taping out its first MCD (Multi-Compute Die) GPU, the Navi 31, soon! In addition to Navi 31, AMD is also expected to tape out its Navi 33 GPU in the fourth quarter of 2021 while Navi 32 is expected to tape out by Q1 2022.


AMD RDNA 3 'Radeon RX' Navi 3X GPU Design, Power Efficiency & Memory Layout



Tweets from Greymon55 who has been discussing the architectural upgrades and what to expect from RDNA 3 GPUs in general. The leaker states that while both NVIDIA Ada Lovelace and AMD RDNA 3 GPUs are going to be really power-hungry as the existing GPU lineup, AMD will have the more efficient offerings. As per current rumors, the MCD on RDNA 3 chiplet based GPUs is turning out to be very power efficient and it will take NVIDIA at least one generation to catch up to them in the gaming segment.













NVIDIA is expected to quickly transition to its MCM GPU lineup which will offer over 3x performance boost over Ampere GPUs.


Source : https://wccftech.com/amd-rdna-3-gpu...pus-navi-31-navi-33-tape-out-later-this-year/
 
So 33 is like 6900xt but on RDNA3?

I can't belive this power jump, I mean MCM explains it on AMD but Lovelace looks like massive boost too and it's using single GPU.

600W GPUs?
Yeah, the rtx 4000 series comes with a special mini nuclear power plant installed by default. You'll need to submerge it in water just to cool down. It's not out yet because Nvidia hasn't found a way to protect your other components, but when they do, I'm sure it will be glorious. We might even start touching 8k at 30 fps :messenger_tears_of_joy:
 

Armorian

Banned
Yeah, the rtx 4000 series comes with a special mini nuclear power plant installed by default. You'll need to submerge it in water just to cool down. It's not out yet because Nvidia hasn't found a way to protect your other components, but when they do, I'm sure it will be glorious. We might even start touching 8k at 30 fps :messenger_tears_of_joy:

Current GPUs are eating more than 300W, and we are talking about 2x or even 3x jump with small nodes reduction, it will eat up 500W+

So when do we think nvidia 4000 series will likely launch?

I guess 2H of 2022

Can't wait to wipe the shit-eating grins off NV fanboys once ML-powered FSR is revealed alongside RDNA 3.

It will be as hard to implement in games as DLSS (it will have to use motion vectors) and only RDNA3 compatible so all advantages of FSR1 will be gone.
 

Bo_Hazem

Banned
That's all we can hope for. Keep the competition tight so the consumers can reap the yields. That being eventually prices come back to reality and we won't have to spend 1K on the lowest tier GPUs. :pie_gsquint:

Honestly due to current prices I would rather buy a pre-built PC instead. See this one for example:


But I'll build/buy when PCIe 5.0 hits.
 

Dampf

Member
Only a bona fide imbecile would look at this slide and think yeah theres nothing wrong with it.

17h5PHL.png


They're basically predicting that Hopper released 2 years after Rdna3 will only be 20% faster :messenger_grinning_sweat:
Hopper is their server architecture, Lovelace their gaming architecture. Hopper is already near tape-out and will likely release 2022.
 

SantaC

Member
It will have 15K cores, which is 3 times the 6900XT.

Finally Nvidia will get a beatdown. Nvidia has big problems with their powerconsumtion and wont be able to match AMDs design.
 

Dampf

Member
It will have 15K cores, which is 3 times the 6900XT.

Finally Nvidia will get a beatdown. Nvidia has big problems with their powerconsumtion and wont be able to match AMDs design.
Yup, it will be very interesting.

Also regarding Raytracing performance. With MCM, AMD could theoretically have one big MIMD chiplet entirely dedicated to Raytracing acceleration. Such a design would destroy Ampere's approach in every way except latency maybe.

I also hope RDNA3 features dedicated matrix acceleration cores, similar to tensor cores. Couple that with a ML-accelerated temporal FSR 2.0 and Nvidia will certainly feel some pressure.
 
Last edited:

DonkeyPunchJr

World’s Biggest Weeb
The AMD cycle:

- rabid, hyperbolic speculation about how much ass product++ is going to kick

- it launches and falls short of the hype

- “Nvidia/Intel might be better right now, but just wait until [better drivers/games are optimized for AMD because consoles use the same architecture/games start to benefit from all that VRAM]”

- competitor releases a new product that dominates the market

- “it’s not fair to compare the other guy’s next gen product with AMD’s current one. Just wait for product++”

- repeat
 

Kenpachii

Member
The AMD cycle:

- rabid, hyperbolic speculation about how much ass product++ is going to kick

- it launches and falls short of the hype

- “Nvidia/Intel might be better right now, but just wait until [better drivers/games are optimized for AMD because consoles use the same architecture/games start to benefit from all that VRAM]”

- competitor releases a new product that dominates the market

- “it’s not fair to compare the other guy’s next gen product with AMD’s current one. Just wait for product++”

- repeat

Remember RDNA2 being a 2080ti, yet sits at 3080/3090 performance. Nobody expected any decent raytracing from it so that's not a point.
Because of consoles RDNA2 will be supported for its entire generation without issue's. 3080's? nvidia could ditch it like the 700 series without issue's 2-3 gens, and gens will come fast because of competition.
Nobody expected RDNA2 to win from Nvidia.
If those new gpu's come with 12gb for even low end side and games will be optimized for it with low price a 3080 will age much like the 580 does like a wet noodle which could already happen in 2023.

So honestly your list is laughable for anybody with any experience with gpu's. Nvidia shit the brick multiple times in the past with there gpu's where people thought it was fine, until they realized it was not and noticed a year or two later in its generation.
 
Last edited:
I hope we'll see such massive jumps from both makers.
Hopper is their server architecture, Lovelace their gaming architecture. Hopper is already near tape-out and will likely release 2022.
If you believe their rumors, they have Hopper set to '24 in their roadmap, but knowing how it's extremely rare to see any rumor from wccftech come to fruition yeah it's safe to ignore that as well.
 
Power consumption has never been a contention of buying product A over B. Performance always has and always will be for me. If AMD can top Nvidia, I'll go with them. But it can't just be winning in rasterization, but raytracing as well.

I'm not saying it can't happen, but look at RDNA2 hype/expectations compared to reality? Rasterization was great, even power consumption was good. But DLSS and raytracing were a night and day difference, which turned me off, and led me to ampere instead.
 

DonkeyPunchJr

World’s Biggest Weeb
Remember RDNA2 being a 2080ti, yet sits at 3080/3090 performance. Nobody expected any decent raytracing from it so that's not a point.
Because of consoles RDNA2 will be supported for its entire generation without issue's. 3080's? nvidia could ditch it like the 700 series without issue's 2-3 gens, and gens will come fast because of competition.
Nobody expected RDNA2 to win from Nvidia.
If those new gpu's come with 12gb for even low end side and games will be optimized for it with low price a 3080 will age much like the 580 does like a wet noodle which could already happen in 2023.

So honestly your list is laughable for anybody with any experience with gpu's. Nvidia shit the brick multiple times in the past with there gpu's where people thought it was fine, until they realized it was not and noticed a year or two later in its generation.
RDNA2 was indeed a pleasant surprise. That came after many years of hype, disappointment, and “just wait, it’s going to get better!” with Vega and RDNA. And that cycle was true for literally a decade on the CPU side of things up until Intel shit the bed with 10nm and gave AMD a chance to catch up.

We’ll see about this rumor. Nothing would make me happier than seeing RDNA3 bitch slap Nvidia. I’ll buy one if it does. But I wish I had a dollar for every wildly optimistic prediction from the AMD evangelists I’ve read over the last 15 years.
 
Last edited:

Dream-Knife

Banned
Three times the performance? We haven't even had that level of jump from the 1080ti to 3080.

Will be cool to see, but I'm sticking with RDNA2 for the next 5 years.
 

rnlval

Member
According To KittyYYuko, a very reliable leaker with multiple correct information regarding NVIDIA & AMD GPUs posted in the past, AMD is expected to tape out multiple Navi 3X GPUs based on its RDNA 3 graphics architecture later this year. As per his latest tweet, AMD will be taping out its first MCD (Multi-Compute Die) GPU, the Navi 31, soon! In addition to Navi 31, AMD is also expected to tape out its Navi 33 GPU in the fourth quarter of 2021 while Navi 32 is expected to tape out by Q1 2022.


AMD RDNA 3 'Radeon RX' Navi 3X GPU Design, Power Efficiency & Memory Layout



Tweets from Greymon55 who has been discussing the architectural upgrades and what to expect from RDNA 3 GPUs in general. The leaker states that while both NVIDIA Ada Lovelace and AMD RDNA 3 GPUs are going to be really power-hungry as the existing GPU lineup, AMD will have the more efficient offerings. As per current rumors, the MCD on RDNA 3 chiplet based GPUs is turning out to be very power efficient and it will take NVIDIA at least one generation to catch up to them in the gaming segment.



NVIDIA is expected to quickly transition to its MCM GPU lineup which will offer over 3x performance boost over Ampere GPUs.


Source : https://wccftech.com/amd-rdna-3-gpu...pus-navi-31-navi-33-tape-out-later-this-year/
While it's good to see AMD's rasterization improvements, AMD's geometry, mesh shader, and raytracing performance was inferior.
 
AMD hit the ball out the park when they introduced the chiplet designs into Zen 2 and likewise we could see them having the same "Zen 2" moment with the chiplet design RDNA 3, the top card is rumoured to have around 160 CU's which is insane, besides rasterisation (which I think AMD will edge Nvidia out on) I'm curious to see how it'll stack up against Lovelace in terms of ray tracing performance.
 

Dream-Knife

Banned
The performance sounds great, but can we go back to cards that ran cooler and didn’t have triple slot coolers and triple 8 pin power connectors? Tons of people undervolting this gen to get their temps and noise down. Ampere puts out a lot of heat and loves to suck power.
I don't know how Nvidia is doing, but my Red Dragon 6800 runs at 50-60*c, and Junction temp from 68-70*c with a timespy score of 15800 with a minor undervolt (980 vs 1025). Without the undervolt at max load it would run 60*c GPU and 82*c Junction. Fans at no more than 70% with a Coolermaster H500 case.

I think the problem is mostly people with poor cases, as exemplified in that Alienware thread yesterday.
 

DonkeyPunchJr

World’s Biggest Weeb
Can't wait to wipe the shit-eating grins off NV fanboys once ML-powered FSR is revealed alongside RDNA 3.
LOL. For the last 9 months all we’ve heard from AMD fanboys is “just wait for FSR it’s gonna be awesome”. Now it’s been out for like 2 weeks and ALREADY it’s “well… wait for version 2.0, I’m sure that’s the one that’s REALLY going to kick ass! You nvidia fanboys are gonna be sorry!!!”
 

Marlenus

Member
3x more performance over ampere is insane.

If 3x shaders is true and if perf/shader is comparable and if perf scaling to that number of shaders does not fall off of a cliff I expect performance to still be lower than 3x performance uplift.

3870 to 4870 was 2.5x more shaders but was only about 1.8x the performance.

If 60WGPs for N31 is accurate and 15k shaders is accurate then a lot has changed under the hood so speculating on performance is just pure guessing. I think I will wait until AMD give some info on their perf/watt uplift target before trying to see where performance will land.

If they do 3x the shaders and are going to aim for that kind of performance uplift I expect TDPs to be around 400W and maybe more.
 
The performance sounds great, but can we go back to cards that ran cooler and didn’t have triple slot coolers and triple 8 pin power connectors? Tons of people undervolting this gen to get their temps and noise down. Ampere puts out a lot of heat and loves to suck power.

I agree with this. People made fun of the PS4pro jet engine but apparently most PC gamers don't care?

I'm lowering the vent RPM to get noise done.
 

DonkeyPunchJr

World’s Biggest Weeb
I agree with this. People made fun of the PS4pro jet engine but apparently most PC gamers don't care?

I'm lowering the vent RPM to get noise done.
Definitely would like lower wattage cards but the noise has more to do with cooler design than with the power draw. I had a Radeon 4870 (150W) and a 5870 (188W) and both sounded like freaking hairdryers. I’ve also had a 3080 FTW3 that was pretty quiet (only ran it in the normal mode not the overclocked bios) and a 3090 FE that was whisper silent. Both quieter than a PS4 Pro easily.

Worst card I ever had was the Radeon VII, that thing sounded like a damn jet engine. I fit a Moroheus Vega aftermarket cooler on it + 2 Be Quiet 120mm fans and it was whisper silent after that.
 

PaintTinJr

Member
Power consumption has never been a contention of buying product A over B. Performance always has and always will be for me. If AMD can top Nvidia, I'll go with them. But it can't just be winning in rasterization, but raytracing as well.
Given the way next-gen gaming will probably rely much more on ROPS and flexible GPU compute - rather than dedicated ASICS or tensor cores - I suspect the reality is that AMD RDNA2 is already in front of Ampere for future games.
I'm not saying it can't happen, but look at RDNA2 hype/expectations compared to reality? Rasterization was great, even power consumption was good. But DLSS and raytracing were a night and day difference, which turned me off, and led me to ampere instead.
I was the opposite, but still bought an Ampere card, because there wasn't a RDNA2 card I could buy from a online retailer at the £500 price point I was prepared to spend on a hold hover card.

Running old Ps4 designed graphics on PC with DLSS and RT - pre-lumen's software RT - doesn't really interest me, as it doesn't really show anything new to impress.
 

Rikkori

Member
Not much to be happy about tbh. It's going to start at a bare minimum at $2K, probably $3K in reality at retail, and unlike this gen you won't even have the benefit of getting disproportionate returns from mining in order to off-set the costs (cache doesn't help for that), because as dumb as the 3090 might have looked in the end that card will end up making a profit for its users even when they're gamers just letting it mine during downtime 8/12hrs a day.

Without a doubt if AMD gets MCM first before Nvidia they will win the crown for both raster & raytracing, and not by a little either but by a lot (remember how the RA cores are structured!). Ampere is imo a weak effort and is Nvidia being caught flat-footed. On paper you look at the performance and they're doing great with raytracing, then dlss is a nice cherry on top, but if you look at it strategically from an R&D stand point it's a power hungry POS, with little room to grow, meanwhile RDNA is just flying & shining brighter with every iteration. The problem for Nvidia is they have nowhere else to go now & have to do a major re-design, but if AMD is closer to MCM and gets the jump on them and now holds the crown... well, look at how that's turned out for Intel. It's hard to make a comeback once you fall behind and usually requires a whole new plan and years of execution until a new "opening" appears, no different than how it was for AMD, both on the CPU & GPU fronts.

Regardless, it's the customer that gets bent over and f'ed, because as we can see the $/perf isn't really getting better except for an ever higher-end product. Oh well.
 
Running old Ps4 designed graphics on PC with DLSS and RT - pre-lumen's software RT - doesn't really interest me, as it doesn't really show anything new to impress.
Not sure where the random ps4 games come into the topic, but I'll bite. What RDNA2 games impressed you on the other hand? It seems you are trying to imply something, without explicitly stating it.
 
Top Bottom