• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: AMDs Big Navi performance on par with 3080RTX

Leonidas

Member
Looks like AMD has fallen short of Nvidia yet again. Not that I'm surprised.

Sounds like they got closer this time though thanks to them finally releasing a GPU with more than 64 CU.
 
Regarding RT, didn't look nice, but perhaps it says something about performance:


I expect 2080Ti + 30%+ performance, which should not be far from the said card, although NV might choose to name it differently, so that "but ti is untouchable" myth that was created in post maxwell era, still lives..
And it is based on basic math:

2080Ti = 5700XT + 36-46%

5700XT is a 250mm2 36CU chip on inferior node and slower RAM.
We know from Sony that RDNA2 can cross 2Ghz, with reasonable power conssumption so clocks won't be lower.
We know AMD aims at 50% higher perf/w vs existing NAVI card, closing the only remaining gap they had vs NV.
"Big Navi" is expected to be a 505mm2 chip with faster memory and 80CUs.
On top of it, Ampere is 8nm Samsung (which is more of a 10nm on steroids) while AMD enjoys TSMC 7nm.

Things stack in AMD's favor, so I expect a lot of "but DLSS" and more agressive "black screen" trolling.
5700XT is 40 CU’s.
 

martino

Member
i need to know :
if there is a dlss equivalent (with or without perf cost)
how card compare with rt enabled

and if there is a disparity in feature or performance there i expect the card to be priced in consequence in one way or the other.
 

Bboy AJ

My dog was murdered by a 3.5mm audio port and I will not rest until the standard is dead
No use getting spun up about any of this baseless rumor nonsense now. I’ll wait for real price and specs. Not too much longer. Should be a great time to buy.
 

CrustyBritches

Gold Member
RTX 3080 will likely be at the very top of my price range, so I'm cool if AMD has a card that matches it. I'm most interested in RT performance and that's something we haven't really had any rumors about.
 

Knightime_X

Member
That's a fucked up situation indeed. I avoid getting into it by being picky about retailers I buy from.


Agreed.
I sold the card to someone but made it extremely clear that black screens happen often.
Now if you disable "reset if overheating" setting on the mother board the card works fine.
But disabling that puts your PC at risk of actually overheating later on if it happens.
The problem is that the card goes nuclear doing absolutely nothing and that the heat it produces is the intended rating.
so yeah. its stupid if you ask me.
 

00_Zer0

Member
Since I got back into PC gaming in 2013 I started out with a rig that had an i7 4770k and a GTX 780 in it, then eventually I upgraded the same rig with a GTX 1060. I sold that rig last year and now I have a new rig with Ryzen 3700X and a Sapphire Nitro+ 5700 RX. The reason I went with AMD for my graphics card this time, was bang for the buck reasons along with features I liked about the card. It was going to be between this card or the lower tier 2070 Super with no overclock. There was about $70 between the cards at the time and I decided to go with the Nitro+. I am not upset that I went with this card, because it is arguably the best 1440p-4K card(on some titles) available. I was kind of burned by Nvidia between the transition between the 780 and 1060. It seems that they stopped supporting the 780 card early in favor of their newest graphics card architecture and it forced me to upgrade sooner than I wanted to. It almost felt that they purposely capped my 780 card that up until 1060 was released was a pretty good 1080p card.


Sapphire is one of the best AMD card makers and they also include great software with the card that works great along with AMD's official software and feature set. The software is called Trixx,(stupid name I know). It has a feature called Trixx Boost that works along side AMD's sharpening technology. It allows you to use a lower custom resolution between 95%-70% of true 4k or true 1440p. You then can add AMD sharpening technology to this to allow for some pretty good results. This feature kills the original DLSS 1.0 feature that Nvidia offers. Even though DLSS 2.0 kills Trixx Boost I think that as of now Trixx Boost supports most games whereas DLSS 2.0 can only be supported by select games that were built from the ground up with the feature in mind.

Anyway, I am a bang for the bucks kind of guy, and I have never owned a top of the line Nvidia Ti card anyway, so the next purchase for my graphics card will not be based on brand loyalty, but the price and feature set of the card at the time of purchase.
 
Last edited:

Bolivar687

Banned
Since I got back into PC gaming in 2013 I started out with a rig that had an i7 4770k and a GTX 780 in it, then eventually I upgraded the same rig with a GTX 1060. I sold that rig last year and now I have a new rig with Ryzen 3700X and a Sapphire Nitro+ 5700 RX. The reason I went with AMD for my graphics card this time, was bang for the buck reasons along with features I liked about the card. It was going to be between this card or the lower tier 2070 Super with no overclock. There was about $70 between the cards at the time and I decided to go with the Nitro+. I am not upset that I went with this card, because it is arguably the best 1440p-4K card(on some titles) available. I was kind of burned by Nvidia between the transition between the 780 and 1060. It seems that they stopped supporting the 780 card early in favor of their newest graphics card architecture and it forced me to upgrade sooner than I wanted to. It almost felt that they purposely capped my 780 card that up until 1060 was released was a pretty good 1080p card.


Sapphire is one of the best AMD card makers and they also include great software with the card that works great along with AMD's official software and feature set. The software is called Trixx,(stupid name I know). It has a feature called Trixx Boost that works along side AMD's sharpening technology. It allows you to use a lower custom resolution between 95%-70% of true 4k or true 1440p. You then can add AMD sharpening technology to this to allow for some pretty good results. This feature kills the original DLSS 1.0 feature that Nvidia offers. Even though DLSS 2.0 kills Trixx Boost I think that as of now Trixx Boost supports most games whereas DLSS 2.0 can only be supported by select games that were built from the ground up with the feature in mind.

Anyway, I am a bang for the bucks kind of guy, and I have never owned a top of the line Nvidia Ti card anyway, so the next purchase for my graphics card will not be based on brand loyalty, but the price and feature set of the card at the time of purchase.

AMD does such a better job of supporting their cards for the long run than Nvidia does and their software utility is also leagues better. Nvidia control panel makes me feel like I'm running Windows 95 and GeForce Experience is such a joke.
 

SantaC

Member
AMD does such a better job of supporting their cards for the long run than Nvidia does and their software utility is also leagues better. Nvidia control panel makes me feel like I'm running Windows 95 and GeForce Experience is such a joke.
Nvidias control panel has been there for like 20 years now. How about a graphical overhaul.
 
I'd be surprised to be honest. I'm expecting it to be at best 3070 or 3070 super (in the future) level. I just hope it's not as power hungry as Ampere.
 
If true, they just need to sell big navi for 600 dollars and hear the screams from jensen...

At similar price nobody is gonna choose amd over nvidia.

Not going to happen, it will likely be 700-1000 is my guess.

People seem to be buying 5700xt in good quantities since it launched, no? I would say that discounts your second comment.

As far as performance it is hard to say as we hear differing rumours from both AMD and Nvidia all the time. I'm going to throw out a prediction that I think will be roughly in line with where we end up give or take a little. I think Big Navi will either match or exceed the 3080 in performance and likely sit somewhere between 3080 and 3090. I don't think it will match/beat the 3090 but if it sits between 3080/3090 then I think that would be an amazing win for AMD. They would be competing at the high end again.

Could be just wishful thinking on my part though, we will have to see how it all pans out.
 

M1chl

Currently Gif and Meme Champion
Thread title suggests AMD will match Nvidia's 3080 Ti, but the article suggests it won't. Maybe worth clarifying the OP?
I was confused too, but then I look closely and thread name is missing "Ti", so it's classic 3080...
 

GymWolf

Member
Not going to happen, it will likely be 700-1000 is my guess.

People seem to be buying 5700xt in good quantities since it launched, no? I would say that discounts your second comment.

As far as performance it is hard to say as we hear differing rumours from both AMD and Nvidia all the time. I'm going to throw out a prediction that I think will be roughly in line with where we end up give or take a little. I think Big Navi will either match or exceed the 3080 in performance and likely sit somewhere between 3080 and 3090. I don't think it will match/beat the 3090 but if it sits between 3080/3090 then I think that would be an amazing win for AMD. They would be competing at the high end again.

Could be just wishful thinking on my part though, we will have to see how it all pans out.
I'm just trying to think what amd can offer to get people who is not brand loyale on their side with a gpu that cost the same as nvidia with similar power...

Better drivers? Lol
More games optimized for amd? Giga lol
More advanced IA upscaling or rtx stuff? Hard to believe

Why the fuck people buy amd cards to begin with if prices are similar and they are always one step behind?

I mean, i get the all nvidia are the bad guys and their gpus are overpriced as fuck but i don't care about these things when i build a pc., I only watch the performance\cost ratio and if they have special features worth a buy.
 
Last edited:

tkscz

Member
AMD is behind in raytracing, dlss and powerdraw. They need to catch up AND have a competetive price.

DLSS is something AMD can't do at all. While they can use the tech they created for the consoles for raytracing, DLSS requires a technology that AMD has not shown they have. Tensor cores are not just extra GPU cores put to the side. They process FP16 x2 (which get put back together as FP32 and sent out as such, but can't be inputted as FP32) so quickly that they have their own FLOP rate. Half the amount of Tensor cores by themselves could do nearly double the FLOP rate of CUDA cores or CUs. This is what allows the AI to super sample so quickly and accurately. Some people have run Control at 580p and had DLSS upscale it so perfectly to 1440p that you can't tell the difference. Until AMD creates something akin to the Tensor cores, don't expect anything like that.
 

Evangelion Unit-01

Master Chief
I want to build a new PC in November. Hope nothing slips to 2021. Waiting on this year's Ryzen update for the CPU.

Want to see what both Nvidia and AMD offer on the GPU front. I hope that Big Navi competes with the 3080...
 

Pagusas

Elden Member
Unless they get a DLSS alternative, AMD is dead in the water this gen. Nvidia will be able to hand it to them even if AMD can match raw power output. I mean we're seeing insane gains in some games with DLSS 2.0, 3.0 will be more insane with even better IQ.
 

RoboFu

One of the green rats
you guys are weird. the TI versions are not the most popular if AMD can compete with the non TI and come in at alower price along with the 30XX huge size and power draw could easily be a RYZEN moment for AMD.
But then again AMD could completely fuck it up to. lol
 
Last edited:

llien

Member
Big navi hype has began.

Who's that again?

I'd be surprised to be honest. I'm expecting it to be at best 3070 or 3070 super (in the future) level. I just hope it's not as power hungry as Ampere.
It should beat 2080Ti comfortably, I doubt 3070 would do that (and not cost a leg and an arm, card names are, after all, arbitrary, you could call 2080 2070super... oh wait... :D)
So should be at around 3080, no leaks are needed.
 
Last edited:

Rikkori

Member
Ready for the HYPE TRAIN???

fFZeEnb.gif








tenor.gif


 

GymWolf

Member
DLSS is something AMD can't do at all. While they can use the tech they created for the consoles for raytracing, DLSS requires a technology that AMD has not shown they have. Tensor cores are not just extra GPU cores put to the side. They process FP16 x2 (which get put back together as FP32 and sent out as such, but can't be inputted as FP32) so quickly that they have their own FLOP rate. Half the amount of Tensor cores by themselves could do nearly double the FLOP rate of CUDA cores or CUs. This is what allows the AI to super sample so quickly and accurately. Some people have run Control at 580p and had DLSS upscale it so perfectly to 1440p that you can't tell the difference. Until AMD creates something akin to the Tensor cores, don't expect anything like that.
Is their situation that bad?

I mean not having dlss is gonna be freaking huge...
 

LordOfChaos

Member
Doesn't change anything, AMD doesn't need to be able to compete on every segment including ones where the prices are ridiculous. Nvidia created this segment that includes $999 GPUs, it makes no sense for the vast majority of people anyway and AMD right isn't on it

What really matters is how they'll be able to compete with the likes of a 3060 or 3070 at most, as well as AMD's answers to RT and DLSS.


"Shouldn't" might be more the word. The halo product effect is very real, and as has been proven many times in the past, often when Nvidia has the top card but AMD products beat it in price segments below it, some people will just look at what the best card is and pick something by Nvidia below it.

It shouldn't be that way, but it do.
 

llien

Member
DLSS is something AMD can't do at all. While they can use the tech they created for the consoles for raytracing, DLSS requires a technology that AMD has not shown they have. Tensor cores are not just extra GPU cores put to the side.
It is hard to find the part of the statement that is not wrong, chuckle.
1) GPUs are inherently great at ML inference
2) Benchmarks have shown that Volta doesn't do better with tensor cores in conv. inference.
3) AMD supports DirectML (that's a generic ML inference API, could be used for... AI stuff lol)
4) AMD has beaten DLSS 1.0 (which is also ML based) using old school algorithmic approach (FidelityFX, oh it runs on NV GPUs too)
5) Last but not least, curiously AMD supports AI Upscaling (no per game bells and whistles attached though)
 
Last edited:

tkscz

Member
Is their situation that bad?

I mean not having dlss is gonna be freaking huge...

Technically speaking, AMD already has an image sharping software built into it's drivers. The good part about that is this means devs don't have to special insert it like they do DLSS, and it runs on all models rather than having to be an RTX, but I haven't seen anyone compare FedlityFX to DLSS

It is hard to find the part of the statement that is not wrong, chuckle.
1) GPUs are inherently great at ML inference
2) Benchmarks have shown that Volta doesn't do better with tensor cores in conv. inference.
3) AMD supports DirectML (that's a generic ML inference API, could be used for... AI stuff lol)
4) AMD has beaten DLSS 1.0 (which is also ML based) using old school algorithmic approach (FidelityFX, oh it runs on NV GPUs too)
5) Last but not least, curiously AMD supports AI Upscaling

I'm going by what I've read from nVidia's site and articles covering DLSS. I'm aware of FidelityFX and it's AI upscaling, but I don't see it talked about as much as, so I assumed it wasn't as good. Thanks for the information, got any links to these benchmarks?
 
Last edited:

llien

Member
Technically speaking, AMD already has an image sharping software built into it's drivers. The good part about that is this means devs don't have to special insert it like they do DLSS, and it runs on all models rather than having to be an RTX, but I haven't seen anyone compare FedlityFX to DLSS
That is wrong.
Fidelity FX is an external toolkit that developers can integrate at will.
It runs on AMD, NV, Intel.


And cost just as much like the Radeon VII.
What do you expect 3080 (that is allegedly beaten by "BIG NAVI") to cost?
 
Last edited:
We will just file this under "no shit"

seriously, AMD is no match for Intel or nVidia. The only way AMD can win is by shrinking their chips, but once the big guys shrink, they're simply more efficient and powerful and better designed. For instance, AMD at 7nm is only slightly better than Intel at 14nm+(repeated). Intel will destroy them at 10nm. nVidia will destroy them at 8nm.
 
Last edited:

tkscz

Member
That is wrong.
Fidelity FX is an external toolkit that developers can integrate at will.
It runs on AMD, NV, Intel.


I didn't know that until you told me. I've only heard AMD bring up Fidelity, so I assumed it was their software. Was not aware it was open source, that's interesting to know. I wonder how well it works with APUs.

Edit: It does A LOT more than I thought it did (I thought it was just an AI super sampler). Why doesn't anyone talk about this more? Can it be implemented into consoles too?
 
Last edited:

Mister Wolf

Member
Technically speaking, AMD already has an image sharping software built into it's drivers. The good part about that is this means devs don't have to special insert it like they do DLSS, and it runs on all models rather than having to be an RTX, but I haven't seen anyone compare FedlityFX to DLSS



I'm going by what I've read from nVidia's site and articles covering DLSS. I'm aware of FidelityFX and it's AI upscaling, but I don't see it talked about as much as, so I assumed it wasn't as good. Thanks for the information, got any links to these benchmarks?



Timestamped for you.
 

Rikkori

Member
I didn't know that until you told me. I've only heard AMD bring up Fidelity, so I assumed it was their software. Was not aware it was open source, that's interesting to know. I wonder how well it works with APUs.

Edit: It does A LOT more than I thought it did (I thought it was just an AI super sampler). Why doesn't anyone talk about this more? Can it be implemented into consoles too?

AMD open sources most stuff on the gaming effects side.

Eg a lot of the hair you see in major titles has been developed as derived from TressFX (all Square Enix, Guerrilla Games, etc.)
SSSR that was recently showcased in Marvel's Avengers was AMD's.
The sharpening everyone is harping on about (even the one now in Nvidia's CP) is from AMD (CAS).
 

IFireflyl

Gold Member
We will just file this under "no shit"

seriously, AMD is no match for Intel or nVidia. The only way AMD can win is by shrinking their chips, but once the big guys shrink, they're simply more efficient and powerful and better designed. For instance, AMD at 7nm is only slightly better than Intel at 14nm+(repeated). Intel will destroy them at 10nm. nVidia will destroy them at 8nm.

Sure, but it's good for us that AMD is staying in the game. Without them NVIDIA and Intel have free reign to do whatever they want, and they can keep releasing marginally better hardware year after year instead of developing hardware with real performance boosts. A 5% boost in performance is nothing to brag about, but Intel specifically loved to do just that.
 

BluRayHiDef

Banned
I was going to wait for Big Navi to see if it would include a model that could match or come close to the 3090's performance, but screw it, I'm snagging a 3090 by EVGA when it becomes available. With DLSS 3.0 and these rumors that Big Navi won't touch anything above a 3080, I'm sticking with Nvidia (980Ti -> 1080Ti -> 3090). Besides, I've gotten used to Nvidia's ecosystem and quite like it.
 
Last edited:

tkscz

Member


Timestamped for you.


Ask and ye shall receive. Thanks for that.

AMD open sources most stuff on the gaming effects side.

Eg a lot of the hair you see in major titles has been developed as derived from TressFX (all Square Enix, Guerrilla Games, etc.)
SSSR that was recently showcased in Marvel's Avengers was AMD's.
The sharpening everyone is harping on about (even the one now in Nvidia's CP) is from AMD (CAS).

Wow, that's really nice of AMD to keep it all open sourced. They could probably boost GPU sales if they toted these as proprietary to AMD GPUs.
 

Papacheeks

Banned

Good luck.

$800+ MSRP is what is being thrown around by leaks. The 2080 reference cards were 700+. Go look at AIB's 800+. You think with the amount of power, and cooling required these are going to sit at 700?

LOL.

Keep dreaming.

I'll be shocked if we see newegg sell them for anything less than 750 for reference. These things MSRP is going to start at $800. All fingers point to another increase in price. If a 3090 is going on average for $1400 expect that to trickle down the line when it comes to price in the stack.

And I'll add for $700 the Radeon VII was basically being given away. The memory alone cost $368. You were getting a instinct card that usually costs $2000+ for $700. There's a reason second hand market on these is so high.

Great render cards.
 
Last edited:
Top Bottom