• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[DF] AMD Radeon 6800 XT/6800 vs Nvidia GeForce RTX 3080/3070 Review - Which Should You Buy?

"And 10GB VRAM is a joke in 2020. "

But no game uses even 9. In 2020. How is 10 a joke then ? For it to be a joke it would have to have issues due to the amount of ram right now. Which it doesnt. The fact that at one point 10 gigs wont be enough is a certainty. We will have to see when exactly is that point in time. Will it be in 2021 ? Or in 2025 ?
 

Papacheeks

Banned
I think the true test is going to be when Unreal 5, and other newer engines come out. How RT is implemented in older engines isn't really a great solution. Hopefully new engine rewrites we will start to see it more.
 

supernova8

Banned
I mean if you could have all the bells and whistles at 1440p, I don't mind. My 4k panel is 60Hz anyway, so there's another limitation for me personally.

I do dislike all the nomenclature that seems to have popped about cards 'destroying' each other or 'wiping the floor' - one idiot youtuber even said the 6800XT 'slaughtered' the 3080. Slaughtered? fucksake, this is the language we're using now?

Personally I would still spend a decent amount to get high resolutions at 1080p because I'm running a 165hz monitor and the difference is crystal clear for me. Might get myself a 3060 Ti and just play stuff at medium settings.
 

supernova8

Banned
I expect them to refresh ampere on a better node. AMD is on fire though, they finally have enough cash for R&D and looking at Zen and RDNA, it seems to be paying off. Their performance jumps year over year is amazing.

Yeah but AMD is taking the piss on pricing with RDNA2. With Zen they undercut Intel by a lot when they were still weaker in gaming and now when they're actually better, they still slightly undercut.

The 6800 XT struggles to beat the 3080 convincingly and yet we're expected to forego good RT performance and other productivity features (which AMD could've made if they chose to) for a measly $50 saving. Once you're up into $600+ price bracket, who the hell is looking at the pricing and saying "wow $50 what a bargain!"?

I agree with people saying they could've reduced the VRAM and dropped the price down to $599 for the 6800 XT. At sub-$600 you have a real dilemma.
 
Last edited:

KungFucius

King Snowflake
"And 10GB VRAM is a joke in 2020. "

But no game uses even 9. In 2020. How is 10 a joke then ? For it to be a joke it would have to have issues due to the amount of ram right now. Which it doesnt. The fact that at one point 10 gigs wont be enough is a certainty. We will have to see when exactly is that point in time. Will it be in 2021 ? Or in 2025 ?

The 10GB is a joke thing is just ignorant assessment of a single bullet point. The 10GB is faster, more expensive ram than the 16GB stuff so it is clearly better than a hypothetical 10GB RX card. Nvidia chose it for a reason and they are competent engineers.

Questions that I have about VRAM:
  • Is usage the same regardless of GPU type or is there driver/technology dependencies?
  • Does faster VRAM allow for less VRAM to perform equivalently to more, i.e. is bandwidth a factor in determining required amount?
  • What is the equivalent VRAM required on a RTX or RX card to meet the VRAM used by the consoles for 4k?
 

Ascend

Member
Questions that I have about VRAM:
  1. Is usage the same regardless of GPU type or is there driver/technology dependencies?
  2. Does faster VRAM allow for less VRAM to perform equivalently to more, i.e. is bandwidth a factor in determining required amount?
  3. What is the equivalent VRAM required on a RTX or RX card to meet the VRAM used by the consoles for 4k?
  1. Driver and compression techniques have an influence, although it is relatively minor compared to asset allocation by the developer.
  2. Faster VRAM increases bandwidth, but an increase in bandwidth does not translate into any reduction in VRAM usage in terms of space.
  3. Hard to say. There are too many variables. But if we assume that consoles will use 10GB as a baseline due to the XSX 10GB of fast RAM, you are likely going to need more than that quite soon if your PC settings are going to be higher than the console settings.
 
Last edited:
It was the same on the CPU side until this year.
Nah. Intel is stuck on 14nm AND it's giving at least some competition to AMD even now.
NVIDIA not only has no such problems, they are on fire.

Edit: if you compare AMD performance at same process node, they don't have a chance versus Intel or NVIDIA.
 
Last edited:
Nah. Intel is stuck on 14nm AND it's giving at least some competition to AMD even now.
NVIDIA not only has no such problems, they are on fire.

Edit: if you compare AMD performance at same process node, they don't have a chance versus Intel or NVIDIA.

To be fair RDNA2 was huge step forward in power efficiency. Previously AMD had 7nm RDNA 1 gpu that matched 12 nm Turing. Now they outperform Nvidia on Samsung 8nm.

Real test will be next gen when both are on 5nm TSMC
 
To be fair RDNA2 was huge step forward in power efficiency. Previously AMD had 7nm RDNA 1 gpu that matched 12 nm Turing. Now they outperform Nvidia on Samsung 8nm.

Real test will be next gen when both are on 5nm TSMC
Yeah. I absolutely adore AMD and Lisa Su and was genuinely impressed with RDNA2 performance. However there is no miracles. NVIDIA is on worse process node and some silicon goes to RT and Tensor Cores. Intel is in the ass right now.
There will be bloodbath between these, like in good-old days! :-D

Edit: plot twist - Apple wins :messenger_tears_of_joy:
 
Last edited:

Ascend

Member
Edit: if you compare AMD performance at same process node, they don't have a chance versus Intel or NVIDIA.
I'm not a proponent of making up imaginary situations that are highly unlikely to happen to somehow prove that A is inferior and B is superior. Not to mention that engineering for a smaller node is not a walk in the park; it is the main thing that has made Intel stagnant after all.

That being said, AMD was on the same node as the 6800 series with the 5700XT. The 5700XT had the same IPC as Turing, but worse power consumption. nVidia went on a better node with Ampere, AMD remained on the same one, yet achieved similar performance at lower power with more VRAM.
 

FireFly

Member
Nah. Intel is stuck on 14nm AND it's giving at least some competition to AMD even now.
NVIDIA not only has no such problems, they are on fire.
Intel had about 80% more IPC before Zen, so obviously AMD needed some help to catch up. The point is that closing that gap in 3 generations of CPU is an insane achievement, and that it would have put AMD in a strong position, even if Intel was much more competitive. In that case we would be making the "Just wait until next year" joke about AMD's CPUs and ignoring the progress they were making. Which illustrates my point that if you're not a performance leader, significant architectural progress is often ignored.

As far Nvidia goes, yes they are much more competitive than Intel, so I can't see AMD beating them outright. But as of summer last year, AMD were nearly a year late with the 5700 XT, couldn't compete in the enthusiast segment, and didn't have any ray tracing support at all.

Edit: if you compare AMD performance at same process node, they don't have a chance versus Intel or NVIDIA
Zen 3 has equal or better IPC to Tiger Lake, which is Intel's latest architecture. They also clock similarly when you compare 7nm to 10 nm SF. So if they were on the same process node, I think AMD would do just fine. Also note that Intel can reach higher clocks on 14 nm than they can on even their latest 10 nm process, so in that sense 14 nm actually gives them an advantage.
 
I'm not a proponent of making up imaginary situations that are highly unlikely to happen to somehow prove that A is inferior and B is superior. Not to mention that engineering for a smaller node is not a walk in the park; it is the main thing that has made Intel stagnant after all.

That being said, AMD was on the same node as the 6800 series with the 5700XT. The 5700XT had the same IPC as Turing, but worse power consumption. nVidia went on a better node with Ampere, AMD remained on the same one, yet achieved similar performance at lower power with more VRAM.
Intel had about 80% more IPC before Zen, so obviously AMD needed some help to catch up. The point is that closing that gap in 3 generations of CPU is an insane achievement, and that it would have put AMD in a strong position, even if Intel was much more competitive. In that case we would be making the "Just wait until next year" joke about AMD's CPUs and ignoring the progress they were making. Which illustrates my point that if you're not a performance leader, significant architectural progress is often ignored.

As far Nvidia goes, yes they are much more competitive than Intel, so I can't see AMD beating them outright. But as of summer last year, AMD were nearly a year late with the 5700 XT, couldn't compete in the enthusiast segment, and didn't have any ray tracing support at all.


Zen 3 has equal or better IPC to Tiger Lake, which is Intel's latest architecture. They also clock similarly when you compare 7nm to 10 nm SF. So if they were on the same process node, I think AMD would do just fine. Also note that Intel can reach higher clocks on 14 nm than they can on even their latest 10 nm process, so in that sense 14 nm actually gives them an advantage.

Those are fair points. I agree - AMD with Zen3 and RDNA2 are caught up the competition. Interesting times!
 

notseqi

Member
I don't think we have seen BF5 RT test anywhere before:


MqtuOAC.jpg
Yeah no, 97fps is not enough. RT still RectalTreatment.
 

MH3M3D

Member
Yeah but AMD is taking the piss on pricing with RDNA2. With Zen they undercut Intel by a lot when they were still weaker in gaming and now when they're actually better, they still slightly undercut.

The 6800 XT struggles to beat the 3080 convincingly and yet we're expected to forego good RT performance and other productivity features (which AMD could've made if they chose to) for a measly $50 saving. Once you're up into $600+ price bracket, who the hell is looking at the pricing and saying "wow $50 what a bargain!"?

I agree with people saying they could've reduced the VRAM and dropped the price down to $599 for the 6800 XT. At sub-$600 you have a real dilemma.

They’re not selling at lower prices unless they have to. Right now the demand is too high. As a company they need to make as much money as possible.
 

Ascend

Member
I’ve always been an Nvidia guy and am waitlisted for a 3090 but man the 6800XT looks like a juicy mama.

Not sure what to do
I think AMD deserves some praise for their achievement, and also deserve to see it back in their pockets so that they can keep improving. Many people bought 1st gen Ryzen despite it not actually being better than Intel. It ultimately paid off with much better products. I would rather give it to AMD than to nVidia, which not only is already swimming in money but has become too good at extracting it from their users. It ultimately is your choice though. I can understand if RT is your thing, that you go nVidia.

That being said, I don't encourage buying anything for more than $50 over MSRP.
 

waylo

Banned
I think AMD deserves some praise for their achievement, and also deserve to see it back in their pockets so that they can keep improving. Many people bought 1st gen Ryzen despite it not actually being better than Intel. It ultimately paid off with much better products. I would rather give it to AMD than to nVidia, which not only is already swimming in money but has become too good at extracting it from their users. It ultimately is your choice though. I can understand if RT is your thing, that you go nVidia.

That being said, I don't encourage buying anything for more than $50 over MSRP.
At the money asking to be spent, I'm not fucking pity buying a product. The fuck kinda shit is this?
 
At the money asking to be spent, I'm not fucking pity buying a product. The fuck kinda shit is this?
Buy for performance, regardless of who has it. Switched from AMD to Intel, and right back at AMD. Same with gpu's, except I been using Nvidia since my r390x started to show it's age. Who ever got the best raytracing, rasterization, and feature set, is the one who gets my money. Not because I feel bad for them lol.
 
Last edited:

Ascend

Member
At the money asking to be spent, I'm not fucking pity buying a product. The fuck kinda shit is this?
Pity buying? If you wanna call it that, that's your business. To me, it's not pity buying.

I'm not buying their product because I'm somehow sorry for AMD. I want competition in the market. AMD brought us some great performance this time around. They might not be as good at RT, but the products deliver on general performance and power consumption, something that AMD was not able to do for quite a while. And they have more VRAM to boot. The choice is quite simple....

1) You buy nVidia and support the company that has been practically a monopoly and trying extremely hard to single out the competition and artificially jacked up prices with the RTX2000 series, just so that you yourself can brag about having 'the best of the best' and feel good about your purchase.
or
2) You buy AMD and support the only other company capable of bringing competition in the market. And even if the card has some shortcomings now (and some advantages), ultimately we all will experience better prices and better cards with better features because the competition was able to put more into R&D.

I am not one of those people that want AMD around just so that I can buy nVidia at lower prices. I want balance in the market, and that will not be achieved by blindly supporting the newest shiny toy.

So, as I said, I vote with my money. You're free vote with yours. And, of course, voting with my money also means not paying for ridiculous prices. When I saw the price of the 6800XT Nitro+, the first thing I said is, I'm not paying that much for that card. That's called being a conscious consumer.

Buy for performance, regardless of who has it.
AMD has got the performance. So does nVidia. The point is, what else do you care about? Just to have the so-called 'best of the best' so that you can have a dopamine rush for a few weeks? Or do you prefer the gaming landscape to become more sustainable with healthy competition..?

Who ever got the best raytracing, rasterization, and feature set, is the one who gets my money.
Buying things just because it has the feature set is not a wise decision if you're not going to use the majority of that feature set anyway. If you are, good. But the majority of people won't, and they simply buy certain things to feel like they are in the cool kids club. And that, is destructive. So just so you know, buying AMD is supporting both camps, while buying nVidia is supporting monopolistic behavior and thus supporting the downfall of PC gaming. I can already see the laugh reacts happening, and, I don't care.

Not because I feel bad for them lol.
Neither do I. I actually feel bad for the ones freely taking the pole even deeper without knowing it.
 
Pity buying? If you wanna call it that, that's your business. To me, it's not pity buying.

I'm not buying their product because I'm somehow sorry for AMD. I want competition in the market. AMD brought us some great performance this time around. They might not be as good at RT, but the products deliver on general performance and power consumption, something that AMD was not able to do for quite a while. And they have more VRAM to boot. The choice is quite simple....

1) You buy nVidia and support the company that has been practically a monopoly and trying extremely hard to single out the competition and artificially jacked up prices with the RTX2000 series, just so that you yourself can brag about having 'the best of the best' and feel good about your purchase.
or
2) You buy AMD and support the only other company capable of bringing competition in the market. And even if the card has some shortcomings now (and some advantages), ultimately we all will experience better prices and better cards with better features because the competition was able to put more into R&D.

I am not one of those people that want AMD around just so that I can buy nVidia at lower prices. I want balance in the market, and that will not be achieved by blindly supporting the newest shiny toy.

So, as I said, I vote with my money. You're free vote with yours. And, of course, voting with my money also means not paying for ridiculous prices. When I saw the price of the 6800XT Nitro+, the first thing I said is, I'm not paying that much for that card. That's called being a conscious consumer.


AMD has got the performance. So does nVidia. The point is, what else do you care about? Just to have the so-called 'best of the best' so that you can have a dopamine rush for a few weeks? Or do you prefer the gaming landscape to become more sustainable with healthy competition..?


Buying things just because it has the feature set is not a wise decision if you're not going to use the majority of that feature set anyway. If you are, good. But the majority of people won't, and they simply buy certain things to feel like they are in the cool kids club. And that, is destructive. So just so you know, buying AMD is supporting both camps, while buying nVidia is supporting monopolistic behavior and thus supporting the downfall of PC gaming. I can already see the laugh reacts happening, and, I don't care.


Neither do I. I actually feel bad for the ones freely taking the pole even deeper without knowing it.
Keep shilling till you make it! More power to you 😂.

Having the best rasterization, raytracing and supporting DLSS is all the features you need. Rtx voice, webcam tech, and all the other features are icing on the cake to sweeten the deal even more. Its sad that you chose to try and paint Nvidia in a bad light, just because you are upset that they are the performance kings.

I don't care how much sales, hype, etc that a product may have. I won't shit on it if it's the best out there. That's what you call blatant fanboyism, and it doesn't help technology get better, or help uninformed people with their purchasing decisions. You love AMD extremely strongly, which is also fine, but it doesn't help with discussions when you praise it so highly, yet shit on Nvidia, when they have the better hardware right now.
 

Ascend

Member
Its sad that you chose to try and paint Nvidia in a bad light, just because you are upset that they are the performance kings.
Thanks again for showing your lack of reading comprehension, your lack of ability to listen, your lack of ability to have a mature conversation, your lack of interest in other people's views, and your constant putting words in other people's mouth.

I don't care how much sales, hype, etc that a product may have.
Sure you don't...

I won't shit on it if it's the best out there.
So I guess you're shitting on the RTX 3080, RTX 3070, 2080Ti etc because they're not an RTX 3090...?

That's what you call blatant fanboyism, and it doesn't help technology get better, or help uninformed people with their purchasing decisions.
I fully agree!!

You love AMD extremely strongly, which is also fine, but it doesn't help with discussions when you praise it so highly, yet shit on Nvidia, when they have the better hardware right now.
If you actually had reading comprehension, you would have noticed, that some things matter that go beyond just the hardware itself.
 

waylo

Banned
To say the majority won't use RTX or DLSS is fucking ridiculous. Those are killer features that help push Nvidia as the better value. If we're talking about RTX Voice, or the webcam thing, then yeah, most will probably not find a ton of use in those things. However, they're nice extras that AMD doesn't have.

But having good-great ray tracing performance and getting FPS for free, at the cost of no image loss in DLSS, are both key features of the cards and something AMD is sorely lacking.

Your love of AMD is kinda scary dude.
 
Am I the only one that doesn't care about raytracing? Higher FPS in 1080p and 1440p especially with smart access memory makes the 6800xt way more enticing to me.
 
To say the majority won't use RTX or DLSS is fucking ridiculous. Those are killer features that help push Nvidia as the better value. If we're talking about RTX Voice, or the webcam thing, then yeah, most will probably not find a ton of use in those things. However, they're nice extras that AMD doesn't have.

But having good-great ray tracing performance and getting FPS for free, at the cost of no image loss in DLSS, are both key features of the cards and something AMD is sorely lacking.

Your love of AMD is kinda scary dude.
I'm glad that the majority understand that Nvidia is doing much better than AMD at the moment, and aren't white knighting for the losing company. Its a shame that some people are so deluded and can't make educated purchases, even after seeing the reality of things.

DLSS has changed my whole opinion on running games below my display resolution, and upscaling it up to my native res. Cause if it wasn't for that, I'd leave raytracing off. Few more days till cyberpunk comes out, and there's only one place to play it on! PC in addition to DLSS and the only family of gpu's with raytracing
 

BluRayHiDef

Banned
"And 10GB VRAM is a joke in 2020. "

But no game uses even 9. In 2020. How is 10 a joke then ? For it to be a joke it would have to have issues due to the amount of ram right now. Which it doesnt. The fact that at one point 10 gigs wont be enough is a certainty. We will have to see when exactly is that point in time. Will it be in 2021 ? Or in 2025 ?

Your assertion that no game uses even 9GBs of VRAM is false. Watch Dogs: Legion uses 9GBs of VRAM at 4K with maximum settings. When I had an RTX 3080, the VRAM usage - according to MSI Afterburner - was over 9GBs; on both my RTX 3090's, the VRAM usage is a little more.

Note: Disregard the MSI Afterburner overlay that lists the graphics card as the RTX 3090 in the first screen shot; I hadn't bothered to rename the listing when I reinstalled my RTX 3080 to run the benchmark.

3QlFvZc.png


EnZBL9L.png

Just imagine how much VRAM GTA6 will need when it's finally released. 10GBs is going to be a problem for 4K gaming with maximum settings very soon.
 

regawdless

Banned
Your assertion that no game uses even 9GBs of VRAM is false. Watch Dogs: Legion uses 9GBs of VRAM at 4K with maximum settings. When I had an RTX 3080, the VRAM usage - according to MSI Afterburner - was over 9GBs; on both my RTX 3090's, the VRAM usage is a little more.

Note: Disregard the MSI Afterburner overlay that lists the graphics card as the RTX 3090 in the first screen shot; I hadn't bothered to rename the listing when I reinstalled my RTX 3080 to run the benchmark.

3QlFvZc.png


EnZBL9L.png

Just imagine how much VRAM GTA6 will need when it's finally released. 10GBs is going to be a problem for 4K gaming with maximum settings very soon.

You have to use Special K or in Afterburner, the Per Process VRAM Monitoring. Otherwise it shows allocations not usage. I'll try it later tonight (Europe, so... in like 12 hours) and measure how much it really uses.
 

TTOOLL

Member
RT on PC is not there yet, still very expensive for negligible differences.

These Nvidia cards will be as trash as the AMD cards in terms of RT in 2 years. Games will be more demanding. Do not buy for RT only yet.
 

TTOOLL

Member
Except ray tracing on Ampere is absolutely here. Right now. Not only is it here, DLSS permits 4k + ray tracing at over 60 frames.

NXXkFvA.png






NXXkFvA.png






Lx9Smjt.png

that's what I'm saying, in 2 years they will be running new titles at 40, 30fps and you'll say it's not good enough anymore. I mean, if people have money to burn like that let them do it.
 

psorcerer

Banned
Does faster VRAM allow for less VRAM to perform equivalently to more, i.e. is bandwidth a factor in determining required amount?

When DirectStorage based games come out they will stream assets into VRAM. And you will need at least 16GB to get console parity. To have better fps/rez you will need more than that.
 

BluRayHiDef

Banned
that's what I'm saying, in 2 years they will be running new titles at 40, 30fps and you'll say it's not good enough anymore. I mean, if people have money to burn like that let them do it.

Your argument is moot. All aspects of a graphics card's performance degrade over time; in two years time, every card being released now will play the new games of that time less impressively - and not only in terms of ray tracing but also rasterization. That's normal.

What matters now is that with Ampere, games can be played at respectable frame rates with a high-quality of ray tracing enabled.
 

spyshagg

Should not be allowed to breed
In less than two years time this will be the choice:


- 6800 series owners: Disable RT options in the game.

- 3080 owners: Drop down resolution and texture sizes, due to vram limits.

- 3070 owners: Drop down resolution and texture sizes, due to vram limits. Will likely have to disable RT for performance like the 6800 does today.


The perfect card for both RT and Vram is not out yet. Maybe next year. Today all we have is compromises. Chose for what you really see yourself playing, not what brand you like.
 
Last edited:

TTOOLL

Member
Your argument is moot. All aspects of a graphics card's performance degrade over time; in two years time, every card being released now will play the new games of that time less impressively - and not only in terms of ray tracing but also rasterization. That's normal.

What matters now is that with Ampere, games can be played at respectable frame rates with a high-quality of ray tracing enabled.

I know it degrades over time, but because RT is not mature enough these cards' performance will degrade even more and even faster over a shorter time.
 
Last edited:

BluRayHiDef

Banned
In less than two years time this will be the choice:


- 6800 series owners: Disable RT options in the game.

- 3080 owners: Drop down resolution and texture sizes, due to vram limits.

- 3070 owners: Drop down resolution and texture sizes, due to vram limits. Will likely have to disable RT for performance like the 6800 does today.


The perfect card for both RT and Vram is not out yet. Maybe next year. Today all we have is compromises. Chose for what you really see yourself playing, not what brand you like.

Ahem, RTX 3090.
 

TTOOLL

Member
In what way is it not mature enough ? Next gen consoles have 2060 level of ray tracing performance. Ampere will not suffer ray tracing deterioration thats faster than the natural aging of the card as time goes on.

Directly comparing PC and consoles is already wrong, consoles that have just been launched is even worse. In two years they will be doing things that a 2060 will never be able to do, I mean, they already do.


There's no proof of that.

Evplution in RT will definitely be faster than in other areas that are more mature.
 
Last edited:
Top Bottom