• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[DF] AMD Radeon 6800 XT/6800 vs Nvidia GeForce RTX 3080/3070 Review - Which Should You Buy?

Bullet Club

Member


BIG NAVI!! Delayed a touch owing to the next-gen console deluge, here's the Digital Foundry verdict on the RX 6800 XT and the RX 6800. Priced below 3080 and above 3070 respectively, AMD's pricing pretty much falls into line with rasterisation performance. There's some innovative engineering here, and full support for next-gen features including VRS and ray tracing... but just how well do the new AMD cards straddle the generational divide?
 

Armorian

Banned
I don't think we have seen BF5 RT test anywhere before:


MqtuOAC.jpg
 

GamingArena

Member
Ouch how? That's literally a single still made to concoct a narrative. If you watched the video, these cards are essentially on par like they were marketed to be.

As long as you don't play any RT current or future games or care about G-Sync, DLSS, RTX -Voice, RTX-Broadcast, NVENC, Tensor Cores, Dedicated RT Hardware, much faster OpenCL/Cuda rendering (Blender, V-Ray, Octane) performance as much as 2x as fast as AMD counterpart, aka if you care only about rasterization performance then yes go ahead and buy AMD.

Me personally i wouldn't touch AMD even if it was half price compared to nvidia with missing all the above!
 
Lots of rumors floating around about Nvidia's Hopper and AMD's RDNA3. Will we see multi-chip modules? I think so. If AMD can have their own DLSS and RT cores, than they'll be in right back in it with team green. My opinion, they're not exactly there yet, but they are working on a DLSS implementation and RDNA3 will be closest battle yet. I don't plan on upgrading my PC until Hopper/RNDA3 anyway so I'm in no hurry.
 

nemiroff

Gold Member
Oh wow look at that, RTX3080 with "almost" double the framerate in RT. That sting. Good luck to those who dare go the rasterization path in a time where RT and ML looks like future. But of course, as not all games support it right now it might end up a colossal dilemma for some.
 
Last edited:

regawdless

Banned
Ouch how? That's literally a single still made to concoct a narrative. If you watched the video, these cards are essentially on par like they were marketed to be.

Huh, interesting take. But they are not on par because AMDs cards are significantly slower in raytracing, depending on the intensity. And they have not been marketed to be on par in raytracing, at least I haven't seen it. They've been marketed to be on par in traditional rasterization performance. Which they are more or less.

And my "ouch" was only regarding their raytracing performance in BF5.

Chill.
 

Kholinar

Banned
As long as you don't play any RT current or future games or care about G-Sync, DLSS, RTX -Voice, RTX-Broadcast, NVENC, Tensor Cores, Dedicated RT Hardware, much faster OpenCL/Cuda rendering (Blender, V-Ray, Octane) performance as much as 2x as fast as AMD counterpart, aka if you care only about rasterization performance then yes go ahead and buy AMD.

Me personally i wouldn't touch AMD even if it was half price compared to nvidia with missing all the above!

The funniest thing about Nvidia fanboys is that they always think the roles are just fixed in perpetuity. It’s astonishing how quickly fanboys discarded rasterisation in favour of RT - a feature that they used to rake over the coals over how demanding it was for GPUs - once AMD took the rasterisation crown. Then DLSS was swung above everyone’s heads and guess what? Now AMD have their own equivalent in the works to be released soon.

G-SYNC is basically the same as FreeSync, and the rest of the features you mentioned are only fit for streamers and 3D modellers, not gamers.

I wouldn’t worry about AMD’s RT. This is quite literally their first attempt at it. What happens when AMD starts to match Nvidia in RT? Will you still be singing the same song? Or did I mistake you, and you’re in reality just a non-partisan individual willing to throw money at the company they serves your interest best? I wonder if we’ll see an AMD GPU in your PC then.

if you think Nvidia will remain unchallenged solely due to the fact that they have been for so long, then you’re wrong. AMD is now competitive, and they’ll continue to be so for the foreseeable future.
 

Bolivar687

Banned
I'm buying whichever one I can get my hands on first. It's not just that the number of games I will personally play using RT is small, but that even on that list, the implementation in those games does not impress me at all, certainly not enough to warrant the performance hit. I expect the 6800 series' 16GB will end up being much more relevant and appreciated this hardware cycle. By the time RT becomes integral, the 3080 will not be one of the cards you'll want to use to run it.

As it stands now, it's merely the current iteration of the overhyped feature Nvidia enthusiasts perennially use to claim that no one should be allowed to buy Radeon. We used to incessantly hear about how important frametimes, overclocking headroom, and power efficiency were, but now that Big Navi outclasses Ampere in all three of those categories, I guarantee you will not hear a single word about it from Nvidia fans this cycle. Then it was GSync, but having used it myself for years now, I can't honestly say it's any smoother than when I used AMD's driver-level VSync solution for variable framerates at high fps. After that, some users on GAF legitimately claimed they could not possibly live without Ansel going forward, but now Nvidia doesn't even bother fully implementing it anymore, not even in their own partnered games.

The level of warrioring in the GPU space is unlike anything else in the gaming industry. The console wars can't even come close because of the extremes of dishonesty which Nvidia fans use in their arguments. Personally I find it exciting that I genuinely don't know which GPU I'm going to get, I just wish they were actually available.
 
Last edited:

diffusionx

Gold Member
The funniest thing about Nvidia fanboys is that they always think the roles are just fixed in perpetuity. It’s astonishing how quickly fanboys discarded rasterisation in favour of RT - a feature that they used to rake over the coals over how demanding it was for GPUs - once AMD took the rasterisation crown. Then DLSS was swung above everyone’s heads and guess what? Now AMD have their own equivalent in the works to be released soon.

G-SYNC is basically the same as FreeSync, and the rest of the features you mentioned are only fit for streamers and 3D modellers, not gamers.

I wouldn’t worry about AMD’s RT. This is quite literally their first attempt at it. What happens when AMD starts to match Nvidia in RT? Will you still be singing the same song? Or did I mistake you, and you’re in reality just a non-partisan individual willing to throw money at the company they serves your interest best? I wonder if we’ll see an AMD GPU in your PC then.

if you think Nvidia will remain unchallenged solely due to the fact that they have been for so long, then you’re wrong. AMD is now competitive, and they’ll continue to be so for the foreseeable future.

If AMD wants to challenge and usurp Nvidia they are more than welcome to. IMO this 6800 series just doesn't measure up, it's not good enough. I am saying this as someone who bought a Ryzen 1600, back when those chips got great reviews with caveats that Intel was still a better buy for gaming. I didn't mind sacrificing a bit of CPU performance to stick it to Intel. I'm glad I did that, but for GPUs AMD is still too far behind. I would say they can use this GPU as a building block, but unlike Intel back then, Nvidia is not standing still. They still have work to do.
 
Last edited:

GamingArena

Member
The funniest thing about Nvidia fanboys is that they always think the roles are just fixed in perpetuity. It’s astonishing how quickly fanboys discarded rasterisation in favour of RT - a feature that they used to rake over the coals over how demanding it was for GPUs - once AMD took the rasterisation crown. Then DLSS was swung above everyone’s heads and guess what? Now AMD have their own equivalent in the works to be released soon.

G-SYNC is basically the same as FreeSync, and the rest of the features you mentioned are only fit for streamers and 3D modellers, not gamers.

I wouldn’t worry about AMD’s RT. This is quite literally their first attempt at it. What happens when AMD starts to match Nvidia in RT? Will you still be singing the same song? Or did I mistake you, and you’re in reality just a non-partisan individual willing to throw money at the company they serves your interest best? I wonder if we’ll see an AMD GPU in your PC then.

if you think Nvidia will remain unchallenged solely due to the fact that they have been for so long, then you’re wrong. AMD is now competitive, and they’ll continue to be so for the foreseeable future.

No one is discarding rasterization the difference is negligible aka placebo either side you look at while rest of options is missing so not sure what are you saying. And me personally i use all of the features i mentioned including rendering so for someone like me AMD is 2 gen behind.
 
Last edited:
The funniest thing about Nvidia fanboys is that they always think the roles are just fixed in perpetuity. It’s astonishing how quickly fanboys discarded rasterisation in favour of RT - a feature that they used to rake over the coals over how demanding it was for GPUs - once AMD took the rasterisation crown. Then DLSS was swung above everyone’s heads and guess what? Now AMD have their own equivalent in the works to be released soon.

G-SYNC is basically the same as FreeSync, and the rest of the features you mentioned are only fit for streamers and 3D modellers, not gamers.

I wouldn’t worry about AMD’s RT. This is quite literally their first attempt at it. What happens when AMD starts to match Nvidia in RT? Will you still be singing the same song? Or did I mistake you, and you’re in reality just a non-partisan individual willing to throw money at the company they serves your interest best? I wonder if we’ll see an AMD GPU in your PC then.

if you think Nvidia will remain unchallenged solely due to the fact that they have been for so long, then you’re wrong. AMD is now competitive, and they’ll continue to be so for the foreseeable future.


But AMD didnt took the rasterization crown. Geforce 3090 is 20% faster at 4k on average. Geforce 3080 is 7.4% faster. Not a single review site in the world found AMD to win in rasterisation at 4k.

For 1440p, i looked at 13 websites. 10 of them had 3080 win. 2 of them had them tied. Hardware unboxed seems to be the only site that has them winning at 1440p.

AMD's first attempt at ray tracing is behind nvidia's turing from more than 2 years ago. They're bellow 2080TI in ray traced games. That doesnt gel too well for the future when nvidia has a better and more performant design, 2 years head start and a bigger interest and investment compared to AMD.

Nvidia is also heavily invested in AI and dedicated hardware for DLSS in its tensor cores, while AMD will most likely have a software implementation. This means whatever they come up as a dlss alternative, it cant match what AI is doing on separate, dedicated hardware.

At this point, independent hardware reviews have the 6800XT slower then nvidia in traditional rasterization, slower in ray tracing, lacking many features the competition has and selling for the same price or higher. It just doesnt seem like a product that has much point at its current price. If you put out an inferrior offer than the competition that the price must reflect that. At the moment the price doesnt.

Lacking in next gen features like dlss or poor performance in the ones they do have, like ray tracing, is a big point to opt for the competition. Ray tracing is already pretty much in most of the games in this quarter. Its been announced for other big hitters next year - Hitman 2, Far Cry 6, Witcher 3 is getting one even. Call of Duty has it for 2 years and will surely the next year.

Just imagine shelling 800 or 900 dollars for an AMD 6800XT Strix. Then firing up Cyberpunk, the most anticipated game of the entire year. And all the ray tracing options are greyed out, missing or the performance isnt there. You just spent almost a thousand dollars for a card and you have to turn off the next gen features because their solution is too inferior to render more than one ray tracing feature at a time.

Maybe with RDNA 3 will be better. But Hopper will be the mountain they have to climb at that point
 
Last edited:
AMD did a lot of work to catch up on rasterization while also sort of half-assing in an RT solution which works completely differently from how Nvidia implements it. The reality is that Nvidia stopped caring about rasterization 2 gens ago which is what allowed AMD to catch up there but Nvidia are now 2 gens ahead of AMD in RT.

If you buy $700+ video cards to play yesterday's games, then sure the AMD solution is as good as the Nvidia one. If you happen to buy those really expensive video cards to play tomorrow's games, well the Nvidia solution is the only sane choice.

Cyberpunk will be the first test of this. I'm already looking forward to the RT benchmarks in that one.
 
Last edited:

regawdless

Banned
AMD did a lot of work to catch up on rasterization while also sort of half-assing in an RT solution which works completely differently from how Nvidia implements it. The reality is that Nvidia stopped caring about rasterization 2 gens ago which is what allowed AMD to catch up there but Nvidia are now 2 gens ahead of AMD in RT.

If you buy $700+ video cards to play yesterday's games, then sure the AMD solution is as good as the Nvidia one. If you happen to buy those really expensive video cards to play tomorrow's games, well the Nvidia solution is the only sane choice.

Cyberpunk will be the first test of this. I'm already looking forward to the RT benchmarks in that one.

I agree, only that I wouldn't say that AMD is two gens behind in raytracing. They're one gen behind. Which is significant, because raytracing on the 20xx cards was oftentimes not feasible for me, performance wise.
 

supernova8

Banned
Basically these 6800 cards are a massive leap in performance when compared to the 5700 cards, but once you compare them to the 30x cards (particularly in RT titles) they kinda fall apart.

I suppose it depends on what you think is going to be more important over the next few years.
(A) More VRAM?
(B) More hardware ray tracing capability?

If you think A then you go for AMD (or wait for the 3080 20GB, however much that'll cost).
If you think B you go for NVIDIA.

Also, a lot of the productivity stuff (NVENC, RTX audio etc etc) is still better on the NVIDIA cards so yeah..........

I do like the AMD cards but they're attractive at a much lower price than NVIDIA's offering.
 

reksveks

Member
I am buying a Gpu primarily for Cyberpunk so going for nvidia and also I would like to convert all my h264 videos into h265 so that's another plus for nvidia it seems.
 
Basically these 6800 cards are a massive leap in performance when compared to the 5700 cards, but once you compare them to the 30x cards (particularly in RT titles) they kinda fall apart.

I suppose it depends on what you think is going to be more important over the next few years.
(A) More VRAM?
(B) More hardware ray tracing capability?

If you think A then you go for AMD (or wait for the 3080 20GB, however much that'll cost).
If you think B you go for NVIDIA.

Also, a lot of the productivity stuff (NVENC, RTX audio etc etc) is still better on the NVIDIA cards so yeah..........

I do like the AMD cards but they're attractive at a much lower price than NVIDIA's offering.


I find that AMD made a series of weird design decisions with the cards. They chose a large amount of memory to put on the cards, quantity thats used at half in the worst of cases right now, usually less than that. They prioritized rasterization performance almost completely when it was clear by nvidias movement and even in the consoles powered with their own hardware that ray tracing is the next step in graphics features. They alocated a lot of silicon to their infinity cache, which helps them imensley at lower resolutions, but isnt enough to upset the narrow bandwith at 4k. 3080 has a 50% higher memory bandwidth than 6800XT. Which is kinda ridiculous when you sell your cards as 4k capable, to have them be bottlenecked by your own design precisely at that resolution. Nvidia has DLSS for more than 2 years and amd still doesnt have something to offset that. Then you look at the prices which are the same or more than a 3080.

Its a weird product. They have designed an expensive next gen product that doesnt handle next gen features too well. And instead runs last gen software very good. They didnt have the correct priorities with these cards i feel.

Im hoping for next series of cards that both nvidia and amd focus on the ray tracing performance, not on being able to run Witcher 3 from 5 years ago at 200 frames. I dont care about that. They need to offset the ray tracing impact as much as possible so we can have more and more ray tracing features in games without the huge performance impact
 
Last edited:

supernova8

Banned
I find that AMD made a series of weird design decisions with the cards. They chose a large amount of memory to put on the cards, quantity thats used at half in the worst of cases right now, usually less than that. They prioritized rasterization performance almost completely when it was clear by nvidias movement and even in the consoles powered with their own hardware that ray tracing is the next step in graphics features. They alocated a lot of silicon to their infinity cache, which helps them imensley at lower resolutions, but isnt enough to upset the narrow bandwith at 4k. 3080 has a 50% higher memory bandwidth than 6800XT. Which is kinda ridiculous when you sell your cards as 4k capable, to have them be bottlenecked by your own design precisely at that resolution. Nvidia has DLSS for more than 2 years and amd still doesnt have something to offset that. Then you look at the prices which are the same or more than a 3080.

Its a weird product. They have designed an expensive next gen product that doesnt handle next gen features too well. And instead runs last gen software very good. They didnt have the correct priorities with these cards i feel.

Im hoping for next series of cards that both nvidia and amd focus on the ray tracing performance, not on being able to run Witcher 3 from 5 years ago at 200 frames. I dont care about that. They need to offset the ray tracing impact as much as possible so we can have more and more ray tracing features in games without the huge performance impact

Yeah generally agree with what you're saying.

Basically these AMD cards are good but they don't give sufficient reason to switch to AMD from NVIDIA, especially in the high end. Considering how badly RT is relatively on the AMD cards, I wonder if even the 3060 Ti would edge out the top tier AMD cards in ray tracing. Kinda embarrassing if so.

I'm really interested to see what happens with AMD's FidelityFX. It might be the case that the upscaling work done on the consoles via the RDNA2 GPU will mean that when they do roll out FidelityFX it'll instantly support far more games than DLSS does, but I could be making that up completely. I have no idea but I wouldn't be surprised if there's a benefit to having RDNA2 powering the consoles.
 

supernova8

Banned
For those that say they don't care about RT , not gonna lie I chuckled a bit imagining they are going to play Cyberpunk without any RT effects . Just don't complain it looking flat :messenger_winking:





Here's a video that actually shows the game without RTX. Personally I'd say it only looks significantly better in some parts and everywhere else the difference is negligible.

If the only way we see the benefits of ray tracing is by them putting mirrors and puddles everywhere, I'd say that's evidence that the current methods of faking it are pretty good.
 

regawdless

Banned
Basically these 6800 cards are a massive leap in performance when compared to the 5700 cards, but once you compare them to the 30x cards (particularly in RT titles) they kinda fall apart.

I suppose it depends on what you think is going to be more important over the next few years.
(A) More VRAM?
(B) More hardware ray tracing capability?

If you think A then you go for AMD (or wait for the 3080 20GB, however much that'll cost).
If you think B you go for NVIDIA.

Also, a lot of the productivity stuff (NVENC, RTX audio etc etc) is still better on the NVIDIA cards so yeah..........

I do like the AMD cards but they're attractive at a much lower price than NVIDIA's offering.

More VRAM that is way slower in a card that has way smaller bandwidth, so the "more VRAM" argument isn't that straight forward and not a clear "win" for AMD. Because a ton of slow VRAM doesn't help a lot if the card can't really take advantage of it. Even with infinity cache.

Don't expect the 3080 to have any VRAM issues in the next four years, if even Flight Simulator only really uses 4gb at 4k max.
 

TriSuit666

Banned
The funniest thing about Nvidia fanboys is that they always think the roles are just fixed in perpetuity. It’s astonishing how quickly fanboys discarded rasterisation in favour of RT - a feature that they used to rake over the coals over how demanding it was for GPUs - once AMD took the rasterisation crown. Then DLSS was swung above everyone’s heads and guess what? Now AMD have their own equivalent in the works to be released soon.

G-SYNC is basically the same as FreeSync, and the rest of the features you mentioned are only fit for streamers and 3D modellers, not gamers.

I wouldn’t worry about AMD’s RT. This is quite literally their first attempt at it. What happens when AMD starts to match Nvidia in RT? Will you still be singing the same song? Or did I mistake you, and you’re in reality just a non-partisan individual willing to throw money at the company they serves your interest best? I wonder if we’ll see an AMD GPU in your PC then.

if you think Nvidia will remain unchallenged solely due to the fact that they have been for so long, then you’re wrong. AMD is now competitive, and they’ll continue to be so for the foreseeable future.

I dunno, for me, I want the new NVENC features for file playback which if you care to check the reviews that touch on that aspect the AMD cards are exceptionally weaker. And in certain apps I use there are significant problems using AMD cards due to the drivers.

I also kinda agree with the DF assessment that despite all the fucking hyperbole, the jury is still out on 16Gb and whether the cards can utilise it fully without significant penalty.
 

supernova8

Banned
I dunno, for me, I want the new NVENC features for file playback which if you care to check the reviews that touch on that aspect the AMD cards are exceptionally weaker. And in certain apps I use there are significant problems using AMD cards due to the drivers.

I also kinda agree with the DF assessment that despite all the fucking hyperbole, the jury is still out on 16Gb and whether the cards can utilise it fully without significant penalty.

Yeah I was occupying a first class seat on the Big Navi hype train but it turns out they are just "good" as opposed to completely wiping the floor. They are seemingly really good at 1080p/1440p but who the hell would spend $500 on a GPU for that resolution?
 

FireFly

Member
They alocated a lot of silicon to their infinity cache, which helps them imensley at lower resolutions, but isnt enough to upset the narrow bandwith at 4k. 3080 has a 50% higher memory bandwidth than 6800XT. Which is kinda ridiculous when you sell your cards as 4k capable, to have them be bottlenecked by your own design precisely at that resolution.
It's not clear they are bottlenecked at 4K. They have the same dropoff going from 1440p to 4K as all the other cards apart from the 3080, which has massively more shader power. And the 6800 has a better bandwidth to compute ratio and doesn't do any better comparatively at 4K.

As far as their rationale goes, firstly, saving power consumption gives them greater performance, since they are limited by power before they are limited by die size. And secondly, mobile parts are likely a big factor. With their design, they can probably get 5600 XT level performance in a regular laptop form factor.
 

kraspkibble

Permabanned.
my money would go to AMD for the 6800XT but i will sure as hell miss DLSS + good driver support. the RTX 3080 is gimped with a measly 10GB VRAM and the 3090 is more than double the price. both suck ridiculous amounts of power. so it's either a 6800XT or wait for the inevitable x80 Ti/Super card. Likely it'll only have 12GB which is also pathetic and you'll just need to deal with the higher electricity bill.

my RTX 2080 hasn't let me down yet. it should be good enough for Cyberpunk. i'm quite happy to skip these silly cards and wait for next gen Nvidia cards.
 
Last edited:

MH3M3D

Member
That's assuming NVIDIA doesn't have a big performance jump up their sleeve.

I expect them to refresh ampere on a better node. AMD is on fire though, they finally have enough cash for R&D and looking at Zen and RDNA, it seems to be paying off. Their performance jumps year over year is amazing.
 

Dr.D00p

Member
also kinda agree with the DF assessment that despite all the fucking hyperbole, the jury is still out on 16Gb and whether the cards can utilise it fully without significant penalty.

If both the 3080 & 6800 cards had gone with what they should have, 12GB of VRAM, it would have been the best of both worlds, no debate about whether there was enough of the stuff on the RTX cards and would have meant the 6800 cards could have come in cheaper, especially the vanilla 6800.

RT performance would still suck on the 6800's of course, but the target audience doesn't really seem to care about that in sufficient numbers to worry AMD this generation.
 

Rickyiez

Member
If both the 3080 & 6800 cards had gone with what they should have, 12GB of VRAM, it would have been the best of both worlds, no debate about whether there was enough of the stuff on the RTX cards and would have meant the 6800 cards could have come in cheaper, especially the vanilla 6800.

RT performance would still suck on the 6800's of course, but the target audience doesn't really seem to care about that in sufficient numbers to worry AMD this generation.

12 or 16gb 320bits DDR6X is the best of all world. You need the memory bandwidth
 

TriSuit666

Banned
Yeah I was occupying a first class seat on the Big Navi hype train but it turns out they are just "good" as opposed to completely wiping the floor. They are seemingly really good at 1080p/1440p but who the hell would spend $500 on a GPU for that resolution?

I mean if you could have all the bells and whistles at 1440p, I don't mind. My 4k panel is 60Hz anyway, so there's another limitation for me personally.

I do dislike all the nomenclature that seems to have popped about cards 'destroying' each other or 'wiping the floor' - one idiot youtuber even said the 6800XT 'slaughtered' the 3080. Slaughtered? fucksake, this is the language we're using now?
 
Last edited:

Ascend

Member
I mean if you could have all the bells and whistles at 1440p, I don't mind. My 4k panel is 60Hz anyway, so there's another limitation for me personally.

I do dislike all the nomenclature that seems to have popped about cards 'destroying' each other or 'wiping the floor' - one idiot youtuber even said the 6800XT 'slaughtered' the 3080. Slaughtered? fucksake, this is the language we're using now?
The whole graphics cards sphere has been littered with fanaticism. The thing is, only one side really had ammo for quite a long time. Now the other side also has some ammo, but the side that was on top for all this time doesn't really like it. So tensions rise, and everything is exaggerated.

You have only to look at this thread to see some damage control at work.
 
Remember seeing Hardware Unboxed use Dirt 5 in their RT benchmarks which had the 6800XT outperforming the 3080 with 94fps to 70, hell the 1% low on the 6800XT was higher than the 3080 average fps at 82. Best to wait and see how things are gonna turn out in the coming days before drawing conclusions from the handful of RT games released these past couple of years, you wouldn't have used Gameworks titles in the past to gauge AMD GPU performance now would you?
 

TriSuit666

Banned
The whole graphics cards sphere has been littered with fanaticism. The thing is, only one side really had ammo for quite a long time. Now the other side also has some ammo, but the side that was on top for all this time doesn't really like it. So tensions rise, and everything is exaggerated.

You have only to look at this thread to see some damage control at work.

Oh yeah, long has it been so - but this round definitely feels heightened. I think it's not only important, but absolutely necessary for a level playing field business-wise, now AMD are able to compete again, we should - should - see a bit more sense coming from nvidia, which's why we're seeing them literally reshape their products before our eyes.

The bigger issue for me is the way both AMD and nVidia have manipulated the market and lied to consumers about the costs of their product.
 
Last edited:

KungFucius

King Snowflake
It's quite clear which card is the better value proposition. But I mean, if $50 means that much to you, when you're already spending $650-$700, then by all means, go AMD.

No. The AMD AIB cards are 780-900 and you will not see many reference boards available. They are all more expensive than the OC 3080s and some are more expensive than the binned/premium air-cooled ones. AMD let the AIBs price the cards like scalpers. It is obnoxious. Nvidia didn't let that happen which shows some kind of respect for the consumer that AMD did not. It does seem that Nvidia and AIBs might be prioritizing 3090s though. It's hard to tell, they stay in stock longer and appear to be available more frequently. It could just be because they are more expensive and don't insta sell out.

That said, I can't speak to the value because I spazzed out and bought a 3090 and sold my 3080 because I just wanted the stupid thing. So by all means buy the ASUS Strix 6800XT for $900 if you want it.
The bigger issue for me is the way both AMD and nVidia have manipulated the market and lied to consumers about the costs of their product.

What has Nvidia done? There are still 3080 OC cards dropping at MSRP +~ 50 bucks. The higher end ones do have greater OC features that some value. So may want the stupid Strix and will not settle for less. Most of the non FE cards have decent coolers and perform better than the FE in most cases (PC chassis). With the exception of Zotac, I have no seen any base cards from the AIBs, but what would you do? Sell a card for 700-730 or sell an OC card for 750-770 when they sell out instantly? It's better than AMD by a long shot and the non OC cards were 'available' for the first few weeks after launch.
 
Last edited:

Bo_Hazem

Banned
6900XT is not out yet. December 8th.

We do not talk about the 3090. Only the 3080 exists.

Well 6900XT interests me more for being in the sweet spot at only $1000. Wanna see its RT performance. I think games aren't utilizing AMD RT now so people need to be patient. Also it's still cooking its super sampling tech.

AMD-RDNA-2-next-gen-features-2.jpg


I lean more to RED TEAM. My build is RED as well (2700x + Radeon VII) and without AMD those 30 series cards would be much more expensive. And 10GB VRAM is a joke in 2020.
 
Last edited:

Ascend

Member
Well 6900XT interests me more for being in the sweet spot at only $1000. Wanna see its RT performance. I think games aren't utilizing AMD RT now so people need to be patient. Also it's still cooking its super sampling tech.

AMD-RDNA-2-next-gen-features-2.jpg


I lean more to RED TEAM. My build is RED as well (2700x + Radeon VII) and without AMD those 30 series cards would be much more expensive. And 10GB VRAM is a joke in 2020.
I don't expect its RT performance to be much better than the 6800XT. And I think the 6800XT is much better value compared to the 6900XT.

I agree with you on the 10GB.
 
Top Bottom