• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Radeon 6700 XT review roundup

Skifi28

Member
And yet again, an AMD card review thread is turned into a DLSS and RTX worshiping thread...
A review means comparing it to other products in order to make an informed decision. How can you not discuss the features that the competition offers? How can you not take into account what the card lacks when talking about pricing? Hardware reviews can't exist in a bubble.
 
Last edited:

Ascend

Member
And yet again you and that other guy try to downplay the benchmarks and technologies you don`t like.
Selectively ignoring facts seems to be a hobby of you guys.
Ignoring facts... Uhuh...
Let's throw a few in here regarding DLSS... And let's see you squirm yourself out of this one.

DLSS is overrated, for the simple reason that it causes a very annoying shimmering that nobody talks about. Look at this video;


Look how it looks in motion with all that shimmering, on the building and on the ground... Even DLSS Quality has it. And when he freezes the footage, suddenly everything looks a lot better for DLSS.
He goes on to compare the screenshots and how quality DLSS looks better than native. But that is deceptive marketing, because it didn't look better in motion with all that additional shimmering. And what kills me, he never mentions that shimmering, ever, and only talks about the positives the whole video.

DLSS has a lot of keyboard warriors, and obviously a lot of influencers are pushing the tech on behalf of nVidia. But in actuality, it's not nearly as good as it's touted to be. In multiple games it also has a ghosting problem, but that's generally less noticeable than the shimmering.
DLSS looks great in screenshots, but in motion it's another story, which matters a lot in games. And the lower the native res it's upscaled from, the worse it is.

The main reason to use DLSS is if you want to play at 4k, and your graphics card is too slow to do it. Otherwise, it's really not worth it.

But what do I know...
 
Last edited:

llien

Member
Anybody that uses RTX will use DLSS. Not factoring in is just being special
Try to use it with game that supports VRS. Oh wait, you can't.
Why don't we get real and factor in "I run it at 1440p and claim it's 4k" and "I run at at 1080p and claim it's 1440p" openly, as real men? :messenger_beaming:

And also lol at anybody supporting AMD that is worried about driver overhead, u cant make this shit up.
FUD is so strong within you, Kenpachi. I can see glow around your posts.
On serious notes, Igor Labs also confirmed HUBs observations (they used (different route though, disabling cores).
At this point, nv drivers eating more of your CPU could be regarded as an established fact.

Didn't notice that. The numbers don't come close to matching what anyone else has for RT benchmarks, must have been a different test setup.
Dude, this is implictly claiming vdieocardz is involved in some sort of anti-nvidia conspiracy.
 
Last edited:

DaGwaphics

Member
^ The rub is that people tend to actually play games vs. staring at a wall. There is some degradation in motion with DLSS, but unless you are playing at really high framerates, unlikely with RT, you are going to have a certain amount of blurring in motion from your display anyway. Unless you are one of the few gaming on CRT, but not too many of those left. Seems like most players are quite happy with the results they get from DLSS.
 
Last edited:

DaGwaphics

Member
Dude, this is implictly claiming vdieocardz is involved in some sort of anti-nvidia conspiracy.

Hardly, just stating the obvious, that their results are an outlier in comparison to EVERY other testbench I've seen. Maybe because they went with 1080p, which I'm sure is the target for users spending $500 on a GPU, LOL.

I don't doubt though that RT performance on AMD will improve as time goes on, simply because of the consoles using the same tech for RT.
 
Last edited:

Haggard

Banned
But what do I know...
Pretty much nothing, or at least you don`t give the impression that you have any hardware or software knowledge at all, or you are just willfully ignoring tech-history alltogether.

And you once again just selectively picked what fits your agenda and ignored the rest. Business as usual for you.
 
Last edited:

Ascend

Member
^ The rub is that people tend to actually play games vs. staring at a wall.
And you think that while playing a game the shimmering will be less than when staring at a wall....???

Their is some degradation in motion with DLSS, but unless you are playing at really high framerates, unlikely with RT, you are going to have a certain amount of blurring in motion from your display anyway. Unless you are one of the few gaming on CRT, but not too many of those left.
I didn't put much emphasis on motion blur, because it bothers less people than shimmering.

If you can live with both the shimmering and the motion blur, use DLSS. I have no issue with people using DLSS. I have an issue with touting it as the Messiah for gaming, figuratively speaking. It has its uses, but, it is being overhyped, like pretty much everything that nVidia puts out.

And reviewers aren't helping. Take a look at Techspot's review... The RTX 3070 gets a 95 score. The 6700XT gets a 70. WTF is that? I can understand if the 3070 is considered a better choice than the 6700xt... But come on.
This blind worshiping of nVidia needs to stop.

Pretty much nothing. Same as that Ilien guy.
You once again just selectively pick what fits your agenda and ignore the rest.
Uhuh... That is why you have such a strong rebuttal to what I just posted, instead of throwing out personal attacks... 🤷‍♂️

Not to mention accusing others of your own behavior...
 
Last edited:

DaGwaphics

Member
@ llien llien , Gaming Nexus, TechRadar, ArsTechnica, TomsHardware, PCgamer, etc.

It's possible some of these did a 1080p test as well as 1440p, but 1440p was what the review concentrated on as it should. Obviously, this should be the target for the price.
 
Last edited:

Haggard

Banned
Uhuh... That is why you have such a strong rebuttal to what I just posted, instead of throwing out personal attacks... 🤷‍♂️

You and your "pals" around here literally do nothing but throw out fake news by selectively picking or ignoring tidbits of info.
One like you is not worth leading a discussion with, but bullshit needs to be called out for what it is.
If you have arguments at hand, state them.
If not, STFU.
You, the prime example for one ignoring all technical facts/benchmarks/etc demanding "arguments"...yeah that`s rich.
I remember discussions between you and that VFX veteran guy where you simply kept shouting that RTX is no RT and DLSS is just TAA.....
You guys are not worth a discussion. The amount of simple stubborn denial is unfathomable on your side.
 
Last edited:

Ascend

Member
You and your "pals" around here literally do nothing but throw out fake news by selectively picking or ignoring tidbits of info.
One like you is not worth leading a discussion with. But bullshit needs to be called out for what it is.
Sure. Call it out. I'm still waiting for the rebuttal of the shimmering.

And I thought I was the one ignoring facts, tidbits of info and being selective...
 
Last edited:

Haggard

Banned
Sure. Call it out. I'm still waiting for the rebuttal of the shimmering.

And I thought I was the one ignoring facts, tidbits of info and being selective...
So we`re ignoring all the good implementations now and concentrate on one bad example...geez as if...you`d once again just render everything outside of your narrow spectrum irrelevant.

yeah, go play that retarded game with yourself. :messenger_tears_of_joy:
 
Last edited:

Ascend

Member
For the record, I will not be buying a 6700XT. I will likely be waiting for next gen graphics cards at this point...

But I really don't get what it is with these nVidia fanboys that they have to infest every AMD thread, despite them knowing that they will not be getting AMD products anyway.

This thread will not amount to anything, so, I will be going now. I have better things to do.
 

regawdless

Banned
For the record, I will not be buying a 6700XT. I will likely be waiting for next gen graphics cards at this point...

But I really don't get what it is with these nVidia fanboys that they have to infest every AMD thread, despite them knowing that they will not be getting AMD products anyway.

This thread will not amount to anything, so, I will be going now. I have better things to do.

It feels like you are in search of an AMD worshipping safe space. People are allowed to discuss GPUs even when they don't want to buy them. People who criticize AMD aren't automatically Nvidia fanboys.

I continue to be amazed how passionate you are about that one GPU brand.

Reading through the pages and all the posts from you and Ilien here... You calling the other side fanboys is...

irony GIF
 
Last edited:

DeceptiveAlarm

Gold Member
I tried getting one at BB. There is a new egg shuffle for a bunch right now. The card may not be perfect but it should be a nice jump up from my rx480 at 1440p. 😄
 

turtlepowa

Banned
I tried getting one at BB. There is a new egg shuffle for a bunch right now. The card may not be perfect but it should be a nice jump up from my rx480 at 1440p. 😄
If it's the 8Gb version of the RX 480, you should get some nice money for it from miners.
 
Last edited:

GreatnessRD

Member
Shit show launch on AMD direct buy. Ton of people didn't even get the "add to cart" button. I did, but it was already gone. I wish they'd restock the 6800 XT though. That's what I really want, lol
 
Considering what this card is, which is basically an RDNA2 version of the 5700XT, technically speaking it is a good enough performance increase. Not great, but good enough. I find it really ironic that reviewers are trashing this card for its MSRP. But we can't expect better in this environment I guess.
Yup, the unpredictable price makes it pretty hard to evaluate the product.

I think the MSRP thing is a mixed bag, obviously if the cards get sold for a much higher price by scalpers retail and manufacturers will want to take a portion of the pie, so it will push all MSRPs or even sales prices up in the medium to long run, even more than they would have gone in normal time. Old cards have almost doubled in value. this is bound to have an impact on the bean counters at these companies.
The reviews are generally stacked against this card, because of its MSRP. But really... To me that is quite stupid. AMD obviously increased the MSRP to get a higher cut from the pie in this market. nVidia advertised low MSRPs to get good reviews, but nobody could get them at MSRP anyway...
Paying "full MSRP" used to be for losers, now you get serious creds if you buy something at MSRP. The review I watched (Hardware Unboxed) noted that they would look at eBay prices in a couple of weeks and re-evaluate on "actual pricing" then, so they can compare the real performance/$ compared to the competition.
 

turtlepowa

Banned
It is. The only thing is I promised to put a rig together for my boys. I have a 3570k on a board with ram waiting for it. Should be fine for Fortnite at 1080p.
I would get the boys a 1060 3GB for 120-150 bucks and sell the 480 for 300. The 1060 should be better for Fortnite anyways.
 

DaGwaphics

Member
Would you mind linking one or two concrete tests, showing that RDNA2 does not win in Dirt5, WoW RT, Fortnight RT or doesn't tie in Godfall please.

I don't think those are at all common in most of the benchmarks. Though the listed figures for Fortnight RT looks very low on the Nvidia side. Considering how good DLSS looks in that particular game, I can't see any Fortnite player not using a DLSS mode in that one.

It's certainly possible that AMD sees a big boost going forward, thanks to the consoles.
 

DaGwaphics

Member
More crap from amd. No thanks.

They are more than competitive in general rasterization, so, I would discredit them completely.

If they can get this out there in good numbers so that the MSRP holds, it could still be a good value. I wouldn't hold my breath though.
 

johntown

Banned
For the record, I will not be buying a 6700XT. I will likely be waiting for next gen graphics cards at this point...

But I really don't get what it is with these nVidia fanboys that they have to infest every AMD thread, despite them knowing that they will not be getting AMD products anyway.

This thread will not amount to anything, so, I will be going now. I have better things to do.
Is it wrong to point out facts? NVIDIA just has the better cards right now. The discussions might be more interesting once AMD implements decent RT and a good competitor to DLSS. Until then their cards are inferior and any graphs/charts that says otherwise is not looking at the whole picture and creates threads like this with people mocking AMD fanboys.
 

llien

Member
I don't think those are at all common in most of the benchmarks. Though the listed figures for Fortnight RT looks very low on the Nvidia side. Considering how good DLSS looks in that particular game, I can't see any Fortnite player not using a DLSS mode in that one.
The context of the discussion is RT capabilities of AMD cards.

The theory of it being vastly inferior to NV needs to explain why they perform so well in the recent games,
E.g. why does 3070 only achieve 65% of 6700XT's framerate win WoW RT, if RT is such a strong thing for it.

DLSS and upscaling techniques in general (of which we know AMD is working on one sprinkled with AI buzzwords), are only adding noise to the discussion (it's a separate argument, valid on its own).
 

DaGwaphics

Member
The context of the discussion is RT capabilities of AMD cards.

The theory of it being vastly inferior to NV needs to explain why they perform so well in the recent games,
E.g. why does 3070 only achieve 65% of 6700XT's framerate win WoW RT, if RT is such a strong thing for it.

DLSS and upscaling techniques in general (of which we know AMD is working on one sprinkled with AI buzzwords), are only adding noise to the discussion (it's a separate argument, valid on its own).

Honestly, something just seems off with the number in that graphic for WoW. Why do the 3060 ti and the 3070 get the same result, and what specific area was tested.

With the FPS you can hit with RT on ultra in this game at 1440p, the 1080p numbers look suspect.

 
Last edited:

Armorian

Banned
5700xt vs 6700xt with the same clock so IPC comparison.



Looks like RDNA1 -> RDNA2 improvements are mostly a myth and all performance gain comes from higher clocks. Higher memory BW also gives 5700XT advantage in some tests...
 
Last edited:

Kenpachii

Member
Try to use it with game that supports VRS. Oh wait, you can't.
Why don't we get real and factor in "I run it at 1440p and claim it's 4k" and "I run at at 1080p and claim it's 1440p" openly, as real men? :messenger_beaming:


FUD is so strong within you, Kenpachi. I can see glow around your posts.
On serious notes, Igor Labs also confirmed HUBs observations (they used (different route though, disabling cores).
At this point, nv drivers eating more of your CPU could be regarded as an established fact.


Dude, this is implictly claiming vdieocardz is involved in some sort of anti-nvidia conspiracy.

Tell that to AMD why are they even busy making a DLSS alternative? just lower the resolution guys!.

And about driver overhead.

Remember what i said in the last topic exactly that was specifically about it?

here i will quote a small part of it.

U need to bench a metric ton of games, at all kinds of different settings, with lots of gpu's and different gpu architectures and different cpu architectures with different cores ( cpu's ) etc etc, with different drivers of every card and different patches of the games themselves that u are benching at different settings with drivers itself and different hardware setups like memory / motherboards. and base your conclusions on it.

I would add towards it, also testing other api's would be handy to give you a better view on what AMD vs Nvidia is doing.

So lets look at your igor labs test,


Uniform test platform
In order to be able to exclude all possible influences by different motherboards, CPUs, memory modules and operating system installations, I created all benchmarks with an exemplary DirectX12 game on one and the same platform, which scales from 2 up to 8 cores (SMT on each) still cleanly over 4 to 16 threads. The game uses two graphics cards from NVIDIA and AMD that are roughly equally fast at WQHD resolution, as well as a Ryzen 7 5800X that I’ve gradually reduced to 2 cores / 4 threads to create the CPU bottleneck. Current drivers are installed and the game has been fully patched. The screen resolution ranges from 720p, 1080p and 1440p to 2160p.

So he used 1 motherboard, 1 type of memory, , uses 1 bios, uses 1 cpu, 1 cpu architecture, hell one cpu lol, 1 gpu architecture of nvidia and amd, and 1 card, uses 1 driver and uses 1 type of windows, with 1 updated windows revision.

Its even worse then what hardware unboxed did. Look hardware unboxed is biased to AMD which was obviously clear with his demonstration, but hey atleast he did a little bit of effort even while it was useless to say the least, this guy just shat the brick entirely.

Then on top of it he tested it on a single game that is known to be riddled with bugs and performance problems on PC, specifically nvidia and intel. That was so bad it got universally slammed by any outlet to the point they had to redesign a lot of it. The port is even worse when u realize it uses the same engine as death stranding which runs like a dream on PC. And with the newest updates just released with almost sometimes double the performance gains on nvidia GPU's its pretty fucking clear the game is a mess.

Look they could be perfectly right about overhead and honestly i wouldn't be shocked if its there because dx12 simple isn't favorable for nvidia? and with some good propper testing u could easily showcase this if that's the case. but those hillbilly tests that proof nothing other then there narrative isn't going to proof anything.

And about my bias. I have no bias towards any corporation, i see how it is and frankly sugarcoating isn't in mine vocabulary which triggers people like you hard it seems like. If a card is shit i will tell you how its shit and why its shit. And this 6700 card that costs half a grand that is mid range tier is laughable shit and any outlet would slam the card for it. Specially when it can't compete with nvidia when raytracing started to become a thing.

The problem however with you is, u pick whatever obsecure or biased outlet that fits your bill and ignore every single feature and function until it favors AMD. That's why i stated AMD users caring suddenly about overhead in a API like dx12 is just laughable ( which is just one api ). As AMD has been heavily hit with API overhead for the last decade and nobody seemed to care that support that company or report on it or even cheer for it because they all covered it up the same with hardware unboxed with the same idiot tests i already slammed these outlets for a decade with. Yet now they suddenly care. Which makes it even more hypocritical from them.

Anyway DX12 is under heavy development anyway, unlike dx11. With rtx i/o and that directstorage microsoft is working at we will see massive changes anyway in this year or the next when PS5 games start to hit to the point we will see changes drastically on GPU department from nvidia. This is why the 3000 series feels kinda like what the 500 series was, a last gen gpu on steroids. AMD is more ready for this with there current GPU's if they get there software under control which has been a thing with amd for the last decade now. so nobody gets there hopes up on that one anymore and even the last remaining die hards probably all moved over to nvidia at this point.
 
Last edited:

Rikkori

Member
5700xt vs 6700xt with the same clock so IPC comparison.



Looks like RDNA1 -> RDNA2 improvements are mostly a myth and all performance gain comes from higher clocks. Higher memory BW also gives 5700XT advantage in some tests...

That's a myopic view. RDNA 2 can do 80CU cards, not just 40CU. Don't underestimate the complexity of scaling the arch up or down. Not to mention - "higher clocks" is no trivial matter! Then you also add all the DX12U features, new decoding capabilities etc. There's significant work done on all those fronts, not to mention for the more subtle improvements & changes like wavefront changes, those won't be apparent in past games but you'll see it change since consoles are RDNA 2 and that's what's going to be the focus.
 

martino

Member
That's a myopic view. RDNA 2 can do 80CU cards, not just 40CU. Don't underestimate the complexity of scaling the arch up or down. Not to mention - "higher clocks" is no trivial matter! Then you also add all the DX12U features, new decoding capabilities etc. There's significant work done on all those fronts, not to mention for the more subtle improvements & changes like wavefront changes, those won't be apparent in past games but you'll see it change since consoles are RDNA 2 and that's what's going to be the focus.
yep people are reading too much in the past here for me too.
 

TriSuit666

Banned
~about even in Price-Performance-ratio with the 3070 if, BIG IF, you disregard RT performance and DLSS.
With those factored in that card is not really attractive.
I paid less than MSRP for my 3070, the 6700 just looks like a wet rag in comparison.
 

FireFly

Member
5700xt vs 6700xt with the same clock so IPC comparison.

Looks like RDNA1 -> RDNA2 improvements are mostly a myth and all performance gain comes from higher clocks. Higher memory BW also gives 5700XT advantage in some tests...
AMD never promised any big IPC improvements. There are some improvements there at 4K, likely due to the Infinity Cache.
 
That's a myopic view. RDNA 2 can do 80CU cards, not just 40CU. Don't underestimate the complexity of scaling the arch up or down. Not to mention - "higher clocks" is no trivial matter! Then you also add all the DX12U features, new decoding capabilities etc. There's significant work done on all those fronts, not to mention for the more subtle improvements & changes like wavefront changes, those won't be apparent in past games but you'll see it change since consoles are RDNA 2 and that's what's going to be the focus.

Still didn't watched the video "Computer Base" but made a IPC comparision: https://www.computerbase.de/2021-03/amd-radeon-rdna2-rdna-gcn-ipc-cu-vergleich/

RDNA2 have a small reduction on "IPC" compared with RDNA1, but this was a natural consequence of the changes to clock higher. At 1GHz is slower, but at 2GHz with everything inside working faster the game changes and RDNA2 becomes faster.

qnZPe4O.jpg


They also tested the CU scalling.

rukc7X4.jpg


About the changes to accommodate more CUs, have in mind that by doing that the GPUs with less CUs have some parts decreased compared with RDNA1.
 
5700xt vs 6700xt with the same clock so IPC comparison.



Looks like RDNA1 -> RDNA2 improvements are mostly a myth and all performance gain comes from higher clocks. Higher memory BW also gives 5700XT advantage in some tests...


In fact many peoples were mixing the things as I said in the past. Some people were claiming a 25% IPC increase, that was false. That was the IPC per watt that has increase a lot with RDNA2 (which is in fact a really good thing).

You can find here what I was explaining.
 
Last edited:

llien

Member
Tell that to AMD why are they even busy making a DLSS alternative?
In my opinion for the same reason they had to call it RDNA1, RDNA2 instead of GCNX, GCNX+1.
Mostly marketing.

The already have checkerboard rendering, Fidelity FX CAS, now they need something sprinkled with "AI", "deep learning", and other popular buzzwords.

So he used 1 motherboard, 1 type of memory, , uses 1 bios, uses 1 cpu, 1 cpu architecture, hell one cpu lol, 1 gpu architecture of nvidia and amd, and 1 card, uses 1 driver and uses 1 type of windows, with 1 updated windows revision.
Dude.
You remember that GPU reviews use, let me list it: 1 motherboard, 1 type of memory, 1BIOS, 1 CPU architecture, hell, 1 CPU lol?

"Maybe it's just one game" is a valid argument.
"Perhaps it is just those 2 games" is a valid argument (still).
"Perhaps it's those bad console ports" is a valid argument (still).

"Maybe it's that BIOS or RAM" is... bananas.
The problem however with you is, u pick whatever obsecure or biased outlet that fits your bill and ignore every single feature and function until it favors AMD.
That's your imagination, and let me be more specific.
RT is still a gimmick to me, despite AMD's 6700Xt beating 3070 in "newer" (in RT terms) games (Dirt 5, WoW RT, Fortnight RT). It is mostly recognized not by visual benefits, but by major fps hit (CP2077). Poke me, if I change mind about it.
Upscaling will still be upscaling for me, and I won't repeat "better than native" bovine feces either.

What you claim to be the case has never happened (and, be sure, won't happen either).
 

iQuasarLV

Member

This is 100% what I was worried about when I saw the thermals and performance #s a couple days ago. You definitely do not want to buy this version of the card. Even, if AIB makers put a better solution on the card it still has to contend with being a 5700XT in a RDNA2 suit. For value proposition this thing sucks at a +$80 jump in price.

I believe Steve hinted at it with AMD controlling the console space with its silicon. They have profit from almost every entertainment angle for the next couple of years. That is until contracts start stripping profit out of the yields because of time to cost terminology in those things. Right now, AMD is just putting shit on the market to keep up with Nvidia's product stack.
 
Top Bottom