• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD RX 6000 'Big Navi' GPU Event | 10/28/2020 @ 12PM EST/9AM PST/4PM UK

Think the stock situation is going to be as bad as with the latest Ryzen launch? I do, and I am sad. Hope you all ready with your distill, discord, twitter, etc. stock/price monitors.
 

Rikkori

Member
Think the stock situation is going to be as bad as with the latest Ryzen launch? I do, and I am sad. Hope you all ready with your distill, discord, twitter, etc. stock/price monitors.
It's going to be much, much worse. With CPUs production is much better, but the demand is also universal (both AMD & Nvidia users want AMD CPUs) so that's why it went out quickly, but I'd have much more hope I'd get one for a reasonable price within the next 2 months. GPUs on the other hand? Ehhh, better hope God hears your prayers, it's the only chance to get one.
 

Ascend

Member
Sapphire 6800XT Pulse;

my7pY43.jpg



5lyj1OJ.jpg


bO2wLd0.jpg


yk1aj0z.jpg


ZU9Z9JS.jpg


 

MadYarpen

Member
Oh, there is also 6800. Sadly 6800 nitro + requires 750 w psu... Pulse used to be a good value though, I may try to get that.
 

Irobot82

Member
Yeah I know. But I think I'll be anxious if I am below required wattage... Anyway it will all be down to what is actually available. So why am I even thinking about this....
I'm just going to keep waiting. It's still between a 6800xt and a RTX3080. I'm just going to wait for supplies to normalize a bit. I have a 3700x system and I went with a 650W power supply this time and I am not worried.
 

MadYarpen

Member
We are getting closer to the release, when can we expect some independent benchmarks & reviews?

And details from AIB partners? So far we have seen only some photos / renders and no spec or prices..
 

Ascend

Member
We are getting closer to the release, when can we expect some independent benchmarks & reviews?
One day before release. That means the 17th.

And details from AIB partners? So far we have seen only some photos / renders and no spec or prices..
Nothing official yet. Rumors are saying they will be available a week or two after the reference cards launch. Prices, well... Expect $50 above MSRP for the good ones. And if they sell like hot cakes, you can forget about the MSRP or anything near it.
 
Hopefully raytracing in consoles aren't anything to go off from, performance wise. Rumors are it's not even as good as Turing in RT. Hopefully AMD rolls out some gpu's for realistic, independent benchmarks.
 

Kenpachii

Member
6800xt will most likely be sold out on day one. So better decide now if you want one. I can't wait to see the benchmarks fucking finally progress on the gpu market.
 
Hopefully raytracing in consoles aren't anything to go off from, performance wise. Rumors are it's not even as good as Turing in RT. Hopefully AMD rolls out some gpu's for realistic, independent benchmarks.

All of the rumours so far have 6000 series being better in RT than Turing but not quite as good as Ampere. No idea where you are getting worse than Turing from.

The only leaked benchmark for RT we have so far was Shadow of The Tombraider for the 6800, where it performed better than a 3070 or a 2080ti. In fact I think it wasn't even that far behind a 3080 in RT performance. Obviously this wasn't including with DLSS turned on. Assuming that leak was true, and assuming the same effect across a wide suite of games then the performance is pretty respectable.

Granted SoTTR and most games use hybrid rendering and will continue to do so for the forseeable future, with a fully Path Traced game like Minecraft we should expect Ampere to pull ahead significantly but for hybrid rendering the gap should shrink by a fair amount.

As for independent benchmarks/reviews apparently the review embargo is launch day, so the 18th of November. Though I expect some leaks a bit ahead of time showing general performance and RT performance maybe in one or two games.
 
Last edited:
All of the rumours so far have 6000 series being better in RT than Turing but not quite as good as Ampere. No idea where you are getting worse than Turing from.

The only leaked benchmark for RT we have so far was Shadow of The Tombraider for the 6800, where it performed better than a 3070 or a 2080ti. In fact I think it wasn't even that far behind a 3080 in RT performance. Obviously this wasn't including with DLSS turned on. Assuming that leak was true, and assuming the same effect across a wide suite of games then the performance is pretty respectable.

Granted SoTTR and most games use hybrid rendering and will continue to do so for the forseeable future, with a fully Path Traced game like Minecraft we should expect Ampere to pull ahead significantly but for hybrid rendering the gap should shrink by a fair amount.

As for independent benchmarks/reviews apparently the review embargo is launch day, so the 18th of November. Though I expect some leaks a bit ahead of time showing general performance and RT performance maybe in one or two games.
Rumors are all over the place. Yesterday or a day before claimed it's not even as good as Turing. You or I can believe whatever we want, but none of us know what is the reality of raytracing performance, especially as AMD didn't cover anything on RT performance at all. You'd imagine they would at least pair it against cards you could potentially buy RIGHT NOW, and not the "paper release" that people are spreading in regards to RTX 30XX cards. One website says this, another says the opposite, and we have no validity on supposed leaks, as they are also all over the place in claims.
 
Rumors are all over the place. Yesterday or a day before claimed it's not even as good as Turing. You or I can believe whatever we want, but none of us know what is the reality of raytracing performance, especially as AMD didn't cover anything on RT performance at all. You'd imagine they would at least pair it against cards you could potentially buy RIGHT NOW, and not the "paper release" that people are spreading in regards to RTX 30XX cards. One website says this, another says the opposite, and we have no validity on supposed leaks, as they are also all over the place in claims.

True we can't know anything for sure until we see independent benchmarking and reviews. While we only have rumours and potentially dubious leaks of benchmarks to go off, so far a few of them seem to correlate regarding RT performance.

  • We had the Port Royal benchmark leak showing better than Turing RT performance.
  • We had the DXR RT benchmark score given by AMD which was better than Turing.
  • We have the Tomb Raider 6800 RT benchmark leak that I mentioned above.
Granted all of that could be faked, nonsense or outliers of some kind but so far most of the people I have been following have been pretty accurate with most of this stuff so far so for now I'll believe it as pretty likely until proven otherwise. Of course if I'm wrong, then so be it.

I would be curious where you saw the rumour of worse than Turing RT performance though? I normally follow this stuff pretty closely on tech twitter and I don't think I've seen that one yet. I'd appreciate a link if you have it so I can take a look, the more info we have the better.
 

Ascend

Member
Rumors are all over the place. Yesterday or a day before claimed it's not even as good as Turing. You or I can believe whatever we want, but none of us know what is the reality of raytracing performance, especially as AMD didn't cover anything on RT performance at all. You'd imagine they would at least pair it against cards you could potentially buy RIGHT NOW, and not the "paper release" that people are spreading in regards to RTX 30XX cards. One website says this, another says the opposite, and we have no validity on supposed leaks, as they are also all over the place in claims.
The leaks have been fairly consistent, if you know which ones are reliable. RedGamingTech has been the most reliable one lately, and he also said AMD RT is better than Turing but worse than Ampere. I think at this point, his sources can be trusted. Maybe not 100%, but definitely 90%. Benchmarks will be here soon enough.

Not that it matters for the 6800 series, but apparently RDNA3 will improve RT significantly.
 
Last edited:

psorcerer

Banned
You cannot put "RT" and "performance" in one sentence.
Mainly because there's no such thing as "RT performance" it all depends on a particular use case much more than on anything else.
 
The leaks have been fairly consistent, if you know which ones are reliable. RedGamingTech has been the most reliable one lately, and he also said AMD RT is better than Turing but worse than Ampere. I think at this point, his sources can be trusted. Maybe not 100%, but definitely 90%. Benchmarks will be here soon enough.

Not that it matters for the 6800 series, but apparently RDNA3 will improve RT significantly.
Raytracing is the way to go though. If you like older games or even games built for cross gen, not having the best raytracing is acceptable. But if you are an enthusiast or care for the now till next gen, AMD didn't seem to compete. I'll wait on benchmarks to clarify this though. Especially since Godfall marketing has failed, and suggested much more ram than needed, even before DX12U/Direct storage/RTX I/O. Another reason that turned me off from AMD marketing, suggesting much more ram than possibly needed for everything maxed out, 4k, ultra settings.
 

Rikkori

Member
Hopefully raytracing in consoles aren't anything to go off from
They're not. An actual GPU is much beefier than what you'd find in an APU (what consoles have) for obvious cost & design related reasons. You absolutely cannot infer RDNA 2 desktop GPU performance from console performance, because there's too many ways in which they diverge.
 
They're not. An actual GPU is much beefier than what you'd find in an APU (what consoles have) for obvious cost & design related reasons. You absolutely cannot infer RDNA 2 desktop GPU performance from console performance, because there's too many ways in which they diverge.
Beefier, yes, but similar architecture. I wouldn't be surprised at all that they are lagging behind Nvidia, as they have been for a good while now. Obviously PC will have better results, but you can at least see where progression is at with consoles, and with rumors online, with the fact that AMD hasn't shown anything yet. You'd imagine they would by now, right?
 
Beefier, yes, but similar architecture. I wouldn't be surprised at all that they are lagging behind Nvidia, as they have been for a good while now. Obviously PC will have better results, but you can at least see where progression is at with consoles, and with rumors online, with the fact that AMD hasn't shown anything yet. You'd imagine they would by now, right?

Which rumours are these? I'm asking because I honestly haven't seen any rumours stating 6000 series GPUs have worse than Turing RT performance. Again, If you can link to a source for that it would be great so that we can all take a look.

As for not showing RT performance yet, the most likely reason is that they are a little behind Ampere in performance. Do you really think they would bring up a chart on stage at their reveal event showing them being beaten in every game vs 3080/3090? Even if Ampere was only beating them by 3-8 fps it would still be a dumb marketing move.

In addition anything they do show will be compared to RTX cards with DLSS enabled which would make their cards look much worse than they actually are comparatively. So until their FidelityFX Super Resolution software is ready (early next year maybe?) they likely won't say anything officially.

Either way we only have a few days to wait for benchmarks/reviews/release (18th of Nov) where we will see everything in action and see head to head RT comparisons with 3080 etc... I would imagine we will probably get some benchmark/RT performance leaks possibly from China before release.

My final prediction is this: Full Path Tracing games, 3000 series takes a significant lead. Hybrid rendered games, 3000 series wins but possibly only by a small FPS margin (will vary game by game) (DLSS off obviously). The real wild card will be the performance of RT with FidelityFX Super Resolution vs the performance of RT with DLSS on Nvidia cards. We know essentially nothing about this so far so it will be interesting to see.
 
Which rumours are these? I'm asking because I honestly haven't seen any rumours stating 6000 series GPUs have worse than Turing RT performance. Again, If you can link to a source for that it would be great so that we can all take a look.

As for not showing RT performance yet, the most likely reason is that they are a little behind Ampere in performance. Do you really think they would bring up a chart on stage at their reveal event showing them being beaten in every game vs 3080/3090? Even if Ampere was only beating them by 3-8 fps it would still be a dumb marketing move.

In addition anything they do show will be compared to RTX cards with DLSS enabled which would make their cards look much worse than they actually are comparatively. So until their FidelityFX Super Resolution software is ready (early next year maybe?) they likely won't say anything officially.

Either way we only have a few days to wait for benchmarks/reviews/release (18th of Nov) where we will see everything in action and see head to head RT comparisons with 3080 etc... I would imagine we will probably get some benchmark/RT performance leaks possibly from China before release.

My final prediction is this: Full Path Tracing games, 3000 series takes a significant lead. Hybrid rendered games, 3000 series wins but possibly only by a small FPS margin (will vary game by game) (DLSS off obviously). The real wild card will be the performance of RT with FidelityFX Super Resolution vs the performance of RT with DLSS on Nvidia cards. We know essentially nothing about this so far so it will be interesting to see.
If dlss2 make rdna2 cards look bad, maybe AMD should work on that, yeah? FidelityFX is shit compared to dlss2, that's been proven over and over now at this point. If they have anything meaningful to bring to the table, they would have shown it, right?

Maybe I'm just used to the constant promises year to year, while giving me no upgrade options due to not having anything to compete with Nvidia, I'd imagine they would show something by now? I have a couple of friends that purchased the 3080 since they don't have as much patience as me. And I don't blame them, as AMD marketing blatantly lied about Godfall VRAM requirements.

We all know Nvidia would fail if they provided minimum ram for gpu's if they weren't more than adequate for next gen games. Which has been proven beyond false.
 
Last edited:

Ascend

Member
If dlss2 make rdna2 cards look bad, maybe AMD should work on that, yeah? FidelityFX is shit compared to dlss2, that's been proven over and over now at this point. If they have anything meaningful to bring to the table, they would have shown it, right?

Maybe I'm just used to the constant promises year to year, while giving me no upgrade options due to not having anything to compete with Nvidia, I'd imagine they would show something by now? I have a couple of friends that purchased the 3080 since they don't have as much patience as me. And I don't blame them, as AMD marketing blatantly lied about Godfall VRAM requirements.

We all know Nvidia would fail if they provided minimum ram for gpu's if they weren't more than adequate for next gen games. Which has been proven beyond false.
The PS5 has a checkerboard rendering technique at 4K that is basically indistinguishable from the native thing. Maybe that is what they will implement.
 

Irobot82

Member
If dlss2 make rdna2 cards look bad, maybe AMD should work on that, yeah? FidelityFX is shit compared to dlss2, that's been proven over and over now at this point. If they have anything meaningful to bring to the table, they would have shown it, right?

Maybe I'm just used to the constant promises year to year, while giving me no upgrade options due to not having anything to compete with Nvidia, I'd imagine they would show something by now? I have a couple of friends that purchased the 3080 since they don't have as much patience as me. And I don't blame them, as AMD marketing blatantly lied about Godfall VRAM requirements.

We all know Nvidia would fail if they provided minimum ram for gpu's if they weren't more than adequate for next gen games. Which has been proven beyond false.
FidelityFX is a suite of open source features developed by AMD. I believe you are actually thinking of CAS. Contrast Adaptive Sharpening. AMD is working on an open-source DLSS alternative in collaboration with many ISV's. Try to keep up.
 
If dlss2 make rdna2 cards look bad, maybe AMD should work on that, yeah? FidelityFX is shit compared to dlss2, that's been proven over and over now at this point. If they have anything meaningful to bring to the table, they would have shown it, right?

Maybe I'm just used to the constant promises year to year, while giving me no upgrade options due to not having anything to compete with Nvidia, I'd imagine they would show something by now? I have a couple of friends that purchased the 3080 since they don't have as much patience as me. And I don't blame them, as AMD marketing blatantly lied about Godfall VRAM requirements.

We all know Nvidia would fail if they provided minimum ram for gpu's if they weren't more than adequate for next gen games. Which has been proven beyond false.

I'm going to leave your silly marketing comments alone (a few of them in this thread) seeing as you are an Nvidia diehard I think you are standing on incredibly shaky ground over a bottomless abyss that leads into the event horizon of a black hole on the subject of misleading or dishonest marketing. joe_biden_cmon_man.gif

As for FidelityFX, you realize I'm not talking about CAS, right?. FidelityFX is the name of AMD's open source cross platform software suite of extensions which provide things like TressFX hair simulation, screen space reflections, CAS etc...

And the FidelityFX suite itself seems to be highly regarded by developers overall as far as I'm aware so... 🤷‍♂️

So FidelityFX is not something comparable to DLSS, they are completely different things. Like comparing the number 1 to a glass of water.

Now, AMD are currently expanding their FidelityFX suite to include Super Resolution, which should be a direct competitor to DLSS but open source and cross platform/vendor. As for AMD not showing Super Resolution yet, they are simply not finished coding it unfortunately. You can't show something to the public that is not finished yet, that would be a disaster. It would definitely be better for AMD from a strategic point of view to have had it ready for launch/reveal, I agree on that point, it is a pity they don't have it ready yet.

But it will come in due time, and I'd be surprised if it wasn't available by end of Q1 2021. I would hope they have it ready early next year.

As for the RAM debate, Nvidia skimped on RAM sizes. It is what it is, which is why as a reaction to AMD they are scrambling right now and changing their lower end offerings to have more RAM than their higher end, which is pretty funny. Not to mention a potential 20GB 3080ti supposedly in the works for January with supposedly the same (or slightly lower) performance as the 3090. Anyway this isn't a thread about Nvidia or the 3000 series cards so I don't really want to get into some long winded debate about RAM sizes on Nvidia cards and if they are enough or not.
 
Last edited:
I'm going to leave your silly marketing comments alone (a few of them in this thread) seeing as you are an Nvidia diehard I think you are standing on incredibly shaky ground over a bottomless abyss that leads into the event horizon of a black hole on the subject of misleading or dishonest marketing. joe_biden_cmon_man.gif

As for FidelityFX, you realize I'm not talking about CAS, right?. FidelityFX is the name of AMD's open source cross platform software suite of extensions which provide things like TressFX hair simulation, screen space reflections, CAS etc...

And the FidelityFX suite itself seems to be highly regarded by developers overall as far as I'm aware so... 🤷‍♂️

So FidelityFX is not something comparable to DLSS, they are completely different things. Like comparing the number 1 to a glass of water.

Now, AMD are currently expanding their FidelityFX suite to include Super Resolution, which should be a direct competitor to DLSS but open source and cross platform/vendor. As for AMD not showing Super Resolution yet, they are simply not finished coding it unfortunately. You can't show something to the public that is not finished yet, that would be a disaster. It would definitely be better for AMD from a strategic point of view to have had it ready for launch/reveal, I agree on that point, it is a pity they don't have it ready yet.

But it will come in due time, and I'd be surprised if it wasn't available by end of Q1 2021. I would hope they have it ready early next year.

As for the RAM debate, Nvidia skimped on RAM sizes. It is what it is, which is why as a reaction to AMD they are scrambling right now and changing their lower end offerings to have more RAM than their higher end, which is pretty funny. Not to mention a potential 20GB 3080ti supposedly in the works for January with supposedly the same (or slightly lower) performance as the 3090. Anyway this isn't a thread about Nvidia or the 3000 series cards so I don't really want to get into some long winded debate about RAM sizes on Nvidia cards and if they are enough or not.
I just have a bit of understanding in these things, that's all. Unless AMD marketing went to absolute shit, they don't have much to stand on when it comes to marketing. Also not a die hard Nvidia fan, as you'll see in my post history, I've owned more AMD cards than from Nvidia's, which is also why i know first hand that they aren't trying to compete with Nvidia in the enthusiast market. They never really have though. They do finally have cards that "fit" into that bracket, but more so to undercut Nvidia and take marketshare, but not to specifically take over that spot. They have more money to make with consoles and finishing up previous contracts with Apple.

I mentioned FidelityFX because, that's what I meant, not CAS. They don't compete with nvidia suite of features. And ram isn't an issue, no matter how hard you try spread that FUD. Each gpu performs in it's perspective resolution. You wouldn't expect a rx 580 to run 4K ultra settings would you? So why expect a 3070 to do the same?
 

PhoenixTank

Member
they aren't trying to compete with Nvidia in the enthusiast market. They never really have though. They do finally have cards that "fit" into that bracket, but more so to undercut Nvidia and take marketshare, but not to specifically take over that spot. They have more money to make with consoles and finishing up previous contracts with Apple.
You lost me here Mr Schlong.
They haven't been up there for a while, but with these cards they do seem to be competing on the enthusiast end. I think the words you're looking for are along the lines of "halo" products? They are definitely important for mindshare and driving sales to the brand as a whole, if not actually the top tier part itself.
In the past, yes, "world's fastest GPU" has traded hands a lot. Less so leading up to GCN and IMO, it started to fade for AMD when deferred rendering techniques basically neutered CrossFire/SLI.

AMDs margins on consoles are almost certainly going to be slimmer than Zen & discrete GPUs. They definitely were in the past generation, and while the higher end nature of console parts this time will command a tasty premium I'd be surprised if the margin for good PC parts and the console APUs have been flipped. Probably more money to be had for a wafer filled with Zen3 CPUs than RX 6000 cards, though.

Out of the loop regarding outstanding Apple contracts for AMD. Is it really that substantial?
 

Ascend

Member
DonJuanSchlong DonJuanSchlong I don't know why you're laugh reacting regarding checkerboard rendering. Timestamped;




Add the sharpening filter(s), and it would be a viable solution. I'm not sure if that's what they are going to use; it is a possible solution without needing to rely on machine learning.
 

Antitype

Member
DonJuanSchlong DonJuanSchlong I don't know why you're laugh reacting regarding checkerboard rendering. Timestamped;




Add the sharpening filter(s), and it would be a viable solution. I'm not sure if that's what they are going to use; it is a possible solution without needing to rely on machine learning.


Probably because it really doesn't hold a candle to dlss2+.
Take Control for example, maxed out at 4k native on my 3080 (undervolted 1920mhz@850mv) it runs at 36fps, dlss quality 64fps, balanced 75fps, performance 87fps and ultra performance 117. And up to balanced mode included it looks indistinguishable from native during gameplay (I mean without having to resort to zoomed in screenshots). You're essentially doubling the fps for free. And even performance mode looks ok if you don't scrutinize every little detail. And at that point that's 2.5 times native performance.
Dlss2+ is in a league of its own. And the fact that it runs on specialized/dedicated hardware makes it even better compared to whatever solution AMD will come up with.
 

VFXVeteran

Banned
Did they ever do a RT comparison between 6000-series and the 3000-series Nvidia boards for Watch Dogs? This would be the true test in performance difference that I'm looking for.
 
Did they ever do a RT comparison between 6000-series and the 3000-series Nvidia boards for Watch Dogs? This would be the true test in performance difference that I'm looking for.

Not that I'm aware of, so far we have only seen a leaked RT enabled benchmark of Tomb Raider with a 6800.

Only 4 days until release/reviews so I'm sure some reviewers will use WD:L in their testing suite of games so we should find out soon enough.

I would be curious to know why this specific game would be your ultimate litmus test for RT performance between the cards vs any other RT enabled game?

Are they doing something special with RT compared to other games? I honestly don't know as I don't really follow Ubisoft games.
 
Top Bottom