• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Radeon RX6800/RX6800XT Reviews/Benchmarks Thread |OT|

I'm German, I have a small household of three, and I pay around €90 per month for electricity. AFAIK, Germany is the country with the highest costs for electricity. I think I pay around 30 cents per kilowatt hour.

lol. than you don't know much about electricity markets.
 
Caribbean is worse... We have a below average electricity cost for the Caribbean... The monthly rate in US dollars for me is;

<= 250 kWh /month = $0.30 / kWh
>250 <=350 kWh /month = $0.36 / kWh
> 350 kWh /month = $0.39 / kWh
Why higher price if you consume more? Makes no sense. Or is that because of taxes and shit?
 

MadYarpen

Member
another day of this mess. The retailers in Poland have simply no information about deliveries.

Aaand some RX 5700 XTs are priced already at 530 eur (with tax). Ubelivable.
 
Last edited:

llien

Member
According to Bitwit, the higher clockspeeds don't scale linearly with real-world game performance.

Let's talk about OC, shall we

Computerbase
Perf gain:
6800 with 14.6% OC => 9%
6800XT with 4% OC => 6%

mXeqNJN.png


Techpowerup

Powercolor Nitro+ 6800XT
stock clock: 2563
OC clock: 2622 (2.3% only)

Perf gain from OC: 5.5%


Non linear? Yeah. Both ways (can be faster than linear)


another day of this mess. The retailers in Poland have simply no information about deliveries.
6000 series was just revealed. (first reviews were when, less than 10 days ago???)
No AIB would be able to roll out own stuff so quickly and en mass.
Let's wait for 4-6 weeks before jumping to conclusion.

As far as I'm aware, nothing in this list is there to buy: PS5, XSeX, Zen3, RDNA2, Fermi2

Why higher price if you consume more? Makes no sense. Or is that because of taxes and shit?
Because "stop consuming too much".
 
Last edited:

llien

Member
Curious figures (ignore the excuse part):



I wish I know how many "things" fit per wafer.
I also think Microsoft needs to have a word with AMD on 2 vs 1 wafer allocation. :D

So estimate is 10k wafers per month for Zen3/GPU business.
Using default values on this yield calculator.

Zen3 (12.9 by 9.5 mm):
300mm2 wafer => 441 chips

RDNA2 (536mm, assuming 25 by 21.44 mm)
300mm2 wafer => 58 chips

Wafers are estimated to be sold at around $17k each.
That means one RDNA2 chip costs => around $293 to produce, if my calculations are correct), while Zen3 costs less than $40.


So, per month, if all 7n wafers are allocated to RDNA2, we'd get 580'000 chips.
If all were allocated to Zen 3 => 4.'410'000 chips.

Makes you wonder why CPUs are so expensive.
Yields might be more nuanced than that and CPUs could yield less due to more dense packing of transistors.
 
Last edited:

MadYarpen

Member
6000 series was just revealed. (first reviews were when, less than 10 days ago???)
No AIB would be able to roll out own stuff so quickly and en mass.
Let's wait for 4-6 weeks before jumping to conclusion.

Yeah, I think I will wait. Buying 5700 XT even if it is substantially cheaper than 6800 (which atm are crazy expensive), doesn't seem like a perfect deal, especially in 3440x1440p... but i hava 580 now so thats why i'm a little lost.
 

Ascend

Member
Why higher price if you consume more? Makes no sense. Or is that because of taxes and shit?
It is an incentive for people to reduce their consumption.

Whether true or not...;

"Apparently AMD AIBs were running into difficulties getting air freight out of China. I was told that Apple has sucked up a huge allocation on that service with new iPhone shipments. And Apple swings a bigger stick than any AIBs, or AMD, or NVIDIA for that matter. Two AIBs told me they ran into shipping delays and had to pay premiums to move cards, as much as 3X cost to ship. So just another thing driving retail costs up. "

 
Last edited:
Let's talk about OC, shall we

Computerbase
Perf gain:
6800 with 14.6% OC => 9%
6800XT with 4% OC => 6%

mXeqNJN.png


Techpowerup

Powercolor Nitro+ 6800XT
stock clock: 2563
OC clock: 2622 (2.3% only)

Perf gain from OC: 5.5%


Non linear? Yeah. Both ways (can be faster than linear)



6000 series was just revealed. (first reviews were when, less than 10 days ago???)
No AIB would be able to roll out own stuff so quickly and en mass.
Let's wait for 4-6 weeks before jumping to conclusion.

As far as I'm aware, nothing in this list is there to buy: PS5, XSeX, Zen3, RDNA2, Fermi2


Because "stop consuming too much".

Those OC'ing gains are in a benchmark. 3d guru and techspot both reviewed the 6800xt water-cooled strix. They both said it's about a 4% gain over reference.

Techspot has the strix at a whopping 10.9% increase in the same benchmark they ran for the sapphire card that you referencing.

So yeah, almost $300 (or $120-150 best case for other decent AIBS) over the price of what I paid for my Tuf 3080 for less performance at 4k? These AIBs are doing crazy, craven shit to gauge customers while demand is sky high. I thought your white knight Lisa su would've stopped 'em because hurr durr' amd is value champ and cares about the little guy?

6800xt value is a joke in the current market. Nvidia doin shady shit but we already knew that. Jokers like Moore's Law trying to claim it's not a worse paper launch than the 3080 are paid schills though. Amd completely screwed the pooch on this one. They care more about profits from cpus and game consoles, and continue to set their own house on fire by allowing idiots like Frank A to go out and make asses of the whole company. If this is the best attempt they can give to make enthusiast market more competitive, we should just resign ourselves to another few years of nvidia dominance. So disappointing. And of course the apologists are already pointing to the next cards, saying 'but they'll be 50% faster and here next year!'. When pigs fly
 
Last edited:
Those OC'ing gains are in a benchmark. 3d guru and techspot both reviewed the 6800xt water-cooled strix. They both said it's about a 4% gain over reference.

Techspot has the strix at a whopping 10.9% increase in the same benchmark they ran for the sapphire card that you referencing.

So yeah, almost $300 (or $120-150 best case for other decent AIBS) over the price of what I paid for my Tuf 3080 for less performance at 4k? These AIBs are doing crazy, craven shit to gauge customers while demand is sky high. I thought your white knight Lisa su would've stopped 'em because hurr durr' amd is value champ and cares about the little guy?

6800xt value is a joke in the current market. Nvidia doin shady shit but we already knew that. Jokers like Moore's Law trying to claim it's not a worse paper launch than the 3080 are paid schills though. Amd completely screwed the pooch on this one. They care more about profits from cpus and game consoles, and continue to set their own house on fire by allowing idiots like Frank A to go out and make asses of the whole company. If this is the best attempt they can give to make enthusiast market more competitive, we should just resign ourselves to another few years of nvidia dominance. So disappointing. And of course the apologists are already pointing to the next cards, saying 'but they'll be 50% faster and here next year!'. When pigs fly
No point in pointing out some of these guys obvious flaws in comparisons. We all know what we prefer, whether it be current gen or next gen gaming. There will still be some Lisa Su minions, fighting to please her. It makes no sense in this day and age to try and white knight for a company that doesn't have the gold medal, especially if you are all about performance, and aren't a fanboy for a certain team. Most people would jump on the 5000 gpu's if they were honestly better than "team green" as some triggered people call it.
 
Regarding OC potential on the AIB models (which have a higher power limit than reference) the manual OC on the Sapphire Nitro+ in the Kit Guru review showed an 11-15% performance increase over the reference model, that was in games by the way, not a synthetic bench, which is pretty impressive overall.

Out of the box the factory OC AIB models seem to offer a 4-5% performance increase over the reference model, although with an manual OC you should be able to push that to 11+%

Having said that the pricing on the AIB models are way too high. I don't know if AMD is to blame or if its the AIB partners just taking advantage of the current market conditions but the prices are historically high for AIB models. Nvidia doesn't really fare much better here with AIB models also going for crazy money.

Looks like it is a shitty time to be a tech enthusiast, doubly so if you live in Europe and have to pay crazy money compared to our US friends.

I'm hoping prices and stock become more sane early next year for GPUs. Once the prices become stable I'll decide between 3080 and 6800XT, I'll be looking at AIB models and I'm happy with performance for both cards but the 3080 has a nice advantage in Blender which I like so that might push me towards Nvidia if the 6800XT prices are the same or more expensive than Nvidia. Terrible time for tech customers all round though.
 

CuNi

Member
Didn't see this posted yet. It's aggregated date and centered around the 6800XT being at 100%.


So if this data is true, 3080 is roughly 3% faster than 6800XT across all data points.

Mind this is non oc data for both GPUs.
50$ less for only 3% slower is quit good actually combined with lower power draw.
 
Regarding OC potential on the AIB models (which have a higher power limit than reference) the manual OC on the Sapphire Nitro+ in the Kit Guru review showed an 11-15% performance increase over the reference model, that was in games by the way, not a synthetic bench, which is pretty impressive overall.

Out of the box the factory OC AIB models seem to offer a 4-5% performance increase over the reference model, although with an manual OC you should be able to push that to 11+%

Having said that the pricing on the AIB models are way too high. I don't know if AMD is to blame or if its the AIB partners just taking advantage of the current market conditions but the prices are historically high for AIB models. Nvidia doesn't really fare much better here with AIB models also going for crazy money.

Looks like it is a shitty time to be a tech enthusiast, doubly so if you live in Europe and have to pay crazy money compared to our US friends.

I'm hoping prices and stock become more sane early next year for GPUs. Once the prices become stable I'll decide between 3080 and 6800XT, I'll be looking at AIB models and I'm happy with performance for both cards but the 3080 has a nice advantage in Blender which I like so that might push me towards Nvidia if the 6800XT prices are the same or more expensive than Nvidia. Terrible time for tech customers all round though.

Kit guru showed 3 games for the oc results...
 

Ascend

Member
Come on up to 4K, man.
Nah... I'm fine. I don't get the hype around 4K to be honest. It reminds me of TVs, when they went from 1080p to 4K, and everyone was getting hyped for it. There is no difference in the image quality at all. It all depends on how close you sit to the screen. PPI is much more important since that determines your minimum distance to the screen.
Not to mention the performance drop is astronomical for games.

On another note;
 
Last edited:
Nah... I'm fine. I don't get the hype around 4K to be honest. It reminds me of TVs, when they went from 1080p to 4K, and everyone was getting hyped for it. There is no difference in the image quality at all. It all depends on how close you sit to the screen. PPI is much more important since that determines your minimum distance to the screen.
Not to mention the performance drop is astronomical for games.

On another note;


Lul what? So you want to continue playing on a 22 inch 1080p monitor to get the same ppi as a 43 inch 4k monitor? (Assuming you're sitting close enough to each to use a mouse)

Otherwise, the image quality is decidedly NOT the same but whatever you gotta tell yourself
 
Kit guru showed 3 games for the oc results...


yeah :)))) he showed the oc results for 3 games that favor the 6800XT and were closest to the 3080 :)))) And it still lost in one of the three games. Then you factor that you can overlock the 3080 and it goes ahead.

Its really bizzare the shilling for these cards here. There is literally no reason to get a 6800xt over a 3080. None. Worst performance, worst video decoding, worst features, monumentaly worst ray tracing performance, no dlss, and pretty much the same price. Buying a 6800XT at pretty much the same cost as a 3080 is just a dumb move. You're paying the same money for worst everything.

I have no ideea why the nitpicking from a dozen sites until you can form a narative is still going on here. Both cards are out. The 3080 is better in everything. Period. If 6800Xt were about 150 dollars cheaper, it would be a conversation. Otherwise, there isnt one. If you want a new gpu and have the money, buy a 3080. If you cant find a 3080, wait until you do. 6800XT is just thowing money in the toilet when it gives you worse of every possible aspect
 
Last edited:
yeah :)))) he showed the oc results for 3 games that favor the 6800XT and were closest to the 3080 :)))) And it still lost in one of the three games. Then you factor that you can overlock the 3080 and it goes ahead.

Its really bizzare the shilling for these cards here. There is literally no reason to get a 6800xt over a 3080. None. Worst performance, worst video decoding, worst features, monumentaly worst ray tracing performance, no dlss, and pretty much the same price. Buying a 6800XT at pretty much the same cost as a 3080 is just a dumb move. You're paying the same money for worst everything.

I have no ideea why the nitpicking from a dozen sites until you can form a narative is still going on here. Both cards are out. The 3080 is better in everything. Period. If 6800Xt were about 150 dollars cheaper, it would be a conversation. Otherwise, there isnt one. If you want a new gpu and have the money, buy a 3080. If you cant find a 3080, wait until you do. 6800XT is just thowing money in the toilet when it gives you worse of every possible aspect

Well, I wouldn't go as far as to say it's 'throwing your money in the toilet' but it's certainly not the most advisable purchase, especially at the insane prices the AIBs are charging for 6800xts right now (worsened by the fact that stock for them is demonstrably worse than it ever was for the 3080 despite what paid youtubers will tell you). DLSS is great and the ray tracing performance is finally bringing it into the 'usable for "experience oriented" games' territory. Of course I won't be turning it on in something like Cod multiplayer. But single player? With that game's highly beneficial DLSS implementation, it's basically a freebee, so, why not?

The efficiency of the AMD cards are really only superior at stock clocks IF you're willing to take the L to the 3080 in performance. As soon as you OC them far enough to meet and beat the stock 3080, you've basically forfeited that advantage. So much for that win. I'm truly disappointed with AMD. I think if anything BOTH Nvidia and AMD overestimated AMD's position, and consequently, Nvidia is still ahead and AMD gave up too much of their price advantage. The only scenario in which it turns out good for them is a pandemic, which fortunately for them, we happen to be in for the foreseeable future.
 
Last edited:

BluRayHiDef

Banned
There is no difference in the image quality at all.

This is utterly false. As resolution increases, minimum screen sizes necessary to convey the additional detail increase as well. Hence, on an adequately large display - one that is at least 40 inches - 4K looks noticeably better than 1080p; it's sharper and conveys more detail.

As for the performance hit, well with an RTX 3080 or RTX 3090, you can play games natively in 4K with max settings and experience 50 frames per second to 60 frames per second, which is smooth. However, via DLSS, you can experience even higher frame rates in 4K while maintaining picture quality that's equal to native 4K or negligibly worse.

I used a 43" 4K TV as my monitor for four years and am now using a 55" TV as my monitor; 4K looks significantly better than 1080p in my experience.
 
Kit guru showed 3 games for the oc results...

Yeah unfortunately almost no reviewers are doing a full suite of game benchmarks on the AIB models so we can only take the data points that we have and extrapolate. 🤷‍♂️

Similarly in some of the actual reviews of reference models some places only benchmarked 7 games for example, even less for RT results. If those are a valid indicator and counted in aggregate reviews across different systems, CPUs and titles to get an average performance then Kit Guru's results should be taken as an equally valid indicator.

I actually think there are tons of improvements that could be made with benchmarks/reviews and it gets especially muddy when different reviewers are testing different titles on different systems and then coming up with some overall average and then having that average aggregated across a bunch of reviews to get a "final" performance score or percentage for any GPU. But that is a different discussion really to what we are discussing.

The simple undeniable fact about the RX6000 series is that they are OC monsters and that the OC does result in tangible improvements in game performance. Obviously this will vary by title and resolution but it still remains true. The 3000 series by comparison was pushed pretty hard with a factory OC and are on a worse silicon node from Samsung so they have almost no OC headroom by comparison.
 
Last edited:
Yeah unfortunately almost no reviewers are doing a full suite of game benchmarks on the AIB models so we can only take the data points that we have and extrapolate. 🤷‍♂️

Similarly in some of the actual reviews of reference models some places only benchmarked 7 games for example, even less for RT results. If those are a valid indicator and counted in aggregate reviews across different systems, CPUs and titles to get an average performance then Kit Guru's results should be taken as an equally valid indicator.

I actually think there are tons of improvements that could be made with benchmarks/reviews and it gets especially muddy when different reviewers are testing different titles on different systems and then coming up with some overall average and then having that average aggregated across a bunch of reviews to get a "final" performance score or percentage for any GPU. But that is a different discussion really to what we are discussing.

The simple undeniable fact about the RX6000 series is that they are OC monsters and that the OC does result in tangible improvements in game performance. Obviously this will vary by title and resolution but it still remains true. The 3000 series by comparison was pushed pretty hard with a factory OC and are on a worse silicon node from Samsung so they have almost no OC headroom by comparison.


They might be OC "monsters" on paper, but the performance increase isnt there, its minuscule.

aMEe5PJ.png



An almost 30% increase in core clock nets you 9% extra performance ? As ive said, they're OC monsters on paper. With nvidia, you can push for an 8% OC and get close to that extra boost.

The situation at this point is as i've already said. A radeon 6800XT gives you worst of evertything when going against a 3080. And at the same price even. Thats not an investment that makes any sense, from any angle. Then, when talking about futureproofing, what is more futureproof ? The radeon card that right out of the gate barelly runs ray traced games ? When the ray tracing era just had its doors blown wide open ? The card that just came out and cant run Control from 2019 with ray tracing ? And will have similarly poor performance in Cyberpunk ? That is the future proof card ? Or the one that blasts Control and will blast Cyberounk and has DLSS on top of that ?

Like i've said, a 6800XT makes sense at 150 bucks lower than a 3080 and if you're not interested in seeing the new graphical features of tomorrow.
 

Antitype

Member
Its really bizzare the shilling for these cards here. There is literally no reason to get a 6800xt over a 3080. None. Worst performance, worst video decoding, worst features, monumentaly worst ray tracing performance, no dlss, and pretty much the same price. Buying a 6800XT at pretty much the same cost as a 3080 is just a dumb move. You're paying the same money for worst everything.

This thread indeed feels like we're in some kind of alternate reality. The narrative being pushed is complete opposite of pretty much everywhere else.. Even r/amd aren't this delusional. The 6800(XT) series aren't bad GPU, they do compete in raster, but are far behind everywhere else as you already mentioned, and without being significantly cheaper than the competition, make for a really pointless product.
 

FireFly

Member
This thread indeed feels like we're in some kind of alternate reality. The narrative being pushed is complete opposite of pretty much everywhere else.. Even r/amd aren't this delusional. The 6800(XT) series aren't bad GPU, they do compete in raster, but are far behind everywhere else as you already mentioned, and without being significantly cheaper than the competition, make for a really pointless product.
There is no need for AMD to change the prices right now, since they are so heavily supply constrained. When supply becomes more plentiful, they will lower their prices as necessary.
 

Antitype

Member
There is no need for AMD to change the prices right now, since they are so heavily supply constrained. When supply becomes more plentiful, they will lower their prices as necessary.

But will they? Because what goes around is that the unusually high price of AIB compared to reference are due to AMD's high asking price for the chip. The node they went with is in such high demand that it makes no sense to lower their margins when they could instead use the wafers for Ryzens or console SOCs. I doubt they will ever lower their margins, there's just too many products fighting for 7nm capacity at the moment and as time goes on some will be converted to 5nm. Capacity is not going to increase nor will demand decrease.
 
This thread indeed feels like we're in some kind of alternate reality. The narrative being pushed is complete opposite of pretty much everywhere else.. Even r/amd aren't this delusional. The 6800(XT) series aren't bad GPU, they do compete in raster, but are far behind everywhere else as you already mentioned, and without being significantly cheaper than the competition, make for a really pointless product.

I mostly just see people discussing the cards and new info or data as it comes out, such as AIB models etc...

I've no idea where you are getting "alternative reality" like comments from, if you prefer Nvidia in general or think the 3000 series are better products or better buys then more power to you. I don't think anyone has a problem with that.

I don't really see many people in this thread posting delusion pro AMD comments or trying to fabricate some kind of alternative reality where 6000 series are suddenly 20% more powerful than 3000 series or really anything particularly unhinged. We are mostly discussing news and info as it comes out.

I can't necessarily speak for other posters, but most of my posts are pretty measured and dealing with facts/data points as we receive them and making reasoned and logical assumptions, extrapolations or predictions. I do tend to correct inaccurate information if I see it and shut down FUD fairly quickly though.

I would question the need some Nvidia leaning people have to constantly post in Radeon related threads with either mostly negative AMD comments or positive Nvidia comments. It gives off a vibe that shows a lack of confidence in your purchase or a fear of competition. "Thou dost protest too much" and all of that. If people really didn't believe AMD was competitive again and offering a solid product then they wouldn't feel the need to constantly reaffirm their love of Nvidia products and dislike of AMD offerings in AMD related threads 🤷‍♂️

Oddly enough, you don't see too much of AMD leaning people shitting up Nvidia related threads (although I'm sure there are some examples I wouldn't call it widespread).

I think the 3000 series are great cards and have a lot of strengths, I especially like their Blender performance. Anyone who buys one will likely be pretty happy as it is a great product. I also think the 6000 series are great products and have their strengths, AMD have come a really long way, way beyond what almost anyone thought they could achieve just a few months ago, there are certainly good reasons people have for being interested in them or wanting to buy them, I don't really get the controversy or defensiveness coming from some people.

AMD doing well or competing in the high end is not going to make the 3000 cards suddenly become worse, in fact if anything it will make Nvidia price more competitively or release better products more quickly (such as 3080ti), we the consumers only benefit.

But haven't we mostly worked out any value proposition arguments earlier in the thread where pretty much everyone said their piece? I don't see why some are so desperate to relitigate all of that again. At this point I doubt most people's minds are going to be changed one way or the other. All we can do from a value/price point of view is keep and eye on the data and wait for pricing across the board for both manufacturers to become somewhat stable/sane and then we can maybe revisit it once something of note has changed.

For the record, I do think that the AIB prices for the 6000 series are way too high at the moment. I hope they become more sane once stock levels stabilize. If an AIB 6800XT for example was more expensive than an equivalent AIB 3080 then I think the 3080 would be the obviously better buy due mostly to the CUDA advantage (for me) but also the better RT would be a factor too.
 

Bolivar687

Banned
This thread indeed feels like we're in some kind of alternate reality. The narrative being pushed is complete opposite of pretty much everywhere else.. Even r/amd aren't this delusional. The 6800(XT) series aren't bad GPU, they do compete in raster, but are far behind everywhere else as you already mentioned, and without being significantly cheaper than the competition, make for a really pointless product.

The only alternate reality is the one where you actually think people are playing the six games that have ray tracing.
 
The only alternate reality is the one where you actually think people are playing the six games that have ray tracing.

There are more than 6 games with raytracing released just in the last month = )))

And more big names are coming. Call of Duty already has it since the last 2 entries. It will have it from now on. Hitman 3 just announced ray tracing. Far Cry 6 ray tracing. Dying Light 2 ray tracing. Bloodlines 2 ray tracing.

You can expect this to be a big selling point from now in most big games. I wonder if this time next year, when most of the bigger games of 2021 will have ray tracing will people finally stop saying "there arent that many games", as if they're still stuck in 2018 at the 2000 series launch
 
Nah... I'm fine. I don't get the hype around 4K to be honest. It reminds me of TVs, when they went from 1080p to 4K, and everyone was getting hyped for it. There is no difference in the image quality at all. It all depends on how close you sit to the screen. PPI is much more important since that determines your minimum distance to the screen.
Not to mention the performance drop is astronomical for games.

On another note;

the closer you are to the screen the more the 4k matters. of course there is a massive improvement in image quality. If you render a high resolution texture at the same screen area with 32 pixels instead of 8 then you have a massive detail increase. I think its crazy how people insist that 1080p ultra is better than 4k when they cant even render those "real" ultra settings because they dont have enough pixels to do so. obviously all that doesnt matter if you sit to far away. if resolution doesnt matter those 1080p fans would play with 320p or something. nothing against people that do play in 1080p but calling a higher resolution a meme is silly.
 

FireFly

Member
But will they? Because what goes around is that the unusually high price of AIB compared to reference are due to AMD's high asking price for the chip. The node they went with is in such high demand that it makes no sense to lower their margins when they could instead use the wafers for Ryzens or console SOCs. I doubt they will ever lower their margins, there's just too many products fighting for 7nm capacity at the moment and as time goes on some will be converted to 5nm. Capacity is not going to increase nor will demand decrease.
Now Apple is moving 5nm, more space should be available to AMD, and I imagine TSMC will continue to ramp up capacity. According to Microsoft, 5nm doesn't bring a significant cost reduction per transistor, so it makes sense for the majority of AMD's products to stay at 7nm. So I can see the 6800 series hanging around for a while.
 

Ascend

Member
Lul what? So you want to continue playing on a 22 inch 1080p monitor to get the same ppi as a 43 inch 4k monitor? (Assuming you're sitting close enough to each to use a mouse)

Otherwise, the image quality is decidedly NOT the same but whatever you gotta tell yourself
I have an ultrawide. And there is no way I'm going back to 16:9. This monitor, I bought it for $250 4 years ago. And as long as it's working, I'm not going to upgrade it.

I have three 4K TVs in my home, one 42", two 50". I know what 4K looks like.

There are more than 6 games with raytracing released just in the last month = )))

And more big names are coming. Call of Duty already has it since the last 2 entries. It will have it from now on. Hitman 3 just announced ray tracing. Far Cry 6 ray tracing. Dying Light 2 ray tracing. Bloodlines 2 ray tracing.

You can expect this to be a big selling point from now in most big games. I wonder if this time next year, when most of the bigger games of 2021 will have ray tracing will people finally stop saying "there arent that many games", as if they're still stuck in 2018 at the 2000 series launch
Time to repeat it again;

The 6800 cards have hardware accelerated ray tracing.

I would question the need some Nvidia leaning people have to constantly post in Radeon related threads with either mostly negative AMD comments or positive Nvidia comments. It gives off a vibe that shows a lack of confidence in your purchase or a fear of competition. "Thou dost protest too much" and all of that. If people really didn't believe AMD was competitive again and offering a solid product then they wouldn't feel the need to constantly reaffirm their love of Nvidia products and dislike of AMD offerings in AMD related threads 🤷‍♂️
giphy.gif


You know what amazes me? The same people cheering for 4K are cheering for DLSS, which is inevitably not 4K.
 
Last edited:
This thread indeed feels like we're in some kind of alternate reality. The narrative being pushed is complete opposite of pretty much everywhere else.. Even r/amd aren't this delusional. The 6800(XT) series aren't bad GPU, they do compete in raster, but are far behind everywhere else as you already mentioned, and without being significantly cheaper than the competition, make for a really pointless product.
LMAO GAF has long had some of the most delusional AMD fanboys I've ever seen. Even r/AMD are properly tearing AMD up for blatant lies about launch availability, this was absolutely a real paper launch unlike Ampere where it turned out demand was really that insane. Also yeah Nvidia genuinely offers more features for what now does seem like exactly the same price so why bother with Big Navi at similar pricing anyways? The price for the nonexistent AIB Big Navi makes zero sense, not that it matters since they apparently made none to sell to begin with.
 
Last edited:

FireFly

Member
What's the perf with SAM enabled......?
It wasn't tested. However, DF found that at least in Control, enabling SAM didn't have much effect on ray tracing performance. (The gains were bigger with ray tracing disabled)

 

DonkeyPunchJr

World’s Biggest Weeb
LMAO GAF has long had some of the most delusional AMD fanboys I've ever seen. Even r/AMD are properly tearing AMD up for blatant lies about launch availability, this was absolutely a real paper launch unlike Ampere where it turned out demand was really that insane. Also yeah Nvidia genuinely offers more features for what now does seem like exactly the same price so why bother with Big Navi at similar pricing anyways? The price for the nonexistent AIB Big Navi makes zero sense, not that it matters since they apparently made none to sell to begin with.
I realized this after the Radeon VII launch. It reached performance parity with the 1080 Ti 2 full years later, at the same launch price ($700). The usual suspects here tried to spin it as a good deal.

That was also half a year after RTX 2080 launched, which had about the same performance but also had ray tracing and DLSS. So you know those fanboys said that those things shouldn’t count as a selling point because not many games support them. But Radeon VII’s 16GB of HBM2 was totally a selling point because someday it might give an advantage when/if any games needed that much memory.
 
Last edited:
ROFLMAO :messenger_grinning_squinting:

You guys recognize why r/realamd exists, yes? Or are you genuinely under the impression that r/amd is representative of AMD enthusiasts??? Why am I asking... guys like you are probably the reason why they had to create the alternate sub.
So what you’re saying is....there’s now a Resetera version of r/AMD too? When the cult in the regular sub just isn’t strong enough, you need to splinter off and create your own true believers’ sub? That’s hilarious, and the way you unironically tell me this like it’s a good thing is all I need to know about the real diehard AMD fans these days.
 
Nah... I'm fine. I don't get the hype around 4K to be honest. It reminds me of TVs, when they went from 1080p to 4K, and everyone was getting hyped for it. There is no difference in the image quality at all. It all depends on how close you sit to the screen. PPI is much more important since that determines your minimum distance to the screen.
Not to mention the performance drop is astronomical for games.

A post like this coming from you doesn't surprise me. One of the most delusional things that can be said. You literally still use a 1080p monitor (which is fine) but then take it a step further and try to convince others that 4K offers no benefit in image quality over 1080p.

I've never seen someone so sensitive to others supposedly "spreading FUD" about AMD while you spread FUD constantly about practically everything.

Just the other day you made a thread claiming that the reason some people couldn't find an Ampere GPU was because Nvidia was intentionally selling them all to bitcoin miners.

Why not make the same thread about Big Navi? I mean obviously the reason why no one can find any Big Navi GPUs is because of Lisa Su's conspiracy to enrich bitcoin miners at the expense of gamers.
 

Ascend

Member
Jesus. The nVidia trolls are here again in full force. You'd think they'd actually be using their graphics for gaming if they were oh so amazing...

A post like this coming from you doesn't surprise me. One of the most delusional things that can be said. You literally still use a 1080p monitor (which is fine) but then take it a step further and try to convince others that 4K offers no benefit in image quality over 1080p.
If you actually had reading comprehension you would have noticed that I said PPI is more important than resolution itself. Obviously, everyone glossed over that little fact, and started quote mining to misrepresent my position.

I've never seen someone so sensitive to others supposedly "spreading FUD" about AMD while you spread FUD constantly about practically everything.
Anyone that is truly actually neutral is quite aware that people are a LOT harder on AMD than they are nVidia... Anyone else is either in denial or blind. It's the truth, and I know you won't like it, and, I don't care.

Timestamped (01:01:35 onwnards) ;


In before "but those guys are AMD shills!!!!, and I didn't listen enough to hear where they actually criticize AMD!!!"


Just the other day you made a thread claiming that the reason some people couldn't find an Ampere GPU was because Nvidia was intentionally selling them all to bitcoin miners.

Why not make the same thread about Big Navi?
Because there are no financial numbers to support it, unlike in nVidia case...? Evidence is there or it isn't.

I mean obviously the reason why no one can find any Big Navi GPUs is because of Lisa Su's conspiracy to enrich bitcoin miners at the expense of gamers.
You would love for that to be true, wouldn't you? Unfortunately there is zero evidence of that for AMD, and quite a bit of it for nVidia. In fact, considering the current mining performance of the 6800 cards, no one could actually argue that AMD is catering to miners. nVidia is ignoring you because they can sell more volumes to miners. Live with it.

LMAO GAF has long had some of the most delusional AMD fanboys I've ever seen.
For some people, the quicker they call others fanboys, the more it says about their own views. And that includes you.

Even r/AMD are properly tearing AMD up for blatant lies about launch availability, this was absolutely a real paper launch unlike Ampere where it turned out demand was really that insane.
As gullible as always I see. As long as it's nVidia...

Also yeah Nvidia genuinely offers more features for what now does seem like exactly the same price so why bother with Big Navi at similar pricing anyways? The price for the nonexistent AIB Big Navi makes zero sense, not that it matters since they apparently made none to sell to begin with.
Why are you here again? If you're not interested in the 6800 series, go find another thread to post in. Nobody needs you here, and I'm sure you can do something more productive or enjoyable with your time, like, actually using your nVidia card.
 
Last edited:
Top Bottom