• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[DF] AMD vs NVIDIA - Ray Tracing Performance Deep Dive Feat. RX 6800XT & RTX 3080

Ascend

Member
Comparison of RTX 3080 and RX 6800XT in WoW Shadowlands;

The RTX 3080 is faster overall at 4K, even without RT. But... Let's compare their own internal scaling with RT;

RTX 3080
RT Shadows Off; 86.7 avg, 67.0 min
RT Shadows Fair; 74.6 / 56
RT Shadows Good; 74.3 / 55.0
RT Shadows High; 67.1 / 49.0

That translates in performance percentage to;
Fair; 86.0% / 83.6%
Good; 85.7% / 82.1%
High; 77.4% / 73.1

The same for the 6800XT gives us;

6800XT
RT Shadows Off; 69.6 avg, 57.0 min
RT Shadows Fair; 60.2 / 49.0
RT Shadows Good; 60.0 / 48.0
RT Shadows High; 50.2 / 40.2

That translates in performance percentage to;
Fair; 86.5% / 86.0%
Good; 86.2% / 84.2%
High; 72.1% / 70.5%

So in actuality, RDNA2 is scaling equal here to Ampere (and better with minimums) except at the high setting where it falls behind. And even at the high setting, it's not THAT much worse... That is actually quite interesting. nVidia will definitely push to crank RT higher to put their cards in the best light, even when not necessary. I'm getting GameWorks/Tessellation vibes.

The 720p results are a whole other story.
 
Last edited:
Really? You're trying to use raytraced shadows in WoW to claim that "ahkchuallly RDNA 2 raytracing scales just as good as Ampere. PS. Nvidia is evil"

Go outside. ALL YOU DO is schill for AMD and their inferior GPUs. Don't you get tired of it?
Seriously 😂. I can't imagine putting this much time and effort into patrolling threads, and shilling all day long. As much as I love my computer, I'm not going to go on a tirade for the entirety of my GAF account, about how this company is so much better than that company. Speak with your wallet, instead of, shilling with your keyboard.

With that said, I don't think WoW is a great example of raytracing "dominance". I'm sure there are multitudes of better and more recent examples out there.
 

Ascend

Member
Really? You're trying to use raytraced shadows in WoW to claim that "ahkchuallly RDNA 2 raytracing scales just as good as Ampere. PS. Nvidia is evil"

Go outside. ALL YOU DO is schill for AMD and their inferior GPUs. Don't you get tired of it?
Seriously 😂. I can't imagine putting this much time and effort into patrolling threads, and shilling all day long. As much as I love my computer, I'm not going to go on a tirade for the entirety of my GAF account, about how this company is so much better than that company. Speak with your wallet, instead of, shilling with your keyboard.

With that said, I don't think WoW is a great example of raytracing "dominance". I'm sure there are multitudes of better and more recent examples out there.

I simply repeated the results with some additional insight. That you don't like it is your problem.

I'm not interested in your fanboy wars. The fact that scaling seems about equal up to a certain point is a much more interesting topic to discuss, because it actually entails the architectures, rather than pointless namecalling and attempts at shaming and discrediting other users for your own grandeur. Maybe you should look in the mirror and spend your own time better, rather than looking at what you think I am doing.
Even better, put me on ignore so you don't have to see my posts. But you can't, can you? I'm flattered that I'm so important to you though. Unfortunately, it's not mutual. Just FYI.

Now let's keep things on-topic.
 
I simply repeated the results with some additional insight. That you don't like it is your problem.

I'm not interested in your fanboy wars. The fact that scaling seems about equal up to a certain point is a much more interesting topic to discuss, because it actually entails the architectures, rather than pointless namecalling and attempts at shaming and discrediting other users for your own grandeur. Maybe you should look in the mirror and spend your own time better, rather than looking at what you think I am doing.
Even better, put me on ignore so you don't have to see my posts. But you can't, can you? I'm flattered that I'm so important to you though. Unfortunately, it's not mutual. Just FYI.

Now let's keep things on-topic.
Imagine using WoW as a talking point to try and compare the architectures? Imagine being fueled so much by your own ego, that you can only post Positive AMD/Negative Nvidia stuff. So egotistical that you think others feel you are so important, and that we can't put you on ignore because we would miss your posts soooo much.



no one is on my ignore list
FmiBkMb.png
 
Really? You're trying to use raytraced shadows in WoW to claim that "ahkchuallly RDNA 2 raytracing scales just as good as Ampere. PS. Nvidia is evil"

Go outside. ALL YOU DO is schill for AMD and their inferior GPUs. Don't you get tired of it?
He's admitted awhile back that he doesn't actually buy video cards, his job is apparently to just do this for a living.

People who actually buy video cards aren't the target market for his posts. Think about that a moment.

Since I am someone who buys video cards, and I now have a 3090, I am not the target market for his posts. So I just kind of scroll past them when I see them.
 

DGrayson

Mod Team and Bat Team
Staff Member
I simply repeated the results with some additional insight. That you don't like it is your problem.

I'm not interested in your fanboy wars. The fact that scaling seems about equal up to a certain point is a much more interesting topic to discuss, because it actually entails the architectures, rather than pointless namecalling and attempts at shaming and discrediting other users for your own grandeur. Maybe you should look in the mirror and spend your own time better, rather than looking at what you think I am doing.
Even better, put me on ignore so you don't have to see my posts. But you can't, can you? I'm flattered that I'm so important to you though. Unfortunately, it's not mutual. Just FYI.

Now let's keep things on-topic.

Imagine using WoW as a talking point to try and compare the architectures? Imagine being fueled so much by your own ego, that you can only post Positive AMD/Negative Nvidia stuff. So egotistical that you think others feel you are so important, and that we can't put you on ignore because we would miss your posts soooo much.


Ok lets everyone dial it back a bit get get back on topic.
 

Bo_Hazem

Banned
I simply repeated the results with some additional insight. That you don't like it is your problem.

I'm not interested in your fanboy wars. The fact that scaling seems about equal up to a certain point is a much more interesting topic to discuss, because it actually entails the architectures, rather than pointless namecalling and attempts at shaming and discrediting other users for your own grandeur. Maybe you should look in the mirror and spend your own time better, rather than looking at what you think I am doing.
Even better, put me on ignore so you don't have to see my posts. But you can't, can you? I'm flattered that I'm so important to you though. Unfortunately, it's not mutual. Just FYI.

Now let's keep things on-topic.

New waves of games could show better utilization of AMD's cards or using less rays but faster favoring the clock speed and achieving similar final image.
 

FireFly

Member
So in actuality, RDNA2 is scaling equal here to Ampere (and better with minimums) except at the high setting where it falls behind. And even at the high setting, it's not THAT much worse... That is actually quite interesting. nVidia will definitely push to crank RT higher to put their cards in the best light, even when not necessary. I'm getting GameWorks/Tessellation vibes.

The 720p results are a whole other story.
Based on the DF video, RT shadows are a best case for AMD. It's with reflections and/or GI that the gap becomes big.
 
Based on the DF video, RT shadows are a best case for AMD. It's with reflections and/or GI that the gap becomes big.
Yes, this is why Spider Man Miles Morales basically only uses RT shadows with very sparing use of 1/4 resolution reflections. That's pretty much what PS5 hardware with RDNA2 can handle. We will not be seeing any real RTX GI on the consoles this gen, for example Metro Exodus with full RTX GI is probably beyond anything these consoles can handle.
 

FireFly

Member
Yes, this is why Spider Man Miles Morales basically only uses RT shadows with very sparing use of 1/4 resolution reflections. That's pretty much what PS5 hardware with RDNA2 can handle. We will not be seeing any real RTX GI on the consoles this gen, for example Metro Exodus with full RTX GI is probably beyond anything these consoles can handle.
It doesn't seem very sparing to me based on the videos. You can see it on the sides of buildings, puddles, cars, polished floors and metallic objects. John from DF said you can see reflections "around every corner" of the game world.

Full GI may be off the table, but RTXGI looks to be quite scalable, and Lumen targets 1440p60, so with 1440p30, you could possibly add other effects such as ray traced reflections.
 

Caio

Member
And now, even if I'm not a PC gamer, but because I'm a graphic whore, I will be forced to buy a RTX 4090 and build the most powerful PC among the whole Gaf community just to play thr best multiplatform games at max res details fps and full Ray Tracing. Thank you Nvidia, this will cost me more than $3000. It might sound like a joke, but probably I will do this. There is always a first time to become a PC gamer too. Why not to start with a RTX 4090 ? Should be in 2022 right ?
 
Really? You're trying to use raytraced shadows in WoW to claim that "ahkchuallly RDNA 2 raytracing scales just as good as Ampere. PS. Nvidia is evil"

Go outside. ALL YOU DO is schill for AMD and their inferior GPUs. Don't you get tired of it?

Seriously 😂. I can't imagine putting this much time and effort into patrolling threads, and shilling all day long. As much as I love my computer, I'm not going to go on a tirade for the entirety of my GAF account, about how this company is so much better than that company. Speak with your wallet, instead of, shilling with your keyboard.

With that said, I don't think WoW is a great example of raytracing "dominance". I'm sure there are multitudes of better and more recent examples out there.
AMD doesn't even hide that they have army of unpaid schills on forums and social media.

And well "unpaid" might be stretching it as we learned they got priority access to buying rx 6800 which could be flipped for few hundreds bucks of profit



Mind you this is just the one that surfaced - we don't know how much this happened in the past without leeking outside.
 

Md Ray

Member
We will not be seeing any real RTX GI on the consoles this gen, for example Metro Exodus with full RTX GI is probably beyond anything these consoles can handle.
Nah, I don't think so. We will definitely see some form of ray-traced GI on consoles. But those games would have to target 30fps. UE5 demo already showcases ray-traced real-time dynamic global illumination called Lumen. And it was running on the shaders i.e. it's a software-based, non-triangle RT bounce lighting. That said, it's a lighter solution compared to the hardware-accelerated RT GI found in ME which is more demanding.

Anyway, this means the UE5 demo wasn't utilizing the PS5's RT hardware yet so there's potential for an even better-looking implementation of GI than Lumen on consoles that will also run faster due to hardware-acceleration when the console's RAs (Ray Accelerators) are utilized.
 
Last edited:

regawdless

Banned
I am not interested in paying over $700 to lower graphics settings for playable framerates.

I never said to ignore it, but to not put a high priority on it.


That statement is kinda confusing to me.

Take Cyberjunk for example. Raytracing effects are a part of the graphical settings that you can chose to enable. I'd think enthusiasts who buy GPUs for 700+ bucks would want to be able to max out their games.
If you don't enable raytracing, you are already lowering the settings by definition, because it's designed with raytracing in mind and it has huge visual impacts.

Buying an AMD card would therefore mean that you have to lower the settings by a bigger margin in the increasing number of games with raytracing for playable fps compared to Nvidias offerings. And again, devs are optimizing their RT implementations for the currently available cards for playable framerates.

If you want even higher fps, you can only enable RT reflections with lower quality. That's still resulting in better visuals at lower performance cost. Or only use RT lighting. Or disable DLSS and reduce the RT quality if you don't like DLSS for some reason.

That's the thing, you can customize it to your individual liking for the best balance of visuals vs fps. Just like with any other visual setting.

Not trying to attack you, only wanting to wrap my head around your statement:
As I've explained, raytracing is part of the visual settings. And it is important for you to not lower the settings on a 700+ bucks card, meaning you want to use highest settings. While simultaneously saying that highest settings don't have a high priority.

That's confusing me.

Or do you somehow exclude raytracing from visual settings? Which in the case of again Cyberpunk for example, would be pretty weird because they are one of the most impactful settings for the visual quality of the game.
 
Last edited:

martino

Member
That statement is kinda confusing to me.

Take Cyberjunk for example. Raytracing effects are a part of the graphical settings that you can chose to enable. I'd think enthusiasts who buy GPUs for 700+ bucks would want to be able to max out their games.
No, spending money , having no compromise on settings without putting any brain in what you do doesn't make you an enthusiast; just an annoying rich person.

edit : but yes RT will be the difference this gen....Lumen excuse or not.
 
Last edited:

regawdless

Banned
No, spending money , having no compromise on settings without putting any brain in what you do doesn't make you an enthusiast; just an annoying rich person.

edit : but yes RT will be the difference this gen....Lumen excuse or not.

Ok then more like this?
A consumer who is willing to pay 700 bucks for a GPU, expects the viable support of high-end features to be able to use them brainz to adjust it to their liking regarding fps vs quality.

Also, being rich would be pretty cool, I'm trying to get there.
 

AllBizness

Banned
Not watching, Digital Foundry are biased af and dont even hide it when it comes to AMD vs Nvidia. Not to mention Nvidia sponsors them. Ray Tracing is way over blown we dont have the horsepower to do that yet. I'll gladly take a lower cost AMD build over an Nvidia/Intel one.
 

regawdless

Banned
Not watching, Digital Foundry are biased af and dont even hide it when it comes to AMD vs Nvidia. Not to mention Nvidia sponsors them. Ray Tracing is way over blown we dont have the horsepower to do that yet. I'll gladly take a lower cost AMD build over an Nvidia/Intel one.

Huh. That's like, prime trolling or incredibly stupid.

So PS5 owners can play the great looking Spider-Man with cool RT reflections on hardware that is significantly slower than a 3080.
And on high-end PCs, people are playing stuff with different raytracing effects like Control.

But we don't have the horsepower yet?

You know that devs implement and optimize these effects with target hardware and target performance in mind? And they then adjust the RT settings to run on these cards?

Not to mention the rest of your comment. Jeez, what's wrong with you.
 

martino

Member
Not watching, Digital Foundry are biased af and dont even hide it when it comes to AMD vs Nvidia. Not to mention Nvidia sponsors them. Ray Tracing is way over blown we dont have the horsepower to do that yet. I'll gladly take a lower cost AMD build over an Nvidia/Intel one.
So because the "right places" miss tech to do something correctly people reporting progress doing it are biased ?
your world is upside down.
 
Last edited:

Ascend

Member
That statement is kinda confusing to me.

Take Cyberjunk for example. Raytracing effects are a part of the graphical settings that you can chose to enable. I'd think enthusiasts who buy GPUs for 700+ bucks would want to be able to max out their games.
If you don't enable raytracing, you are already lowering the settings by definition, because it's designed with raytracing in mind and it has huge visual impacts.

Buying an AMD card would therefore mean that you have to lower the settings by a bigger margin in the increasing number of games with raytracing for playable fps compared to Nvidias offerings. And again, devs are optimizing their RT implementations for the currently available cards for playable framerates.

If you want even higher fps, you can only enable RT reflections with lower quality. That's still resulting in better visuals at lower performance cost. Or only use RT lighting. Or disable DLSS and reduce the RT quality if you don't like DLSS for some reason.

That's the thing, you can customize it to your individual liking for the best balance of visuals vs fps. Just like with any other visual setting.

Not trying to attack you, only wanting to wrap my head around your statement:
As I've explained, raytracing is part of the visual settings. And it is important for you to not lower the settings on a 700+ bucks card, meaning you want to use highest settings. While simultaneously saying that highest settings don't have a high priority.

That's confusing me.

Or do you somehow exclude raytracing from visual settings? Which in the case of again Cyberpunk for example, would be pretty weird because they are one of the most impactful settings for the visual quality of the game.

You know what's confusing to me? That you're arguing that people paying 700+ bucks don't want to lower settings, and that lowering settings is a viable option at the same time.

You can't run RT at max settings in Cyberpunk with playable framerates, even with a respectable DLSS settings (which is a fancy resolution-lowering method with AI upscaling), so by default you're lowering settings. That means you're paying 700+ bucks to play with lowered settings.
That means, RT is too large a performance hog to be considered a priority for the majority of people. Especially for the price. The game still looks extremely good without RT. Sure. It does look better with RT. Sure, the Ampere cards are faster. But that doesn't mean much when every current hardware drowns with it enabled.

So either you think lowering settings for a 700+ graphics card is not justifiable, or, that RT has too high a performance impact. You can't have it both ways.

If the RTX 3080 and the 6800XT were sub $300 cards we wouldn't be having this conversation.

Yes, this is why Spider Man Miles Morales basically only uses RT shadows with very sparing use of 1/4 resolution reflections. That's pretty much what PS5 hardware with RDNA2 can handle. We will not be seeing any real RTX GI on the consoles this gen, for example Metro Exodus with full RTX GI is probably beyond anything these consoles can handle.
RT is there to make the life of developers easier in the long run. And at this point, all consoles have RT. But there is a reason that the games that have it in a noticeable manner, have an option to turn it off.

As for the PC... Whenever you enable DLSS, you're automatically reducing the RT resolution to whatever the native rendering resolution is. And additionally, DXR is sort of a black box on PC, while consoles can be optimized a lot better.

The consoles and PC are in the same boat; if normally the GPU can handle 4K/60, you have to be content with either upscaled 1080p/60 or upscaled 1440p/30, or somewhere around there.
 
Last edited:

AllBizness

Banned
Huh. That's like, prime trolling or incredibly stupid.

So PS5 owners can play the great looking Spider-Man with cool RT reflections on hardware that is significantly slower than a 3080.
And on high-end PCs, people are playing stuff with different raytracing effects like Control.

But we don't have the horsepower yet?

You know that devs implement and optimize these effects with target hardware and target performance in mind? And they then adjust the RT settings to run on these cards?

Not to mention the rest of your comment. Jeez, what's wrong with you.
Wtf does PS5 have to do with this?
 

regawdless

Banned
You know what's confusing to me? That you're arguing that people paying 700+ bucks don't want to lower settings, and that lowering settings is a viable option at the same time.

You can't run RT at max settings in Cyberpunk with playable framerates, even with a respectable DLSS settings (which is a fancy resolution-lowering method with AI upscaling), so by default you're lowering settings. That means you're paying 700+ bucks to play with lowered settings.
That means, RT is too large a performance hog to be considered a priority for the majority of people. Especially for the price. The game still looks extremely good without RT. Sure. It does look better with RT. Sure, the Ampere cards are faster. But that doesn't mean much when every current hardware drowns with it enabled.

So either you think lowering settings for a 700+ graphics card is not justifiable, or, that RT has too high a performance impact. You can't have it both ways.

Maybe I didn't write it in the best way. People want the maximum visual effects and performance possible for the price. Of course gamers adjust and scale the settings to their target fps, that's why PC gaming is so great.

What is a playable framerate for you?

I'm playing Cyberpunk at 1440p with all raytracing maxed at 50 - 70 fps (depending on the scene) with DLSS quality mode. I add a bit of reshade sharpening and clarity, and it looks very clean and extremely close to native.
Same with Control, maxed out with all RT, it runs at 40+ fps even without DLSS at 1440p. While with DLSS obviously way better.

Curious what your definition of playable fps is.
 

Ascend

Member
Maybe I didn't write it in the best way. People want the maximum visual effects and performance possible for the price. Of course gamers adjust and scale the settings to their target fps, that's why PC gaming is so great.
That, I agree with.

What is a playable framerate for you?
Considering variable refresh rate monitors, 60 fps average is fine, with minimums above 40 fps.

I'm playing Cyberpunk at 1440p with all raytracing maxed at 50 - 70 fps (depending on the scene) with DLSS quality mode. I add a bit of reshade sharpening and clarity, and it looks very clean and extremely close to native.
Same with Control, maxed out with all RT, it runs at 40+ fps even without DLSS at 1440p. While with DLSS obviously way better.
That's fair. Which card? If your card cost more than $500, I would not find that good value. But that's just me.

Curious what your definition of playable fps is.
It's a combination of fps, resolution and price for me. That is one package. For playing at 1080p-ish, I considering anything under $350 reasonable, but preferably sub $250. For playing at 1440p-ish, I consider up to $500-ish reasonable. For playing at 4K, I consider up to $750 reasonable. Anything above that is a waste of money. All need to be 60 fps average.

Those are my standards. It doesn't have to be everyone's. I value my performance for money. All that, is achievable with the current cards (if there was actual stock). We just moved to 4K being viable with max settings, after 4K has been around for a long time.
But if you enable RT, everything shifts down two tiers, i.e. the $750 card becomes a $350 performance card, so to speak. And the 1440p card is suddenly a 720p card. And I doubt performance will get better over time for these cards. This is the best RT performance we're going to get from them.
 
Last edited:

regawdless

Banned
That, I agree with.


Considering variable refresh rate monitors, 60 fps average is fine, with minimums above 40 fps.


That's fair. Which card? If your card cost more than $500, I would not find that good value. But that's just me.


It's a combination of fps, resolution and price for me. That is one package. For playing at 1080p-ish, I considering anything under $350 reasonable, but preferably sub $250. For playing at 1440p-ish, I consider up to $500-ish reasonable. For playing at 4K, I consider up to $750 reasonable. Anything above that is a waste of money. All need to be 60 fps average.

Those are my standards. It doesn't have to be everyone's. I value my performance for money. All that, is achievable with the current cards (if there was actual stock). We just moved to 4K being viable with max settings, after 4K has been around for a long time.
But if you enable RT, everything shifts down two tiers, i.e. the $750 card becomes a $350 performance card, so to speak. And I doubt performance will get better over time for these cards. This is the best RT performance we're going to get from them.

Thanks for the clarification. I'm playing with a 3080 and an 8700k at 5ghz.

If it's not worth it for you, that's perfectly reasonable. All a matter of budget. I just don't like the general sentiment that raytracing isn't feasible and not possible yet. Because it is, at least for the people like me who are stupid enough to drop 800 bucks on a GPU.

And because of the consoles, I truly believe that we'll see more toned down, cheaper implementations that'll run well on cheaper cards.
 

Ascend

Member
Thanks for the clarification. I'm playing with a 3080 and an 8700k at 5ghz.

If it's not worth it for you, that's perfectly reasonable. All a matter of budget. I just don't like the general sentiment that raytracing isn't feasible and not possible yet. Because it is, at least for the people like me who are stupid enough to drop 800 bucks on a GPU.

And because of the consoles, I truly believe that we'll see more toned down, cheaper implementations that'll run well on cheaper cards.
Yeah. If you want to pay $800 for ray tracing, you can use it at 1080p and to some extent 1440p, you can argue that RT can be used. Possible it definitely is right now. The whole argument is based on the fact that the performance loss is too big. You will pay with a kidney for it, so to speak. So to me, I can't currently call it a feasible solution considering the prices.

We have to remember that the market for that is quite small though. There is a group of people that wants to be at the edge of tech, and they are willing to pay for it. The majority of people are not, and are still sitting on their GTX 1060s, which is why nVidia advertises specifically for that group.
I fall somewhere in the middle, where I am extremely interested in tech and graphics, but am 'careful' with my money. I never buy consoles at launch, I never buy the highest end hardware, I buy most games at a later time and at a discount etc. Not because I don't have the money (I just bought a car paying cash last week), but because I want to be efficient with it, especially with things that are not mandatory.
 
Top Bottom