• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Quake 2 RTX Vulkan Ray Tracing/Path Tracing Benchmarks. RTX 3080 vs. RX 6800 XT.

Ascend

Member
The only "negatively" I've spoken about AMD, is in regards to raytracing and DLSS. I've given them props especially on their CPU's. I don't harp over and over in every thread, on how evil Nvidia is, or constantly downplay DLSS, raytracing, etc. If you ever noticed, I don't defend Nvidia. You love to defend AMD every chance you get. Big difference there.


Although, It's kinda funny to see you and llien fighting so hard to protect AMD on GAF and elsewhere I'm sure. I am neutral, and it kills you that I'm using Nvidia for the past couple of years, ever since they had better performance. If that changes, I night switch back. Just like I went from Intel to AMD. See I'm not against AMD with a ryzen CPU, and waiting till 5xxx cpus are back in stock. How am I not neutral again? Imagine if Nvidia completed in the CPU industry right now, you'd turn soooo red.

Again, it's ironic you say virtue signaling. 😂
So your answer is you have never called out shilling against AMD. That's settled. Thank you.

Keep pretending to be neutral though.
 

Buggy Loop

Member
That's not it.

Amd architectures are dictated by console makers demands - they want universal solution without additional Silicon that would do nothing in traditional rasterization.

On the other hand Nvidia found briliant way how to utilise tech that they need for their professional users for RT and DLSS in gaming scenarios.

The RT BVH accelerators are there and remain there idle until an RT fonction is called. What AMD mostly skipped is the ML part, since their “ML acceleration” is done in the pipeline with 8-4 bit int.

So how did Nvidia manage to make a ML monster (nearly triple the tensor ops of Turing), put the dedicated RT cores with ASIC BVH structure (more silicon) and still managed to have that good rasterization?

It’s quite a bump from RDNA 1 to RDNA 2 in AMD side, but it feels like they are a generation behind what you can do with silicon area, considering that rasterization was not even the main focus of Ampere, it stayed in the ring and actually, statistically wins in general at 1440p and certainly 4K (and VR)
 

Ascend

Member
The RT BVH accelerators are there and remain there idle until an RT fonction is called. What AMD mostly skipped is the ML part, since their “ML acceleration” is done in the pipeline with 8-4 bit int.

So how did Nvidia manage to make a ML monster (nearly triple the tensor ops of Turing), put the dedicated RT cores with ASIC BVH structure (more silicon) and still managed to have that good rasterization?

It’s quite a bump from RDNA 1 to RDNA 2 in AMD side, but it feels like they are a generation behind what you can do with silicon area, considering that rasterization was not even the main focus of Ampere, it stayed in the ring and actually, statistically wins in general at 1440p and certainly 4K (and VR)
I don't see this silicon area advantage you're talking about. We can't compare die size because they're not the same process, but we can compare transistors.

The RTX 3090/3080 has 28.3 billion transistors, while the 6900XT/6800XT has 26.8 billion. Quite a few things can be done with 1.5 billion additional transistors. Besides, the infinity cache, which is basically a bandwidth booster, takes up quite a bit of space on the die.

The rasterization performance of nVidia is no mystery. They simply improved the amount of FP32 cores. Those are harder to keep busy at lower resolutions, which is why the 6800 cards pull ahead there.
 
Last edited:


What a joke. Time for the AMD guys to stop posting those fake Watchdogs RT benchmarks.



A 3080 is almost 40% faster at 1440p and almost 50% faster in 4k. Using DLSS this is playable all the way to 4k. 6800XT i guess would be playable with RT only at 1080p. This situation with the borked RT rendering on earlier AMD drivers also shows you how poor the coverage is from the "tech press" and reviewers in general. Just as it is with them using Dirt 5. Just how it was with the geforce 970 that half a year went by and reddit users were posting about iffy performance regarding vram. The entirety of the tech press in the world didnt catch that. What the fuck were they testing then ? Not even profoundly arrogant guys like Gamers Nexus for example. Its quite pathetic
 

Neo_game

Member
For RDNA 2 CU, BVH transverse workloads are done on shader units while box and intersect tests are done on RT hardware, hence large-scale BVH transverse workload has a blocking effect on normal shader workloads.

When box and intersect tests active within the CU, texture units can be blocked. RT payload needs to be broken down into smaller sizes to minimize the blocking effect.

RDNA 2's hardware RT is fundamentally inferior when compared to NVIDIA's RTX design.

AMD traded for ~100 mm2 area for 128 MB L3 cache instead of robust RT hardware, but BiG NAVI's PCB designs are in the mid-range 256-bit bus BOM cost levels.

i guess all others RT games where Nvidia crushes AMD are also developed by Nvidia.

Yeah. That optimization is called much better RT hardware.

I never said AMD has better hardware RT and I do not think anyone it expected either. However this quake 2 rtx is optimized for nvidia gfx cards. Nvidia programmer have worked on it that is why we are seeing literally 2x performance advantage here.
 

Mister Wolf

Member
A 3080 is almost 40% faster at 1440p and almost 50% faster in 4k. Using DLSS this is playable all the way to 4k. 6800XT i guess would be playable with RT only at 1080p. This situation with the borked RT rendering on earlier AMD drivers also shows you how poor the coverage is from the "tech press" and reviewers in general. Just as it is with them using Dirt 5. Just how it was with the geforce 970 that half a year went by and reddit users were posting about iffy performance regarding vram. The entirety of the tech press in the world didnt catch that. What the fuck were they testing then ? Not even profoundly arrogant guys like Gamers Nexus for example. Its quite pathetic

I've always liked Wccftech. They usually cover something I'm interested in without any bloat added.
 

magaman

Banned
It's telling just how inefficient and underdeveloped realtime raytracing is when the best examples for it are Minecraft and Quake - games that are decades old or otherwise graphically unimpressive. This tech really needs another generation before it's ready. Prebaked lighting runs better and looks 90% the same.
 

Buggy Loop

Member
I never said AMD has better hardware RT and I do not think anyone it expected either. However this quake 2 rtx is optimized for nvidia gfx cards. Nvidia programmer have worked on it that is why we are seeing literally 2x performance advantage here.

Ah yea. While Nvidia was providing a beta branch of Vulkan RT because the API was not ready, AMD, part of the consortium, just sat there while they saw Nvidia create function calls in the API that would just benefit Nvidia and nobody else. Then the Khronos Group just released it as is and looked at AMD like this ¯\_(ツ)_/¯

If you think Nvidia somehow poisoned the well for decades old maths (RT is not Nvidia’s invention), then that makes AMD the biggest sleeper on the job when it came to API development.
 
Last edited:

regawdless

Banned
It's telling just how inefficient and underdeveloped realtime raytracing is when the best examples for it are Minecraft and Quake - games that are decades old or otherwise graphically unimpressive. This tech really needs another generation before it's ready. Prebaked lighting runs better and looks 90% the same.

Depends on the game and usage. Dynamic lighting looks way better with raytracing. Switching to night time and then fire the standard gun in Q2 RTX looks really amazing, no other game has achieved this kind of realism and believability regarding lighting and shadows.
 
So your answer is you have never called out shilling against AMD. That's settled. Thank you.

Keep pretending to be neutral though.





Who do you think you are dude 🤣😂🤣😂. Are you holding a full on conversation with yourself there? I never even answered your question. I'm just pointing out how ridiculous you look by going ape shit on anyone who says the truth about AMD having lower performance than Nvidia.

You need to stop taking the internet so serious, as it can, and seemingly is, affecting your personal life. You are so quick to quote and go to war with anyone who doesn't praise AMD, or better yet, doesn't apologize on behalf of Nvidia.

Indeed they did. But their loud & proud defenders didn't apologize. They are constantly overstaying their welcome not only here, but also in the 6800 series review thread and other threads where their nose doesn't belong. And I have no problem rubbing this fact in their faces, until they apologize. And they aren't going to either, which is why they are worse than nVidia.


Lmfao what kinda weird fetish shit is that? Do I need to show you all of my receipts of AMD/ATI products that I currently/previously owned? Do I need to get an AMD tattoo on my ass crack? You are borderline mental, get that checked out. AMD is not your friend, even if you are on their payroll.



I'm not sure if I can say you are simping hella hard for AMD, but this isn't twitch or REEEE.




 

Ascend

Member
Not after what happened last weekend.
Ironically, he's among them... But for the record... I'll just show this;

Waiting for Ascend and LLien to shit up the thread as per usual. They are way worse, than the worst Nvidia fanboys. But entertaining nonetheless.

It would be cool if AMD had the same engineers from the Ryzen team, to work on the GPU's. As they are night and day different.
Poisoning the well (or attempting to poison the well) is a type of informal fallacy where adverse information about a target is preemptively presented to an audience, with the intention of discrediting or ridiculing something that the target person is about to say.

The likes of him thrive of fallacies and bullying. Pretty much like nVidia. At least nVidia apologized. These people don't, and keep on hammering.
 
Last edited:

alucard0712_rus

Gold Member
I never said AMD has better hardware RT and I do not think anyone it expected either. However this quake 2 rtx is optimized for nvidia gfx cards. Nvidia programmer have worked on it that is why we are seeing literally 2x performance advantage here.
It runs "poor" on NVIDIA, it's fully path-traced game. Don't think it will be any better on AMD even with all possible optimizations.
 

Mister Wolf

Member
Depends on the game and usage. Dynamic lighting looks way better with raytracing. Switching to night time and then fire the standard gun in Q2 RTX looks really amazing, no other game has achieved this kind of realism and believability regarding lighting and shadows.


Imagine Halo Infinite at night with all of its plasma weapons using that same raytraced emissive effect.
 

quest

Not Banned from OT
Water is wet and the sky is blue what other obvious things need to be stated. The patents said it all this version of RT was about saving die space so they could dedicate those savings to traditional rendering. It came at the price of performance of course. No reason for anyone to get their panties twisted up over it.
 
Last edited:

Ascend

Member
What the hell is wrong with you, like seriously. Your obsession with Nvidia being evil and bad can't be healthy.

I'm baffled.
Did you just miss the Hardware Unboxed fiasco?
And it's got more to do with the blind nVidia followers than nVidia itself. The amount of brainwashed sheep defending nVidia for this was baffling. They have the audacity to say that AMD is not my friend, but defend nVidia's actions in such a case. And I'm supposed to remain fine and dandy when those people start trashing everyone that agrees with Hardware Unboxed.

Nothing is wrong with me.
 
Did you just miss the Hardware Unboxed fiasco?
And it's got more to do with the blind nVidia followers than nVidia itself. The amount of brainwashed sheep defending nVidia for this was baffling. They have the audacity to say that AMD is not my friend, but defend nVidia's actions in such a case. And I'm supposed to remain fine and dandy when those people start trashing everyone that agrees with Hardware Unboxed.

Nothing is wrong with me.
If AMD did the same, I would support AMD. Fuck anyone who purposely excludes the big differentiator between GTX and RTX gpu's, and the biggest thing in today's GPU's, to then shill themselves off for AMD gpu's and even skew the benchmarks. If the shoe was on the other foot, you wouldn't be singing the same song. That's why so many people have asked what is wrong with you, in every single thread you witch hunt anyone who bought an Nvidia GPU. You have more posts in the Nvidia off topic than AMD threads. What kinda weirdo shit is that. Don't try and play the victim now.
 
Last edited:

Ascend

Member
Who do you think you are dude 🤣😂🤣😂. Are you holding a full on conversation with yourself there? I never even answered your question.
Oh I'm well aware that you didn't answer it. You pretended to answer it as a way to escape the fact that you're one of the most biased passive aggressive assholes on here that needs to constantly virtue signal how good you are to everyone else.
That's why you didn't answer. Because you can't answer. You have no evidence of ever speaking out against someone shilling against AMD. You only call out so-called anti nVidia shills, and never call out shilling against AMD, because you're an nVidia fanboy pretending to be neutral.
You're like feminists. When you get caught for spouting BS you invent something else to create a problem about.

I'm just pointing out how ridiculous you look by going ape shit on anyone who says the truth about AMD having lower performance than Nvidia.
Nope. AMD's RT performance is lower than nVidia. I never said otherwise. Prove where I said otherwise. I'll wait.
Oh. You're gonna avoid answering this one too, aren't you?

You need to stop taking the internet so serious, as it can, and seemingly is, affecting your personal life. You are so quick to quote and go to war with anyone who doesn't praise AMD, or better yet, doesn't apologize on behalf of Nvidia.
nVidia already did that. You should apologize for yourself.
Maybe you should take the internet more seriously. After all, we are all humans behind the keyboard. But you have no issue trashing others for fun. Unfortunately for you, this voice is not so easily silenced.

Lmfao what kinda weird fetish shit is that? Do I need to show you all of my receipts of AMD/ATI products that I currently/previously owned?
I don't need to see receipts. And the apology is not for nVidia to anyone. The apology is for your own BS actions. And you damn well know that. But, one can't expect someone that only uses fallacies to have an honest conversation.

Do I need to get an AMD tattoo on my ass crack?
Would beat that nVidia dick you love riding so much.

You are borderline mental, get that checked out.
🖕

AMD is not your friend, even if you are on their payroll.
I am not on their payroll and I have no delusions about them being a business out for my money. But keep spouting nVidia's marketing drivel. I'm sure you're doing that for free.

I'm not sure if I can say you are simping hella hard for AMD, but this isn't twitch or REEEE.
BS. I'm anti fanaticism. Sure. Laugh it off. You can't see it because your fanaticism for nVidia has gotten to your brain.

You have more posts in the Nvidia off topic than AMD threads. What kinda weirdo shit is that. Don't try and play the victim now.
Prove it. I'm pretty sure that I don't. Or are you again;

Poisoning the well (or attempting to poison the well) is a type of informal fallacy where adverse information about a target is preemptively presented to an audience, with the intention of discrediting or ridiculing something that the target person is about to say.

Asshole.

If AMD did the same, I would support AMD.
Suuuuuuuuuuuuuure. Yet you have zero evidence that you have ever stood up against shilling for nVidia or against AMD.
 
Last edited:

regawdless

Banned
Did you just miss the Hardware Unboxed fiasco?
And it's got more to do with the blind nVidia followers than nVidia itself. The amount of brainwashed sheep defending nVidia for this was baffling. They have the audacity to say that AMD is not my friend, but defend nVidia's actions in such a case. And I'm supposed to remain fine and dandy when those people start trashing everyone that agrees with Hardware Unboxed.

Nothing is wrong with me.

You hated Nvidia before that, this incident hasn't changed your mind. This is just more oil for your biased fire of Nvidia hate.

Why do you let your personal distaste for Nvidia and their politics poison your view and why do you bring that distaste into every technical discussion about Nvidia, raytracing etc? It's like you're on a crusade.

It's getting tiresome.
 


What a joke. Time for the AMD guys to stop posting those fake Watchdogs RT benchmarks.


It is good that we can finally see the game running on AMD with proper Ray Tracing effects. I remember we had a bunch of posters posting FUD and misinformation that the developers somehow left out RT effects on purpose or purposely rendered with lower quality effects on this game for AMD.

Even when corrected multiple times as this was called out in AMD's reviewer documents under "known issues" for the missing RT reflections, some people still pretended they didn't see it so they could brand warrior in some threads. I'm glad to see it finally cleared up and retested. Oddly enough it turns out that it wasn't AMD's driver issues but Ubisoft had not officially enabled the RT functionality on AMD cards yet, which they did with a recent patch.

As for the performance itself, it is actually pretty respectable compared to Nvidia cards. Granted that is without DLSS enabled. Obviously the Nvidia cards take the lead at their respective tiers, but for a title where the RT implementation/PC port was sponsored by Nvidia and optimised for their cards that is not a terrible result at all. A far cry from fully path traced titles like Minecraft and Quake for example.
 

Ascend

Member
You hated Nvidia before that, this incident hasn't changed your mind. This is just more oil for your biased fire of Nvidia hate.
I freely admit that this is 100% true. Because that is not the first incident and it will not be the last.
TWIMTBP
PhysX
DX10.1
GameWorks
970 3.5GB
GPP
Then there's the HUB incident.

And I'm sure I forgot a few.

Tell me. Is that better than all those shills that pretend to be neutral? What do you prefer? Passive aggressiveness or honest open dislike?

Why do you let your personal distaste for Nvidia and their politics poison your view and why do you bring that distaste into every technical discussion about Nvidia, raytracing etc? It's like you're on a crusade.
Where in this thread, have I said something that is technically inaccurate?

It's getting tiresome.
What's getting tiresome is the likes of you believing lies others are saying about me and then also having a go at me so that you can feel like you're on the 'right side'.
 
Last edited:
Holy meltdown! Lol, and I'm the passive aggressive one.... Can you imagine having that kind of intimate relationship with a company?! Yikes 😳.

Anyways, it's good that Vulkan drivers and everything have been updated, so devs, consumers, etc can run all the benches, games, etc. It should have been that way from day one, but at least they didn't wait till next year. Hoping RDNA3 punch above their weight, so raytracing for ALL sides get a boost. Which one will win, may be closer or further apart, than ever. Supposedly 3080 Ti in February, and maybe another sku from AMD shortly after.
 

Ascend

Member
Holy meltdown! Lol, and I'm the passive aggressive one.... Can you imagine having that kind of intimate relationship with a company?! Yikes 😳.
I'm still waiting for your evidence that;

a) You have spoken out against people shilling for nVidia or against AMD
b) Where I supposedly said that AMD's RT is better than nVidia's
c) I supposedly have more posts in the Nvidia off topic thread than AMD threads

Anyways, it's good that Vulkan drivers and everything have been updated, so devs, consumers, etc can run all the benches, games, etc. It should have been that way from day one, but at least they didn't wait till next year. Hoping RDNA3 punch above their weight, so raytracing for ALL sides get a boost. Which one will win, may be closer or further apart, than ever. Supposedly 3080 Ti in February, and maybe another sku from AMD shortly after.
And there's the virtue signaling again. "RDNA3 punch above its weight" i.e. implying it's pretty much guaranteed to be slower nVidia's thus it needs to 'punch above its weight' to actually have a chance of catching up. The truth always shines through with the language used.

But. I agree. It's good that the Vulkan API has been upgrading to have RT on both nVidia and AMD cards. Maybe in a couple of years RT will actually be usable for a reasonable amount of money.

What's wrong with physx, i liked it and miss it a lot.
You could buy a cheap nVidia card to use it just for PhysX along an AMD card. nVidia disabled that option in their drivers and then blamed AMD for it.
And prepare for the amount of people that are going to jump on me saying that it was indeed AMD's fault.
 
Last edited:

Dampf

Member
There's a world of difference in terms of visual sharpness and clarity between DLSS on & native. 1440p DLSS means sub 1080p. It's whatever. It also gimps RT effects like reflections so you get to enjoy 960p RT reflections instead of 1440p. So, let's not kid ourselves about how good it is - it might still be worth using for the performance, but you're nowhere near native.






Cyberpunk 2077 TAA has sharpening integrated which you can't turn off

(https://forums.cdprojektred.com/index.php?threads/forced-sharpening-in-1-04.11045372/)

DLSS sharpness however, is inactive
https://pastebin.com/mg5Mh3sV

[DLSS]
Enable = false
EnableMirrorScaling = true
MirrorScaling = -1.000000
Quality = 2
Sharpness = 0.000000 )

This is the reason the game appears blurry with DLSS on. However, this is fixed by using ReShade CAS or Sharpening in Nvidia Control Panel.

Here's a fair comparison, with DLSS Performance+CAS (1440p, 720p internal) vs native 1440p and integrated game sharpening.


The DLSS image, even on performance mode, actually has more detail than native rendering. Look at the 6909 number on the car beneath the left brake light, the number is visable with DLSS, but not with TAA. The car textures are more detailed and on the wall textures above the car, there are additional wear spots.
 
Last edited:

regawdless

Banned
I freely admit that this is 100% true. Because that is not the first incident and it will not be the last.
TWIMTBP
PhysX
DX10.1
GameWorks
970 3.5GB
GPP
Then there's the HUB incident.

And I'm sure I forgot a few.

Tell me. Is that better than all those shills that pretend to be neutral? What do you prefer? Passive aggressiveness or honest open dislike?


Where in this thread, have I said something that is technically inaccurate?


What's getting tiresome is the likes of you believing lies others are saying about me and then also having a go at me so that you can feel like you're on the 'right side'.

See, the problem is that you are using categories like the "right side" or wrong side. You're very agenda driven. Openly admitting that you dislike Nvidia doesn't make it better.

Like, what's your goal? You won't change anything because Nvidia doesn't care about you. You just ridicule yourself in an online forum, while members and lurkers are shaking their heads, asking themselves how invested someone can be regarding the actions of a company that sells GPUs.
 
Last edited:

Soltype

Member
You could buy a cheap nVidia card to use it just for PhysX along an AMD card. nVidia disabled that option in their drivers and then blamed AMD for it.
And prepare for the amount of people that are going to jump on me saying that it was indeed AMD's fault.
Nvidia bought Physx right?Why would they allow people to use a competitors product with their investment?
 

Ascend

Member
See, the problem is that you are using categories like the "right side" or wrong side.
Oh give me a break. It's quite clear that if you say anything negative about RT on here, people jump on you for disagreeing. You're only allowed to be positive towards RT. That's when the sides were created. I did not make them up.

You're very agenda driven.
And all the ones hyping up RT aren't?

Openly admitting that you dislike Nvidia doesn't make it better.
Well then. I guess that says enough.

Likey what's your goal?
I don't think you honestly want to know. It's quite clear that I'm pretty much anti-hype. Because hype makes people make stupid buying decisions. And RT is the biggest hype right now.

You won't change anything because Nvidia doesn't care about you.
I know nVidia doesn't care about me. They don't care about anyone else in here, and that's the point. But the adoration of RT is through the roof.

You just ridicule yourself in an online forum, while members and lurkers are shaking their heads, asking themselves how invested someone can be regarding the actions of a company that sells GPUs.
This is not about AMD vs nVidia. This is about gamers vs corporations. The AMD vs nVidia thing is what people paint it as to ignore the fact that nVidia is a shit company, so that they can still feel good about buying their products.

quote-first-they-ignore-you-then-they-laugh-at-you-then-they-fight-you-then-you-win-mahatma-gandhi-68010.jpg


Nvidia bought Physx right?Why would they allow people to use a competitors product with their investment?
Again... You could have had a main Radeon graphics card, buy a $50 nVidia card, and use that $50 nVidia card to enable PhysX. Both the graphics cards would need to be in your system at the same time. And nVidia disabled that feature by disabling PhysX as soon as any Radeon graphics card was detected.
You still bought THEIR card to run it, and then you couldn't use it anymore for PhysX, unless you removed the Radeon card from your system.
 
Last edited:

regawdless

Banned
Oh give me a break. It's quite clear that if you say anything negative about RT on here, people jump on you for disagreeing. You're only allowed to be positive towards RT. That's when the sides were created. I did not make them up.


And all the ones hyping up RT aren't?


Well then. I guess that says enough.


I don't think you honestly want to know. It's quite clear that I'm pretty much anti-hype. Because hype makes people make stupid buying decisions. And RT is the biggest hype right now.


I know nVidia doesn't care about me. They don't care about anyone else in here, and that's the point. But the adoration of RT is through the roof.


This is not about AMD vs nVidia. This is about gamers vs corporations. The AMD vs nVidia thing is what people paint it as to ignore the fact that nVidia is a shit company, so that they can still feel good about buying their products.

quote-first-they-ignore-you-then-they-laugh-at-you-then-they-fight-you-then-you-win-mahatma-gandhi-68010.jpg

So your goal is to prevent people from buying Nvidia products because you don't like their politics.

That's great.

Keep up the good fight, our blessed morally superior crusader. Don't let the corps win.
 

Ascend

Member
So your goal is to prevent people from buying Nvidia products because you don't like their politics.

That's great.

Keep up the good fight, our blessed morally superior crusader. Don't let the corps win.
That's quite narrow. When the MSRP of the 6800XT Nitro+ was revealed to be over $100 higher than the reference card, I immediately said that I would not buy that card. Want the proof? Here;



So I will repeat yet again, since apparently it shot out the other ear. This is not about AMD vs nVidia. It's about people not making stupid buying decisions based on hype.

But thanks for judging....
 
Last edited:

regawdless

Banned
That's quite narrow. When the MSRP of the 6800XT Nitro+ was revealed to be over $100 higher than the reference card, I immediately said that I would not buy that card. Want the proof? Here;



So I will repeat yet again, since apparently it shot out the other ear. This is not about AMD vs nVidia. It's about people not making stupid buying decisions based on hype.

So you're not at all about Nvidia vs AMD, you just want to help people make the right buying decision. While openly hating Nvidia, but that surely has no influence at all about your assessment what a good and intelligent buying decision is and what isn't.

Very convincing, I must say.

Sorry for the derail, let's get back to the topic and stop the individual psychoanalysis :messenger_tears_of_joy: :messenger_winking:
 
Last edited:

Soltype

Member
Again... You could have had a main Radeon graphics card, buy a $50 nVidia card, and use that $50 nVidia card to enable PhysX. Both the graphics cards would need to be in your system at the same time. And nVidia disabled that feature by disabling PhysX as soon as any Radeon graphics card was detected.
You still bought THEIR card to run it, and then you couldn't use it anymore for PhysX, unless you removed the Radeon card from your system.
I ain't mad at them on that one, they're making sure you have 2 of their cards to run their stuff.That's straight up protecting an investment.
 

Lethal01

Member
Nvidia is clearly ahead, that said I think it's reasonable to claim that AMD is underperforming partly because Nvidia has had more time to optimize their stuff due to launching over a year earlier.
 

Lethal01

Member
Someone really needs to explain to me why a feature meant to streamline the development process for artist is being sold to gamers like it is...

Again, I’m not saying you shouldn’t be impressed with the results or anything but this is just a graphics feature that doesn’t radically change the game in a significant way.

Anyone remember physx in Alice Madness Returns? The entire game felt empty and soulless without it. Ray tracing just makes everything look “different” but not better.


If something like this made tangible differences like world destruction, particle effects, fluid physics, etc I could be on board with it but ray tracing is just so...eh


Because it's not just stream lining the development pipeline. It's making the graphics in those games better.
You could say "it's not better just different" while I could say that Cyberpunk or ghost of tsushima isn't "better" looking. than Mario64, just different
 

Blond

Banned
Because it's not just stream lining the development pipeline. It's making the graphics in those games better.
You could say "it's not better just different" while I could say that Cyberpunk or ghost of tsushima isn't "better" looking. than Mario64, just different

I have every available game with ray tracing on the PS5 and I can think of 2 instances that looked "Better/Different"

1. DMC 5, stage 12 there was a bunch of puddles that suddenly reflected and it made the scene have a bit of depth but before that I forgot the ray tracing mode was even on.

2. The Miami stage on COD: Cold War, looks like a lit neon night after it rains.

Beyond that nothing looks "better" ray tracing in Spiderman is less impressive because you could honestly screenshot identical scenes and not notice a difference and the ONE TIME I thought "Yes, that's for sure ray tracing!" I turned it off and realized the reflections/lights looked identical but the game looked like it lacked ambient occlusion which I'm sure they could implement in the normal game.

Like I said, different but not "better" this is a feature meant for developers, not gamers. One of those in the background task that we shouldn't concern ourselves with yet here we are.
 
Last edited:

Chiggs

Member
I gotta hand it to Ascend Ascend here. Busting out the Gandhi quote just takes the showmanship to lofty new heights. A tip of that hat, good sir.

Tell me RTX isn't the single most divisive and volatile subject on this forum? It just pulls you in, and soon you're saying and doing things you'd never imagine yourself saying or doing.
 
Last edited:

Buggy Loop

Member
You hated Nvidia before that, this incident hasn't changed your mind. This is just more oil for your biased fire of Nvidia hate.

Why do you let your personal distaste for Nvidia and their politics poison your view and why do you bring that distaste into every technical discussion about Nvidia, raytracing etc? It's like you're on a crusade.

It's getting tiresome.

He probably accepted this incident or turns a blind eye to it


That’s why we say AMD is not your friend. Nvidia nor AMD are my friends, they’re corporations that makes cards, I give my money to either when it makes the most sense, period. Youtubers don’t deserve your simping either ffs 🤦‍♂️ especially not for a freebie.
 

Lethal01

Member
I have every available game with ray tracing on the PS5 and I can think of 2 instances that looked "Better/Different"

1. DMC 5, stage 12 there was a bunch of puddles that suddenly reflected and it made the scene have a bit of depth but before that I forgot the ray tracing mode was even on.

2. The Miami stage on COD: Cold War, looks like a lit neon light after it rains.

Beyond that nothing looks "better" ray tracing in Spiderman is less impressive because you could honestly screenshot identical scenes and not notice a difference and the ONE TIME I thought "Yes, that's for sure ray tracing!" I turned it off and realized the reflections/lights looked identical but the game looked like it lacked ambient occlusion which I'm sure they could implement in the normal game.

Like I said, different but not "better" this is a feature meant for developers, not gamers. One of those in the background task that we shouldn't concern ourselves with yet here we are.

Every single window you swing past looks noticeable worse without raytracing to me. I could be here all day listing specific, Same game for Watch dogs. Cyberpunk 2077 looks a generation ahead using raytracing.

I suppose we can agree there are specific scenes where it adds nothing. Obviously if a game is only using raytracing for reflective surfaced then a totally rough scene won't gain any benefits. But the majority of the time it's adding a lot.
 

3liteDragon

Member
Benchmarks paid by nvidia don´t count.
Nah man, these benchmarks are real. I actually thought in the beginning that AMD would be able to compete with NVIDIA when it came to ray-tracing performance. I was pretty naive when it came to this by not knowing the R&D process NVIDIA had going on for years before all this to get to where they are now with real-time ray-tracing, but then I read the white paper for Ampere and learned just how much of the RT process was hardware-accelerated.


They have HW-acceleration for BVH traversal, ray/triangle and bounding box intersections, and instance transformation (someone correct me if I’m wrong on this, but I’m guessing this is HW acceleration for rapidly updating any asset changes like breaking glass for example but every bits of glass being shattered is being updated in the BVH and being ray-traced in real-time instead of the object being completely removed from the BVH). The Ampere cards are level 3 cards in ray-tracing, there are six different levels of RT and to achieve FULL HW-accelerated RT, it takes more time and research (and of course, more custom hardware).

  • Level 0: Legacy solutions
  • Level 1: Software on traditional GPUs
  • Level 2: Ray/box and ray/tri-testers in hardware
  • Level 3: Bounding Volume Hierarchy (BVH) processing in hardware
  • Level 4: BVH processing and coherency sorting in hardware
  • Level 5: Coherent BVH processing with Scene Hierarchy Generation (SHG) in hardware

As been pointed out before and elsewhere, ray tracing is not a new subject or a new computer technique. The following is a bit of an expansion on the six levels proposed by Imagination.


Level 0: There have been many ambitious Level 0 attempts but all unfortunately failed, and yet new designs with custom APIs continue to be announced. The biggest reason for failure was the discontinuity with how traditional GPUs process data. Part of the failure has been trying to create a new paradigm. Without continuity, a completely new and not compatible ecosystem is imposed and doesn’t offer an evolutionary adoption. Imagination Technologies’ OpenRL was the first attempt to have a link with standard 3D APIs such as OpenGL.


Level 1: Ray tracing has been treated as an app and runs on conventual processors, x86 being most common. Such a software solution ensures continuity with the existing ecosystem. Compute/Shader paths are used to execute ray tracing functionality. However, because a scene can have so many rays running simultaneously, a 2-, 4-, or even 16-core CPU will have difficulty with performance due to computational load. For realtime experience, one must use many tricks, hacks, and shortcuts as well as limit the resolution.


An example is Adshir’s LocalRay where the secondary rays are handled apriority in coherent beams. This not only improves the parallelism and performance but cache usage as well. It is not limited in resolution/usage and no tricks.


Level 2: Ray-box and ray-triangle testers can be implemented in hardware using standard fused multiply-add operations on GPUs but this repeated operation is expensive (cycles/power/area cost). A Level 2 solution offloads a large part of the ray tracing job to dedicated hardware improving efficiency.


Level 3: Bounding Volume hierarchical (BVH) processing provides a more extensive offloading of data flow management in hardware. BVH helps cut down the amount of ray testing needed through a hierarchical testing system thus making realtime ray tracing possible. Tracing a ray through the acceleration structure is much more complicated than just ray-box and ray-triangle testing. Complex and dynamic data flow is required where each box test step decides what happens next, e.g., more hierarchical box tests and/or triangle tests. There are significant opportunities to streamline this process by moving the full BVH tree structure walking into hardware. It can improve execution efficiency, bandwidth, and caching efficiency, enabling the next level of ray tracing acceleration.


Level 4: BVH processing with coherency sorting in hardware can increase the processing and bandwidth efficiency of ray tracing. Ray tracing struggles with coherency as bouncing rays generate ever more divergence in ray directions. Each ray needs to walk through the BVH structure and if each ray follows a different path this results in very poor memory access efficiency and caching. As divergent rays also hit different objects, this mismatches with the SIMD nature of all modern GPU architectures: different ray hits mean different shaders. A hardware coherency sorting engine can enable this 4th level of efficiency. Adding coherency sorting across the rays in flights helps with SIMD and BVH memory access efficiency for higher real-world ray rate utilization. This type of hardware coherency engine is similar to Imagination Technologies’ tile-based deferred rending (TBDR) which uses a unique sorting block to ensure coherent processing of pixels, the coherency engine enables the same for rays. A hardware ray coherency sorting engine enables this 4th level of efficiency.


Level 5: Full acceleration of the ray tracing processing in hardware. Building an efficient BVH structure is complex and expensive. It can be done on the CPU and/or the GPU using a variety of algorithms and approaches. However, achieving optimal level 5 efficiency calls for a dedicated hardware solution. A hardware BVH builder enables much higher performance with high efficiency for very detailed dynamic 3D scenes. When this capability is added to a lower than Level 4 hardware design, it can be recognized as a plus level, e.g., Level 2 Plus solution.

AMD’s RDNA 2 cards are level 2 cards in ray-tracing since they only have custom hardware acceleration for ray/triangle and ray/bounding box intersection tests, that’s pretty much it.


The part highlighted in red here is what the RX 6000 series cards (level 2 RT solution) have HW acceleration for, versus what NVIDIA has HW acceleration for (level 3 RT solution).
ZPQT8Cf.jpg


NVIDIA’s so ahead they even moved on to ray-traced MOTION BLUR.
lRgqNtN.jpg


I think AMD will start to improve with RT performance with RDNA 3 and 4 by a lot, I don’t think it’s fair to just shit on them since this is their first attempt at it, it’ll only get better from there. But I think NVIDIA will probably achieve level 5 by the time the RTX 50 or 60 series cards are out because of how early they started R&D on this. I was actually planning to get the 3080 but might as well wait for the 3070 Ti next year.
 
Last edited:
Top Bottom