• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Whatever you do, don't buy a RX 5000 Series GPU. And if you have one, sell it.

So...basically we are back at "the card doesn't have RT so card = bad"?

You are stanning here for a card with very weak RT performance, mentioning that to get it running well on new games you can dial down or adjust settings, if you are going to do that then what is point of even turning on RT on a card with such weak RT performance to begin with? You realise that you can simply turn off RT on 5700XT right? And with RT off it will outperform the 2060 and 2060 super. Otherwise all of the same arguments you make for a 2060 for example can apply to the 5700XT as well.

Simply put the 5700XT is a hands down better, more performant card than the 2060 or 2060 Super. Your RT and DX12 feature set arguments don't really stand up to scrutiny, sorry.
 

spyshagg

Should not be allowed to breed
This is just a stupid argument. It won't run DXR titles at Ultra Max Settings in 3 years and it doesn't have to, but it certainly will run with great , next gen graphics and great performance if you adjust the settings accordingly.

No, they wont work great with lower settings! neither card from 2019 from that price bracket will! They struggle today, let alone in 3 years.

Your argument is lost. It happens. Get over it.
 
Last edited:
Funnily enough for a 2018 architecture, Turing does indeed have hardware support for the DX12U feature set.

DX-12-Feature-Capabilities.jpg

that's crazy impressive indeed considering 2018. i could remember about mesh shaders from the turing presentation, but that it even has sampler feedback.

is that from anandtech?
 

Dampf

Member
So...basically we are back at "the card doesn't have RT so card = bad"?

You are stanning here for a card with very weak RT performance, mentioning that to get it running well on new games you can dial down or adjust settings, if you are going to do that then what is point of even turning on RT on a card with such weak RT performance to begin with? You realise that you can simply turn off RT on 5700XT right? And with RT off it will outperform the 2060 and 2060 super. Otherwise all of the same arguments you make for a 2060 for example can apply to the 5700XT as well.

Simply put the 5700XT is a hands down better, more performant card than the 2060 or 2060 Super. Your RT and DX12 feature set arguments don't really stand up to scrutiny, sorry.
The point of turning RT on is to get dramatically better visual quality. Even at lower settings, it makes a huge difference once its turned on in games like Metro, Control and Watch Dogs. And with DLSS, most of the times you can get much more performance back than RT at moderate settings reduces.

And again, you assume you can turn off RT forever. This won't happen, read my OP please:

Next, there is Raytracing. Raytracing is not a gimmick. Raytracing saves so much time for devs and it gets more efficient each day it's going to be an integral part of Next Gen games in the future. Even the consoles support it in decent fashion and it is the future of rendering, don't let anyone tell you otherwise. Nvidia recently released their RTXGI SDK which allows for dynamic GI using updated light probes via Raytracing and it doesn't destroy performance, it is extremly efficient. This means developers don't have to pre-bake lighting anymore and save so much time and cost when developing games. RTXGI will work on any DXR capable GPU, including consoles and RDNA2. However, the 5700XT is not even capable (well it could be if AMD would enable DXR support) of emulating it in software, meaning a game using RTXGI as its GI solution won't even boot up on a 5700XT anymore. However, giving that the RT capable userbase is growing each day with Turing, Ampere, Pascal, RDNA2 and the consoles, this is likely a non issue for devs. If AMD decides to suddenly implement DXR support for the 5700XT you could still play the game, but with much worse performance and visual quality than the DX12U capable GPUs due to the lack of hardware acceleration for Raytracing.
 
Last edited:
And again, you assume you can turn off RT forever. This won't happen, read my OP please:

Let me make this clear for you with a definitively true statement: You will absolutely always be able to turn off RT in games. In the same way that you can turn up/down/off AA, Shadows, AO, AF and the dozens of other graphical effects that have existed for decades and been incorporated into game engines far longer than RT currently is.

This argument that RT will be mandatory within the next 2 years on PC is completely bizarre and totally detached from reality. Even in 10 years time when all engines are built for RT from the ground up there will absolutely always be a fallback raster option and ability to turn it off.
 

mr.dilya

Banned
You seem pretty angry. What did I do? Did I make you feel insecure about your purchase? Maybe that's a good thing, as cards like yours hold back game development.

I mean, as long as you want to play cross gen titles with last gen console graphics, you will be fine.


lol @ holding back game development.

you seem like the angry one, going out of your way to tell people to sell off their cards that are running the games they want to play just fine, to go and try and get a card that isn’t even available. You are a terrible salesman. Nvidia should fire you and hire someone more competent.
 

Dampf

Member
Let me make this clear for you with a definitively true statement: You will absolutely always be able to turn off RT in games. In the same way that you can turn up/down/off AA, Shadows, AO, AF and the dozens of other graphical effects that have existed for decades and been incorporated into game engines far longer than RT currently is.

This argument that RT will be mandatory within the next 2 years on PC is completely bizarre and totally detached from reality. Even in 10 years time when all engines are built for RT from the ground up there will absolutely always be a fallback raster option and ability to turn it off.
Nope, sorry. We will simpy use DXR in software to still get gather to that 1060 userbase.

You really want us to waste so much money and effort on manually baking lights and using all sort of tricks to give you a nice visual presentation ? You know that process costs a lot more than the few RDNA and Maxwell users could possibly bring...? And remember, with new console generations, PC specs rise...

In 2-3 years, many PC games will require DXR/VKRT to work properly. Raytracing finally allows developers to breathe and put money and time where it counts.
 
Last edited:
Nope, sorry. We will simpy use DXR in software to still get gather to that 1060 userbase.

You really want us to waste so much money and effort on manually baking lights and using all sort of tricks to give you a nice visual presentation ? You know that process costs a lot more than the few RDNA and Maxwell users could possibly bring...? And remember, with new console generations, PC specs rise...

In 2-3 years, many PC games will require DXR to work properly. Raytracing finally allows developers to breathe and put money and time where it counts.

Who is this "we" and "us" you talk about? Are you a game developer at an AAA studio or something? I think not given your demonstrated lack of knowledge in the PC gaming space and your detached from reality conclusions.

You can believe whatever you want, no matter how delusional but don't expect the rest of us to turn off our brains and buy into your fantasy. I think you really need to ask Jensen for a raise, or at the very least a new limited edition leather jacket or something.

I don't see this going anywhere productive and I don't fancy wasting my time bashing my head against a brick wall so I think I'm out.
 

Papacheeks

Banned
Nope, sorry. We will simpy use DXR in software to still get gather to that 1060 userbase.

You really want us to waste so much money and effort on manually baking lights and using all sort of tricks to give you a nice visual presentation ? You know that process costs a lot more than the few RDNA and Maxwell users could possibly bring...? And remember, with new console generations, PC specs rise...

In 2-3 years, many PC games will require DXR/VKRT to work properly. Raytracing finally allows developers to breathe and put money and time where it counts.



And this:



Software based ray tracing through custom engines like Crytek.

It can be done, so just stfu you really have no clue and everyone in here has tried beyond their hardest to be nice in correcting your non-sensical outlook on GPU's.

Feel bad for Ryujin
 
Last edited:

mr.dilya

Banned
I think this would be a better thread if stock weren't so abysmal right now. As it stands right now, selling your 5700 XT because OP told you to is cutting off your nose to spite your face. And depending on when stock (and, just as important, price) gets sorted out, it might not be the worst stop gap out there.

exactly. It’s one thing to recommend the 3060ti over the 5700xt. All things being equal and if the 3060ti were actually available and someone was looking to upgrade, no question you get the 3060ti.

but this dude is a weirdo telling people to sell their gpus just cause he doesn’t like them. Lol what.
 

Dampf

Member


And this:



Software based ray tracing through custom engines like Crytek.

It can be done, so just stfu you really have no clue and everyone in here has tried beyond their hardest to be nice in correcting your non-sensical outlook on GPU's.

Feel bad for Ryujin

I never said it wasn't possible. I say all the time it would be possible for AMD to enable DXR on the RDNA cards. It probably would do a decent job at software Raytracing giving its compute capabilities.

Or of course you can build custom solutions like Crytek did, but that is tied to huge development time and effort that is better spent on somewhere else. So either AMD enables DXR on the 5700XT or it will have even more trouble in the future...
 
Last edited:

smbu2000

Member
The 5700XT is still a good card. If you can get one at a good price, then it should play most games fairly well at 1080P/1440P.

I sold off my 2080Ti and I'm using my 5700XT for now.
 

Emedan

Member
The irony is, everyone is living the belief that Intel and Nvidia are the bad guys, which is driven by nothing but their product pricing, but then again, why would anyone price their product the same as their competition that half as capable, but no one seems to be noticing that the closer in performance AMD CPUs/GPUs get to Intel/NV, so are the prices, they're basically in the same league now on both fronts, so there are no excuses anymore to underdeliver in any aspect.

I wouldn't call Nvidia a bad guy, they've been innovating the past decade. Intel on the other hand has been so complacent the last 10 years it has hold back a whole industry, thank God AMD kicked them in the nuts with the Zen. Price parity has been Intel lowering theirs due to actual competition from AMD, they can't sell a $1000 CPU being outperformed by a CPU half that price, fucking bullshit Intel decade old garbage architecture.
 

Knightime_X

Member
I had an RT 5700 XT Challenger by Asrock
Straight trash card.

All it did was crash with a black screen.
Lost count how many times I screwed around with the drivers.

I did get it to work by replacing it with an RTX 2060 Super.
Haven't had a problem since!
 
Top Bottom