• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Whatever you do, don't buy a RX 5000 Series GPU. And if you have one, sell it.

Dampf

Member
Hello,

I still come across RX 5000 recommendations on the internet, specifically the 5700XT, because it has a good price to performance ratio. That might be correct, but only in the present and certainly not in the future.

Why not? Because this cards architecture, RDNA1, is horribly outdated. Let me explain.

This card will simply age like milk because it lacks the featureset neccessary for next generation gaming. It doesn't have DX12 Ultimate support, no DirectStorage, no INT8/4 ML operations support, and of course no HW-accelerated Raytracing. In short, it lacks every features the next gen consoles including the lowest common denominator, the Series S has and thus, will fall significantly behind even that one once true next gen games hit.

The DX12U features are crucial for supporting next generation graphics and performance, let me explain why:

With Sampler Feedback Streaming, modern GPUs from the Ampere, Turing and RDNA2 architecture as well as the new Xbox consoles have around 2.5-3.5x the effective VRAM amount compared to cards without SFS (basically, the 5700XT) (Inform yourself here what this technology does: https://microsoft.github.io/DirectX-Specs/d3d/SamplerFeedback.html ) Simply speaking, it allows for much finer control of texture MIP levels, meaning your VRAM can be used far more efficiently than before. That will result in much higher resolution textures for DX12 Ultimate compatible cards without the need to increase physical VRAM and eliminates stuttering and pop in. Basically, even a entry level RTX card like a 2060 Super has around 20 GB and more effective VRAM compared to a 5700XT.

Next, we have mesh shading. That is a key technology as it replaces current vertex shaders and allows for much, much finer LOD and higher geometry, similar like what you saw at the PS5 Unreal Engine 5 demo. On PS5, it is using Sony's customized next generation Geometry Engine, which is far beyond the standard GE from RDNA1. On the PC however, Nanite will be using mesh shaders, most likely, which the 5700XT does not support, meaning it is incapable of handling that much geometry or it will have huge performance issues when trying to emulate it in software.

VRS can give you 10-30% or even higher performance boost at almost no image quality cost.

Next, there is Raytracing. Raytracing is not a gimmick. Raytracing saves so much time for devs and it gets more efficient each day it's going to be an integral part of Next Gen games in the future. Even the consoles support it in decent fashion and it is the future of rendering, don't let anyone tell you otherwise. Nvidia recently released their RTXGI SDK which allows for dynamic GI using updated light probes via Raytracing and it doesn't destroy performance, it is extremly efficient. This means developers don't have to pre-bake lighting anymore and save so much time and cost when developing games. RTXGI will work on any DXR capable GPU, including consoles and RDNA2. However, the 5700XT is not even capable (well it could be if AMD would enable DXR support) of emulating it in software, meaning a game using RTXGI as its GI solution won't even boot up on a 5700XT anymore. However, giving that the RT capable userbase is growing each day with Turing, Ampere, Pascal, RDNA2 and the consoles, this is likely a non issue for devs. If AMD decides to suddenly implement DXR support for the 5700XT you could still play the game, but with much worse performance and visual quality than the DX12U capable GPUs due to the lack of hardware acceleration for Raytracing.

Remember, everything I talked about also applies to AMD's new RDNA2 architecture. It fully supports these features as well. Basically, RDNA1 to RDNA2 is a much, much bigger jump than you might realize.

The 5700XT might be fine for the cross generation if you can live with reduced visual quality, but once next gen games using more DX12U features and DirectStorage hit, the RX 5000 series will struggle heavily, while even Turing cards, the competitor to RDNA1, will still push next gen console settings at higher framerates than the consoles thanks to DLSS, as it supports the next gen featureset.

In short, do not buy a 5700XT or any card from the RDNA1 generation. And if we are at it, don't buy used Pascal cards either if you plan to play next gen games.
 

Dampf

Member
Just buy it if it's super cheap and you're on a budget. The terrible performance op is saying is grossly exaggerated as long as you're gaming on 1080p/1440p you'll be fine.

I don't get the hate honestly. I own 2080ti btw cus I know I sound like a defensive 5700xt owner right now lol
Please read my OP.

I am not talking about cross generation games, but about next gen games built around the DX12U featureset.

Current generation titles are fine and cross gen will be if you can sacrifice visual quality and performance. But next gen games will run horribly on that card, or might not run at all (if AMD still doesn't bother to enable DXR)
 

ZywyPL

Banned
The card wasn't worth it on its release date to begin with. At this point due to upcoming DX12U features I'm more interested to see how the Turing cards will age, because technically they have all the features already, just sitting there waiting to be used, and who knows, maybe it'll turn out that despite their ridiculous prices they were't that bad of an investment after all.

But then again, when will DX12U kick in? My guess is not within next 2-3 years, because A) PS4/XB1 will still be supported until then, and B) PCs aren't any better in regards of adopting the new tech, and by that time we will have RDNA3 and Hopper GPUs.

So as always when it comes to PC - buy for the now and enjoy it, if you can get a dirty cheap 5700 today and you game on 1080p 60Hz display it'll do a great job for the next couple of years.
 

GHG

Gold Member
Well, Hardware Unboxed still seems to be proud of that they recommended that outdated cards to their viewers. As a tech channel, this is just irresponsible.

Honestly in starting to think they are just AMD shills. Either that or they have an irrational hate hard on for Nvidia.

When reviewing the 6800 series cards they completely skipped over ray tracing saying "it's not worth it at the moment".

That's not how you objectively review a piece of hardware.
 

MadYarpen

Member
Well I was thinking about it considering the lack of RDNA2 supply, but their prices went crazy.

In some cases they are around 550 eur in my counry (with taxes).
 

Dampf

Member
Old cards getting old for future games?
I would never have thought that.

If you give me 500 bucks I'll buy another one : )
That is not a valid argument, as the Turing RTX series from 2018 has the next gen featureset, while RDNA1 from 2019 does not. Turing has ML acceleration, DirectStorage support, HW-Raytracing and the DX12Ultimate featureset and thus, will age quite nicely, as opposed to RDNA1.

Honestly in starting to think they are just AMD shills. Either that or they have an irrational hate hard on for Nvidia.

When reviewing the 6800 series cards they completely skipped over ray tracing saying "it's not worth it at the moment".

That's not how you objectively review a piece of hardware.

Sometimes it seems like they are shills, but I don't think they really are, most of the time they are objective. However, Steve clearly has a negative opinion about Raytracing and his mistake is, he basically lets his personal preference affect the reviews from Hardware Unboxed and that is clearly a huge mistake because as you said, it doesn't lead to a good, objective review for the hardware.

And I personally disagree with him that RT is not worth it. RT in Watch Dogs, Control and Metro is absolutely worth it, it really looks so much better.
 
Last edited:

Papacheeks

Banned
Please read my OP.

I am not talking about cross generation games, but about next gen games built around the DX12U featureset.

Current generation titles are fine and cross gen will be if you can sacrifice visual quality and performance. But next gen games will run horribly on that card, or might not run at all (if AMD still doesn't bother to enable DXR)

But they have shown games like AC VAHALLA being played on it without issue? Games scale you know that? It's not like anyone with a 20sereis card is fucked for playing games. Just you wont be bale to do crazy ray tracing or have the game on ultra.

This thread makes no sense,
 

mr.dilya

Banned
Honestly in starting to think they are just AMD shills. Either that or they have an irrational hate hard on for Nvidia.

When reviewing the 6800 series cards they completely skipped over ray tracing saying "it's not worth it at the moment".

That's not how you objectively review a piece of hardware.

He says the same thing you just said, that now he recommends getting a 3060ti over a 5700xt if you are coming from an older card. So how is he shilling for AMD?
 

Dampf

Member
But they have shown games like AC VAHALLA being played on it without issue? Games scale you know that? It's not like anyone with a 20sereis card is fucked for playing games. Just you wont be bale to do crazy ray tracing or have the game on ultra.

This thread makes no sense,
AC Valhalla is not a next gen game, it is a current gen game running at 60 FPS on the next gen consoles.

It doesn't even use ANY DX12 Ultimate feature.

And I already stated RTX 2K series is fine, due to the compatible featureset. I never said anything about cranking settings to ultra in next gen only games, that is not what this thread is about.
 
Last edited:

bad guy

as bad as Danny Zuko in gym knickers
Age like milk? I love me some good cheese together with some old games.
 
Last edited:

Silver Wattle

Gold Member
That is not a valid argument, as the Turing RTX series from 2018 has the next gen featureset, while RDNA1 from 2019 does not. Turing has ML acceleration, DirectStorage support, HW-Raytracing and the DX12Ultimate featureset and thus, will age quite nicely, as opposed to RDNA1.



Sometimes it seems like they are shills, but I don't think they really are, most of the time they are objective. However, Steve clearly has a negative opinion about Raytracing and his mistake is, he basically lets his personal preference affect the reviews from Hardware Unboxed and that is clearly a huge mistake because as you said, it doesn't lead to a good, objective review for the hardware.

And I personally disagree with him that RT is not worth it. RT in Watch Dogs, Control and Metro is absolutely worth it, it really looks so much better.
He doesn't hate RT, FFS.
He, like many that can see past the hype, are saying RT is in its infancy and needs more time to get better.
 

Papacheeks

Banned
AC Valhalla is not a next gen game, it is a current gen game running at 60 FPS on the next gen consoles.

It doesn't even use ANY DX12 Ultimate feature.

What you talking bout willace?


Ubisoft has just revealed the official PC system requirements for Assassin’s Creed Valhalla. According to the specs, Assassin’s Creed Valhalla will be the first AC game that will support DirectX 12. As such, we hope that the game will perform better than all the previous Assassin’s Creed games.

Additionally, the PC version will feature a built-in benchmark tool, as well as in-depth customization options. Naturally, PC gamers can expect support for ultra-wide monitors, as well as uncapped framerates.

For gaming at 1080p/30fps with the High preset, Ubisoft suggests using an AMD Ryzen 5 1500 or Intel i7-4790 with 8GB of RAM and an AMD RX 570 or NVIDIA GeForce GTX 1060. For gaming at 1080p/60fps with the High preset, PC gamers will need an AMD Vega 64 or GeForce GTX 1080. For gaming at 1440p/60fps with the Very High preset, PC gamers will need an AMD RX 5700XT or GeForce RTX 2080 SUPER.

Assassin’s Creed Valhalla releases on November 10th. Contrary to Watch Dogs: Legion, this new AC game will not be using any real-time Ray Tracing effects.

All those cards look like older cards? HMM? :goog_unsure:
 

Dampf

Member
What you talking bout willace?




All those cards look like older cards? HMM? :goog_unsure:
DirectX12 ULTIMATE.

Dude.... I am talking about the additions to DX12 that are Mesh shaders, Sampler Feedback, VRS and DXR 1.1. It is an evolution of DX12 and it is only supported on Ampere, Turing, RDNA2 and the new Xbox consoles. https://devblogs.microsoft.com/directx/announcing-directx-12-ultimate/
 

Kholinar

Banned
Ah, yes. Because the majority of gamers have Ampere and RDNA 2, or even Turing! Quit it. As long as the mass market continues to have the likes of 1060s and RX 580s, they'll continue to be supported. Don't buy into this fearmongering; there's still a good 2 years left for these cards.
 
What an odd thread/thing to get worked up enough about to make a thread about.

Personally I wouldn't recommend buying older GPUs when a new generation has launched for both Nvidia and AMD unless you can get them for pretty cheap. Having said that for 1080p/1440p the 5700XT will run perfectly fine for the foreseeable future. Once stock for new GPUs normalize expect the price for older models to drop significantly.

Is this thread simply a reaction to HW Unboxed 3060ti review? If so you could at least make that clear and include the review in your OP. Would help give the thread a bit more context.
 

Dampf

Member
Ah, yes. Because the majority of gamers have Ampere and RDNA 2, or even Turing! Quit it. As long as the mass market continues to have the likes of 1060s and RX 580s, they'll continue to be supported. Don't buy into this fearmongering; there's still a good 2 years left for these cards.
A good 2 years of sacrificing visual quality and performance, yes. That is cross gen.

After that, it will be a bloodbath for RDNA1 because of the lacking featureset.
 

Papacheeks

Banned
DirectX12 ULTIMATE.

Dude.... I am talking about the additions to DX12 that are Mesh shaders, Sampler Feedback, VRS and DXR 1.1. It is an evolution of DX12 and it is only supported on Ampere, Turing, RDNA2 and the new Xbox consoles. https://devblogs.microsoft.com/directx/announcing-directx-12-ultimate/

Ok? So you dont play with those features? Look I get what your saying. But PC gaming the whole point is scalability. I know someone whos going to play Cyberpunk without all the ray tracing. And they are adding DXR support for it. So you can play it with our without. It depends on how much those bells an whistles mean to you.

Just because Ultimate and DXR 1.1 only supports those gpu's for those specific features of the API, doesnt mean you can't scale and play that said game on older hardware. Your just not going to be able to use those features and benefits. But the game will still run and look really good.

Anything you bought last year even if RDNA1 will still run all of those games running VRS, DXR 1.1 just that you wont use those features.

So you joined in June? Now this idiotic thread makes more sense. I guess you have not been around PC gaming that much? This is no different than the transition from DX9 to DX10. Everyone was saying you need a 8800 Ultra just to play Crysis. But then I was playing it in DX9 with SLI config of 7900GTX's. ANd it ran better than the 8800 Ultra and the difference was minimal in my experience with the game.

I then replayed it once I had a GTX 285. It looked nice, and lighting and tessellation was better. But it wasn't some giant thing that it made my experience on DX9 complete shit?

i feel you have not been PC gaming for a while. This new transition is nothing new to us PC gamers. It's just that it's advancing more quickly because now the consoles are on a equal footing in terms of API usage.
 
Last edited:
Already sold my 5700XT not long ago for £300. Put that towards an MSI GeForce RTX 2080 Ti Gaming X Trio a local chap was selling for £600. Nice upgrade for £300, thanks Ampere hype :messenger_ok:

5700XT is still a nice GPU tho at the right price given it trades blows with a 2070S in rasterisation performance. Funnily enough there are people out there that just want as much performance as possible for as little as they can get it, latest & greatest features usually don't factor in.
 

Kholinar

Banned
A good 2 years of sacrificing visual quality and performance, yes. That is cross gen.

After that, it will be a bloodbath for RDNA1 because of the lacking featureset.

And Pascal too. Why are you only singling out RDNA 1? The thread title should be updated to feature both architectures.

I also question your notion that these cards will suddenly become obsolete once the new feature set comes out. You do realise there's something called 'low settings?' A setting that the majority of gamers play on anyway?
 

mr.dilya

Banned
And Pascal too. Why are you only singling out RDNA 1? The thread title should be updated to feature both architectures.

I also question your notion that these cards will suddenly become obsolete once the new feature set comes out. You do realise there's something called 'low settings?' A setting that the majority of gamers play on anyway?

Because ironically, he's a much more of a shill for Nvidia than the hardware unboxed guy is for AMD.
 

Papacheeks

Banned
This thread should be closed, the OP makes no sense. Sounds like he/she read a bunch of articles from Digital Foundry and thought that the end times for old hardware was coming.

I run a Radon 7 and it eats a lot of current games, barring ray tracing.
 

Dampf

Member
Ok? So you dont play with those features? Look I get what your saying. But PC gaming the whole point is scalability. I know someone whos going to play Cyberpunk without all the ray tracing. And they are adding DXR support for it. So you can play it with our without. It depends on how much those bells an whistles mean to you.

Just because Ultimate and DXR 1.1 only supports those gpu's for those specific features of the API, doesnt mean you can't scale and play that said game on older hardware. Your just not going to be able to use those features and benefits. But the game will still run and look really good.

Anything you bought last year even if RDNA1 will still run all of those games running VRS, DXR 1.1 just that you wont use those features.
The question is, for how long you will be able to play without these features? DXR 1.1 and VRS are easy to integrate for cross gen games, which is why that is the only way DX12U is used right now. Sampler Feedback and Mesh shaders fundamentally change how the engine work, so once these features are used it's going to be a very hard time for RDNA1. As I said,once games are built with that featureset in mind, any card that does not support it will have a hard time. It's kind of like Kepler doing DX12 but, like, 10 times worse. When cross gen ends, there already is a huge userbase of next gen consoles, RDNA2, Ampere, Hopper, RDNA3 and Turing. And with every new console generation, system requirements rise. I could see only DX12U cards being supported once true next gen titles hit.

As for Raytracing, it won't take long once it replaces certain raster effects like shadows and GI, because it's so much easier for devs. DXR in software mode can be used on non HW-RT GPUs, leaving no one behind.
 
Last edited:

Papacheeks

Banned
The question is, for how long you will be able to play without these features? DXR 1.1 and VRS are easy to integrate for cross gen games, which is why that is the only way DX12U is used right now. Sampler Feedback and Mesh shaders fundamentally change how the engine work, so once these features are used it's going to be a very hard time for RDNA1. As I said,once games are built with that featureset in mind, any card that does not support it will have a hard time. It's kind of like Kepler doing DX12 but, like, 10 times worse. When cross gen ends, there already is a huge userbase of next gen consoles, RDNA2, Ampere, Hopper, RDNA3 and Turing. And with every new console generation, system requirements rise. I could see only DX12U cards being supported once true next gen titles hit.

Games and engines scale. i think you have no fucking idea wtf your talking about. Go look at the Unreal 5 demo which is using new shit. It was running on a PS5 and will scale to a mobile device. So you do the fucking math.

Seriously It's like you only recently discovered PC gaming and read a bunch of Nvidia PR and made a thread.

You do know that a 5700xt was running the new Crysis remaster with forced ray tracing through software? It didnt run super well but you it could done. your making this out like you wont be able to run any of these games. ANd thats not the case at all.

DXR is still in a slow adoption because it's a black box at the moment. I really think you need to take a step back for a moment and watch some videos or read some more on the subject your talking about.
 
Last edited:

Ascend

Member
The 5700XT is aging better than the 2070S. Just saying.

The only reason to not go for the 5700XT is because the RTX 3060Ti is better value. If you can find one at or near MSRP....
 

Dampf

Member
Games and engines scale. i think you have no fucking idea wtf your talking about. Go look at the Unreal 5 demo which is using new shit. It was running on a PS5 and will scale to a mobile device. So you do the fucking math.

Seriously It's like you only recently discovered PC gaming and read a bunch of Nvidia PR and made a thread.
You clearly have no clue about the concept of lowest common denominator in game development. Series S will be that one in the next generation and it has that important featureset that is crucial for next gen graphics.

Do the math.

By the way, PS5 has the same featureset as DX12U, it just has a different name. The Next Gen Geometry Engines boost geometry way above RDNA1's Geometry Engine capabilities, similar to mesh shading on PC.

Speaking of mobile phones, even Snapdragon phones will have DX12 Ultimate soon.
  • Microsoft is collaborating with Qualcomm to bring the benefits of DirectX feature level 12_2 to Snapdragon platforms.
 
Last edited:

Mithos

Member
Lets see....

RTX 3060Ti, pricepoint, equivalent of us$590...
No thanks, get back to me when its equivalent to us$399-449.
 
Last edited:

Ascend

Member
Just buy it if it's super cheap and you're on a budget. The terrible performance op is saying is grossly exaggerated as long as you're gaming on 1080p/1440p you'll be fine.

I don't get the hate honestly. I own 2080ti btw cus I know I sound like a defensive 5700xt owner right now lol
It's because it's AMD. The 'cool kids' like beating up the underdog because they think it makes them feel superior.
 

Papacheeks

Banned
You clearly have no clue about the concept of lowest common denominator in game development. Series S will be that one in the next generation and it has that important featureset that is crucial for next gen graphics.

Do the math.

By the way, PS5 has the same featureset as DX12U, it just has a different name. The Next Gen Geometry Engines boost geometry way above RDNA1's Geometry Engine capabilities, similar to mesh shading on PC.

Speaking of mobile phones, even Snapdragon phones will have DX12 Ultimate soon.
  • Microsoft is collaborating with Qualcomm to bring the benefits of DirectX feature level 12_2 to Snapdragon platforms.

Actually no. It does not have the same feature set, it's very custom. It has same to similar architecture for RDNA2 but it is very custom. They use their own API. They do not use DX12 or any derivative of it. Which is why a lot of developers and software engineers are talking about insomniac right now and their implementation of Ray tracing in their engine. It's on a different level. ANd that has nothing to do with DX12U.

You think Microsoft is going to not have DX12U scale? Mobile app/game makers are not going to isolate millions/billions of users for some DX12U benefits. They are going to scale the app/game and those with the said hardware to take advantage will.

Like seriously again I ask wtf is this thread?
 

GHG

Gold Member
He says the same thing you just said, that now he recommends getting a 3060ti over a 5700xt if you are coming from an older card. So how is he shilling for AMD?

In fairness I haven't seen their 3060ti review and was responding to someone who suggested they were recommending the 5700 xt over that card.

What I said was based on how they went about reviewing the 6800 cards which was not informative in the slightest and full of bias in AMD's favour.
 

mr.dilya

Banned
In fairness I haven't seen their 3060ti review and was responding to someone who suggested they were recommending the 5700 xt over that card.

What I said was based on how they went about reviewing the 6800 cards which was not informative in the slightest and full of bias in AMD's favour.

The OP is an obnoxious idiot. Watch the review. I watched it coming away with the idea that the 3060ti is the best value in that pricepoint right now, dethroning the 5700xt.
 
  • Thoughtful
Reactions: GHG

Dampf

Member
Actually no. It does not have the same feature set, it's very custom. It has same to similar architecture for RDNA2 but it is very custom. They use their own API. They do not use DX12 or any derivative of it. Which is why a lot of developers and software engineers are talking about insomniac right now and their implementation of Ray tracing in their engine. It's on a different level. ANd that has nothing to do with DX12U.

You think Microsoft is going to not have DX12U scale? Mobile app/game makers are not going to isolate millions/billions of users for some DX12U benefits. They are going to scale the app/game and those with the said hardware to take advantage will.

Like seriously again I ask wtf is this thread?
I said similar. Of course it doesn't use DX12. Sony uses their own API to realize their vision.

Yes for cross generation, DX12U capable games will scale. I never said it didn't. I am talking about games in 2-3 years from now.
 
Last edited:

mr.dilya

Banned
The 5700XT is aging better than the 2070S. Just saying.

The only reason to not go for the 5700XT is because the RTX 3060Ti is better value. If you can find one at or near MSRP....

Exactly, and then you look at something like the 2060 which is already pretty much obsolete even though it is a ray tracing card. Further evidence that ray tracing is more gimmick than it is compelling tech right now.
 

Dampf

Member
Exactly, and then you look at something like the 2060 which is already pretty much obsolete even though it is a ray tracing card. Further evidence that ray tracing is more gimmick than it is compelling tech right now.
The 2060 is a lot more powerful than the Series S and with DLSS, it is far above Series X. It is fully compatible to the next gen featureset and it will certainly age much better than the 5700XT. Don't spread misinformation or telling me I am an idiot. I just spread the facts here that RDNA1 is incompatible to the next gen featureset.
 
Last edited:

ZywyPL

Banned
Honestly in starting to think they are just AMD shills. Either that or they have an irrational hate hard on for Nvidia.

When reviewing the 6800 series cards they completely skipped over ray tracing saying "it's not worth it at the moment".

That's not how you objectively review a piece of hardware.

Almost everyone does it nowadays, all those sites and reviewers thrive on catchy AMD vs Intel/Nvidia headlines, while the actual content is absolutely awful, like some of the recent Zen3 review (it was Hardware Canucks if I remember correctly) with a headline like "AMAZING GAMING PERFORMANCE! Intel, didn't see it coming!", only to post slides from 4 titles for a few seconds throughout entire 34min. long video, the rest is just desperate convincing how awesome the CPUs really are, of course tons of useless fake benchmarks that have nothing to do with real gaming performance, Intel bad, and what's not. So sorry, but after such a long review, with such a brave title, and just 4 slides showing actual gaming performance, the only logical step is to hit the unfollow button. And those are Zen3 reviews where AMD actually does a great job, but most RDNA2 reviews are straight up garbage, like they really got paid for AMD to praise the products no matter what and don;t mention any shortcomings, I applaud DF's review where they didn't hesitate to show RT performance, and even put the RDNA2 cards against DLSS where they get completely outdone.

The irony is, everyone is living the belief that Intel and Nvidia are the bad guys, which is driven by nothing but their product pricing, but then again, why would anyone price their product the same as their competition that half as capable, but no one seems to be noticing that the closer in performance AMD CPUs/GPUs get to Intel/NV, so are the prices, they're basically in the same league now on both fronts, so there are no excuses anymore to underdeliver in any aspect.
 

mr.dilya

Banned
The 2060 is a lot more powerful than the Series S and with DLSS, it is above Series X. Don't spread misinformation or telling me I am an idiot. I just spread the facts here that RDNA1 is incompatible to the next gen featureset.

The 5700xt outperforms the 2060 and the 2060 super thoroughly and decisively. You are saying that right now it makes more sense to buy a 2060 over a 5700xt? Really?
 

Dampf

Member
The 5700xt outperforms the 2060 and the 2060 super thoroughly and decisively. You are saying that right now it makes more sense to buy a 2060 over a 5700xt? Really?
God, why does no one understand.

That is the perspective now. This is the present, this is just the beginning of cross generation.

Yes I am absolutely saying that. For the near and far future, the 2060 will run next gen games with much better visual quality and performance than the Series S. The 5700XT won't because it lacks featureset parity. What is there so hard to understand? You can't compare Current/Cross Gen games to games built around the next gen consoles and modern GPUs. The 5700XT is outdated, similar to Pascal because it lacks the hardware features to compete.
 
Last edited:

mr.dilya

Banned
God, why does no one understand.

That is the perspective now. This is the present, this is cross generation.

Yes I am absolutely saying that. For the near and far future, the 2060 will run next gen games with much better visual quality and performance than the Series S. The 5700XT won't because it lacks featureset parity. What is there so hard to understand? You can't compare Current/Cross Gen games to games built around the next gen consoles and modern GPUs. The 5700XT is outdated, similar to Pascal because it lacks the hardware features to compete.

It's not even doing it now, but you want people to believe that it will do so in the future.

In 2 to 3 years people will be looking to upgrade their GPUS anyway.

Bruh you just sound like a snake soil salesman.
 

spyshagg

Should not be allowed to breed
Buy an used 5700XT for 300$, is the correct recommendation. The product is good.

If wanting new, wait to see what AMD has in store in the 400$ range.
 

Dampf

Member
It's not even doing it now

"Now"... I'm speaking to a wall. Mesh shaders and Sampler feedback are not even used. Neither is DirectStorage on PC. And with games Using Raytracing and DLSS 2.0 the 2060 is already far ahead of the 5700xt in terms of visual quality and performance.
 
Last edited:

Hydroxy

Member
Didn't read your post but i agree. 5700xt is done. Either you buy 3060Ti or wait for 3060 non ti/Radeon 6700
 
Top Bottom