• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Navi21 XT/6800XT(?) allegedly scores more than 10,000 in Fire Strike Ultra

notseqi

Member
Do not care considering it took them 6 months to fix their last set of drivers. As someone who grew up during 90’s era pc gaming, the thought of spending hours trouble shooting driver issues and glitches has me terrified. I left that shit in the 90’s where it belongs.
Care to elaborate on your setup and what was going wrong? I hear complaints about AMD cards but it has been absolute plug&play for me.
 

jigglet

Banned
Care to elaborate on your setup and what was going wrong? I hear complaints about AMD cards but it has been absolute plug&play for me.

I don't have an AMD card, I've just been going off videos like this:



They're one of the most reputable hardware channels around and huge AMD fans, so I think they're pretty impartial. I think it was in this video where he talked about how a lot of retailers had confided in them that they preferred to push Nvidia GPU's because so many AMD returns were making it unprofitable for them.

The card came out in June / July I think, the drive update that fixed most (not all) of peoples' issues came out in Jan (which this video is all about). That's 6 months.
 
Last edited:
I don't have an AMD card, I've just been going off videos like this:



They're one of the most reputable hardware channels around and huge AMD fans, so I think they're pretty impartial. I think it was in this video where he talked about how a lot of retailers had confided in them that they preferred to push Nvidia GPU's because so many AMD returns were making it unprofitable for them.

The card came out in June / July I think, the drive update that fixed most (not all) of peoples' issues came out in Jan (which this video is all about). That's 6 months.


The launch drivers werent broken. They broke the drivers with adrenaline update in december. Was fixed after 1.5 month.

The drivers of my rtx 3080 still giving me bad times.

That are problems i ecpect from new architectures.

GCN drivers were rock solid over the years. With exception of some launch driver wich always were fixed within 1 to 2 weeks.
 

Ascend

Member
Possible raytracing benchmarks:


Looks like there is a chance the 6800 XT will be competitive with the 3070 in raytracing.
Seems like the rumors of RT being better than Turing but worse than Ampere are correct.

In my book, that's a good enough result.
 

FireFly

Member
Seems like the rumors of RT being better than Turing but worse than Ampere are correct.

In my book, that's a good enough result.
From a performance per transistor/CU perspective, Big Navi may be inferior to Turing. It seems they are investing more budget into scaling rasterisation performance.
 

Papacheeks

Banned
People putting all their eggs into DLSS need a reality check real soon. Next year and going into 2022 your going to see more updated engines or re-writen engines that use RT in better ways. On top of having it not be as taxing with how they code it into their engine.

When we start seeing that stuff, I think your going to see RDNA 2-3 age much better than nvidia.
 

Ascend

Member
From a performance per transistor/CU perspective, Big Navi may be inferior to Turing. It seems they are investing more budget into scaling rasterisation performance.
It indeed seems worse than Turing per CU. But I don't think it will really matter. People will see that in practice it can surpass a 2000 RTX card and they can still consider upgrading to them.
More importantly... It might be one of the performance metrics that will push AMD to keep the prices just a bit lower than the 3000 series, which is good for us.
 
500 notes and we've got a winner. Give us a return to the glorious days of cards like the 9800 Pro.

Not a hope that their 3080 competitor comes in at that price, sorry.

However there is supposed to be a further cut down Navi21 with 64CUs, so a 6800 (non-XT) that should be a little weaker than 3080 but way more powerful than 3070.

That one miiiight come in at maybe 550, but that also might be too low. Hard to say, but keep your expectations realistic.
 

ZywyPL

Banned
People putting all their eggs into DLSS need a reality check real soon. Next year and going into 2022 your going to see more updated engines or re-writen engines that use RT in better ways. On top of having it not be as taxing with how they code it into their engine.

But if than happens, then DLSS will boost the performance even further anyway, no?
 

regawdless

Banned
*rushes home to run fire strike ultra to validate his 3080 purchase in fear*

Hehe just kidding. Great to see very fast cards from AMD as well. Let's push forward as much as possible, the more progress through the bank the better.
 

NoviDon

Member
If those figures are correct, and they're doing this without dedicated raytracing cores...what will 3rd gen navi be like 🥵 nvidia better stop playing and step it up or they'll get curbstomped like Intel.
 
People putting all their eggs into DLSS need a reality check real soon. Next year and going into 2022 your going to see more updated engines or re-writen engines that use RT in better ways. On top of having it not be as taxing with how they code it into their engine.

When we start seeing that stuff, I think your going to see RDNA 2-3 age much better than nvidia.


The inferior performing card with fewer features will age much better ? How so ? How are people needing a reality check with DLSS when the heaviest games in the last 2 years have it and boost performance ? 2 of the biggest launches of the year, Watch Dogs and Cyberpunk will both have it.
 

llien

Member
But what does that mean tho?
To me it means that when a company does something, you strongly condemn, you buy less from it.

against RT and AI
That is your reading comprehension problem at best, straw maning at worse.
I'm telling you there is only a handful of games that use hardware RT, most of them for basic gimmicks, and this isn't going to change any time soon, because tech has neither market penetration, nor delivers on its core promise: ease of development.
Your understanding of what "AI" likely differs from mine.

show them they need to do better, not support mediocrity.
I find this to be a weird approach, but to each their own.

I already agreed with you about the performance preview.
Then we agree on fact, but not on how to interpret them.
I see it as paid ninja-shilling and covering their asses later on, when someone called BS.

But if than happens, then DLSS will boost the performance even further anyway, no?
Yeah, lowering resolution will boost performance indeed.
 
Last edited:

ZywyPL

Banned
At the cost of image quality.

What looks better (and I am not sure I have seen it compared) DLSS 4k or native 4k with a few subtle settings turned down?

But DLSS gives the same or even better image quality, that's why there is so much hype around it recently. The 1.0 version boosted the performance at the cost of the image clarity, and it was indeed a questionable trade-off, but the 2.0 makes it a no-brainer feature, in best case scenarios you are getting twice the framerate AND more detailed image, what's there not to like? And it only keeps getting better over time.
 

Papacheeks

Banned
I only care if they have a viable alternative to DLSS and no some reshade caliber sharpening filter doesn't count.

WOw,

DLSS is not the be all. And it's a by product of software not being where it needs to be in terms of engines being built around RT.

These DLSS cheer leaders will not age well with their comments in the next year or so.

The inferior performing card with fewer features will age much better ? How so ? How are people needing a reality check with DLSS when the heaviest games in the last 2 years have it and boost performance ? 2 of the biggest launches of the year, Watch Dogs and Cyberpunk will both have it.

Those titles are based on older engines that had RT added. Cyber punk by the time it comes out engine wise is super old. You think RT was built in back in 2013 when they were concepting and building their engine for Witcher 3?
There's a reason next gen patch comes next year. And also if you actually used your head and paid attention to how RT is being shown currently is in puddle reflections, reflections off of surfaces like windows/cars.
Implementation going into next year and beyond will change as we get more into the new gen.

But keep believing DLSS is the future. It's creation is to compensate how taxing RT implementation currently is.
Here's my question to you and others if you can answer it honestly:

What's more plausible? DLSS being the future of how we render, or game engines advancing their infrastructure in how they code/handle advanced lighting and RT implantation?
RT in game engines is super new still, as in maybe 2 years old.
Do you think as Direct x 12 ultimate gets re-written and new engines like Unreal 5 get completed next year we will see better implementation than currently brute forcing it?

Because all signs point to yes.
 
Last edited:

llien

Member
The launch drivers werent broken.
There were no clear signs of that whatsoever.
A bunch of reviewers have been using 5700 series for months, and in different hardware setups with zero issues.
Most likely reason is as with 3080 black screens and unusually high RMA rate of 2080Ti => hardware.

DLSS gives the same or even better image quality
No, it does not, stop listening to shills and use your own eyes and god forbid, brain.

From a performance per transistor/CU perspective,
There are different ways to pack transistors, lower density leads to higher clocks, it's a tradoff.
That's why this metric is rather misleading.
 
Last edited:

Marlenus

Member
But DLSS gives the same or even better image quality, that's why there is so much hype around it recently. The 1.0 version boosted the performance at the cost of the image clarity, and it was indeed a questionable trade-off, but the 2.0 makes it a no-brainer feature, in best case scenarios you are getting twice the framerate AND more detailed image, what's there not to like? And it only keeps getting better over time.

Not entirely true. DLSS 4k can look great but not always and it will never look better than a native image.

This is where I miss HardOCP as they would probably compare DLSS 4k to native 4k at similar frame rates and show what has better IQ for a desired frame rate.
 

Mister Wolf

Member
WOw,

DLSS is not the be all. And it's a by product of software not being where it needs to be in terms of engines being built around RT.

These DLSS cheer leaders will not age well with their comments in the next year or so.

Not for you. I've played Control/Metro Exodus and seen firsthand how much of a performance hit multiple raytracing techniques running together causes. Anyone who thinks Cyberpunk will be any different is silly.
 

Papacheeks

Banned
Not for you. I've played Control/Metro Exodus and seen firsthand how much of a performance hit multiple raytracing techniques running together causes. Anyone who thinks Cyberpunk will be any different is silly.

I think seeing the big discrepancy in RT ON/OFF basically proves my point. If you have to design a software based AI system to do a lot of the work because RT takes so much to do, then doesnt that prove my point in engine optimizations, and being built around RT instead of it being added?

New engines coming next year and beyond are going to show what can be done without DLSS.

Crytek's update to their old Crysis build being run on a 5700xt with software ray tracing shows promise that it can be done. It's only a matter of time in how they code it, to have it be more optimized.

DLSS is just a in between tech until things get figured out.

Which will be soon.
 

Mister Wolf

Member
I think seeing the big discrepancy in RT ON/OFF basically proves my point. If you have to design a software based AI system to do a lot of the work because RT takes so much to do, then doesnt that prove my point in engine optimizations, and being built around RT instead of it being added?

New engines coming next year and beyond are going to show what can be done without DLSS.

Crytek's update to their old Crysis build being run on a 5700xt with software ray tracing shows promise that it can be done. It's only a matter of time in how they code it, to have it be more optimized.

DLSS is just a in between tech until things get figured out.

Which will be soon.

I'm not worried about promise Im worried about right now. Cyberpunk is almost here and it will feature more raytracing techniques than Control. Now i can play with DLSS with all of it turned on and get image quality better than 1440p(around 1800p) upscaling from 1080p or I can not on some AMD card and be stuck playing at 1080p. Thats what is going down in a couple of weeks.
 
Last edited:

ZywyPL

Banned
Not entirely true. DLSS 4k can look great but not always and it will never look better than a native image.

It will, it basically already does, as native 4K doesn't have any AA by definition, so there's shimmering, jaggies etc. all over the place. I think lot of people got mislead by DF vids comparing DLSS with 4K+TAA where they called it "native 4K", which is just plain wrong. And speaking of TAA, just as any post-processing AA it does have its cons, which is slightly blurred image, so at the end of the day, yeah you are initially rendering at a higher resolution, but then the image gets blurred anyway.
 

Papacheeks

Banned
I'm not worried about promise Im worried about right now. Cyberpunk is almost here and it will feature more raytracing techniques than Control. Now i can play with DLSS with all of it turned on and get image quality better than 1440p(around 1800p) upscaling from 1080p or I can not on some AMD card and be stuck playing at 1080p. Thats what is going down in a couple of weeks.

When you say ray tracing, have you seen the game? It's all surface reflections off of anything that has a reflection surface. Which seems to be more into the city, than the wasteland sections. If there's a option to do what PS5 does and bring resolution down to Ray traced reflections then it's not going to be taxing.
 

Mister Wolf

Member
When you say ray tracing, have you seen the game? It's all surface reflections off of anything that has a reflection surface. Which seems to be more into the city, than the wasteland sections. If there's a option to do what PS5 does and bring resolution down to Ray traced reflections then it's not going to be taxing.

Yes I've seen the game. And I'm referring mainly to the raytraced GI and emissive textures. If you played metro exodus on PC you would understand how demanding they are. Throw in raytraced shadows plus ambient occlusion and you have a hog not even counting the reflections. Not gimped reflections neither.
 
Last edited:
Not gonna lie, but I honestly think some of the people are rooting for AMD, only because their favorite console uses them. That's it. The real PC gamers know where the performance is, and they know what the budget option is. The only reason to go with AMD with it's current lineup is if you could save money over getting the rtx 2070. Otherwise, there wouldn't be much of an option to do so, and suddenly a synthetic benchmark is supposed to mean better raytracing and rasterization? Why am I getting multiple deja vu memories right now...
 
When you say ray tracing, have you seen the game? It's all surface reflections off of anything that has a reflection surface. Which seems to be more into the city, than the wasteland sections. If there's a option to do what PS5 does and bring resolution down to Ray traced reflections then it's not going to be taxing.



Ray-Traced Diffuse Illumination
Ray-Traced Reflections
Ray-Traced Ambient Occlusion
Ray-Traced Shadows

This game will be a different game on a 3080 than everywhere else visually.
 

spyshagg

Should not be allowed to breed
I've been through times where ATI has been the premium, then Nvidia, them AMD, and now Nvidia.

A bad spell does not mean the company no longer knows how to make products. On these components, a bad decision can take half a decade to fix. Look at Intel.
 
Last edited:

Senua

Member
Not gonna lie, but I honestly think some of the people are rooting for AMD, only because their favorite console uses them. That's it. The real PC gamers know where the performance is, and they know what the budget option is. The only reason to go with AMD with it's current lineup is if you could save money over getting the rtx 2070. Otherwise, there wouldn't be much of an option to do so, and suddenly a synthetic benchmark is supposed to mean better raytracing and rasterization? Why am I getting multiple deja vu memories right now...
Leave thelastword alone!!!11


Also fucking lol at people downplaying DLSS
 
Last edited:
Leave thelastword alone!!!11


Also fucking lol at people downplaying DLSS
But... But... But FidelityFX is better than DLSS!!

Me: *But how? Any examples?*

Death stranding!

Me: *Looks at Death Stranding... DLSS still looks better... Any other examples?*

Screeches + Random tirade of irrelevant stuff, and no proof.




We all know if Nvidia was used in consoles, DLSS would be the holy grail... But since it's PC and possibly nintendo switch exclusive....Screeches
 
Last edited:

llien

Member
Why am I getting multiple deja vu memories right now...
It is green reality distortion field.
NV routinely beats AMD at synthetics and 5700XT is a hands down better card than 2070.


Me: *Looks at Death Stranding... DLSS still looks better...
You look at Death Stranding where? In videos of the types known for shilling, or on your own monitor?


Any other examples?*
In a world where only a handful of games supports D2.0, which other game supports both D2.0 and FidelityFX?

This game will be a different game on a 3080 than everywhere else visually.
Surely, developers have invested a lot of effort for those 12 thousand (was it 20?) of users who got that card to enjoy their game.
 
Last edited:
Not gonna lie, but I honestly think some of the people are rooting for AMD, only because their favorite console uses them. That's it. The real PC gamers know where the performance is, and they know what the budget option is. The only reason to go with AMD with it's current lineup is if you could save money over getting the rtx 2070. Otherwise, there wouldn't be much of an option to do so, and suddenly a synthetic benchmark is supposed to mean better raytracing and rasterization? Why am I getting multiple deja vu memories right now...

While you are probably right that some people are rooting for AMD simply because they provide the GPU/CPU in their console of choice, there are several key differences between the AMD of today and the AMD of 5 years ago.

In that timeframe AMD were almost bankrupt, their CPUs were had almost no marketshare/mindshare, had far worse performance than Intel and they were completely starved for cash for R&D and on the verge of shutting their doors.

Since then they got their act together with Ryzen CPUs and their market cap and revenue have skyrocketed. Their Ryzen CPUs are so good in fact and have continued to improve generation to generation that they are now the market leader in terms of performance and DIY desktop CPU sales (of new chips).

The company has grown immensely since in size and grown from strength to strength. This is now an AMD with the cash and talent to invest in R&D to compete.

That is one huge chunk of the puzzle, the other is the departure of Raja from the Radeon group. He is now over at Intel continuing to flounder in the graphics department so Radeon group has competent leadership now with a renewed focus now that their CPU business is stable and in a good position.

We saw the first steps of this with the RDNA1 architecture where the 5700XT punched way above its weight and competed very very well with Nvidia and the mid range and below. Anyone paying attention should have been able to see that this was the beginning of a change of pace for Radeon group.

Now RDNA2 seems to be further refining and improving that architecture along with help from the Ryzen engineers. This is not the Radeon group of old, to make things even better for them, Nvidia decided to own goal themselves by trying to cheap out on the inferior Samsung 8nm node.

This node has caused them tons of problems, from high power draw/heat, which also limits clockspeed, performance and overclocking potential to bad yields of the chips themselves which has contributed to Nvidia's supply shortages (along with them rushing the launch to get ahead of AMD).

If you are a fan of technology in general and want to see competition on the market again, especially in the high end to keep both competitors innovating and honest, then you should hopefully be pleasantly surprised by the 28th of October reveal, if not and if you were hypothetically for example a raging Nvidia fanboy then you are going to very very unpleasantly surprised come 28th of October.

These benchmarks also do not exist in isolation, we have evidence of high clock speeds, large chips with up to 80CUs on the top die etc....

For comparison, 5700XT was 251 mm² in size with 40CUs. Now double that size for Navi21 (500+mm2 is the current estimation), double the CUs, increase the clock speeds heavily, increase power efficiency by 50% (according to AMD targets) and add IPC and other architectural improvements/gains and of course you are going to get somewhat far more powerful than a 5700XT.

Simply put, the figures all pretty much match up with performance in the 3080 tier. I understand people don't want to get burned by expecting too much and being let down, which is a fair mindset to have. However you can't base all future trends and developments on the past, look at what Ryzen has done to Intel compared to where AMD was before with their processors? That was an incredible jump and just continued to skyrocket over time. It really is not inconceivable for AMD to knock it out of the park again.

Just to be clear, Big Navi/Navi 21/Radeon doing well and competing with 3080 won't take away your 3080, it will still exist, you can still buy it (once it launches for real) and it will still be a great performing card with good features. The difference is this time there will be real competition from AMD and a genuine alternative to Nvidia. Most people should be happy about that.
 

RoboFu

One of the green rats
I have a 2080 and I dont use DLSS when avaible it just always has some weird look to it. I do think it will continue to get better.
 

Krappadizzle

Gold Member
AquaticSquirrel AquaticSquirrel Absolutely right in regard to AMD now from then. They have really stepped up their game and it's rather impressive what they've done in regards to their cpu side of business. I'm hopeful that that success starts boiling over onto their GPU side as well because they have been killing it lately. Really wish I had waited just a bit and gone Ryzen instead of the i7-9700k I put in my system last year.

I'm kicking myself too for not investing in AMD stock when they started the Ryzen line up, but I was so sure that it was just gonna be another Bulldozer all over again. Annoyed and glad to be wrong on that front.
 
In a world where only a handful of games supports D2.0, which other game supports both D2.0 and FidelityFX?


Surely, developers have invested a lot of effort for those 12 thousand (was it 20?) of users who got that card to enjoy their game.


Ok, so in the last 2 years, we have DLSS in these games:

Anthem
Battlefield 5
Control
Death Stranding
Final Fantasy 15
Avengers
Metro Exodus
Monster Hunter World
Wolfesntein
Tomb Raider

Now, these are not all of them. I pointed out the major titles that came out very recently, in the last 2 years. Ok ?

We will also have DLSS right in the next month in Watch Dogs Legion, in Call of Duty and in Cyberpunk. Ok, now i await your response, since you're so bent on dismissing this tech, in what games should it have been ? In the 2 years since it exists ? Huh ? What major, AAA games that are extremely taxing on hardware have came out besides these that DLSS is missing from ? Do you expect DLSS in games like Doom that run with 500 frames on a 3080 ? In games like Shovel Knight maybe ? No, because it doesnt make any sense there. You only need it in the heaviest triple A games. Which it seems its got them covered.
 
AquaticSquirrel AquaticSquirrel Absolutely right in regard to AMD now from then. They have really stepped up their game and it's rather impressive what they've done in regards to their cpu side of business. I'm hopeful that that success starts boiling over onto their GPU side as well because they have been killing it lately. Really wish I had waited just a bit and gone Ryzen instead of the i7-9700k I put in my system last year.

I'm kicking myself too for not investing in AMD stock when they started the Ryzen line up, but I was so sure that it was just gonna be another Bulldozer all over again. Annoyed and glad to be wrong on that front.

Yeah they really have come a tremendously long way since 5 years ago. I'm glad to start to see that pouring over in the GPU department now with RDNA2.

Yeah I really missed the boat on buying AMD stocks back during the first Ryzen launch. But eh I wasn't as stable financially back then, plus I've never invested in stocks before. After I purchase a house early next year I'm going to start investing where I can.
 

Krappadizzle

Gold Member
Ok, so in the last 2 years, we have DLSS in these games:

Anthem
Battlefield 5
Control
Death Stranding
Final Fantasy 15
Avengers
Metro Exodus
Monster Hunter World
Wolfesntein
Tomb Raider

Now, these are not all of them. I pointed out the major titles that came out very recently, in the last 2 years. Ok ?

We will also have DLSS right in the next month in Watch Dogs Legion, in Call of Duty and in Cyberpunk. Ok, now i await your response, since you're so bent on dismissing this tech, in what games should it have been ? In the 2 years since it exists ? Huh ? What major, AAA games that are extremely taxing on hardware have came out besides these that DLSS is missing from ? Do you expect DLSS in games like Doom that run with 500 frames on a 3080 ? In games like Shovel Knight maybe ? No, because it doesnt make any sense there. You only need it in the heaviest triple A games. Which it seems its got them covered.
I really like DLSS as well. But I won't pretend that it's had a great showing. We've had a handful of games in two years time that use it. And of those games only a few use it effectively. Metro's implementation is less than stellar and it took Battlefield 5 a long time before it was actually useful as well. Death Stranding, Wolfenstein, and Control are probably the best examples one could use to showcase the positives of DLSS. I do think that DLSS will have a big impact going forward, but if we are being honest, it's only just now getting the push it should have had 2 years ago.

And even then, we really need to see an AMD equivalent which they've yet to show, but I bet they have something cooking in the kitchen. It'd be real nice if DLSS was platform agnostic. But Nvidia doesn't like to share their toys as much, which is understandable, but we'd get more DLSS support in general if everyone could use it.
 
While you are probably right that some people are rooting for AMD simply because they provide the GPU/CPU in their console of choice, there are several key differences between the AMD of today and the AMD of 5 years ago.

In that timeframe AMD were almost bankrupt, their CPUs were had almost no marketshare/mindshare, had far worse performance than Intel and they were completely starved for cash for R&D and on the verge of shutting their doors.

Since then they got their act together with Ryzen CPUs and their market cap and revenue have skyrocketed. Their Ryzen CPUs are so good in fact and have continued to improve generation to generation that they are now the market leader in terms of performance and DIY desktop CPU sales (of new chips).

The company has grown immensely since in size and grown from strength to strength. This is now an AMD with the cash and talent to invest in R&D to compete.

That is one huge chunk of the puzzle, the other is the departure of Raja from the Radeon group. He is now over at Intel continuing to flounder in the graphics department so Radeon group has competent leadership now with a renewed focus now that their CPU business is stable and in a good position.

We saw the first steps of this with the RDNA1 architecture where the 5700XT punched way above its weight and competed very very well with Nvidia and the mid range and below. Anyone paying attention should have been able to see that this was the beginning of a change of pace for Radeon group.

Now RDNA2 seems to be further refining and improving that architecture along with help from the Ryzen engineers. This is not the Radeon group of old, to make things even better for them, Nvidia decided to own goal themselves by trying to cheap out on the inferior Samsung 8nm node.

This node has caused them tons of problems, from high power draw/heat, which also limits clockspeed, performance and overclocking potential to bad yields of the chips themselves which has contributed to Nvidia's supply shortages (along with them rushing the launch to get ahead of AMD).

If you are a fan of technology in general and want to see competition on the market again, especially in the high end to keep both competitors innovating and honest, then you should hopefully be pleasantly surprised by the 28th of October reveal, if not and if you were hypothetically for example a raging Nvidia fanboy then you are going to very very unpleasantly surprised come 28th of October.

These benchmarks also do not exist in isolation, we have evidence of high clock speeds, large chips with up to 80CUs on the top die etc....

For comparison, 5700XT was 251 mm² in size with 40CUs. Now double that size for Navi21 (500+mm2 is the current estimation), double the CUs, increase the clock speeds heavily, increase power efficiency by 50% (according to AMD targets) and add IPC and other architectural improvements/gains and of course you are going to get somewhat far more powerful than a 5700XT.

Simply put, the figures all pretty much match up with performance in the 3080 tier. I understand people don't want to get burned by expecting too much and being let down, which is a fair mindset to have. However you can't base all future trends and developments on the past, look at what Ryzen has done to Intel compared to where AMD was before with their processors? That was an incredible jump and just continued to skyrocket over time. It really is not inconceivable for AMD to knock it out of the park again.

Just to be clear, Big Navi/Navi 21/Radeon doing well and competing with 3080 won't take away your 3080, it will still exist, you can still buy it (once it launches for real) and it will still be a great performing card with good features. The difference is this time there will be real competition from AMD and a genuine alternative to Nvidia. Most people should be happy about that.
I'm glad to see AMD doing great, as most of my builds were AMD based, until the last couple of years.

Even if they can compete with 3080, without having ample raytracing abilities or a solid answer to DLSS, I still wouldn't buy for $100 less. I bought a 2080 ti a few years ago, not just because it had much better rasterization than the 5700XT, but because of raytracing + DLSS. I'd even get the 2070/S over it, because of the same reason. I can completely understand the hype for these cards if they do in fact cover these bases, but for now I'll wait and see if it'll be much better than what next gen consoles are displaying.
 

Ascend

Member
DLSS is great if implemented well, but the main thing hampering it is that it requires "manual" implementation. If you have an alternative that works for all games, like rendering at 80% resolution and using sharpening, it's simply a more handy feature, simply because it's universal. The only thing AMD needs to do here is to automatically incorporate the scaling & sharpening, give it a proper name and market it. Now you have to do it manually.

As for DLSS 2.0 vs FideltyFX...
But... But... But FidelityFX is better than DLSS!!

Me: *But how? Any examples?*

Death stranding!

Me: *Looks at Death Stranding... DLSS still looks better... Any other examples?*

Screeches + Random tirade of irrelevant stuff, and no proof.
"But Death Stranding is a high-falutin' game with auteur aspirations, and this means that tiny details, like sparkly highlights in a cut scene, matter. Until Nvidia straightens this DLSS wrinkle up, or until the game includes a "disable DLSS for cut scenes" toggle, you'll want to favor FidelityFX CAS, which looks nearly identical to "quality DLSS" while preserving additional minute details and adding 2-3fps, to boot. "


But they also mention;

"Nvidia reached out to Ars Technica to point to moments where DLSS, as rendered exclusively on Nvidia's RTX line of GPUs, takes a commanding lead over the AMD method of CAS and upscaling. You'll see this most prominently in the game's "city" sequences, and the DLSS model has obviously been trained on common game-graphics elements like power lines and line-covered walls. I've included those comparison images, along with two of my own representative CAS versus DLSS captures, so you can get a sense of what you're missing out on if your GPU is outside the Nvidia RTX family. "

Short version, they both have advantages and disadvantages. Neither FidelityFX CAS nor DLSS2.0 is perfect.

Me? I'll be sticking with my 21:9 1080p monitor for a while so that I don't have to use any upscaling tech.
 

Ascend

Member
I'm glad to see AMD doing great
tenor.gif


Your post history indicates that's...
tenor.gif


without having ample raytracing abilities
Did you even look at the last page...?

BSEU5Ye.png
 
Last edited:
tenor.gif


Your post history indicates that's...
tenor.gif



Did you even look at the last page...?

BSEU5Ye.png
A quick search through my history would say otherwise....



I'm hoping they deliver. I think they might have something up there sleeve, hence Nvidia prices. But we shall see
du1uvY4.png








Exactly. These rumors sound great and all, but so did the Rx4xx/5xx, as well at the 57xx series and so forth. I've heard it a million times, but I want it to come true forreal this time. Not that I would switch from Nvidia, but it would make them stay competitive, price wise.
People swore up and down the RTX 30XX series would be very powerful, but cost more than 20XX series. Nvidia killed the "more expensive rumor", by revealing it's performance, series lineup and pricing.

Is it too much to ask for AMD to just spill the beans already? I'd love to be impressed and have my doubts removed, but I just can't help think they have dropped the ball on their GPU's yet again.

I'm completely manufacturer agnostic between the two rivals. As matter a fact I'll lever throw Intel into the mix. WHOEVER can provide me with the best performance, gets my money. It used to be Intel for me in regards to CPU, and my past 3 gpu's were all Nvidia, after switching from an AMD r9 390x a few years ago.

If AMD cannot beat out Nvidia in the enthusiast market with big navi, I'll continue to go with the proven God's of graphics. And I'm pretty much decided with AMD as processors from here on out, till Intel can give me a reason to come back to them.




 

Ascend

Member
A quick search through my history would say otherwise....
Really...? Let's see;

If you have to get a Red card to suffice instead of the best of the best, then yes lol
That alone is enough to discount your claim. I would keep going, but, I have better things to do with my time. It's obvious that you're simply touting the latest newest shiniest stuff that nVidia propagates, because that's what you like. That's fine by me, but that you don't freely admit it when being called out, is a problem. But whatever.

In any case...


 
Last edited:
Really...? Let's see;


That alone is enough to discount your claim. I would keep going, but, I have better things to do with my time. It's obvious that you're simply touting the latest newest shiniest stuff that nVidia propagates, because that's what you like. That's fine by me, but that you don't freely admit it when being called out, is a problem. But whatever.

In any case...



My posts even back it up. I didn't go back and edit anything because you think you had a "gotcha" moment, when you failed at that. If you haven't noticed by now, I get the best of the best. Did the 5700XT or 2080 ti have better performance? Did the 480/580/590 have better performance than the gtx 1080/ti? If the answer is no, then my buying decision would go with the one that does. If they can beat the 3080 or 3090 this time around, I'd buy it.


Also next time read the context of that quote. Maybe spend more time doing that instead of trying to prove a point you can't. I mentioned the 30XX cards not being in stock, so getting a Red card would be sufficing for the lack of a 30XX. Its pretty obvious too.
 

RoboFu

One of the green rats
Really...? Let's see;


That alone is enough to discount your claim. I would keep going, but, I have better things to do with my time. It's obvious that you're simply touting the latest newest shiniest stuff that nVidia propagates, because that's what you like. That's fine by me, but that you don't freely admit it when being called out, is a problem. But whatever.

In any case...




13k would be an insane score.
 
Top Bottom