• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: AMDs Big Navi performance on par with 3080RTX

GymWolf

Member
Sure, but it's good for us that AMD is staying in the game. Without them NVIDIA and Intel have free reign to do whatever they want, and they can keep releasing marginally better hardware year after year instead of developing hardware with real performance boosts. A 5% boost in performance is nothing to brag about, but Intel specifically loved to do just that.
not to be that guy, but amd being in the game did really changed all of the bolded part? i think intel and nvidia are doing the fuck they want even with amd being kinda competitive...
 

Papacheeks

Banned
not to be that guy, but amd being in the game did really changed all of the bolded part? i think intel and nvidia are doing the fuck they want even with amd being kinda competitive...

I think Nvidia got a wake up call though when it came to passing the buck for Ray tracing to the consumers. ANd if rumors are true about pricing may get another one.
 

GymWolf

Member
Errrr intel basically cut their high end cpu by 50% 🤣
i'm sorry, i don't understand what you are saying, can you explain better?!

oh wait, are yopu talking about prices? i'm not well informed about intel prices in the last 2-3 years unfortunately, i'm the kinda of guy who only follow prices and the other stuff when he actually have to buy stuff, so every 2-4 years (so basically since ryzen came out) :lollipop_grinning_sweat:
 
Last edited:

Zato

Banned
Lol, same rumours all the time.

AMD will not have the fastest card or close to it.


Their highest tier will be lucky to compete with the 3080.
 

IFireflyl

Gold Member
not to be that guy, but amd being in the game did really changed all of the bolded part? i think intel and nvidia are doing the fuck they want even with amd being kinda competitive...

I didn't say AMD has already changed the game, but rather that them being in the game is good for customers moving forward. You're right that NVIDIA and Intel were basically doing whatever they wanted, but that's because there was no competition. Intel in particular found out the hard way that AMD was bringing the heat with their CPUs. That means Intel can't just sit on their laurels anymore lest AMD actually overtake them.

If the only person you're competing with is yourself there isn't an incentive to go above and beyond in your work.
But if you're competing with others there is incentive to try harder so as to not be beat by your competition.
 
Last edited:

GymWolf

Member
I didn't say AMD has already changed the game, but rather that them being in the game is good for customers moving forward. You're right that NVIDIA and Intel were basically doing whatever they wanted, but that's because there was no competition. Intel in particular found out the hard way that AMD was bringing the heat with their CPUs. That means Intel can't just sit on their laurels anymore lest AMD actually overtake them.

If the only person you're competing with is yourself there isn't an incentive to go above and beyond in your work.
But if you're competing with others there is incentive to try harder so as to not be beat by your competition.
yeah yeah i know the general rules of market, i was just thinking about what amd did to change how nvidia do things...well at least intel did something.
 

IFireflyl

Gold Member
yeah yeah i know the general rules of market, i was just thinking about what amd did to change how nvidia do things...well at least intel did something.

True, nothing as changed with NVIDIA yet. They've been the powerhouse in GPUs for so long, and AMD hasn't really broken out in the GPU department in recent years. But as long as AMD is releasing GPUs that are marginally in line with NVIDIA it will be good for consumers. Maybe not this go-round, but the GPUs coming after Ampere would benefit from the AMD/NVIDIA competition. Either because pricing will be more balanced, or because the features will actually be worth the price.
 

GymWolf

Member
True, nothing as changed with NVIDIA yet. They've been the powerhouse in GPUs for so long, and AMD hasn't really broken out in the GPU department in recent years. But as long as AMD is releasing GPUs that are marginally in line with NVIDIA it will be good for consumers. Maybe not this go-round, but the GPUs coming after Ampere would benefit from the AMD/NVIDIA competition. Either because pricing will be more balanced, or because the features will actually be worth the price.
We can only hope, the balls are on amd court (with nvidia dick very close to their asshole)
 
Last edited:
Sure, but it's good for us that AMD is staying in the game. Without them NVIDIA and Intel have free reign to do whatever they want, and they can keep releasing marginally better hardware year after year instead of developing hardware with real performance boosts. A 5% boost in performance is nothing to brag about, but Intel specifically loved to do just that.
I don't disagree, but they will never be the best, if that makes sense..
 

SantaC

Member
We will just file this under "no shit"

seriously, AMD is no match for Intel or nVidia. The only way AMD can win is by shrinking their chips, but once the big guys shrink, they're simply more efficient and powerful and better designed. For instance, AMD at 7nm is only slightly better than Intel at 14nm+(repeated). Intel will destroy them at 10nm. nVidia will destroy them at 8nm.
Actually zen 3 is about to take performance crown from intel.
 
Hasn't this been the case for several years now. Fury beat the 980, Vega64 competed with the 1080 and the Radeon VII competed with the 2080. The issue has always been that AMD haven't surpassed the xx80ti, which has led to the amusing mindset of AMD not being able to compete irrespective of the GPU performance tier across the PC community.
 

The Skull

Member
We will just file this under "no shit"

seriously, AMD is no match for Intel or nVidia. The only way AMD can win is by shrinking their chips, but once the big guys shrink, they're simply more efficient and powerful and better designed. For instance, AMD at 7nm is only slightly better than Intel at 14nm+(repeated). Intel will destroy them at 10nm. nVidia will destroy them at 8nm.

Rumour and leaks from intel are that 10nm is absolutely shit. So much so that they might be skipping it, hence the 14++++++++.
 

IFireflyl

Gold Member
Hasn't this been the case for several years now. Fury beat the 980, Vega64 competed with the 1080 and the Radeon VII competed with the 2080. The issue has always been that AMD haven't surpassed the xx80ti, which has led to the amusing mindset of AMD not being able to compete irrespective of the GPU performance tier across the PC community.

The Vega-64 was released a full year after the GTX 1080, and it was comparable to (but still trailed) the GTX 1080 in performance while being similarly priced. The Radeon VII was released two years after the GTX 1080 TI, and it has worse performance. Not to mention that the RTX 2080 and RTX 2080 TI also beat it, and the 2070 is virtually tied with it performance-wise (while also being cheaper).

AMD is holding a fire to Intel in terms of CPU, but they really need to step up their game when it comes to GPUs. There is no point in releasing their GPU after NVIDIA if the GPU isn't going to beat NVIDIA in either price or performance.
 
Last edited:

martino

Member
need good rt perf and dlss but given nvidia policy i think it's
tenor.gif
 

llien

Member
seriously, AMD is no match for Intel or nVidia. The only way AMD can win is by shrinking their chips
Oh boy, I don't know where to start...

vs Intel
1) First Zen was on an inferior process, still beating Intel on perf/watt.
2) In terms of security Intel is inherently broken, to a point when even the newest chips are vulnerable
3) Chiplet design is genius, allows to produce CPUs with insane number of cores for relatively low price
4) In terms of mobile chips, as I said above, Intel is not behind process wise, but have you seen what 4xxx (Renoir) notebook CPUs are doing with Intel? To illustrate:




vs Nvidia
GPUs are so much simpler than CPUs.
No need to have genius of Jim Keller caliber here, just normal funding will cut it, oh wait, hold on, it already does.
As for "but node advantage", AMD was traditionally the first to embrace new nodes in GPU business.
 

gatti-man

Member
Nothing about ray tracing performance and without DLSS any big Navi is still going to come up short against the 3080. Why even buy a 3080 if you aren’t interested in ray tracing.....
 

llien

Member
Nothing about ray tracing performance and without DLSS any big Navi is still going to come up short against the 3080. Why even buy a 3080 if you aren’t interested in ray tracing.....

RT performance is barely relevant (in all those 20 games that support or are promised to support it, lol) the "can't we render it at lower resolution, then upscale... then claim it's higher resolution" is a funny take and, no thanks.

Faster than 2080/2080s GPU makes sense on PCs to users who game at 4k but want higher framerates than consoles. 3070 / Big Navi's smaller brother should be right there for 500-600 bucks.
 

godhandiscen

There are millions of whiny 5-year olds on Earth, and I AM THEIR KING.
Considering the 3800 is the equivalent of the X70 tier of past generations, it doesn’t seem like anything has changed compared to previous generations.
 
Last edited:

Spukc

always chasing the next thrill
besides that DLSS and RTX means fuck all if tested. easy to manipulate
in want NO DLSS and RTX running side by side
 
Last edited:
I feel like this should be pretty obvious.
Big Navi has 80 Compute Units (5120 shaders), and the RTX 3080 has 68 Streaming Multiprocessors (4352 shaders).

You just need to take a look at the 5700XT vs the 2070 Super. Both have 2560 shaders. If you lock both at the same frequency, they perform pretty much identically. The only advantage Turing had over Navi 10 was the fact that it could clock higher, and could scale a bit better to those clocks.
Ampere won't have such an advantage, as there appears to be no significant clockspeed increase this time around. Even if it could clock higher than the 1700MHz rated boost (according to leaks), its already consuming a ridiculous 320W. The thermal headroom you'd need for higher clocks would be ludicrous.


PC Gamer reports it's on part with the 2080 ti.


The bigger issue is you would have AMD hardware in your system.

This is exactly what Nvidia wants.
Can't have a competitive market, if nobody buys the competing product.

Who's that again?
Its this guy



Note the date. Its also a reply to an earlier tweet of his own that got the GA100 die size reasonably accurate (leaked approx. 800mm2, the actual die is 826mm2 - good enough) and also got the transistor count pretty damn accurately as well (leaked approx 55 Billion transistors, the actual die has 54.2 Billion transistors - again good enough).

He also had a tweet from Feb 2019 (which unfortunately I cannot find), where he noted that the fully unlocked GA100 die would have 128SM (8192 cuda cores), which was absolutely correct. This tweet has him calling that the A100 final chip would have 1 GPC cut down, so only 108 SM's (6912 cuda cores) active. The chip specs he nailed around a year in advanced, and the final active processor specifications a full month in advanced. He's been basically spot on with pretty much every Nvidia leak so far.

Here's another tweet:


Where he leaked the existence of GDDR6X, which nobody believed because there were no JEDEC specs for GDDR6. A few weeks before the launch of gaming Ampere, and lo and behold Micron confirm the existence of GDDR6X memory, which happens to be a non-JEDEC certified VRAM spec.

A very reliable source of information.
 
Last edited:

IFireflyl

Gold Member
This is exactly what Nvidia wants.
Can't have a competitive market, if nobody buys the competing product.

I am an (unapologetic) NVIDIA fanboy. Even if AMD releases a GPU that's slightly better than NVIDIA I will stick with NVIDIA for now. They have a more consistent track record. I mean that in terms of the GPUs power AND its support. Back in the day when NVIDIA and AMD were running circles around each other I remember plenty of games that had better support for NVIDIA than AMD. If that changes in the future and AMD proves it can match NVIDIA I might change my mind. But for now, I'm in NVIDIA's pocket.

However, I do hope that AMD steps up (soon) and delivers a GPU that is on par with NVIDIA's top-tier GPU. This would only be beneficial to me as we would then have a real GPU war that would either give us lower-priced GPUs or GPUs with more useful features/functionality.
 

Ellery

Member
I am an (unapologetic) NVIDIA fanboy. Even if AMD releases a GPU that's slightly better than NVIDIA I will stick with NVIDIA for now. They have a more consistent track record. I mean that in terms of the GPUs power AND its support. Back in the day when NVIDIA and AMD were running circles around each other I remember plenty of games that had better support for NVIDIA than AMD. If that changes in the future and AMD proves it can match NVIDIA I might change my mind. But for now, I'm in NVIDIA's pocket.

However, I do hope that AMD steps up (soon) and delivers a GPU that is on par with NVIDIA's top-tier GPU. This would only be beneficial to me as we would then have a real GPU war that would either give us lower-priced GPUs or GPUs with more useful features/functionality.

Is money a factor in your purchase decision or do you buy the best GPU each generation?
(Genuine question. no mean intentions or anything)
 
I am an (unapologetic) NVIDIA fanboy. Even if AMD releases a GPU that's slightly better than NVIDIA I will stick with NVIDIA for now. They have a more consistent track record. I mean that in terms of the GPUs power AND its support. Back in the day when NVIDIA and AMD were running circles around each other I remember plenty of games that had better support for NVIDIA than AMD. If that changes in the future and AMD proves it can match NVIDIA I might change my mind. But for now, I'm in NVIDIA's pocket.

However, I do hope that AMD steps up (soon) and delivers a GPU that is on par with NVIDIA's top-tier GPU. This would only be beneficial to me as we would then have a real GPU war that would either give us lower-priced GPUs or GPUs with more useful features/functionality.
Well you're not gonna get that.
That exact same mentality is exactly what has led to the current market situation. AMD and ATi before them have been thought of as a mechanism for getting Nvidia to lower their prices. You can't provide the software development resources for driver development etc. if nobody buys your GPU. That costs money.

Since the HD5000 series people have patted AMD on the back for delivering very good value products, and them promptly gone and bought an Nvidia card instead as soon as the price dropped. Back then, AMD's drivers weren't even that bad, which is the amazing thing. Nvidia had their fair share of appalling drivers from that era. One infamous case where a driver accidentally removed the software thermal limiter on the GPU and literally caused the GPU die to go into thermal runaway and catch fire. Nvidia have had bumpgate, which was a problem where the "bumps" on Nvidia chips would wear and break due to thermal stress (lol Fermi). This problem was awful, and yet Nvidia never accepted fault and instead blamed OEMs. If you want to know why Apple to this day do not use Nvidia GPUs - Bumpgate is the cause.
Then there was the infamous GTX 970 3.5GB fiasco as recent as Maxwell. The GTX 970 was advertised as a core-reduced 980 with 4GB of RAM and 64 ROPS. The reality was that Nvidia had not just disable Cuda cores, but they had disabled part of a memory controller and its associated ROPS. So only 3.5GB of VRAM was actually usable and only 56 ROPS were active.
More recently than that did people forget the frequency with which 2080Tis failed and needed to be RMA'd?

The reality is that Nvidia has had their fair share of problems, with drivers and even hardware failure. During GCN, AMD's drivers were nearly faultless as succeeding architectures were derivatives of the previous set. As such the drivers had matured and everything was perfectly fine. The drivers were so good in fact that the HD7970 (Tahiti), which started off slower than the GTX 680 (GK104), but in some circumstances is now faster than the GTX 780 Ti (GK110). The only time I remember drivers being a problem was Navi 10, where the move to RDNA was handled exceptionally poorly by the way of drivers.

Why has everyone forgotten all this shit? Instead AMD and ATI have had to suffer this "bad driver" meme that's been haunting them for decades.

I am by no means an AMD fanboy, but if thats how you perceive me then so beit. But I've been following this market for years now, I've noticed a pattern. Nvidia's problems are either forgotten or are framed as user-error. Meanwhile AMD's problems are not only remembered, but they hang over them like a spectre and are used to caveat any GPU launch before its even happened. "Well they might have the performance crown, but their drivers are bad, so I'll buy Nvidia anyway".

I suppose you could blame a combination of willfull ignorance and/or foolhardy brand loyalty on the part of the PCMR community, and the fact that Nvidia deployed a systematic astroturfing campaign in partnership with Arbuthnot Entertainment Group (AEG) and offered freebies to prominent members of the tech community and forums to talk up Nvidia products, whilst under an NDA to never mention that they're receiving kickbacks. The current state of the GPU market is the logical conclusion of a campaign set in motion over a decade ago.

So now you can take a look at the 5700/5700 XT as an example of AMD no longer playing the price cut game. They'll undercut Nvidia by as little as they can get away with. Why bother cutting their own margins into oblivion when Nvidia will cut GeForce prices to be $50-100 above the equivalent Radeon and people go GeForce anyway? They will now try and maximise the profit on any GPU they sell. If Nvidia raise prices by $200 per tier, AMD will happily set their prices $50-100 below that benchmark.

If they don't sell that well? Well who cares? AMD sell 20-30 million console APU's every year which creates great revenue - even if margins are lower. And the rest of their energy will be spent on Datacentre where margins are much higher than PC gaming.

The PCMR has spoken, and they seem to be willing to pay more for Nvidia at any given price or tier. So I guess we'll just have to deal with it.
 
Last edited:

nochance

Banned
Given current rumours that would still put them a generation behind. I miss the Radeon 9800 days.

AMD has squandered their 7nm opportunity on every front, and the only thing that they really have going for them seems to be the easily excitable and very militant fanbase.
 

IFireflyl

Gold Member
Is money a factor in your purchase decision or do you buy the best GPU each generation?
(Genuine question. no mean intentions or anything)

Money really isn't a huge factor. I have no desire to pay $2,500 for the RTX Titan, but I bought the 2080 TI when it came out. If the rumors about the 3090 are true (meaning it turns out to be $2,000 USD) then I'll probably get the next card down. But if it's $1,500 or less I'm jumping on it.
 

Ellery

Member
Money really isn't a huge factor. I have no desire to pay $2,500 for the RTX Titan, but I bought the 2080 TI when it came out. If the rumors about the 3090 are true (meaning it turns out to be $2,000 USD) then I'll probably get the next card down. But if it's $1,500 or less I'm jumping on it.

Okay thanks for sharing your point of view. I would probably buy a lot more high end GPUs if I had a really good job, but I make less than $80K a year and don't have a 4K, ultra widescreen or Triple Monitor setup. So my 2080 at 1440p 144hz is still blasting all games without a sweat.
 
This is a new GPU launch for AMD, so expect to be disappointed.

Expect it to be less powerful and more expensive than the rumors led you to believe.

AT BEST Big Navi will be on par with a 3080 but with far worse raytracing performance and no answer for DLSS.

Prove me wrong AMD.
 

IFireflyl

Gold Member
Well you're not gonna get that.
That exact same mentality is exactly what has led to the current market situation. AMD and ATi before them have been thought of as a mechanism for getting Nvidia to lower their prices. You can't provide the software development resources for driver development etc. if nobody buys your GPU. That costs money.

When companies manufacture a product they have to take into consideration the cost of maintaining that product. In regards to GPU, that would be driver support. Regardless of the number of sales the company needs to have a plan for how they will implement driver support, how often they'll deploy updates, and the most efficient way to make that happen. AMD wasn't a nobody when it came to GPUs. They sold enough GPUs that they should have had the resources available for driver development.

Since the HD5000 series people have patted AMD on the back for delivering very good value products, and them promptly gone and bought an Nvidia card instead as soon as the price dropped. Back then, AMD's drivers weren't even that bad, which is the amazing thing. Nvidia had their fair share of appalling drivers from that era. One infamous case where a driver accidentally removed the software thermal limiter on the GPU and literally caused the GPU die to go into thermal runaway and catch fire. Nvidia have had bumpgate, which was a problem where the "bumps" on Nvidia chips would wear and break due to thermal stress (lol Fermi). This problem was awful, and yet Nvidia never accepted fault and instead blamed OEMs. If you want to know why Apple to this day do not use Nvidia GPUs - Bumpgate is the cause.
Then there was the infamous GTX 970 3.5GB fiasco as recent as Maxwell. The GTX 970 was advertised as a core-reduced 980 with 4GB of RAM and 64 ROPS. The reality was that Nvidia had not just disable Cuda cores, but they had disabled part of a memory controller and its associated ROPS. So only 3.5GB of VRAM was actually usable and only 56 ROPS were active.

I built my first PC in 2011. I can't speak to every individual issue that NVIDA (or AMD) had, nor was that the point of my post. You're reading far too much into what I said if you're defending AMD this hard.

More recently than that did people forget the frequency with which 2080Tis failed and needed to be RMA'd?

I have never had an issue with my 2080 TI, and I don't know how many failures there were.

The reality is that Nvidia has had their fair share of problems, with drivers and even hardware failure. During GCN, AMD's drivers were nearly faultless as succeeding architectures were derivatives of the previous set. As such the drivers had matured and everything was perfectly fine. The drivers were so good in fact that the HD7970 (Tahiti), which started off slower than the GTX 680 (GK104), but in some circumstances is now faster than the GTX 780 Ti (GK110). The only time I remember drivers being a problem was Navi 10, where the move to RDNA was handled exceptionally poorly by the way of drivers.

Great?

Why has everyone forgotten all this shit? Instead AMD and ATI have had to suffer this "bad driver" meme that's been haunting them for decades.

How did my one post equate to what you're asking? I never said AMD was some garbage company that didn't know how to make a GPU. I said that NVIDIA tended to perform better than AMD back in the day (early 2010's), and that stuck with me. AMD hasn't led the way in the GPU arena once since then.

I am by no means an AMD fanboy, but if thats how you perceive me then so beit. But I've been following this market for years now, I've noticed a pattern. Nvidia's problems are either forgotten or are framed as user-error. Meanwhile AMD's problems are not only remembered, but they hang over them like a spectre and are used to caveat any GPU launch before its even happened. "Well they might have the performance crown, but their drivers are bad, so I'll buy Nvidia anyway".

You're the only that mentioned AMD's issues. I only said that the NVIDIA GPUs were (slightly) more powerful than AMDs. And that was true. And is still true.

I suppose you could blame a combination of willfull ignorance and/or foolhardy brand loyalty on the part of the PCMR community, and the fact that Nvidia deployed a systematic astroturfing campaign in partnership with Arbuthnot Entertainment Group (AEG) and offered freebies to prominent members of the tech community and forums to talk up Nvidia products, whilst under an NDA to never mention that they're receiving kickbacks. The current state of the GPU market is the logical conclusion of a campaign set in motion over a decade ago.

This has nothing to do with my post.

So now you can take a look at the 5700/5700 XT as an example of AMD no longer playing the price cut game. They'll undercut Nvidia by as little as they can get away with. Why bother cutting their own margins into oblivion when Nvidia will cut GeForce prices to be $50-100 above the equivalent Radeon and people go GeForce anyway? They will now try and maximise the profit on any GPU they sell. If Nvidia raise prices by $200 per tier, AMD will happily set their prices $50-100 below that benchmark.

If they don't sell that well? Well who cares? AMD sell 20-30 million console APU's every year which creates great revenue - even if margins are lower. And the rest of their energy will be spent on Datacentre where margins are much higher than PC gaming.

Alright?

The PCMR has spoken, and they seem to be willing to pay more for Nvidia at any given price or tier. So I guess we'll just have to deal with it.

Wow. I didn't realize that one person spoke for the entire "PCMR", and I certainly didn't realize that I was that person. Thanks for educating me. It feels good to have all of this power.
 

Ellery

Member
Big if true! But how will the drivers & the prices compare?

There are so few Big Navi rumors that it is hard to take an educated guess. It probably points to a late November/December release.

As far as the drivers goes the "worry" about them has been greatly exaggerated and people weirdly think that AMD drivers crash all the time or have other weird issues, which is not true. I used an R9 290X for 5 years before upgrading to an Nvidia RTX 2080 and haven't had an issue once. I know it is anecdotal evidence, but I think there are always "driver reviews" that compare how well Nvidia and AMD drivers are doing.
One thing I hugely admired about my R9 290X was that it became better over time and even though I hate using that phrase "aged like fine wine" also thanks to the generous amount of VRAM that AMD cards always came with and the wide memory bandwidth.

And for the price part it is usually AMD that slightly undercuts the rival Nvidia product. So if a 5700 XT is around as fast as an RTX 2070 then it is usually 5-15% cheaper or so and I think the same would happen with the BigNavi card if it competes with the RTX 3080.

If I had to take completely blind guesses of the best case scenario I'd say that the BigNavi card slots in somewhere between the RTX 3080 and the RTX 3090, but probably much closer to the RTX 3080 and then costs around 999$~ which will then be answered by an RTX 3080 Ti (maybe?) from Nvidia which slightly beats the BigNavi card for the same price so AMD has to lower their price.

Worst case scenario would be between 3070 and 3080 where Nvidia can answer with an RTX 3070 Ti.

(Both of these scenarios pure speculation on my end only considering what AMD told us about RDNA2 improvements and extrapolating performance from the 250mm² 5700 XT with regards to diminishing returns and lower clocks within the TDPS limits)
 
When companies manufacture a product they have to take into consideration the cost of maintaining that product. In regards to GPU, that would be driver support. Regardless of the number of sales the company needs to have a plan for how they will implement driver support, how often they'll deploy updates, and the most efficient way to make that happen. AMD wasn't a nobody when it came to GPUs. They sold enough GPUs that they should have had the resources available for driver development.
Well, my rant wasn't aimed directly at you specifically.
I've just become somewhat weary of seeing "I hope AMD become competitive again so Nvidia are forced to bring prices back down". Unfortunately I guess, I ended up roping you into that frustration.

How did my one post equate to what you're asking? I never said AMD was some garbage company that didn't know how to make a GPU. I said that NVIDIA tended to perform better than AMD back in the day (early 2010's), and that stuck with me. AMD hasn't led the way in the GPU arena once since then.

You're the only that mentioned AMD's issues. I only said that the NVIDIA GPUs were (slightly) more powerful than AMDs. And that was true. And is still true.

The 2010's onwards was very competitive. AMD took the performance crown no less than twice in that period mind. The R9 290X was faster than the GTX Titan. The R9 Fury X was more or less tied with the 980 Ti. Nvidia didn't really pull away from AMD in terms of performance until Pascal in 2016. Until that point they consistently offered better value products.

This has nothing to do with my post.

You are correct. I went on a bit of a rant. Apologies


???
You explained how you hoped AMD would bring competition so you could benefit from lower prices from Nvidia.
I just offered my hypothesis as for why I think that won't happen.
 
Last edited:

IFireflyl

Gold Member
Well, my rant wasn't aimed directly at you specifically.
I've just become somewhat weary of seeing "I hope AMD become competitive again so Nvidia are forced to bring prices back down". Unfortunately I guess, I ended up roping you into that frustration.

Ah, yes. I think we have all been there at one point or another.

The 2010's onwards was very competitive. AMD took the performance crown no less than twice in that period mind. The R9 290X was faster than the GTX Titan. The R9 Fury X was more or less tied with the 980 Ti. Nvidia didn't really pull away from AMD in terms of performance until Pascal in 2016. Until that point they consistently offered better value products.

The R9 290X was better than the GTX Titan, but the R9 290x was released more than 8 months after the launch of the GTX Titan. Less than 11 months later NVIDIA took back their number one slot with the GTX 960. As far as the R9 Fury X, I think you're misremembering because the GTX 980 TI was drastically superior to the R9 Fury X.


You are correct. I went on a bit of a rant. Apologies

No apologies necessary. I get it, and I'm glad to know I wasn't the overall problem. :messenger_relieved:
 
Top Bottom