I am an (unapologetic) NVIDIA fanboy. Even if AMD releases a GPU that's slightly better than NVIDIA I will stick with NVIDIA for now. They have a more consistent track record. I mean that in terms of the GPUs power AND its support. Back in the day when NVIDIA and AMD were running circles around each other I remember plenty of games that had better support for NVIDIA than AMD. If that changes in the future and AMD proves it can match NVIDIA I might change my mind. But for now, I'm in NVIDIA's pocket.
However, I do hope that AMD steps up (soon) and delivers a GPU that is on par with NVIDIA's top-tier GPU. This would only be beneficial to me as we would then have a real GPU war that would either give us lower-priced GPUs or GPUs with more useful features/functionality.
Well you're not gonna get that.
That exact same mentality is exactly what has led to the current market situation. AMD and ATi before them have been thought of as a mechanism for getting Nvidia to lower their prices. You can't provide the software development resources for driver development etc. if nobody buys your GPU. That costs money.
Since the HD5000 series people have patted AMD on the back for delivering very good value products, and them promptly gone and bought an Nvidia card instead as soon as the price dropped. Back then, AMD's drivers weren't even that bad, which is the amazing thing. Nvidia had their fair share of appalling drivers from that era. One infamous case where a driver accidentally removed the software thermal limiter on the GPU and literally caused the GPU die to go into thermal runaway and catch fire. Nvidia have had bumpgate, which was a problem where the "bumps" on Nvidia chips would wear and break due to thermal stress (lol Fermi). This problem was awful, and yet Nvidia never accepted fault and instead blamed OEMs. If you want to know why Apple
to this day do not use Nvidia GPUs - Bumpgate is the cause.
Then there was the infamous GTX 970 3.5GB fiasco as recent as Maxwell. The GTX 970 was advertised as a core-reduced 980 with 4GB of RAM and 64 ROPS. The reality was that Nvidia had not just disable Cuda cores, but they had disabled part of a memory controller and its associated ROPS. So only 3.5GB of VRAM was actually usable and only 56 ROPS were active.
More recently than that did people forget the frequency with which 2080Tis failed and needed to be RMA'd?
The reality is that Nvidia has had their fair share of problems, with drivers and even hardware failure. During GCN, AMD's drivers were nearly faultless as succeeding architectures were derivatives of the previous set. As such the drivers had matured and everything was perfectly fine. The drivers were so good in fact that the HD7970 (Tahiti), which started off slower than the GTX 680 (GK104), but in some circumstances is now faster than the GTX 780 Ti (GK110). The only time I remember drivers being a problem was Navi 10, where the move to RDNA was handled exceptionally poorly by the way of drivers.
Why has everyone forgotten all this shit? Instead AMD and ATI have had to suffer this "bad driver" meme that's been haunting them for decades.
I am by no means an AMD fanboy, but if thats how you perceive me then so beit. But I've been following this market for years now, I've noticed a pattern. Nvidia's problems are either forgotten or are framed as user-error. Meanwhile AMD's problems are not only remembered, but they hang over them like a spectre and are used to caveat any GPU launch before its even happened. "Well they might have the performance crown, but their drivers are bad, so I'll buy Nvidia anyway".
I suppose you could blame a combination of willfull ignorance and/or foolhardy brand loyalty on the part of the PCMR community, and the fact that Nvidia deployed a systematic astroturfing campaign in partnership with Arbuthnot Entertainment Group (AEG) and offered freebies to prominent members of the tech community and forums to talk up Nvidia products, whilst under an NDA to never mention that they're receiving kickbacks. The current state of the GPU market is the logical conclusion of a campaign set in motion over a decade ago.
So now you can take a look at the 5700/5700 XT as an example of AMD no longer playing the price cut game. They'll undercut Nvidia by as little as they can get away with. Why bother cutting their own margins into oblivion when Nvidia will cut GeForce prices to be $50-100 above the equivalent Radeon and people go GeForce anyway? They will now try and maximise the profit on any GPU they sell. If Nvidia raise prices by $200 per tier, AMD will happily set their prices $50-100 below that benchmark.
If they don't sell that well? Well who cares? AMD sell 20-30 million console APU's every year which creates great revenue - even if margins are lower. And the rest of their energy will be spent on Datacentre where margins are much higher than PC gaming.
The PCMR has spoken, and they seem to be willing to pay more for Nvidia at any given price or tier. So I guess we'll just have to deal with it.