• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia Launches GTX 980 And GTX 970 "Maxwell" Graphics Cards ($549 & $329)

jfoul

Member
Yeah. I feel pretty confident that EVGA will take care of its customers.

I have the FTW and am in queue for the FTW+ (probably for another few weeks lol).

I got the FTW+ yesterday, and it seems to be a winner. Performs better than my G1 with the luxury of 0db mode. It also has Samsung memory compared to the Hynix on my G1.

I'll be sticking with this card unless the GTX980 drops in price over the next 90 days. It's nice to be back with EVGA, with the step-up option in my back pocket.
 

Ryne

Member
I got the FTW+ yesterday, and it seems to be a winner. Performs better than my G1 with the luxury of 0db mode. It also has Samsung memory compared to the Hynix on my G1.

I'll be sticking with this card unless the GTX980 drops in price over the next 90 days. It's nice to be back with EVGA, with the step-up option in my back pocket.

Can't wait for my FTW+ to arrive, I don't know why I paid for ground shipping when I'm so impatient. Shipping from Industry, CA to Mississauga, Ontario, Canada is taking forever.
 
http://www.pcper.com/news/Graphics-Cards/NVIDIA-Responds-GTX-970-35GB-Memory-Issue

NVidia speaks.

The GeForce GTX 970 is equipped with 4GB of dedicated graphics memory. However the 970 has a different configuration of SMs than the 980, and fewer crossbar resources to the memory system. To optimally manage memory traffic in this configuration, we segment graphics memory into a 3.5GB section and a 0.5GB section. The GPU has higher priority access to the 3.5GB section. When a game needs less than 3.5GB of video memory per draw command then it will only access the first partition, and 3rd party applications that measure memory usage will report 3.5GB of memory in use on GTX 970, but may report more for GTX 980 if there is more memory used by other commands. When a game requires more than 3.5GB of memory then we use both segments.
 

Their testing methodology is not good at showing what occurs. Better would be
1. to put the game with low textures at something like 4K. Thereby not filling the VRAM to even 3.5GB, then measure performance.

2. Then putting the textures to ULTRA, thereby filling to beyond 3.5GB at 4K, and measuring the performance. This would show the actual cost of filling that last 512 MB.
 
Their testing methodology is not good at showing what occurs. Better would be
1. to put the game with low textures at something like 4K. Thereby not filling the VRAM to even 3.5GB, then measure performance.

2. Then putting the textures to ULTRA, thereby filling to beyond 3.5GB at 4K, and measuring the performance. This would show the actual cost of filling that last 512 MB.

Why even bother testing? We know what the performance for that last .5GB is, and it's super shit, that has been tested extensively already. Whatever they do in their drivers to juggle VRAM around to mitigate that is beside the point. They're just skirting the issue with no guarantee that they can avoid it for all cases.

I just don't understand why they had to lie when this card would have been perfectly fine sold as a 3.5GB GPU.
 

Piers

Member
I just don't understand why they had to lie when this card would have been perfectly fine sold as a 3.5GB GPU.

Presumably because it's not as attractive as 4GB, and 3GB would be seen as too low for many. Regardless, people who actually want 4GB will have to re-invest in a new graphics card as of now.
 

teiresias

Member
This means hardware reviewers and journalists will now have to carefully scrutinize the performance and architecture of mid-range cards that are simply higher end card cores with posts of their hardware disabled in order to explain to their readers how those core changes affect things like memory bandwidth. They should have been doing this anyway, but hopefully this makes it apparent to them.
 

IceIpor

Member
That depends on how important drivers are to you.

With the whole 970 memory thing still under investigation, it seems silly to jump to conclusions and buy one thing or the other based on something which is still unclear. If you're concerned, wait and see what Nvidia finds out and says.

Well, AMD drivers are pretty on par with Nvidia now, so that doesn't enter into the equation.

Now if having full 4GB is important to you (instead of 3.5GB for whatever reason), then I would steer clear of the 970 until it gets revised now that Nvidia have spoken.
 
Well, AMD drivers are pretty on par with Nvidia now, so that doesn't enter into the equation.

Now if having full 4GB is important to you (instead of 3.5GB for whatever reason), then I would steer clear of the 970 until it gets revised now that Nvidia have spoken.

I'm pretty sure it has a full 4GB and all 4GB works. I'll go with the card which has the better drivers thanks.
 
16172124448_0e2b973d7d_b.jpg
 

M3d10n

Member

So, it's the 2GB GTX 660 all over again? They did the same shit back then: a 1.5GB "partition" that was fast and a 0.5GB partition that was on a 64-bit bus or something like that and was very slow if it had to be used.

However, there were 1.5GB and 3GB variants that weren't affected by this issue. They really should have released the 970 as 3.5GBs instead of adding virtually unusable extra 0.5GBs just to tick a box.

If they update the drivers to "fix" this, it will probably simply shut off access to the slow memory and cap the card at 3.5GBs.
 

TAJ

Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.
How does it perform 1440p? I could also game at that resolution too on it I guess or even 1080p if needed.

I finished AC: Unity and I had it set at 1440p with all features maxed and FXAA, downsampled to 1080p. I capped it at 30fps with RivaTuner and it never dipped below 30fps once during gameplay. A small minority of the cutscenes (like the one just before the final battle) briefly dipped to the mid-20s.
I tried the same thing but with 4K and it was solid 30fps sometimes, but spent a lot of time in the mid-20s and dipped as low as 17fps worst-case. This was with a single 970 at 1450Mhz.
 

Helmholtz

Member
So am I better off going with an R9 290x over the 970?
I recently bought an r9 290, and for what it's worth I like it well enough so far. I got one with a pretty serious heat sink/3 fan cooler on it though because I had heard that these cards run hot, and so far it's been quite quiet and has run pretty cool. Some very minor coil whine though unfortunately, but nothing that I'm too concerned about. And that would vary card to card.
I didn't get a 970 mainly because the price difference between a 290 and a 970 was pretty significant where I live, and I think the performance of the 290 should be able to do what I want it to.
If you want to wait though, I think AMD will be announcing a new line soon?
 

Dezzy

Member
I finished AC: Unity and I had it set at 1440p with all features maxed and FXAA, downsampled to 1080p. I capped it at 30fps with RivaTuner and it never dipped below 30fps once during gameplay. A small minority of the cutscenes (like the one just before the final battle) briefly dipped to the mid-20s.
I tried the same thing but with 4K and it was solid 30fps sometimes, but spent a lot of time in the mid-20s and dipped as low as 17fps worst-case. This was with a single 970 at 1450Mhz.

Are you using DSR to downsample from 1440p? Or an actual 2560x1440 monitor with the game running at 1080p?

I have a GTX 970, i7 4790k, and 16GB of ram, and I currently use a 1920x1200(16:10) monitor.
I can play Farcry 4, AC: Unity and Dragon Age: Inquisition all at 60fps. I've been itching for a 1440p display though, and I'm wondering how big of a hit my fps would take.
I've used DSR to play at 2560x1600 to test out the performance and it gets quite choppy in all three of those games.

I've heard that using DSR to downscale from 1440p performs worse than actually playing on a 1440p monitor, and I wonder if anyone can confirm that?
 

mephixto

Banned
So, it's the 2GB GTX 660 all over again? They did the same shit back then: a 1.5GB "partition" that was fast and a 0.5GB partition that was on a 64-bit bus or something like that and was very slow if it had to be used.

However, there were 1.5GB and 3GB variants that weren't affected by this issue. They really should have released the 970 as 3.5GBs instead of adding virtually unusable extra 0.5GBs just to tick a box.

If they update the drivers to "fix" this, it will probably simply shut off access to the slow memory and cap the card at 3.5GBs.

The 970 perform as intended with 4GB:

https://www.youtube.com/watch?v=VrkIF1-e9ds
 

mephixto

Banned
Is that at 1080p? If not then its not really doing what the GTX 980 is easily able to do.

Thats 4k all maxed, you can't get that mem usage with 1080p. People claims that games can't access more that 3.5GB on the 970 or that games are unplayable beyond that point.
 

XBP

Member
Thats 4k all maxed

Which is where the problem lies. A 980 is able to reach 4GB VRAM usage at 1080p without forcing insane resolutions and effects in the game. If a game wants to use 4GB of VRAM at 1080p, a 980 will allow it to use that.
 

TAJ

Darkness cannot drive out darkness; only light can do that. Hate cannot drive out hate; only love can do that.
Are you using DSR to downsample from 1440p? Or an actual 2560x1440 monitor with the game running at 1080p?

I have a GTX 970, i7 4790k, and 16GB of ram, and I currently use a 1920x1200(16:10) monitor.
I can play Farcry 4, AC: Unity and Dragon Age: Inquisition all at 60fps. I've been itching for a 1440p display though, and I'm wondering how big of a hit my fps would take.
I've used DSR to play at 2560x1600 to test out the performance and it gets quite choppy in all three of those games.

I've heard that using DSR to downscale from 1440p performs worse than actually playing on a 1440p monitor, and I wonder if anyone can confirm that?

I'm using DSR and don't have a 1440p display to compare with.
 
Which is where the problem lies. A 980 is able to reach 4GB VRAM usage at 1080p without forcing insane resolutions and effects in the game. If a game wants to use 4GB of VRAM at 1080p, a 980 will allow it to use that.

If the card doesn't NEED all 4GB it won't use it. If the card does need it, the special 0.5GB pool is activated. This isn't that hard to understand guys.
 

jfoul

Member
Newegg dropped the price on the EVGA 970 FTW+ to $384.99 with free shipping. The discounts below should still be active. I contacted Newegg and got a $5 gift card because I bought it for $389.99 on the 21st.

Visa checkout: $25 Off Your $200+ Purchase with code: VCOJAN25
  • Offer valid through 1/26/15
Pay with AMEX after activating the $25 credit
  • "Get a one-time $25 statement credit by using your enrolled Card to spend a total of $200 or more online at www.Newegg.com by 2/28/2015."
 

Trouble

Banned
As a recent upgrader to a 970, after reading all the hubbub and Nvidia's explanation I have to say... I ain't even mad. I upgraded my PC from an old i7 with a 560ti to a new i5 with a 970 and everything runs buttery smooth on ultra settings. I'm eyeballing that ASUS gsync 1440 monitor because I feel like I'm wasting potential at 1080.
 

dr_rus

Member
So when is Nvidia or the other card manufacturers going to release 8GB GTX 970/980 cards? :/

Why? Are there any games which require more than 4 at the moment?

I'm guessing that GM200 card will have 6 - if you're ready to pay a thousand for that.
 

Kezen

Banned
Why? Are there any games which require more than 4 at the moment?

I'm guessing that GM200 card will have 6 - if you're ready to pay a thousand for that.

It's possible (and even likely) that the new Titan will have 12gb of VRAM, but the rest of the series will have 6.
 

Xdrive05

Member
So, reading that...what exactly is verdict on the 970s?

Confirmed to be "gimped" hardware in a 3.5 fast + 0.5 very slow configuration, all of them, but Nvidia maintains the extra performance loss is minimal. Nvidia gives dubious reasons why they think it's minimal (average fps instead of frame time).

Because it's the weekend, no tech sites have had time to independently research the issue and show actual results at the edge cases where this will matter most.

So we're still in wait and see mode.
 

dr_rus

Member
It's possible (and even likely) that the new Titan will have 12gb of VRAM, but the rest of the series will have 6.
It's highly unlikely in my opinion. There is no need for a gaming card with such amount of memory. Everyone should also consider that first HBM cards are likely to have rather conservative memory pools.
 

AmyS

Member
I'm guessing the Titan II might have 10 GB with 1 SMM disabled and the cheaper GTX 1080 would be an 8 GB card with weaker DP and 2 SMMs disabled.

Then refresh in early 2016 on either same 28nm process (lol) or 20nm - Titan II Black no SMMs disabled and GTX 1080 Ti with only one SMM disabled.

In late 2016 comes the first cards in the Pascal family (mid sized GPx04 ?) on 16nm FinFET. 2017 we get 'Big Pascal' while the professional / server / supercomputer markets gets the Volta architecture (for the Tesla and maybe Quadro lines) which we won't see on the consumer/gaming side until 2018.

Remember Volta was originally meant to follow Maxwell but last year Nvidia announced Pascal.

It seems Volta was too ambitious for 2016 with both 1 TB/sec 3D stacked DRAM and NVLink. So they moved NVLink (and perhaps other features) to a later time and with the first generation of cards with 3D stacked RAM (HBM) the bandwidth will be high but not 1 TB/sec high, thus Pascal.

2013 roadmap
nvidia_gpu_roadmap_maxwell_volta.jpg


2014 roadmap
6TTiAyq.jpg


Volta might be to Pascal what Maxwell is to Kepler.


I know someone will come along and either pick apart my theory, or at least correct it or expand on it since I know absolutely nothing.
 
So Nvidia has admitted that they sold a 3.5GB card with 0.5GB stapled on the side and some folks are OK with this? I mean if I bought a V8 car to discover that it was a V7 I'd expect a refund how is this different?

Jesus Nvidia are slimy, I mean do they have so little confidence in their cards that they have to indulge in this deceptive BS. Maxwell is an excellent card too so it's utterly unnecessary I guess they've gotten so used to 'creative' communications around their mobile chips that it's leaking back over to the GPUs.
 

garath

Member
Nice. My EVGA FTW to FTW+ step up just passed the queue phase. Waiting on them to approve my invoice and I'll pay and ship it out. Fingers crossed for no coil whine!!
 

TeaFan

Member
I posted this in the new PC thread but I think it's better off here. Is it worth me upgrading to a 980 from a 770 gtx 2gb? I play on 120hz monitor at 1080p
 

garath

Member
I posted this in the new PC thread but I think it's better off here. Is it worth me upgrading to a 980 from a 770 gtx 2gb? I play on 120hz monitor at 1080p

Honestly, I don't think I would. But I tend to stick to every other generation. It's more cost efficient.

You really shouldn't be having that many problems with the 770 right now. Maybe by the end of the year.

But that's strictly opinion. Look at some benchmarks and determine if you find the extra frames with the $500+ price tag.
 
hey GAF, what would be a good model of the GTX 970 to pick up? I'm looking at the Gold edition....but I really don't need to spend $399 on it do I?

I can just pick up one of these correct?


http://www.amazon.com/dp/B00NH5T1UA/?tag=neogaf0e-20


http://www.amazon.com/dp/B00NN0GEXQ/?tag=neogaf0e-20

Out of these two I'd go with the MSI one. But if I had the choice I'd go with an EVGA one (SSC or FTW+), their warranty and step-up program is unmatched from what I keep hearing.
 

ViciousDS

Banned
Out of these two I'd go with the MSI one. But if I had the choice I'd go with an EVGA one (SSC or FTW+), their warranty and step-up program is unmatched from what I keep hearing.

yes, I had an EVGA in the past and there customer service is fantastic along with the warranty/step-up program

I'll just grab the EVGA one then
 
Well, what with the recent purchase of a 4k gsync monitor, the 3.5gb issue, and realizing my return period is almost up on my 970s, I'm opting to return them and buying 980s while I'm able. Hopefully I get lucky again with regards to coil whine.
 
hey GAF, what would be a good model of the GTX 970 to pick up? I'm looking at the Gold edition....but I really don't need to spend $399 on it do I?

I can just pick up one of these correct?
My MSI golden edition's are solid. Both of my cards have nice headroom and even in SLI the top gpu never breaches 65c... But, the Golden edition chips are not binned, so the MSI gaming edition is likely just as good (typical silicon lottery). Only reason to get a Golden is if you want the best air cooler

Edit: Might be worth noting, my Golden's do not have coil whine. Not sure if I got lucky, or if there was a revision that resolved the issue.
 

Jonm1010

Banned
I can't seem to find anyone else that has had this problem but I just installed my 970 and have wanted to try some downsampling of older games but for whatever reason my Nvidia Control Panel does not have a DSR option

What the heck?

It is up to date, acknowledges the 970 but nothing is there. Google has turned up nothing and like most things Im assuming Im just an idiot and am overlooking something.
 

The End

Member
I can't seem to find anyone else that has had this problem but I just installed my 970 and have wanted to try some downsampling of older games but for whatever reason my Nvidia Control Panel does not have a DSR option

What the heck?

It is up to date, acknowledges the 970 but nothing is there. Google has turned up nothing and like most things Im assuming Im just an idiot and am overlooking something.

It's enabled in the Nvidia control panel by default, go check the game itself and it should display whatever whackadoo resolution supported by your monitor
 
I can't seem to find anyone else that has had this problem but I just installed my 970 and have wanted to try some downsampling of older games but for whatever reason my Nvidia Control Panel does not have a DSR option

What the heck?

It is up to date, acknowledges the 970 but nothing is there. Google has turned up nothing and like most things Im assuming Im just an idiot and am overlooking something.

Are you sure you're looking under the right place in the nvidia control panel?

Also GeForce Experience can turn on DSR for you if it deems it "optimal" for a given game.
 
I can't seem to find anyone else that has had this problem but I just installed my 970 and have wanted to try some downsampling of older games but for whatever reason my Nvidia Control Panel does not have a DSR option

What the heck?

It is up to date, acknowledges the 970 but nothing is there. Google has turned up nothing and like most things Im assuming Im just an idiot and am overlooking something.
It was not enabled by default for me.
This guide shows to to turn is on in the Control Panel
http://nvidia.custhelp.com/app/answ...w-to-enable-dynamic-super-resolution-in-games.
aid_3587_1.jpg
 
Top Bottom