• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Nvidia Launches GTX 980 And GTX 970 "Maxwell" Graphics Cards ($549 & $329)

Yeah. I was on day 80 of the 90 day window when I put in for it. I have a feeling it'll be a few weeks at least before I get out of the queue. The cards are STILL in high demand.

Is there anything I need to know about the process? just need register and provide a copy of the invoice?
 

garath

Member
Is there anything I need to know about the process? just need register and provide a copy of the invoice?

Register the card, upload the invoice then you can start the process. It's relatively simple. Verify your info is correct, plug in what you paid, pick the step up card and shipping type and it'll give you the step up price and enter you in the queue.

After that it's waiting. Once you've passed the queue and a card is available, you'll have 7 days to pay the step up price then you have like 10 days to get the card to them with all the original pack ins. Once they get it, they'll ship you your new card with the shipping you chose at step up time.
 

nicoga3000

Saint Nic
EVGA is probably my top choice right now. I have a FTW and stepping up to a FTW+. I didn't like my ASUS - lowest base clock, poor overclocker, louder coil whine than my EVGA. I like the support EVGA shows their products as well.

Glad to hear someone else has high praise for EVGA. Makes me feel better about dropping $350+ on a GPU. :3
 

garath

Member
Glad to hear someone else has high praise for EVGA. Makes me feel better about dropping $350+ on a GPU. :3

They've always had top notch customer service and they stand behind their products. That was my major motivation. The other was that they have the highest base clocks. I really find using software to overclock annoying and I generally leave my cards factory.
 

Bladelaw

Member
I've been using EVGA cards for 10 years. I've had 2 RMAs and both were processed without fuss. I highly recommend them.

The 970 looks like a good upgrade for me. I'm coming from a 660Ti and annoyed I can't crank max settings on some newer releases. I'm only running 1080p so it should give me enough headroom for if/when I upgrade monitors. At least through this console generation.
 

garath

Member
I've been using EVGA cards for 10 years. I've had 2 RMAs and both were processed without fuss. I highly recommend them.

The 970 looks like a good upgrade for me. I'm coming from a 660Ti and annoyed I can't crank max settings on some newer releases. I'm only running 1080p so it should give me enough headroom for if/when I upgrade monitors. At least through this console generation.

660ti to 970 was my exact upgrade. It's a huge improvement.
 
With tax season coming I'll have some money to play with. Has anyone had experience upgrading from an EVGA 760? Or should I wait until the next generation (I fear my i5-2500k also may be beginning to choke)
 
With tax season coming I'll have some money to play with. Has anyone had experience upgrading from an EVGA 760? Or should I wait until the next generation (I fear my i5-2500k also may be beginning to choke)

Nah, I have that exact CPU OC'd to 4.5Ghz and I have no problems at all with my two GTX 770 4GB cards. All games butter smooth. You shouldn't have any bottlenecks with that CPU on the new cards either.
 

Xdrive05

Member
I missed the 970/980 hype train, but now I'm in the market for a 970. Any particular brands/models I need to know about? Or know to avoid?

Is there a quality round-up review for the 970 cards?
 
I missed the 970/980 hype train, but now I'm in the market for a 970. Any particular brands/models I need to know about? Or know to avoid?

Is there a quality round-up review for the 970 cards?

Personally I usually either get EVGA or MSI, never tried Asus but I've yet to hear complaints. Typically I choose MSI because their revisions often have a little bit of an edge over EVGA, or at least they do when I'm in the market for one. But so far the EVGA seems to have the edge right now in the base models. Honestly though I'd wait for the inevitable revisions with more RAM, it's basically tradition at this point.

Evga or MSI 970? Tempted to go with the new SSC

EVGA has better clock speeds for the 970...for now.
 

Vlaphor

Member
So bizarre... So I wasn't experiencing coil whine. I just assumed that was what was going on because I'd heard so much about it. It turns out that by default my 970 ACX 2.0 had the fans turned off, unless it got close to 90 degrees, at which point it blasted the fans for 5-10 seconds. It was so loud my friend over skype thought someone was vacuuming in the room. I checked my case to see if there was an airflow issue, and after feeling that the card was super hot to the touch, I discovered that the fans weren't spinning at all. I had to go into EVGA PrecisionX and enable the automatic fan control. So damn weird that that wasn't on by default. Running super cool with no issues now.

I decided to check my video card after dealing with idle temps in the upper 40's and found that mine was in the same way. Enabled automatic fan control and now my idles are in the low 30's Strange.


...and when I shut down Precision X, the temps raise. Do I just need to leave it on in the background or something?
 

garath

Member
I decided to check my video card after dealing with idle temps in the upper 40's and found that mine was in the same way. Enabled automatic fan control and now my idles are in the low 30's Strange.


...and when I shut down Precision X, the temps raise. Do I just need to leave it on in the background or something?

Idle temps in the 40s with no fans are fine and what the bios is programmed for. The fans will not turn on until 60c. That's the "silent at idle" feature of majority of the 970s. Undead hero sounds like he had a bug with the fan curve where they weren't kicking on until the 90s. That's way off base.
 
I decided to check my video card after dealing with idle temps in the upper 40's and found that mine was in the same way. Enabled automatic fan control and now my idles are in the low 30's Strange.


...and when I shut down Precision X, the temps raise. Do I just need to leave it on in the background or something?

Geez, it seems like it. I just checked and the fans weren't running. Opened PrecisionX and it was also in the upper 40's, but then the fans started spinning. There has to be something wrong here.

Edit:

Idle temps in the 40s with no fans are fine and what the bios is programmed for. The fans will not turn on until 60c. That's the "silent at idle" feature of majority of the 970s. Undead hero sounds like he had a bug with the fan curve where they weren't kicking on until the 90s. That's way off base.

That makes sense, but this makes me wonder if I'll still need to have PrecisionX open in the background anytime I'm playing a game.
 

nicoga3000

Saint Nic
They've always had top notch customer service and they stand behind their products. That was my major motivation. The other was that they have the highest base clocks. I really find using software to overclock annoying and I generally leave my cards factory.

Got it installed over lunch. Max settings on Warframe on my rig puts my temp in the 34C range. On top of that, this card is silent - my case fans run louder. I'm sure it'll make noise when I press it harder, but I'm not currently playing anything with much more demanding specs and PhysX to merit trying. I could install Skyrim and put it on Ultra I guess?

Thanks for your input, though. Definitely happy with my decision to go EVGA on this.
 

garath

Member
Geez, it seems like it. I just checked and the fans weren't running. Opened PrecisionX and it was also in the upper 40's, but then the fans started spinning. There has to be something wrong here.

Edit:



That makes sense, but this makes me wonder if I'll still need to have PrecisionX open in the background anytime I'm playing a game.

Unless it's bugged, you won't need PrecisionX open for the standard fan curve. The standard curve is fans off until 60c then it'll slowly ramp up. The card will usually peak around 72 under load until the fans bring it back down. You'll average 70 give or take on a 100% GPU game.
 

Vlaphor

Member
Unless it's bugged, you won't need PrecisionX open for the standard fan curve. The standard curve is fans off until 60c then it'll slowly ramp up. The card will usually peak around 72 under load until the fans bring it back down. You'll average 70 give or take on a 100% GPU game.

That does make sense. I was peaking at around mid 70s during Unity. I'll try it in a bit and see how it goes
 

Daunt

Neo Member
Guys I need some help.

I have had my MSI GTX 980 for months now and thought I have had a stable overclock time and time again.

I ran it for hours at a time on Uniengine, furmark, OCCT you name it and it has been stable but then I will play certain games and BOOM I get a solid colour screen crash (it will literally go a solid colour for 15-30 seconds before coming back into the game)

This is because you're assuming that benchmark tools and stability should equate to stable overclock for games. Hint: They aren't. Stop that.

If you want to ensure your OC is stable, always, always test it on actual games. Relying on benchmark tools seems to be as popular a misconception to the internet as thinking that using Vsync to cap frames is a good idea, which is even worse.

If your overclock doesn't pass the game test, it's not actually stable.

With tax season coming I'll have some money to play with. Has anyone had experience upgrading from an EVGA 760? Or should I wait until the next generation (I fear my i5-2500k also may be beginning to choke)

Like the other guy (Trev) said, your 2500k is going to be fine. Most likely for a few years yet, considering the (lack of) quality for console hardware and ports -- the only area you'll struggle are with PC exclusives.

I have the same processor at 4.5 too, no worries about bottlenecks.

Does anyone have the Gigabyte WindForce 3 variant of GTX970?
My current card is Gigabyte's gtx760 windforce variant and I've been happy with it, I'm assuming that GTX970 version is good too?

Yep, I bought the G1 Windforce in October. Previous card was a MSI 7850.

You won't get exactly a 2x perf boost from a 760 to a 970, but you'll be close. It's swell, go get one!
 

Xdrive05

Member
Should we wait until the 970 memory allocation issue is addressed before buying?

I really want one, but if this is a legit issue (and a lasting issue), I may just pass. 3.5gb vs. 4gb will probably matter at some point in the coming few years I'm sure. :-/
 
If my FTW didn't have any coil whine at all, my answer would be definitely not.

However, I'm paying to see if the FTW+ resolves the light coil whine I have on my FTW.
New Features:
  • Included Backplate with thermal pads cooling topside memory
  • Dual Bios
  • MMCP – Memory MOSFET Cooling Plate
  • New heatsink with straight copper pipes
  • 6 Phase Power
  • Higher power limit with 8+6 pin power connect
  • New video output config (DVI-I, HDMI, DP, DP)

Even the SSC+ is better than the original FTW. I'm pretty sure EVGA is also allowing step-up up from the FTW to the SSC+. The SSC+ has all of the above improvements except the backplate with memory cooling and slightly lower clocks.

Thanks. Seems to be worth it. I missed the original backplate deal, and better cooling is a concern as the card runs hotter than expected.
 

jfoul

Member
Heads up for anybody looking to buy a videocard from Newegg soon. The $25 Visa Checkout deal stacks with the $25 AMEX credit.

Visa checkout: $25 Off Your $200+ Purchase with code: VCOJAN25
  • Offer valid through 1/26/15
Pay with AMEX after activating the $25 credit
  • "Get a one-time $25 statement credit by using your enrolled Card to spend a total of $200 or more online at www.Newegg.com by 2/28/2015."
 
With tax season coming I'll have some money to play with. Has anyone had experience upgrading from an EVGA 760? Or should I wait until the next generation (I fear my i5-2500k also may be beginning to choke)

I upgraded my EVGA 760 to an EVGA 970 FTW a couple months back and it was totally worth it. Obviously the performance bump will depend on what kind of game you're playing but I can't think of a game that doesn't run nicely at 1080p with my 970 at mostly maxed out settings. Maybe you'll have to drop down AA and whatnot but it was definitely a noticeable increase in performance.
 

Dries

Member
Cross post from PC thread:

Hey guys, I'm wondering whether to set my PhysX to be handled by my CPU (an OC'ed 2500K 4.4Ghz) or my GPU (GTX 980).

My 980 is always fully loaded in games (as seen in MSI Afterburder) cause I DSR the shit out of every game. My CPU cores are usually around a usage of 40-50% during gaming.

It would seem to me that my CPU still has some load to spare, so I should let that handle my PhysX. Am I thinking right?
 

JaseC

gave away the keys to the kingdom.
Cross post from PC thread:

You could probably get away with that with older games, but in newer stuff it'd severely cripple your framerate. Hardware-accelerated PhysX is practically designed to run poorly on CPUs.
 

lumi7890

Member
Should we wait until the 970 memory allocation issue is addressed before buying?

I really want one, but if this is a legit issue (and a lasting issue), I may just pass. 3.5gb vs. 4gb will probably matter at some point in the coming few years I'm sure. :-/

I am also concern about this issue as well, should I just get a gtx 980 instead, because I don't want to get something that going to give me problems.
 
Should we wait until the 970 memory allocation issue is addressed before buying?

I really want one, but if this is a legit issue (and a lasting issue), I may just pass. 3.5gb vs. 4gb will probably matter at some point in the coming few years I'm sure. :-/

Well I just bough 970. Doesn't seem like it affects real gaming usage so far.
Altrough to be fair I don't plan to keep this card for too long.

980 is not worth 200$ more sadly.
 

Branson

Member
So this 980 is uh, Insane coming from a 580. I love it. It destroys everything I've thrown at it. Mordor maxed at 60 is a glorious thing.
 

BNGames

Member
Just went SLI. Need to clean up a bit in here, but I wanted to make sure the card worked before i started pulling the rest of this apart.

[
16152469139_639f3391cd_h.jpg
 

cheezcake

Member
Should we wait until the 970 memory allocation issue is addressed before buying?

I really want one, but if this is a legit issue (and a lasting issue), I may just pass. 3.5gb vs. 4gb will probably matter at some point in the coming few years I'm sure. :-/

Hmmm weird. I just tested this in AC: Unity, at first it went to ~3550 MB Usage with normal performance, I changed FXAA to MSAA 4x and it immediately bumped up to ~3900 MB Usage. Of course there was a performance decrease to about ~35fps avg but nothing insane like this threads implying.

If it matters my exact model is an MSI GTX 970 GAMING 4G
 
Hmmm weird. I just tested this in AC: Unity, at first it went to ~3550 MB Usage with normal performance, I changed FXAA to MSAA 4x and it immediately bumped up to ~3900 MB Usage. Of course there was a performance decrease to about ~35fps avg but nothing insane like this threads implying.

If it matters my exact model is an MSI GTX 970 GAMING 4G

It was the same for me, in exact every way.
 

curlycare

Member
Seems like Gigabyte is releasing new bios for their G1 970s soon, probably to fix the dvi issues some are having :
Release for HYNIX Memory
NVIDIA Source BIOS Version: DP:84.04.31.00.B8, DD:84.04.31.00.B7
Base Clock/Boost: Clock:1178/1329MHz
Memory Clock:7010MHz
Improve compatibility with some monitors
Modify fan duty in MS-DOS mode
 

AmyS

Member
Nvidia To Reveal GM200 Between March 17-20 At The GPU Tech Conference

Nvidia will reportedly reveal GM200 in March. The GPU core will power next generation flagship Quadro and GeForce GTX ( Titan X ) products.
This will occur during the GPU Technology Conference between the 17th of March and the 20th of March. The new GPU will reportedly be showcased in the new Quadro M6000 from Nvidia. However we should see the new GPU soon after in the next GeForce GTX Titan card ( Titan X / Titan II ) from Nvidia.

GM200 is a massive GPU, quite possibly the largest ever produced by Nvidia. In our exclusive analysis of the GPU we had found out that GM200 could easily be over 600mm² large. This means that Nvidia may have approached the reticle limit of TSMC’s 28nm process.

GM200 GPU Core Arrives In Two Months

Leaked but unconfirmed specifications for the GPU include 24 SMM units for a total of 3072 CUDA cores, a 384bit GDDR5 memory interface and 96 render output units. The GPU was pictured above on a PCB most likely belonging to a Quadro M6000 card as with 12GB of GDDR5 memory.


In addition to technical specifications of GM200 there’s also information pertaining to the next Titan in particular. The Titan X will reportedly employ a native fan-off function that completely disables the fan below a certain temperature limit to reduce noise. This should prove especially effective when the GPU is idling while you browse the web or work on a document. There’s also talk of pricing for the Titan X going up as high as $1350. a 35% increase over the previous, arguably exorbitant, launch price for the original GTX Titan.

If history repeats itself, which it often does, then we’re going to see GM200 arrive to market in a professional workstation graphics card first. Following that Nvidia may choose to release it in an uber expensive GTX Titan type card a few months afterwards. Finally the GPU should arrive in ordinary GeForce GTX, possibly post 900 series, SKUs.

Nvidia’s traditional 3 digit naming scheme is running out of steam. So non Titan GeForce GTX cards based on GM200 will most likely adopt a new naming structure especially if Nvidia plans to release more than one non Titan card. As 980 Ti won’t leave room for a second card and 985, 985 Ti simply don’t represent the performance delta between GM204 and GM200.

We don’t really know what naming structure will eventually take place but we might see something like X80 and X70, with the GTX prefix going back to its original suffix location. So something like an Nvidia Geforce X80 GTX, or Nvidia GeForce X70 GTX where the X represents a 10 as with Roman numerals. For subsequent generations Nvidia could simply add an additional numeral such as X180 or X280. I’m just brainstorming here and would love to hear what you think Nvidia’s future naming structure should look like. Please share your thoughts in a comment below. Not just for GM200 cards but for future Pascal based cards as well.
http://wccftech.com/nvidia-reveal-gm200-march-gpu-conference/

original source:
http://www.sweclockers.com/nyhet/19923-nvidia-avtacker-gm200-pa-gpu-technology-conference-2015

GTX 1080 or 1080 Ti for me.
 

ZOONAMI

Junior Member
New Features:
  • Included Backplate with thermal pads cooling topside memory
  • Dual Bios
  • MMCP – Memory MOSFET Cooling Plate
  • New heatsink with straight copper pipes
  • 6 Phase Power
  • Higher power limit with 8+6 pin power connect
  • New video output config (DVI-I, HDMI, DP, DP)

Even the SSC+ is better than the original FTW. I'm pretty sure EVGA is also allowing step-up up from the FTW to the SSC+. The SSC+ has all of the above improvements except the backplate with memory cooling and slightly lower clocks.


The big reason I intentionally purchased an original ftw is that it is only 9.5 inches versus 10.1. I have a hadron air so this is a big deal. I had a gtx 780 in there and it was nearly imposssible to fit and even then only with a ton of cabling hitting the back end of the card. The original 970 ftw fits perfectly in the hadron. 10.1 inches would run into cabling problems again.
 

Xdrive05

Member
This memory issue keeps being mentionned but i have yet to see someone affected by it.

I've seen a lot of posts/screenshots showing where a 980 uses all vram in the same scene where the 970 clearly caps at 3.5-3.6. But more concerning, I've seen clips that show when a 970 is "forced" to go over 3.6, then it hiccups really hard and slows down way more than one would expect from the setting increase that cause it to go "over"... implying performance issues accessing that last 500mb.

Hopefully this is just mass hysteria based on unscrupulous posters. But what has been shown, coupled with Nv's silence is concerning.
 
There's two new versions for Samsung memory for whatever reason.

If you run GPU-Z, it will display what brand of memory you have.

No there are two bioses for samsung memory also. Or not-yet-working links to them more specifically.
EDIT: Yeep should have refreshed before posting.

Just checked with GPU-Z, it's Hynix... which from what I recall reading doesn't overclock as well as Samsung memory, oh well (haven't tried overclocking the memory yet).

Since I'm not having any issues with my card for now, I'm going to wait until the dust settles on the new bios.
 
Just checked with GPU-Z, it's Hynix... which from what I recall reading doesn't overclock as well as Samsung memory, oh well (haven't tried overclocking the memory yet).

Since I'm not having any issues with my card for now, I'm going to wait until the dust settles on the new bios.
I can vouch for the Hynix memory overclock limitations. Whether it's because of voltage or something else, I can't go above +299MHz overclock on my Hynix VRAM without crashing Heaven on scene 11. I don't know if it's a voltage thing or what, but it sucks.

Like you, I'll wait to see what this BIOS does before I flash it, and I'll probably wind up modding it anyway.

That said, anybody know a way to lock the GPU's target resolution even when the display it's connected to is off? I finally managed to get Nvidia Gamestream working via Limelight, but the game changes resolution and visual dimensions whenever I turn off my display.
 

Alo81

Low Poly Gynecologist
MFAA now works on nearly all dx10/dx11 games.

NVIDIA Multi-Frame Sampled Anti-Aliasing (MFAA) is a new Anti-Aliasing (AA) technique exclusive to 2nd generation Maxwell Architecture GPUs, improving upon the performance of Multisample Anti-Aliasing (MSAA) to give you faster frame rates in your favorite games, whilst simultaneously improving image quality.



Last November we introduced MFAA support for a variety of games, and today we're extending MFAA support to almost all DX10 and DX11 games. Furthermore, you can now bypass the MFAA setup and activation process by enabling MFAA with a single click in supported GeForce Experience games. Simply click 'Optimize' in GeForce Experience and we'll do the rest. Learn more here.

GeForce Experience 2.2.2 MFAA OPS

For games not supported by GeForce Experience you'll need to open the NVIDIA Control Panel, navigate to 'Manage 3D Settings', change the 'Multi-Frame Sampled AA (MFAA)' option to “On”, and click 'Apply'. Next, in a game with MSAA graphics options, set the anti-aliasing option 2x or 4x MSAA, and our driver will take care of the rest.

GeForce Game Ready 344.75 WHQL Driver - MFAA NVIDIA Control Panel Global Game Configuration GeForce Game Ready 344.75 WHQL Driver - MFAA NVIDIA Control Panel Per-Game Configuration GeForce Game Ready 344.75 WHQL Driver - MFAA In-Game Configuration

When enabled, the result will be MFAA anti-aliasing comparable to 4x MSAA when 2x MSAA is selected, and comparable to 8x MSAA when 4x MSAA is selected. Performance, meanwhile, will be 10-30% faster thanks to MFAA's innovative technology:
 
Top Bottom