• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

GeForce GTX 970s seem to have an issue using all 4GB of VRAM, Nvidia looking into it

Status
Not open for further replies.
I doubt that Nvidia would replace a card like this. At most, maybe a free game or some such as compensation, that is if it turns out to be a hardware issue.

It would not be impossible to form a class action lawsuit if Nvidia wants to hand out $5 game codes or something similar for compensation. They have a responsibility in both financial and customer-loyalty terms to deliver the product they sold, or at least set up an exchange program for an actual 4GB version. This is one of their best-selling gpu's so far (IIRC) and it would be absurd and foolish of them to screw their customer base over. The card is too expensive a piece of electronics hardware to brush something like this under the rug or pretend a game-code for a discounted steam game is enough for false-advertising one of the main features.

This all depends on it being a hardware issue at all, though, and unable to be patched.
 

Parsnip

Member
Well, I'm glad I had to cancel my gtx970 order for other reasons.
Still planning to get a new card later, but definitely not until this clears up one way or another.
 

riflen

Member
It's most likely the issue described in the overclock.net thread and is a result of the design of the GPU, combined with the approach NVIDIA take in creating the product tiers.

It's hard to say because we don't know which SMX units are disabled on any particular 970, but the effective memory bus width for a 970 could be 208 bit, meaning a percentage of the 4GB VRAM would not be addressable at full rate.

Similar things happened with various Kepler GPUs, so it's nothing new and I expect AMD GPUs will exhibit it as well. No-one's getting anything for free from this. NVIDIA quote the memory bus controller width, which is 256 bit, not the effective bus width of the individual product.
 

Serandur

Member
If it's not 4GB, this is lawsuit material. The question is, who is going to sue?

Arguably, it seems fairly obvious Nvidia were always aware of this and technically, it is 4GBs and all of it can be allocated, but with a severe performance penalty that make 970s try to avoid using past ~3.5 GBs whenever they can (similar to 660s/660Tis).

I think Nvidia would technically be covered because it is 4GBs and 970s can, under some circumstances, use it. I think they knew it wouldn't be a valid point of legal contention beforehand and so that's why they went ahead and did it anyway. But it is scummy. I think the best hope is to get one of the major sites reporting on this, but I doubt any successful legal action will come from this.
 
If it's not 4GB, this is lawsuit material. The question is, who is going to sue?

everyone.gif
 
So how would this affect sli setups? Because for me, I know it should be 8Gb VRAM but even the 7Gb plus possible update to 8Gb is a massive improvement for me.
 

Akronis

Member
So how would this affect sli setups? Because for me, I know it should be 8Gb VRAM but even the 7Gb plus possible update to 8Gb is a massive improvement for me.

SLI mirrors VRAM, it does not add them together. You have 4GB of usable VRAM like everyone else.
 
SLI mirrors VRAM, it does not add them together. You have 4GB of usable VRAM like everyone else.

Sorry I'm new to SLIs, what do you mean by mirror VRAM?

Oh and if some 970s are reaching 4Gb then it sounds like it could (hopefully) be a software issue.
 
So how would this affect sli setups? Because for me, I know it should be 8Gb VRAM but even the 7Gb plus possible update to 8Gb is a massive improvement for me.

SLI in AFR clones your memory setup. You have access to 4GB in total (not 8) with 2x the grunt. But due to SLI users often having higher res screens and graphical options selected, this may inversely affect you even more that normal users. I know for example my brother'S SLI 970 rig suffers from it qzute a bit (1440p).
 

Xdrive05

Member
http://forums.guru3d.com/showthread.php?t=396471

Guru3d started a new thread about it in light of the new findings.

Nvidia continues a very loud silence on this issue. Took them a tsunami of customer outcry to even acknowledge it, and still no official response or explanation.

Stinks of prior knowledge and hoping no one would catch it, and now working on a damage control plan.

It's already Friday. If they don't say something by close of business today, then chances are this will go on.
 
SLI in AFR clones your memory setup. You have access to 4GB in total (not 8) with 2x the grunt. But due to SLI users often having higher res screens and graphical options selected, this may inversely affect you even more that normal users. I know for example my brother'S SLI 970 rig suffers from it qzute a bit (1440p).

So it's going to affect me even worse? I've literally just bought a new SLI setup. I'm only using it for 18th, especially for a while. I got the SLI to prepare me for when I get a new monitor.
 
The GTX 980 isn't worth it, they said. The cost / benefit ratio favors the GTX 970, they said. Who's laughing now? I SAY WHO?!? Just kidding and I hope it's resolvable for both y'alls' and Nvidia's sake. It's a really good thing all of the 970s were sold out when I decided to upgrade.
 
So it's going to affect me even worse? I've literally just bought a new SLI setup. I'm only using it for 18th, especially for a while. I got the SLI to prepare me for when I get a new monitor.

It will affect you more so if you game at higher resolutions/ higher graphical settings (which SLI users tend to do). What new monitor are you getting?
 

Spartix

Member
Wow I just blew close to 500$ CDN on a Msi gold edition 970 about 3 weeks ago.. Granted I can't really say I have had any major problems with it while gaming so far. (Shadow of Mordor ultra pack maintained a steady 60fps no problem). It's just kind of unnerving knowing my card can be... I don't know? Better?
 
It will affect you more so if you game at higher resolutions/ higher graphical settings (which SLI users tend to do). What new monitor are you getting?

Not sure. Wasn't even thinking about it just yet. Was gonna get a 1440p or a 4k one when I can spare the room. Will be using my TV for a while so max of 1080/60. Unless I can get 120Hz on it.
 
For me the real problem are games that come out a year or two from now that will require larger VRAM, then the problem will be glaring.
 

Kinthalis

Banned
Sorry I'm new to SLIs, what do you mean by mirror VRAM?

Oh and if some 970s are reaching 4Gb then it sounds like it could (hopefully) be a software issue.

In the current SLI implementation (AFR - alternate frame rendering) each card takes turns (alternates) rendering a frame of the game. So GPU 1 renders frame 1, GPU 2 renders frame 2, then GPU1 renders frame 3, etc.

In order to accomplish this, both GPU's need access to the same rendering assets. So both GPU buffers contain the same set of assets effectively mirroring the information required to render the needed frames.

Games therefore, cannot access more than the amount of RAM available to any single GPU. To do so, would mean at least one GPU would not have access to the required assets in order to properly draw the next frame.
 

Kinthalis

Banned
So it's going to affect me even worse? I've literally just bought a new SLI setup. I'm only using it for 18th, especially for a while. I got the SLI to prepare me for when I get a new monitor.

Got ninja'd but anyway, yeah, SLI users usually go SLI BECAUSE theya re gmaing on higher reoslution panels and like to run things at max settings.

Max settings = higher Vram usage, higher resolution = higher vram usage. So you are more likely to get hit by a sudden performance drop when assets residing in a slow paging part of the RAM are requested by the GPU. Than someone with a single card, running the game at medium-high settings, 1080p.
 

LilJoka

Member
You know this is quite frustrating if I've upgraded from a 780 3Gb to a 970 3.5Gb. The driver allocating 4gb on a 980 in watchdogs but only 3.5gb on a 970, added to the drop in bandwidth on the last 500mb really is making me think nvidia knew and blocked it with drivers to keep a smooth experience in the short run.

If this is true I will be demanding my money back from scan.co.uk.
 

UnrealEck

Member
As good as the 900 series is for price to performance and TDP, they've had some problems with coil whine and now this rather major problem.
 

Kuro

Member
OP needs to put this quote from GURU3D in

64bit memory controller, total 4 memory controllers = 256 bit memory controller.
Assuming if there are 3 raster engines with each three has one SMM disabled leaving 1 raster engine with 4 SMM intact.
Mathematically ;
16 SMM = 256 bit = 4096 Mb
13 SMM = 208 bit = 3328 Mb

208 bit = effective width after disabling SMM with 256 bit being actual memory controller width
 

LilJoka

Member
OP needs to put this quote from GURU3D in

Very interesting, could be the answer to it.

I'm reading that the GiB/s bandwidth use drops around 90% or more when 3.5 GB vram or higher is hit... is that right? WOW

Yes from 150Gb/s to less than 20, even single digit gbps. Only the last 500mb though, not the whole 4Gb.

The 970 tries very hard not to use this portion of VRAM whilst gaming. And when it does the game fps is all over the place.
 
In the current SLI implementation (AFR - alternate frame rendering) each card takes turns (alternates) rendering a frame of the game. So GPU 1 renders frame 1, GPU 2 renders frame 2, then GPU1 renders frame 3, etc.

Sorry for being such an idiot about this. But wouldn't this mean that it would only have to do half the frame power so would be like having twice the power?
 

riflen

Member
Yes, it's a little sneaky to quote memory bus controller width and not effective bus width (especially on products where those values differ), but if the 970 was sold as 208 bit, most would still have bought it because the massive majority don't know what that means and purchase on price vs performance alone. It would be just another number that's better on the 980. It's also what their competition do too.
 

LilJoka

Member
Sorry for being such an idiot about this. But wouldn't this mean that it would only have to do half the frame power so would be like having twice the power?

Yes that's right, 2 cards, ideally twice power as they have 2x as long to deliver the frame in time.

But VRAM is mirrored, so your 4Gb that could do ultra textures 1600p is likely not going to be able to with 3.5Gb VRAM.
 
Yes that's right, 2 cards, ideally twice power as they have 2x as long to deliver the frame in time.

But VRAM is mirrored, so your 4Gb that could do ultra textures 1600p is likely not going to be able to with 3.5Gb VRAM.

Well s***.

I guess it won't last as long as I'd have hoped without upgrading :(
 
Similar things happened with various Kepler GPUs, so it's nothing new and I expect AMD GPUs will exhibit it as well.

I don't know if performance on all the memory controllers on my R9 290 are equal, since I've never tested that. But I do know that AMD's driver has no issue with allocating 4GB VRAM with no stuttering. And neither does Mantle, for that matter.

I can post a screenshot of BF4 with Mantle reporting 4GB usage and no stuttering later, if you want proof-- Mantle doesn't use AMD's memory management so in theory it should be the best test for this sort of thing. It overlays a frametime graph and everything. But I'll have to re-download the game which should take a long ass time

I really don't think bringing AMD into this discussion is necessary though.
 

Kinthalis

Banned
Sorry for being such an idiot about this. But wouldn't this mean that it would only have to do half the frame power so would be like having twice the power?

Well kind of. Each GPU still has to handle the full complexity of each frame (other methods of SLI used in the past such as tiling spread out the workload of each single frame), but yeah, they have half the number of frames to process for any given frame rate. That's why you can, given proper optimization, achieve results that are nearly 100% scalability in GPU bound scenarios.
 

RCSI

Member
It would not be impossible to form a class action lawsuit if Nvidia wants to hand out $5 game codes or something similar for compensation. They have a responsibility in both financial and customer-loyalty terms to deliver the product they sold, or at least set up an exchange program for an actual 4GB version. This is one of their best-selling gpu's so far (IIRC) and it would be absurd and foolish of them to screw their customer base over. The card is too expensive a piece of electronics hardware to brush something like this under the rug or pretend a game-code for a discounted steam game is enough for false-advertising one of the main features.

This all depends on it being a hardware issue at all, though, and unable to be patched.

I did not think about it in that regard. I'll admit to this being my first product I've purchased that has the potential to be a hardware problem. I've been too used to being on the receiving end of failure of services. Thank you for informing me as to the actions that could occur if it does turn out to be hardware issues.
 

curlycare

Member
Don't know if it's relevant, but I tried playing Crysis 3 with 4k resolution and 8xMSAA, max textures and everything else on low. Memory usage was around 92-95% and framerate was pretty stable 11 (lol), but that's to be expected I think. Nevertheless no stuttering besides the low framerate. Has the low performance after reaching the vram cap been consistently low fps or just stuttering?

This with Gigabyte G1 970.
 
I got an MSI 970 over the holiday, and it's been superb...but I suppose I'm probably not pushing it to the absolute limit as of yet.

I just can't get over the fact that they sold a 3gb 208bit card as 4gb 256bit

That is legitimately upsetting.
 

Kinthalis

Banned
Don't know if it's relevant, but I tried playing Crysis 3 with 4k resolution and 8xMSAA, max textures and everything else on low. Memory usage was around 92-95% and framerate was pretty stable 11 (lol), but that's to be expected I think. Nevertheless no stuttering besides the low framerate. Has the low performance after reaching the vram cap been consistently low fps or just stuttering?

This with Gigabyte G1 970.

The problem is that memory allocated does not necessarily mean memory utilized.
 
Don't know if it's relevant, but I tried playing Crysis 3 with 4k resolution and 8xMSAA, max textures and everything else on low. Memory usage was around 92-95% and framerate was pretty stable 11 (lol), but that's to be expected I think. Nevertheless no stuttering besides the low framerate. Has the low performance after reaching the vram cap been consistently low fps or just stuttering?

This with Gigabyte G1 970.
Before commenting, you should check your VRAM usage whilst testing. I suggest MSI afterburner with the Rivatuner OSD.
 

LilJoka

Member
Don't know if it's relevant, but I tried playing Crysis 3 with 4k resolution and 8xMSAA, max textures and everything else on low. Memory usage was around 92-95% and framerate was pretty stable 11 (lol), but that's to be expected I think. Nevertheless no stuttering besides the low framerate. Has the low performance after reaching the vram cap been consistently low fps or just stuttering?

This with Gigabyte G1 970.

It's too hard to test with a single card with games since the fps is too low already. It's easier to test with SLI. Monitor GPU usage once the 970 VRAM reaches close to 4Gb.

It's also obvious in Watchdogs, 970 allocates 3.5Gb, 980 allocates 4Gb same settings.
 

Kuro

Member
This doesn't effect me too much considering I only game in 1080p but this ruins my plans for SLI in the future to play at higher resolutions. I don't even know what kind of memory requirements at high-ultra settings games are going to need at 1080p in the next year so it could possible begin to be a problem. I'm going to have to sell my 970 and pick up a 980 at this point.
 
OP needs to put this quote from GURU3D in

64bit memory controller, total 4 memory controllers = 256 bit memory controller.
Assuming if there are 3 raster engines with each three has one SMM disabled leaving 1 raster engine with 4 SMM intact.
Mathematically ;
16 SMM = 256 bit = 4096 Mb
13 SMM = 208 bit = 3328 Mb

208 bit = effective width after disabling SMM with 256 bit being actual memory controller width

I just can't get over the fact that they sold a 3gb 208bit card as 4gb 256bit

Added to OP. The drama intensifies.
 

Kuro

Member
I got an MSI 970 over the holiday, and it's been superb...but I suppose I'm probably not pushing it to the absolute limit as of yet.



That is legitimately upsetting.

I guess its technically a 4gb 208bit card but that bus is what causes the massive drop in bandwidth over 3gb.
 

curlycare

Member
The problem is that memory allocated does not necessarily mean memory utilized.
Okay I see.

Before commenting, you should check your VRAM usage whilst testing. I suggest MSI afterburner with the Rivatuner OSD.
I was monitoring VRAM usage with HWINFO.

It's too hard to test with a single card with games since the fps is too low already. It's easier to test with SLI. Monitor GPU usage once the 970 VRAM reaches close to 4Gb.

It's also obvious in Watchdogs, 970 allocates 3.5Gb, 980 allocates 4Gb same settings.
Yeah it might be useless to test with one card.

Edit: I received the invoice for my card just today so I'm trying to find some silver lining. :(
 
As horrible and shitty this is. I'm still upgrading from an AMD radeon 1Gb to SLI 970s. I am going to see a gigantic difference and hopefully it will be able to be patched.

Also with dx12 it will also add more performance.

I, myself, am staying cautiously optimistic.
 

Kayant

Member
It's most likely the issue described in the overclock.net thread and is a result of the design of the GPU, combined with the approach NVIDIA take in creating the product tiers.

It's hard to say because we don't know which SMX units are disabled on any particular 970, but the effective memory bus width for a 970 could be 208 bit, meaning a percentage of the 4GB VRAM would not be addressable at full rate.

Similar things happened with various Kepler GPUs, so it's nothing new and I expect AMD GPUs will exhibit it as well. No-one's getting anything for free from this. NVIDIA quote the memory bus controller width, which is 256 bit, not the effective bus width of the individual product.

Really hate it how companies can get away with doing stuff like that.

My 970 was supposed to be super future proof. Really thought the whole idea of getting one was that the premium for the 980 was not worth it. Really sucks if it's indeed hardware related :( even though it hasn't affected my yet.

You know this is quite frustrating if I've upgraded from a 780 3Gb to a 970 3.5Gb. The driver allocating 4gb on a 980 in watchdogs but only 3.5gb on a 970, added to the drop in bandwidth on the last 500mb really is making me think nvidia knew and blocked it with drivers to keep a smooth experience in the short run.

If this is true I will be demanding my money back from scan.co.uk.

Might consider the same thing. Do you have any experience with returns with them? I got mine December last year.
 

LiquidMetal14

hide your water-based mammals
Does anyone have a CS number for Gigabyte? I'm researching the protocol for returns on the product in case I have to take those means.
 

LilJoka

Member
Really hate it how companies can get away with doing stuff like that.

My 970 was supposed to be super future proof. Really thought the whole idea of getting one was that the premium for the 980 was not worth it. Really sucks if it's indeed hardware related :( even though it hasn't affected my yet.



Might consider the same thing. Do you have any experience with returns with them? I got mine December last year.

I have bought over £10,000 worth of parts over the years so they better sort it out lol.

But I had to Rma a water pump and I was literally tested for user error by their "water cooling expert"... They didn't acknowledge my purchase history or that I was an intel engineering employee either.

That's the only time I dealt with them for an RMA. And obviously I expect a small battle. I got mine around December too I think.

Let's wait for nvidia to play their cards.
 
Status
Not open for further replies.
Top Bottom