• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

GeForce GTX 970s seem to have an issue using all 4GB of VRAM, Nvidia looking into it

Status
Not open for further replies.

LilJoka

Member
The GPU usage fluctuated but not by a great deal. 99% down to 85% but it didn't coincide with the VRAM usage increase/decrease.

I just played again and the VRAM / RAM usage that I referenced before did not replicate itself. RAM usage stayed around 6.9GB when the VRAM fluctuated between 3.2GB and 3.7GB.

I have 4 CSV files of data that I will have to import into a spreadsheet to see if there are any wild and wonderful patterns I can find.

Compare VRAM usage vs GPU usage vs Frame times.
 
Compare VRAM usage vs GPU usage vs Frame times.

I have uploaded a document showing VRAM usage / frame times / System RAM usage.

Apologies for the number of entries; my polling rate was set at 300MS rather than 1000ms.

https://docs.google.com/spreadsheets/d/16EEpiC47KilVHzt1rEfLUfFULMth52UxSc0qSU77M78/pubhtml

There are moments that drop but I was messing around in the menus at varying points so that is probably the cause of some odd drop outs.

Edit: Here is another one but at a polling rate of 1000ms.

https://docs.google.com/spreadsheets/d/1JnGt0ITa0fixDdWGjzlYv3_W7SWbkrVnRWr2Lg9zLjk/pubhtml
 

LilJoka

Member
I have uploaded a document showing VRAM usage / frame times / System RAM usage.

Apologies for the number of entries; my polling rate was set at 300MS rather than 1000ms.

https://docs.google.com/spreadsheets/d/16EEpiC47KilVHzt1rEfLUfFULMth52UxSc0qSU77M78/pubhtml

No, that's good, I should have done that too, even 10ms!

The graphs for your data, looks like you have a frame limiter?

k3B6TTq.jpg


Repeat with more than 3600MB VRAM, 3800-3900MB.
 

Faith

Member
Well if you're looking for a game to try, the Star Citizen demo hits the 3.5gb cap for me at 1440p on the medium or high preset and stays around 35fps (on one 970). I'll also be really interested in how far The Witcher 3 pushes memory usage.
Star Citizen? Ok, let's try it out. I'm going to check frame times also.
 
GTA V could be one of those games where we see normal vram usage past 3.5GB.
Would suck if say you can't put a certain setting to ultra even though you have the frame rate while the 980 has no issue.
 

The Llama

Member

You should take two things away from that simple description. First, despite initial reviews and information from NVIDIA, the GTX 970 actually has fewer ROPs and less L2 cache than the GTX 980. NVIDIA says this was an error in the reviewer’s guide and a misunderstanding between the engineering team and the technical PR team on how the architecture itself functioned.

Error in the reviewer's guide? Yep. That's what happened. Just a misunderstanding. Of course.
 

Mr Swine

Banned
So did I do a mistake that I bought a 970 yesterday, is it possible for Nvidia to release a 970 that doesn't have slow 512MB ram?
 
I miss being in blissful ignorance to this. It hasn't changed the fact that I am extremely pleased with the card, it hasn't changed that I feel like all of my games are performing great. It is just like having an annoying itch that I can't scratch.
 

Mengy

wishes it were bannable to say mean things about Marvel
I miss being in blissful ignorance to this. It hasn't changed the fact that I am extremely pleased with the card, it hasn't changed that I feel like all of my games are performing great. It is just like having an annoying itch that I can't scratch.

Yeah, that's how I feel as well. I'm pissed at the deception, yet I'm super happy with my 970. Quite the conundrum!

In the end, there's not much I can do about it now anyway. I bought the card, my bad adopting new tech too fast I suppose. All I can do now is enjoy my 970 and be aware of the 3.5GB limit and learn from my mistake for the future. I'll tell you one thing, I'll never buy an Nvidia card early again. Not ever. I'll always wait a year or so to wait for the skeletons to get cleaned out first. Fool me once, shame on you, fool me twice shame on me...
 

Lain

Member
Disappointing.
Sure the 970 is still a great card, but had I known about this before, I wouldn't have bought it and instead opted for a 980 after saving a bit more.
To me Nvidia's behaviour feels deceptively dishonest.
 
Nvidia has a shitstorm on their hands right now... the internet mob will not let this slide. I would not want to be in the person's seat who was responsible for the "internal miscommunication" (assuming that story is true)
 

Honey Bunny

Member
Extremetech have tested frametimes

www.extremetech.com/extreme/198223-...idias-penultimate-gpu-have-a-memory-problem/2

With that said, our 4K test did pick up a potential discrepancy in Shadows of Mordor. While the frame rates were equivalently positioned at both 4K and 1080p, the frame times weren’t. The graph below shows the 1% frame times for Shadows of Mordor, meaning the worst 1% times (in milliseconds).

S-of-M.png



The 1% frame times in Shadows of Mordor are significantly worse on the GTX 970 than the GTX 980. This implies that yes, there are some scenarios in which stuttering can negatively impact frame rate and that the complaints of some users may not be without merit. However, the strength of this argument is partly attenuated by the frame rate itself — at an average of 33 FPS, the game doesn’t play particularly smoothly or well even on the GTX 980.
 

bwat47

Neo Member
Error in the reviewer's guide? Yep. That's what happened. Just a misunderstanding. Of course.

Its a huge screw up, and could definitely land nvidia in some hot water depending on various countries consumer protection laws, but I highly doubt it was intentional. I liked anandtech's take on whether or not they thought it was intentional:

Now as NVIDIA is in full damage control mode at this point, consideration must be given as to whether NVIDIA’s story is at all true; NVIDIA would hardly be the first company to lie when painted into a corner by controversy. With that in mind, given the story that NVIDIA has provided, do we believe them? In short, yes we do.

To be blunt, if this was intentional then this would be an incredibly stupid plan, and NVIDIA as a company has not shown themselves to be that dumb. NVIDIA gains nothing by publishing an initially incorrect ROP count for the GTX 970, and if this information had been properly presented in the first place it would have been a footnote in an article extoling the virtues of the GTX 970, rather than the centerpiece of a full-on front page exposé. Furthermore if not by this memory allocation issues then other factors would have ultimately brought these incorrect specifications to light, so NVIDIA would have never been able to keep it under wraps for long if it was part of an intentional deception. Ultimately only NVIDIA can know the complete truth, but given what we’ve been presented we have no reason to doubt NVIDIA’s story.

If anything it highlights very poor internal communication within nvidia as a company.
 
You know what else sucks, a lot of us were forced to pay marked up prices because these cards were in such high demand. Had the specs been accurately reported in the first place then the demand would probably have been somewhat lower and prices wouldn't have been jacked up as much if at all.

So not only did we buy a falsely advertised product, but we also paid a premium on top of MSRP because it was falsely advertised.
 

Zane

Member
You know what else sucks, a lot of us were forced to pay marked up prices because these cards were in such high demand. Had the specs been accurately reported in the first place then the demand would probably have been somewhat lower and prices wouldn't have been jacked up as much if at all.

So not only did we buy a falsely advertised product, but we also paid a premium on top of MSRP because it was falsely advertised.

I disagree. People bought this card in droves because of the benchmarks not because of the specs on the box. Also no you weren't forced to pay marked up prices at all. I was able to grab the MSI version for retail price on Amazon and this was a week before Christmas.

I feel cheated :/

First PC I build and I'm stuck with this. Thanks Nvidia.

Stuck with a great card that will easily last you the next 3 or 4 years at least. :'(
 

seph1roth

Member
I don't know anything about technicalities and shit but, there's any possibility to solve the problem with a software update?

Also, affects to ALL 970s?

If it has no solution, Nvidia should return the money to everyone who wants his money back, this is not a joke.
 
I own 2 GTX 970s. From a practical stand point, this hasn't proven to be an issue for me but the situation remains unacceptable. I have a hard time believing that engineers familiar with the cards true specifications didn't notice the inaccurately reported specs when review after review were being published. I'm very curious where Nvidia goes from here with this issue.
 

The Llama

Member
I don't know anything about technicalities and shit but, there's any possibility to solve the problem with a software update?

Also, affects to ALL 970s?

If it has no solution, Nvidia should return the money to everyone who wants his money back, this is not a joke.

It's a hardware issue that affects every 970 because of the way the chip is manufactured.
 

Honey Bunny

Member

From this article, page 2. Bolded is what is interesting.

As a result NVIDIA has segmented the GTX 970’s memory into the now-familiar 3.5GB and 512MB segments. In the case of the 3.5GB segment, this behaves otherwise identically to a fully enabled card such as the GTX 980, with the 1KB stride being striped over 7 crossbar ports, and hence 7 DRAM modules. Meanwhile the 8th and final DRAM module sits in its own 512MB segment, and must be addressed by the crossbar on its own.

This in turn is why the 224GB/sec memory bandwidth number for the GTX 970 is technically correct and yet still not entirely useful as we move past the memory controllers, as it is not possible to actually get that much bandwidth at once on the read side. GTX 970 can read the 3.5GB segment at 196GB/sec (7GHz * 7 ports * 32-bits), or it can read the 512MB segment at 28GB/sec, but not both at once; it is a true XOR situation. Furthermore because the 512MB segment cannot be read at the same time as the 3.5GB segment, reading this segment blocks accessing the 3.5GB segment for that cycle, further reducing the effective memory bandwidth of the card. The larger the percentage of the time the crossbar is reading the 512MB segment, the lower the effective memory bandwidth from the 3.5GB segment.

And yet:

AsTRT1H.png
 

owasog

Member
From this article, page 2. Bolded is what is interesting.

This in turn is why the 224GB/sec memory bandwidth number for the GTX 970 is technically correct and yet still not entirely useful as we move past the memory controllers, as it is not possible to actually get that much bandwidth at once on the read side. GTX 970 can read the 3.5GB segment at 196GB/sec (7GHz * 7 ports * 32-bits), or it can read the 512MB segment at 28GB/sec, but not both at once; it is a true XOR situation. Furthermore because the 512MB segment cannot be read at the same time as the 3.5GB segment, reading this segment blocks accessing the 3.5GB segment for that cycle, further reducing the effective memory bandwidth of the card. The larger the percentage of the time the crossbar is reading the 512MB segment, the lower the effective memory bandwidth from the 3.5GB segment.
And yet:

AsTRT1H.png
The 970 is a 196GB/s card below 3.5 GB? And even less than that above that? That's not what they're selling, now is it?
 

Xdrive05

Member
Nvidia generally has really good drivers, so hopefully their smoke and mirrors vram driver management actually keeps pace with new demanding games.

As much heat as they are drawing right now, I'm sure they're dedicating gobs of resources to software R&D.

Just waiting patiently to see if any price drops come of this.
 

Honey Bunny

Member
It's a bit inconvenient having two threads about this, maybe a mod can merge the two? Just a suggestion. I do like your OP NakedSnake so I wouldn't want it to get lost in the shuffle.
 
It's a bit inconvenient having two threads about this, maybe a mod can merge the two? Just a suggestion. I do like your OP NakedSnake so I wouldn't want it to get lost in the shuffle.

It's actually 3 threads as some people are still discussing this in the cards launch thread... I was worried yet another thread will be created today, it's a problem with "new news new thread" policy.

FWIW I've been trying to keep the OP updated as new articles and findings emerge. The thread title could use an update though.
 

Zukuu

Banned
They need to cut the price and market it as a 3.5gb card. Everything is fine then - at least for future buyers. As it stands, there aren't any comparable cards at that price point yet, correct? So it's not like you have options anyway.
 

Honey Bunny

Member
It's actually 3 threads as some people are still discussing this in the cards launch thread... I was worried yet another thread will be created today, it's a problem with "new news new thread" policy.

FWIW I've been trying to keep the OP updated as new articles and findings emerge. The thread title could use an update though.

Yeh I suppose as long as the info gets out there.

Er....Nvidia have removed the 100+ page thread on this issue from their official forum now, lol.

Only accessible via direct link. https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/114/

It's supposed to show up here https://forums.geforce.com/default/board/155/

Im sure this is just a misunderstanding.
 
Yeh I suppose as long as the info gets out there.

Er....Nvidia have removed the 100+ page thread on this issue from their official forum now, lol.

Only accessible via direct link. https://forums.geforce.com/default/topic/803518/geforce-900-series/gtx-970-3-5gb-vram-issue/114/

It's supposed to show up here https://forums.geforce.com/default/board/155/

That Nvidia sense of class, why are such good engineers cursed with such crappy marketeers? I feel for reviewers on this as right now it's an edge case that was easy to miss but because nvidia have 'veracity issues' they're going to have to further slow down the test process with tools to stress vRAM. At this rate they'll have to start popping off the heatsinks to see if there's a real chip in there or just a video playback unit that runs a loop of Far Cry 4 at 1000FPS
 
Yeah, that's how I feel as well. I'm pissed at the deception, yet I'm super happy with my 970. Quite the conundrum!

In the end, there's not much I can do about it now anyway. I bought the card, my bad adopting new tech too fast I suppose. All I can do now is enjoy my 970 and be aware of the 3.5GB limit and learn from my mistake for the future. I'll tell you one thing, I'll never buy an Nvidia card early again. Not ever. I'll always wait a year or so to wait for the skeletons to get cleaned out first. Fool me once, shame on you, fool me twice shame on me...

I just love the kick in the pants I'm feeling, first Nvidia card I've bought, was an amd user pretty much forever, really thought Nvidia was this bastion of quality and it obviously gets a lot of praise around here due to excellent driver support and what not. This has really put a sour taste in my mouth now, and may jump back to amd once that extra .5 that isn't being utilized properly starts really showing its colors in future game releases. I'm really hoping this card lasts me a good two years, especially for the Rift CV1, but boy does this info put a nagging doubt in the back of my head. Time will tell I suppose.
 

cyberheater

PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 PS4 Xbone PS4 PS4
From this article, page 2. Bolded is what is interesting.

This in turn is why the 224GB/sec memory bandwidth number for the GTX 970 is technically correct and yet still not entirely useful as we move past the memory controllers, as it is not possible to actually get that much bandwidth at once on the read side. GTX 970 can read the 3.5GB segment at 196GB/sec (7GHz * 7 ports * 32-bits), or it can read the 512MB segment at 28GB/sec, but not both at once; it is a true XOR situation. Furthermore because the 512MB segment cannot be read at the same time as the 3.5GB segment, reading this segment blocks accessing the 3.5GB segment for that cycle, further reducing the effective memory bandwidth of the card. The larger the percentage of the time the crossbar is reading the 512MB segment, the lower the effective memory bandwidth from the 3.5GB segment.

And yet:

AsTRT1H.png

Holy fuck. They really did gimp the memory controller for this card. It really is effectively a 3.5GB graphics card.
 
Status
Not open for further replies.
Top Bottom