• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA GeForce RTX 3090 to have 24GB GDDR6X VRAM, RTX 3080 10GB VRAM

Bullet Club

Banned
NVIDIA GeForce RTX 3090 will get 24GB GDDR6X VRAM, & RTX 3080 to get 10GB VRAM

The memory specifications of the upcoming RTX 3090 and RTX 3080 Ampere cards have just been confirmed by Videocardz. According to Videocardz’s internal sources at Add-in-Board (AIB) Partners, it has been confirmed that the NVIDIA GeForce RTX 3090 will feature 24 GB of GDDR6X memory, while the GeForce RTX 3080 will initially launch with 10 GB of GDDR6X memory. There’s a huge difference between the RTX 3090 & the RTX 3080 when it comes to the VRAM capacity, but only these specs have been confirmed for now.

There is no word on any Ti Model, but the RTX 3090 appears to be the flagship GPU in the Ampere lineup, directly replacing TITAN. The RTX 3090 will feature the GA102-300-A1 GPU. The GA102-300-A1 GPU sports 5248 CUDA cores or 82 SMs, which is a 20% increase in cores over the RTX 2080 Ti. We don’t have any details on the TMU (texture mapping unit), ROP count (raster operations pipeline) and the clock speeds yet. Assuming the RTX 3090 comes with 24 GB of VRAM then we are looking at a 384-bit wide memory bus, and 1 TB/s of bandwidth. The memory speeds for both the RTX 3090 and 3080 cards are expected to be around 19 Gbps.

But it is important to note that Gaming and HPC GPU variants have different configurations, so we can’t take the GA102-300-A1 GPU die as a reference if the RTX 3090 is going to target both the consumer and the HPC/enterprise market. The GeForce RTX 3080 on the other hand has been confirmed to get 10 GB GDDR6X memory, which appears to less that the RTX 3090, but this could also mean that we might get an RTX 3080 SUPER variant GPU with 16 GB of VRAM in the future. These super cards might also come in 20 GB flavors but for now only 10 GB of memory has been confirmed.

There are chances that Nvidia will launch the RTX SUPER series next year though. The RTX 3080 will feature the GA102-200-KD-A1 GPU die, which is a cut-down SKU having the same 4352 CUDA cores as the RTX 2080 Ti with a total of 68 SMs. Assuming the memory is running at 19 Gbps across a 320-bit wide bus interface, we can expect a bandwidth of up to 760 GB/s.

Nvidia is using the high-end GA102 GPU for the RTX 3080 as well, which appears to be an upgrade over the previous TU104 core featured on the RTX 2080. This could also mean that the new Ampere card will have higher wattage requirements and thermals, and it will fall under the high-end enthusiast segment. The GeForce RTX 3070 is also expected to be announced along with these two high-end cards, but we don’t have any specs on this SKU yet.

NVIDIA’s flagship RTX 3090 graphics card was also pictured recently, and is a massive 3-slot GPU. This model appears to have the same irregular shaped PCB which was leaked before. My guess is that this is the reference Founders Edition graphics card sample.



NVIDIA’s upcoming RTX 3000 series Ampere cards are rumored to be built on the Samsung’s 8nm process node. Nvidia has also recently ceased the production of the RTX 20 series of GPUs to make room for the Ampere lineup. Nvidia is hosting a Geforce Special Event on September 1st, and we expect the company to announce the next-gen Ampere Gaming GPUs.

Stay tuned for more!

Source: DSOG
 

Ellery

Member
3080 still has less vRAM than the 1080ti? Seems odd.

It still easily smokes the 1080 Ti in every way possible and there would never be a scenario where the 1080 Ti 11GB would be better than the 10GB GDDR6X of the RTX 3080.

Nvidia has always been a little bit greedy on the VRAM side of things, but it usually is enough until they want you to upgrade to the next card. (Which is as often as possible)
 

GHG

Member
This is a strange decision.

It's almost as if they are forcing people who want to play at 4k towards the 3090.

If the 3080 is indeed ~30% faster than the 2080ti then it would be perfectly capable at 4k as well but the VRAM could start to become an issue.
 
How will this compare to consoles which will have 13.5GB available? The 3000cards have higher bandwidth but purely from a capacity standpoint this seems odd.
 

RNG

Member
That's a pretty huge gap in VRAM from 10GB to 24GB. Plus I'm expecting the price to also have a huge gap between the two cards.
 

iJudged

Banned
PC Gaming has become a joke. Beside mod support and high end emulation is nothing that justefies that big pricegap anymore if u play mainstream games.
I AGREE, it's overkill on the prices and power, right now, we don't really need anything more then 2080ti powerwise. I still have my gtx 1080ti and run most games on the highest setting at 2k, since my monitor is 32in curved 2k @ 165hz gsync/fresync, i don't really need more, but i will be upgrading to next gen GPU because RTX, and i am not spending more then $700, that will do for the next 5 years, i bet,
 
Last edited:

GreatnessRD

Member
A 850W will be easily enough for the RTX 3090. What else would draw like 500W in the system when the 3090 is like 350W? A 95W Ryzen CPU ?
I said that in jest really. More so because of the beefy size and the new 12-pin connector. But overall just joking. Doesn't bother me either way because the 3090 isn't even in my wheelhouse. Dem bills, yo
 

notseqi

Member
To be fair the existence of ultra enthusiast high end cards that are aimed at the 1% of the 1% are not indicative of a big pricegap between PC and consoles from a mainstream standpoint.
It never was comparable but tabbing from a game to watch some gentlemen's special interest documentaries is still not as easy on consoles.

2k is mad for a gfx card though. Most I spent was 350.
 

iJudged

Banned
That thing is so big man
what_she_said_office.gif


i am sorry
 

thief183

Member
How did you expected more after the flagship card 2 generations ago was $699 and the flagship card from the last generation was $999 ?
Mostly curious. No offense or judgement directed at you regarding your expectations of a future purchase decision.

Cause the 3090 should be seen as the new Titan
 

Nikana

Go Go Neo Rangers!
This is a strange decision.

It's almost as if they are forcing people who want to play at 4k towards the 3090.

If the 3080 is indeed ~30% faster than the 2080ti then it would be perfectly capable at 4k as well but the VRAM could start to become an issue.

I concur. It "seems" purposefully lower.
 
  • Like
Reactions: GHG
I AGREE, it's overkill on the prices and power, right now, we don't really need anything more then 2080ti powerwise. I still have my gtx 1080ti and run most games on the highest setting at 2k, since my monitor is 32in curved 2k @ 165hz gsync/fresync, i don't really need more, but i will be upgrading to next gen GPU because RTX, and i am not spending more then $700, that will do for the next 5 years, i bet,
I want to upgrade but i just cant justfy it. I bought 2 gtx 970 with the release of witcher 3 and i was blown away. Todays cards cost even more and the graphics are just slightly better. Hell, even chanels like digital foundry have to zoom in 300% to find substancial differences. If i can play games like cyberpunk and rdr2 with 4k/60 fps on the next gen, i think im out of pc gaming for good. A shame, because ive been there since my first card the voodoo 3 2000...
 

diffusionx

Gold Member
I don’t even know if the 3090 would fit into my case. It seems like it is designed for 900D type absolute units. I was looking forward to getting the top of the line card but Nvidia seems to have priced me out of it for all sorts of reasons.

And that huge gap in VRAM between the 3090 and 3080 doesn’t make sense. Part of me wonders if they are going to do 3080/3080TI/3090, with the 90 being positioned as the Titan type card as opposed to the TI. They could even release the 3080TI down the road. I’ll probably wait for that one...

I want to upgrade but i just cant justfy it. I bought 2 gtx 970 with the release of witcher 3 and i was blown away. Todays cards cost even more and the graphics are just slightly better. Hell, even chanels like digital foundry have to zoom in 300% to find substancial differences. If i can play games like cyberpunk and rdr2 with 4k/60 fps on the next gen, i think im out of pc gaming for good. A shame, because ive been there since my first card the voodoo 3 2000...

next gen consoles will be able to do stuff like RDR2 and Cyberpunk at 4K/60fps, sure, but not the next gen of games that comes after. This 3090 seems like a ludicrous card but it already has more RAM than the next gen systems in their entirety. IMO, there is going to be a huge gap between console and PC next gen, far wider than it was for the current gen. Especially once RT takes hold.

No need to panic now man, cards will keep coming out and prices will drop.
 
Last edited:

Ellery

Member
Cause the 3090 should be seen as the new Titan

I think that is going to depend whether the RTX 3090 is labelled as a GeForce card. The GeForce segment of Nvidia was always about gaming and the Titan cards were never classified as GeForce cards. They are Titan cards. Can see that on nvidia.com the GeForce and Titan cards are seperated and not sold under the same classification.

I don't understand why it should be seen as a Titan card. It has a numeric name and not a halo name like the Titan cards (from what we know now. Might still change by September 1st and the card we call the RTX 3090 is indeed a much more expensive TITAN card)
 

Jonsoncao

Banned
PC Gaming has become a joke. Beside mod support and high end emulation is nothing that justefies that big pricegap anymore if u play mainstream games.
Loading an MPNN with about 2 conv layers and 3m parameters in dense layers in tensorflow 2 requires 12G memory. A training batch can then contain only 8 sample molecules or proteins if you use CPU mode and only have 16G ram.

This thing will be huge in deep learning community as you can afford bigger batch in train and fit in memory bigger and more exotic networks.

Gamers are just too narcissistic to realize this is not for them.
 

diffusionx

Gold Member
I think that is going to depend whether the RTX 3090 is labelled as a GeForce card. The GeForce segment of Nvidia was always about gaming and the Titan cards were never classified as GeForce cards. They are Titan cards. Can see that on nvidia.com the GeForce and Titan cards are seperated and not sold under the same classification.

I don't understand why it should be seen as a Titan card. It has a numeric name and not a halo name like the Titan cards (from what we know now. Might still change by September 1st and the card we call the RTX 3090 is indeed a much more expensive TITAN card)

Or they are trying to “mainstream” it by putting the “Titan level” card alongside their other ones. In the past they were always separate -released earlier, in a very different price bracket, etc. Now if they’re saying, ok consider Titan just another GF, it’s now the xx90 variant, that would make some sense. Well as much sense as a $2000 GPU the size of a barn door can make.
 
Last edited:

Ellery

Member
Or they are trying to “mainstream” it by putting the “Titan level” card alongside their other ones. In the past they were always separate -released earlier, in a very different price bracket, etc. Now if they’re saying, ok consider Titan just another GF, it’s now the xx90 variant, that would make some sense. Well as much sense as a $2000 GPU the size of a barn door can make.

It could make sense from a full fat chip perspective I agree yeah. Give the gamers the absolute biggest behemoth we have with enough VRAM and no disabled units. What would they do for a Titan then or do they just drop that card?
 
That's shared ram.

Yes. But that's what the 2.5GB of the 16GB is for. 13.5GB is just for the GPU so it's more. Like I said though, the 3000cards have more bandwidth. I'm trying to look at the pros and cons of both.

Don't next gen consoles share VRam and system memory?

Yes it's 2.5GB for system memory.

Of 13.5GB only part is for GPU .
Sony is more flexible, Microsoft has 10GB of faster RAM + 6GB of slower (so they were aiming for 10GB for GPU)

So effectively MS has 10GB of RAM that can practically be used for the GPU. 3.5GB of slower RAM which can also be used for the GPU, but will have to be leveraged differently. PS5 should be around the same but it's slightly slower, but they have more available at a higher speed.
 
To be fair the existence of ultra enthusiast high end cards that are aimed at the 1% of the 1% are not indicative of a big pricegap between PC and consoles from a mainstream standpoint.
I know that, i bought a ASUS Gefoprce 3 Ti 500 back in the day. :messenger_winking:
Its not about the high end cards, but also middle-high cards. Just look at a 2070 super, thats 500-600 Euros and u cant even run RDR2 at 1440/60fps on ultra. Meanwhile on the XBOXONEX u get a native 4k/30fps with mid to high settings (still looks awesome) for 399. And i bet beside the fps most pc gamer wouldnt notice the "bad" lod or shadow quality while playing. Maybe im too old but i wont pay nearly tripple the price, just so i can stand still in the game and observe 10 more bushes and trees at the horizion.
 

Rickyiez

Member
Friendly reminder that GTX970 has only "3.5GB" vram was still fine for its time . I have a 1080Ti with 1440p screen and it barely reach 7gb usage, so stop with the memes
 
Last edited:

diffusionx

Gold Member
I know that, i bought a ASUS Gefoprce 3 Ti 500 back in the day. :messenger_winking:
Its not about the high end cards, but also middle-high cards. Just look at a 2070 super, thats 500-600 Euros and u cant even run RDR2 at 1440/60fps on ultra. Meanwhile on the XBOXONEX u get a native 4k/30fps with mid to high settings (still looks awesome) for 399. And i bet beside the fps most pc gamer wouldnt notice the "bad" lod or shadow quality while playing. Maybe im too old but i wont pay nearly tripple the price, just so i can stand still in the game and observe 10 more bushes and trees at the horizion.

Not really...

image-5.png


I agree that RDR2 looks awesome on One X, but they pulled back a good deal to get to 4K/30fps. What sort of performance does the 2070 Super get with those settings? Better than One X I am sure.

Friendly reminder that GTX970 has only "3.5GB" vram is still fine. I have a 1080Ti with 1440p screen and it barely reach 7gb usage, so stop with the memes

The 970 is fine for 1080p but most of us moved past that long ago...
 
Last edited:

martino

Member
Friendly reminder that GTX970 has only "3.5GB" vram is still fine. I have a 1080Ti with 1440p screen and it barely reach 7gb usage, so stop with the memes

Also ram usage is supposed to drastically change this gen.
Good to luck to fill your screen with more than 10gb of mostly in current frame assets.
 
Last edited:

Godfavor

Member
Yes. But that's what the 2.5GB of the 16GB is for. 13.5GB is just for the GPU so it's more. Like I said though, the 3000cards have more bandwidth. I'm trying to look at the pros and cons of both.



Yes it's 2.5GB for system memory.



So effectively MS has 10GB of RAM that can practically be used for the GPU. 3.5GB of slower RAM which can also be used for the GPU, but will have to be leveraged differently. PS5 should be around the same but it's slightly slower, but they have more available at a higher speed.

You forgot to add ram for CPU console gaming calculations
 
Top Bottom