• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA GeForce RTX 3090 to have 24GB GDDR6X VRAM, RTX 3080 10GB VRAM

supernova8

Banned
Current predictions look right +- 5%

thJIYUG.png
4tRXVwa.png

The fucking nerve if they charge almost double 3080 for the 3090 when it's only slightly better. Plus, $400 for the 3060 is a fuckin dick move. I hope AMD wipes the floor with their 3060 and shows NVIDIA out to be the price-gouging bastards they've always been.

edit: probably a little too harsh. I guess if you're getting close to 2080 Ti performance for 1/3 of the launch price it's not that bad but I still hope AMD comes out with a real corker.
 
Last edited:

Rikkori

Member
The fucking nerve if they charge almost double 3080 for the 3090 when it's only slightly better. Plus, $400 for the 3060 is a fuckin dick move. I hope AMD wipes the floor with their 3060 and shows NVIDIA out to be the price-gouging bastards they've always been.
That's why they're giving it 24 GBs. Have to say it does make me sad how much they gimped the vram. My only hope is that nvcache rumours are true.
 

Spukc

always chasing the next thrill
The fucking nerve if they charge almost double 3080 for the 3090 when it's only slightly better. Plus, $400 for the 3060 is a fuckin dick move. I hope AMD wipes the floor with their 3060 and shows NVIDIA out to be the price-gouging bastards they've always been.
good luck waiting for that magical AMD card that will never release
 

supernova8

Banned
good luck waiting for that magical AMD card that will never release

To be clear I'm not expecting them to beat NVIDIA in performance, I'm just expecting a relatively competitive card at a low price. Since AMD has pretty much dominated Intel in CPUs, there's more headroom to turn their attention to GPUs.

They spent a few years getting Ryzen into full swing and I think last year's 5700XT was the first sign of AMD starting to get back into the GPU fight.

Just remember when Ryzen first came out and the reviews were like "yeah pretty good but err.. stick with Intel for now" and then the next year it was like "shiiiiiiiit". I doubt we're that far away from the "shiiiiit" moment with Radeon. We've already seen what they can pack into a very tight power envelope for RDNA2 with the next-gen consoles so imagine that as a full-fat GPU and you have a potential killer on your hands.
 
Last edited:

Spukc

always chasing the next thrill
To be clear I'm not expecting them to beat NVIDIA in performance, I'm just expecting a relatively competitive card at a low price. Since AMD has pretty much dominated Intel in CPUs, there's more headroom to turn their attention to GPUs.

They spent a few years getting Ryzen into full swing and I think last year's 5700XT was the first sign of AMD starting to get back into the GPU fight.

Just remember when Ryzen first came out and the reviews were like "yeah pretty good but err.. stick with Intel for now" and then the next year it was like "shiiiiiiiit". I doubt we're that far away from the "shiiiiit" moment with AMD. We've already seen what they can pack into a very tight power envelope for RDNA2 with the next-gen consoles so imagine that as a full-fat GPU and you have a potential killer on your hands.
just around the corner yeah ANY DAY NOW :pie_roffles:
 

Spukc

always chasing the next thrill
Haha yeah I know it's somewhat of a running joke. Totally get that. But that was back when AMD was totally hopeless with no money, no hope. It's a different company now.
Aye i would be the first in line to stand having a laugh at team green getting fucked by AMD like they did intel.
But i honestly can't care until the moment they release something. The last stuff AMD did was a joke. And people should not care about them until they show something.
 
Last edited:

Ellery

Member
I think it is always best to be realistic about AMD. I don't know how much R&D money they have now, but it is probably 1/5th or less than what Nvidia has.

And I guess we can take it for a fact now that Ampere is going to be a gigantic GPU with extremely high power draw and if Nvidia sees the need to bring something this brutal then it will be very hard for AMD to catch up with that. I would still expect the Big Navi to slot in somewhere between the 2080 Ti and the RTX 3090, but I also expect the gap between the 2080 Ti and 3090 to be atleast 60% in terms of raw performance we see in gaming and synthetic benchmarks.

Going from a 5700 XT to something that would compete with a 2080 Ti + 60% is probably way out of reach. I hope they can make a well rounded high end GPU that competes with the RTX 3080.
 

supernova8

Banned
Aye i would be the first in line to stand having a laugh at team green getting fucked by AMD like they did intel.
But i honestly can't care until the moment they release something. The last stuff AMD did was a joke. And people should not care about them until they show something.

5700XT seemed to review pretty well but it came out too late. I believe it was July 2019? RTX 2060 was already out by Jan 2019 so anybody looking for a mid range card probably had that. Plus, NVIDIA killed off the 5700XT with the Super cards before AMD even got their products out.

I guess it might be fair to say AMD has actually made some good progress in the GPU space, the only problem is that they're up against actual competition.
 

Polygonal_Sprite

Gold Member
What will the 3070 and 3080 use power draw wise vs the 2070 I currently have? I don't want to upgrade PSU. Are those two standard 8pin too?

I personally can't even imagine going with AMD just because of DLSS. It's basically 100% more performance in the games that support it if you're targeting 4k.
 
Last edited:

supernova8

Banned
I think it is always best to be realistic about AMD. I don't know how much R&D money they have now, but it is probably 1/5th or less than what Nvidia has.

And I guess we can take it for a fact now that Ampere is going to be a gigantic GPU with extremely high power draw and if Nvidia sees the need to bring something this brutal then it will be very hard for AMD to catch up with that. I would still expect the Big Navi to slot in somewhere between the 2080 Ti and the RTX 3090, but I also expect the gap between the 2080 Ti and 3090 to be atleast 60% in terms of raw performance we see in gaming and synthetic benchmarks.

Going from a 5700 XT to something that would compete with a 2080 Ti + 60% is probably way out of reach. I hope they can make a well rounded high end GPU that competes with the RTX 3080.

(1) If you think NVIDIA is bigger than AMD, have you seen how much bigger Intel is than AMD? Personally I'd suggest NVIDIA going for a massive GPU with an extremely high power draw suggests they've been spooked a little by AMD.

(2) Don't forget they went from Excavator CPUs to Ryzen in the space of two years (I'm not saying it took 2 years to make it). Plus since Raja Koduri left AMD, Lisa Su has been running Radeon herself in the interim. I think we should be cautiously optimistic about what AMD produces.
 

Ellery

Member
(1) If you think NVIDIA is bigger than AMD, have you seen how much bigger Intel is than AMD? Personally I'd suggest NVIDIA going for a massive GPU with an extremely high power draw suggests they've been spooked a little by AMD.

(2) Don't forget they went from Excavator CPUs to Ryzen in the space of two years (I'm not saying it took 2 years to make it). Plus since Raja Koduri left AMD, Lisa Su has been running Radeon herself in the interim. I think we should be cautiously optimistic about what AMD produces.

I agree with both. Currently rocking a Ryzen CPU 3700X because I was extremely impressed by the impact Lisa Su had and the stock went through the roof ever since she took over. The 5700 XT looks promising, but unlike Intel I don't see Nvidia screwing up. They seem much more focused and passionate about pushing.
 

magaman

Banned
The fucking nerve if they charge almost double 3080 for the 3090 when it's only slightly better. Plus, $400 for the 3060 is a fuckin dick move. I hope AMD wipes the floor with their 3060 and shows NVIDIA out to be the price-gouging bastards they've always been.

edit: probably a little too harsh. I guess if you're getting close to 2080 Ti performance for 1/3 of the launch price it's not that bad but I still hope AMD comes out with a real corker.

Make more money. Then you can afford nicer things.
 

DeaDPo0L84

Member
3080ti with the rumored 20gb sounds perfect for me but of course its not launching in September and I really want a ray tracing card that can push cyberpunk in November so 3090 looks to be what I need.
 
Last edited:

llien

Member
To be clear I'm not expecting them to beat NVIDIA in performance,
5700 series do exactly that: beat similarly priced NV cards on performance.

until the moment they release something.

5700 was a joke? Release "Something"?
What the heck is wrong with you?

Ampere is going to be a gigantic GPU with extremely high power draw and if Nvidia sees the need to bring something this brutal then it will be very hard for AMD to catch up with that.
What does "catching up with that" even mean? We are talking about less than 1% of the GPU market.
 
Last edited:

kiphalfton

Member
3090 is nowhere near a titan, titan v-ram pool should be 32gb+. 24gb is basically what u would expect the 3080ti to sit at.

then about 3080.

3080 is a budget card sold for top end money. Anything below 16gb of v-ram has no future simple as that.

I would not even consider the card as a 3060 option.

People are currently looking through this generation glasses. Next generation baseline for v-ram will be 16gb of memory. the base line will go up massively.

This reminds me of a 580 1,5gb, totally solid gpu performance and memory performance, v-ram was fine for ultra settings on any game until PS4 games arrived. It couldn't even run unity or watch dogs on any setting because v-ram bottleneck. Now it probably won't be as bad for the 3080, but expect medium settings being a thing for the card through the entire generation if not low with it.

Any gpu at 400+ when next gen consoles are out needs 16gb of v-ram modules.

10gb makes absolute no sense at all in any way other then they eyeballed for that number. The only way that card has any reason to exist is if its dirt cheap like 200 bucks. And we all know that won't be the case.

It's clear to me that the entire 3000 series besides the 3090 is going to be replaced in 6 months with proper versions when amd has there cards on the market. Anybody buying into these cards will be burned hard.

And for me with a 1080ti, i will be sitting this clusterfuck of a card generation out until they offer something half decent.

That doesn't make sense. 780Ti > 980Ti > 1080Ti > 2080Ti was 3GB > 6GB > 11GB > 11GB. If what you said is true, and the RTX 3080Ti is simply the RTX 3090, RAM wouldn't just jump from 11GB to 24GB "just because", especially when purported RAM for the RTX 3070 and RTX 3080 are pretty close to what they were for Turing.

The RTX 3090 being a Titan replacement would make sense as Nvidia isn't going to indirectly admit they overpriced the Turing Titan by releasing an Ampere Titan for less. $2500 was too much, and they know it. Instead they change the name, and act like them pricing the Turing Titan at $2500 didn't happen. It's correcting course, for pricing, as they went from $1200 with the Titan Xp to $2500 for the Titan RTX. Same thing with the RTX 3080Ti is probably going to happen, where it's obviously not going to be as expensive as the RTX 2080Ti, since pricing wise it would alot in right between the RTX 3080 and RTX 3090 (i.e. somewhere between $800 and $1400, assuming leaked prices are correct).
 
Last edited:

GHG

Gold Member
Leaked specs in one chart:

nvidia3000series-2.png


Also:

NVIDIA GeForce RTX 3090 features GA102-300 GPU with 5248 cores and 24GB of GDDR6X memory across a 384-bit bus. This gives a maximum theoretical bandwidth of 936 GB/s. Custom boards are powered by dual 8-pin power connectors which are definitely required since the card has a TGP of 350W.

The GeForce RTX 3080 gets 4352 CUDA cores and 10GB of GDDR6X memory. This card will have a maximum bandwidth of 760 GB/s thanks to the 320-bit memory bus and 19 Gbps memory modules. This SKU has a TGP of 320W and the custom models that we saw also require dual 8-pin connectors.


Good news on the dual 8 pins for both the 3090 and 3080.

Come on EVGA, don't let me down.
 

Rikkori

Member
I'm wondering if the 3070 at $600 is the 8GB or 16GB model. (Since earlier on there were rumors that there were going to be 2 models of it, one launching in Sept, and the other in Oct)
That would be 8 GB. We have no leaks about the double vram variants besides that they might exist. Sadly I expect those to be $100-$200 more.
 
Man, I used to get so excited when a new gen of videocards was about to release. Maybe it's all in my head, but it feels like NV just keeps getting more and more [blatantly] money-grubby as the years go by. Like, fuck me in the ass if you absolutely must, but don't laugh maniacally in my ear the whole time you're doing it.
 

FireFly

Member
That doesn't make sense. 780Ti > 980Ti > 1080Ti > 2080Ti was 3GB > 6GB > 11GB > 11GB. If what you said is true, and the RTX 3080Ti is simply the RTX 3090, RAM wouldn't just jump from 11GB to 24GB "just because", especially when purported RAM for the RTX 3070 and RTX 3080 are pretty close to what they were for Turing.

The RTX 3090 being a Titan replacement would make sense as Nvidia isn't going to indirectly admit they overpriced the Turing Titan by releasing an Ampere Titan for less. $2500 was too much, and they know it. Instead they change the name, and act like them pricing the Turing Titan at $2500 didn't happen. It's correcting course, for pricing, as they went from $1200 with the Titan Xp to $2500 for the Titan RTX. Same thing with the RTX 3080Ti is probably going to happen, where it's obviously not going to be as expensive as the RTX 2080Ti, since pricing wise it would alot in right between the RTX 3080 and RTX 3090 (i.e. somewhere between $800 and $1400, assuming leaked prices are correct).
The reason for the jump is likely because on the 3090 they're using a 384-bit bus with 12 x 32-bit memory controllers that would otherwise have been addressing 1 GB of memory each. So if Micron only do GDDR6X in 1GB and 2 GB flavours, which seems plausible, their choices for a "full" configuration are 12 GB or 24GB. And increasing memory by only 1 GB over the 2080 Ti probably wasn't going to cut it, especially if AMD are going for 16 GB.

The Titan is $2500 because it's not a gaming card, but rather intended for developers and scientists. In the context of a business purchase, I don't think $2500 is a huge amount of money, and if it wasn't selling, Nvidia was free to reduce pricing. I presume that if a successor to the Titan was created, it would be based on Nvidia's A100 architecture.
 
Last edited:

Kenpachii

Member
Man, I used to get so excited when a new gen of videocards was about to release. Maybe it's all in my head, but it feels like NV just keeps getting more and more [blatantly] money-grubby as the years go by. Like, fuck me in the ass if you absolutely must, but don't laugh maniacally in my ear the whole time you're doing it.

It's because of no competition really. I hope AMD curp stomps nvidia to the shitter this generation. But who am i kidding i have zero faith in there gpu department.
 
I thought 5700XT was one terrific card from AMD. 2080 perf for half the price. [2080 launched at $700]

RDNA2 replacement of 5700xt [6700xt?] should put it in the 2080ti territory with Raytracing support for what 1/3rd the cost. Simply put amazing. But AMD definitely need something to counter DLSS.
 

martino

Member
I thought 5700XT was one terrific card from AMD. 2080 perf for half the price. [2080 launched at $700]

RDNA2 replacement of 5700xt [6700xt?] should put it in the 2080ti territory with Raytracing support for what 1/3rd the cost. Simply put amazing. But AMD definitely need something to counter DLSS.
it would be great and would make sure i won't buy nvidia card before long.

edit : i forgot there is also hdmi 2.1 support for tv.
 
Last edited:

GHG

Gold Member
I thought 5700XT was one terrific card from AMD. 2080 perf for half the price. [2080 launched at $700]

RDNA2 replacement of 5700xt [6700xt?] should put it in the 2080ti territory with Raytracing support for what 1/3rd the cost. Simply put amazing. But AMD definitely need something to counter DLSS.

The first one of the two to implement something like DLSS at the driver level wins.

Knowing Nvidia they will probably do it but make it exclusive to the 3xxx series onwards.
 

Ellery

Member
What does "catching up with that" even mean? We are talking about less than 1% of the GPU market.

Well at the point of writing this post I was refering to having the best infrastructure to deliver the best GPU. The general "capabilities" of delivering like Nvidia does.
 

supernova8

Banned
5700 series do exactly that: beat similarly priced NV cards on performance.

5700XT is better than 2060 Super in most games and is often neck and neck with 2070 Super but I suppose if you consider stuff like ray tracing and DLSS, the NVIDIA cards are better.
 

baphomet

Member
so are we expecting these to go on sale on the 1st, or will they be at some point later in the month?

going by say the 2000 series?
 

llien

Member
if you consider stuff like ray tracing
Yeah, let's consider stuff like "ray tracing",which % of games supports it, or, which % of games needs it, to look great:



Looks like 0 (zero, null).

Saying "but dlss" is basically saying "but fancy upscaling".

Well at the point of writing this post I was refering to having the best infrastructure to deliver the best GPU.
We have exactly zero RTX Titan owners in this thread, the "the best infrastructure" song is clearly not yours.
 

RoadHazard

Gold Member
To be fair the existence of ultra enthusiast high end cards that are aimed at the 1% of the 1% are not indicative of a big pricegap between PC and consoles from a mainstream standpoint.

Build a $500 PC with PS5 equivalent specs, please. I'd buy that.
 

Ellery

Member
Ok, you can go up to $600 for the PC, and you have until November to do it.

Sure, do you already have components you are using from your previous build or do you plan on a fresh build?
Do you have a windows 10 license? Can you build yourself or do you need help with that?
 

RoadHazard

Gold Member
Sure, do you already have components you are using from your previous build or do you plan on a fresh build?
Do you have a windows 10 license? Can you build yourself or do you need help with that?

No existing components, just a TV to plug it into.
 

GymWolf

Member
It seems like many people are gonna wait for the inevitable 3080super...

The question is, can i survive with my 2070s for another 12-18 months before that thing come out? Is it enough to play cyberpunk at 1440p60 with decent details? This shit can't even sustain 1440p60 ultra in remnant of the fucking ashes with hell on screen:ROFLMAO:

It's bad...
 
Last edited:
I'm in the market for a $200-300 replacement for my 1070 Ti. Looks like I'll be lucky to get a GeForce 3040 with that money seeing as how they keep increasing prices.
 

Ellery

Member
Why wouldn't they just go with 12gb for the 3080?

  1. For the next few years 10GB is enough for 1440p.
  2. Good 4K monitors are expensive. Nvidias knows their audience and they can milk 4K gamers for more money with the 3090. Those people have no alternative anyway.
  3. Having just 10GB is cheaper for Nvidia. VRAM is one of the most expensive things for them.
  4. A 10GB card puts people on a more frequent upgrade interval, because they sooner realize that the VRAM is a potential bottleneck.
  5. People look at performance first in reviews and then at the price. The price would go up with more VRAM, but the performance wouldn't
  6. GPU Reviews are a reflection of the current PC gaming landscape with titles from the last few years and the 3080 10GB will be enough for them and reviewers won't run into VRAM shortcomings.
  7. Based on that people are going to see that 10GB is enough for them currently. What the future holds nobody knows.
  8. The 10GB 3080 gives a very clear distinction to the 24GB 3090 and it makes the extremely expensive 1500$ RTX 3090 look like a reasonable buy because it flaunts so much VRAM. Basic consumer psychology.
  9. It leaves more room for future cards that may slot between the 3080 and 3090 to be impressive. Nvidia can go with anything between 10 and 20gb then and people will be impressed.
 
Top Bottom