• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA GeForce RTX 3090 TecLab Review Leaked: 10% Faster Than RTX 3080

Thelastword shitting on nvidia? That’s new!

With proper drivers and once the release is complete, with more cards available, the performance gap will surely increase.

More cores, more compute units, more rt and tensor core will surely translate to big gains in DLSS and such.

But yeah, we’re not getting double the perf of the 3080.
 
Few things:

1. 3080s have very little OC headroom, close to zero appreciable gains.

2. RT performance increase is tiny. HU pegged it at just 10% over Turing.

3. 3090 being just 10% faster than 3080 is absolutely shocking. This makes the claims 'RDNA2 will compete with 3080 probably but not the 3090!' seem odd now. If it gets close to the former it wont be far off the latter of course.

4. Ampere has been a disappointment and I laugh at all the idiots that were scrambling to get a '$700' (reality: closer to $800) card with just 10GB memory which draws 330-360W after being suckered into Nvidia marketing show earlier this month like complete PC noobs.
 

Armorian

Banned
If people are buying the 3090 for the VRAM, why not just wait for the next 3080 iteration with more VRAM then? I get the feeling that this whole thing is a bait-and-switch.

Nvidia just want to sell the most 3090, after some time they will drop 3080 with 20GB...

Thelastword shitting on nvidia? That’s new!

With proper drivers and once the release is complete, with more cards available, the performance gap will surely increase.

More cores, more compute units, more rt and tensor core will surely translate to big gains in DLSS and such.

But yeah, we’re not getting double the perf of the 3080.

NO, this card is ~20% faster than 3080 ON PAPER, only ~10% in real life is not suprising.

Pay 2x the money for it if you want. Maybe in native 8K differences will be bigger.
 

M1chl

Currently Gif and Meme Champion
It's worth it for me due toe VRAM, Titan RTX was just slighty faster than 2080Ti, so I was fully expecting this...
 
If people are buying the 3090 for the VRAM, why not just wait for the next 3080 iteration with more VRAM then? I get the feeling that this whole thing is a bait-and-switch.

This all the way. I really wish people would stop parting with their money so easily even if they have it. Do not fall for the 3090 trick. Details about cards with more VRAM have already leaked and they are coming. If you really want to spend your money get a 3080 and a fast Zen3 processor with the left over money you were going to spend on a 3090 later this year.

Stop being brainless.

It's worth it for me due toe VRAM, Titan RTX was just slighty faster than 2080Ti, so I was fully expecting this...

Come on man you know that amount of VRAM is excessive.
 
Last edited:

M1chl

Currently Gif and Meme Champion
This all the way. I really wish people would stop parting with their money so easily even if they have it. Do not fall for the 3090 trick. Details about cards with more VRAM have already leaked and they are coming. If you really want to spend your money get a 3080 and a fast Zen3 processor with the left over money you were going to spend on a 3090 later this year.

Stop being brainless.



Come on man you know that amount of VRAM is excessive.
Nah it isn't for me, I am doing machine learning stuff, so I need a lot of memory : ) Also crypto mining when higher VRAM is alway welcomed : )
 

evanft

Member
Few things:

1. 3080s have very little OC headroom, close to zero appreciable gains.

2. RT performance increase is tiny. HU pegged it at just 10% over Turing.

3. 3090 being just 10% faster than 3080 is absolutely shocking. This makes the claims 'RDNA2 will compete with 3080 probably but not the 3090!' seem odd now. If it gets close to the former it wont be far off the latter of course.

4. Ampere has been a disappointment and I laugh at all the idiots that were scrambling to get a '$700' (reality: closer to $800) card with just 10GB memory which draws 330-360W after being suckered into Nvidia marketing show earlier this month like complete PC noobs.

tenor.gif
 

ZywyPL

Banned
More cores, more compute units, more rt and tensor core will surely translate to big gains in DLSS and such.

But all of that at lower clocks... I imagine that a liquid cooled 3090 pushed to ~2GHz (42TF) will indeed provide a considerable jump from 3080, but for air cooled cards I guess the performance boost will be, sadly very small. Maybe that quad-slot version from Gigabyte will have as high frequencies as AIB 3080s, but that's all I can think of.
 

M1chl

Currently Gif and Meme Champion
You're comparing wrong cards. The Titan was massively faster than the 2080.

The 3090 (Ampere's Titan), is slightly faster than the 3080.
Was it? I honestly did not play on that card any game, so I don't know. I bough it for machine learning and there it absolutely destroyed 2080Ti, mainly due to VRAM, however compute was faster as well...
 

Nodial

Member
I'm wondering if those benchmark have been taken with a stable or beta version of drivers, this may change the result, specially considering the 3090 has not been released yet, those drivers may not properly cover its capabilities yet.
 

thelastword

Banned
Meh, that rumored 20-25% performance uplift was very promising, but for a mere 10% (if not less) it's absolutely not worth double the price, it's better to just OC the 3080 and call it a day. The only way NV could make it gaming-worthy would be to allow that Ultra Performance DLSS mode to be used not just for 8K but also to upscale games to 4K but from 1080p instead of 1440p, giving it quite and edge compared to 3080/70.
You say 20-25%, I read elsewhere where some people were expecting a 30-35% uplift.......I could see there's a slight possibility in a very obscure vram fiendish game at 8K, so outliers more or less, but don't see anyway a massive uptick can be sustained over 3080 apart from these edge cases.....

People are saying OC your 3080, but the 3080 only did 2300Mhz on LN2. On air and water, the OC headroom isn't much and there, the powerdraw will increase even more from it's astronomical heights. Curiously, the powerdraw on the 3090 should be much more than the 3080 too, so OC'ing that means you need more than 750 Watts. Emtek's power rating for it's 3 fan 3090 has a TDP of over 410 Watts already.....

https://www.dsogaming.com/news/emte...t-gpu-has-a-max-tdp-rating-of-over-410-watts/



What resolution are these test results at?

Wouldnt the benefit mainly be expected at 4k
4K resolution, average fps is shown in the table....

I want some Crysis benches......TBT, Nvidia's cores don't carry the same prowess as before, they boosted it for marketing reasons, but the performance increase is not 20% more because the 3090 has 20% more cuda cores than the 3080..

Nah it isn't for me, I am doing machine learning stuff, so I need a lot of memory : ) Also crypto mining when higher VRAM is alway welcomed : )
Radeon 7's memory and bandwidth did help in many work related scenarios. Though the rumors have died down a bit, I still believe AMD may have a card with high Vram and bandwidth with HBM2E......AMD having lots of vram onboard is something they've always done. I would not be surprised if they have a 32GB card cooking......Maybe CDNA, maybe RDNA 2.


I'm still trying to wrap my head around why Nvidia is releasing such low Vram cards now, then in a month's time, they will release 12-16-20GB variants of their GPU stack......That's really believing your customers will buy anything you put out and they do. Some got burnt on buying 2080ti's for over $1200 mere days and months ago, and now there is much more performance on a 3080 for only $700. Some want to go and spend $1500 on a 3090, when they've already heard rumors of a 3080Super with 20GB of Vram, which will no doubt be priced lower than a 24GB 3090......Yet, there is a reason why Nvidia has high Vram cards ready, it's not out of the kindest of their hearts. It's because AMD has high vram counts in their RDNA 2 cards.....Nvidia's first set of cards is just to get the Jensen tattoed guys to pay for yet another Ferrari to add to his garage. Look at how angry some of them are because they can't get a 3080.......They will probably buy the 3080 now and in a month buy the 3080 Super and call Nvidia's praises for blessing them with double the ram for maybe $1000....

What I think is happening is that 10GB 3080's are just for the "will buy anything Nvidia guys", they will make bank on these guys, but during that small space of time what they are doing is focusing on manufacturing 3080S 20GB GPU's in their factories as we speak to go against AMD in October. That's why you can't get 3080 stock, they made very little of these.....Soon Nvidia will cut off 3080 production and will have only the 12GB 3060S, the 16GB 3070S and the 20GB 3080S on the market......Could be as soon as late October, early November...
 

M1chl

Currently Gif and Meme Champion
You say 20-25%, I read elsewhere where some people were expecting a 30-35% uplift.......I could see there's a slight possibility in a very obscure vram fiendish game at 8K, so outliers more or less, but don't see anyway a massive uptick can be sustained over 3080 apart from these edge cases.....

People are saying OC your 3080, but the 3080 only did 2300Mhz on LN2. On air and water, the OC headroom isn't much and there, the powerdraw will increase even more from it's astronomical heights. Curiously, the powerdraw on the 3090 should be much more than the 3080 too, so OC'ing that means you need more than 750 Watts. Emtek's power rating for it's 3 fan 3090 has a TDP of over 410 Watts already.....

https://www.dsogaming.com/news/emte...t-gpu-has-a-max-tdp-rating-of-over-410-watts/




4K resolution, average fps is shown in the table....

I want some Crysis benches......TBT, Nvidia's cores don't carry the same prowess as before, they boosted it for marketing reasons, but the performance increase is not 20% more because the 3090 has 20% more cuda cores than the 3080..


Radeon 7's memory and bandwidth did help in many work related scenarios. Though the rumors have died down a bit, I still believe AMD may have a card with high Vram and bandwidth with HBM2E......AMD having lots of vram onboard is something they've always done. I would not be surprised if they have a 32GB card cooking......Maybe CDNA, maybe RDNA 2.


I'm still trying to wrap my head around why Nvidia is releasing such low Vram cards now, then in a month's time, they will release 12-16-20GB variants of their GPU stack......That's really believing your customers will buy anything you put out and they do. Some got burnt on buying 2080ti's for over $1200 mere days and months ago, and now there is much more performance on a 3080 for only $700. Some want to go and spend $1500 on a 3090, when they've already heard rumors of a 3080Super with 20GB of Vram, which will no doubt be priced lower than a 24GB 3090......Yet, there is a reason why Nvidia has high Vram cards ready, it's not out of the kindest of their hearts. It's because AMD has high vram counts in their RDNA 2 cards.....Nvidia's first set of cards is just to get the Jensen tattoed guys to pay for yet another Ferrari to add to his garage. Look at how angry some of them are because they can't get a 3080.......They will probably buy the 3080 now and in a month buy the 3080 Super and call Nvidia's praises for blessing them with double the ram for maybe $1000....

What I think is happening is that 10GB 3080's are just for the "will buy anything Nvidia guys", they will make bank on these guys, but during that small space of time what they are doing is focusing on manufacturing 3080S 20GB GPU's in their factories as we speak to go against AMD in October. That's why you can't get 3080 stock, they made very little of these.....Soon Nvidia will cut off 3080 production and will have only the 12GB 3060S, the 16GB 3070S and the 20GB 3080S on the market......Could be as soon as late October, early November...
Honestly CUDA is the reason to go with nVidia, they having amazing and mature SDK for this type of thing which I want to do. AMD has some vague Open CL documentation and that's it. nVidia is light years ahead in this regard.
 
You say 20-25%, I read elsewhere where some people were expecting a 30-35% uplift.......I could see there's a slight possibility in a very obscure vram fiendish game at 8K, so outliers more or less, but don't see anyway a massive uptick can be sustained over 3080 apart from these edge cases.....

People are saying OC your 3080, but the 3080 only did 2300Mhz on LN2. On air and water, the OC headroom isn't much and there, the powerdraw will increase even more from it's astronomical heights. Curiously, the powerdraw on the 3090 should be much more than the 3080 too, so OC'ing that means you need more than 750 Watts. Emtek's power rating for it's 3 fan 3090 has a TDP of over 410 Watts already.....

https://www.dsogaming.com/news/emte...t-gpu-has-a-max-tdp-rating-of-over-410-watts/




4K resolution, average fps is shown in the table....

I want some Crysis benches......TBT, Nvidia's cores don't carry the same prowess as before, they boosted it for marketing reasons, but the performance increase is not 20% more because the 3090 has 20% more cuda cores than the 3080..


Radeon 7's memory and bandwidth did help in many work related scenarios. Though the rumors have died down a bit, I still believe AMD may have a card with high Vram and bandwidth with HBM2E......AMD having lots of vram onboard is something they've always done. I would not be surprised if they have a 32GB card cooking......Maybe CDNA, maybe RDNA 2.


I'm still trying to wrap my head around why Nvidia is releasing such low Vram cards now, then in a month's time, they will release 12-16-20GB variants of their GPU stack......That's really believing your customers will buy anything you put out and they do. Some got burnt on buying 2080ti's for over $1200 mere days and months ago, and now there is much more performance on a 3080 for only $700. Some want to go and spend $1500 on a 3090, when they've already heard rumors of a 3080Super with 20GB of Vram, which will no doubt be priced lower than a 24GB 3090......Yet, there is a reason why Nvidia has high Vram cards ready, it's not out of the kindest of their hearts. It's because AMD has high vram counts in their RDNA 2 cards.....Nvidia's first set of cards is just to get the Jensen tattoed guys to pay for yet another Ferrari to add to his garage. Look at how angry some of them are because they can't get a 3080.......They will probably buy the 3080 now and in a month buy the 3080 Super and call Nvidia's praises for blessing them with double the ram for maybe $1000....

What I think is happening is that 10GB 3080's are just for the "will buy anything Nvidia guys", they will make bank on these guys, but during that small space of time what they are doing is focusing on manufacturing 3080S 20GB GPU's in their factories as we speak to go against AMD in October. That's why you can't get 3080 stock, they made very little of these.....Soon Nvidia will cut off 3080 production and will have only the 12GB 3060S, the 16GB 3070S and the 20GB 3080S on the market......Could be as soon as late October, early November...
Is this for real? I want a 3090 just for the vram, but as seen witj the 3080s, I'm afraid the 3090 will be an absolute waste of money. Where did you get the info on new 3080s on october/november? Commonly it has been like 6-9 months after the initial release to reveal a new iteration. 2 months is really soon
 

BluRayHiDef

Banned
As far as I am concerned, the RTX 3090 is the only currently viable upgrade from my 1080Ti. I'm not going from an xx80Ti to a regular xx80 with less VRAM; so, I'm going to the top, baby!
 

thelastword

Banned
Imagine the people who bought that convinced that they are gonna play 4k60 ultra+ rtx for an entire gen :ROFLMAO: :ROFLMAO:
Just the other day in one of the Nvidia threads someone was saying finally 4K 60fps cards are here. People always say these things when the games being shown are last gen games.....And yet not even all these lastgen games are 4k 60fps....With only a 10% uptick in RT performance over Turing, when I read articles before saying the RT performance of Ampere over Turing would be massive, it only means the Nvidia hype machine is just crazy, but too blatant for generally smart PC tinkerers to not put two and two together....So it must be the koolaid.

So when we start getting next gen games built with shadows, lighting and reflection RT all at once, with improvements to volumetrics, shaders and effects.....I want to see how the 3080 will manage at 4K. The bruteforce approach is something Nvidia has always done, but here doubling cores is not 1:1, so the performance they promised is not there, hence why they were pushing DLSS so much, because they know at native resolutions the uptick is not that great.....8K gaming on a 3090 (fineprint; through DLSS) and high vram which will not benefit most gamers right now......Only $1500, the more you buy right......3080 10GB $700, a card that should have had this performance and price since Turing is now being marked as such a great deal, because they were offered less performance per generation from Turing at highly marked up prices......I wish I could say I wish I was working for NV, it's a dog meet dog's world, and business is crude..... and maybe I'd have a huge garage of exotic cars working there, but I'd feel bad for gouging customers loyalty like that....

Honestly CUDA is the reason to go with nVidia, they having amazing and mature SDK for this type of thing which I want to do. AMD has some vague Open CL documentation and that's it. nVidia is light years ahead in this regard.
Well this isn't a thread to prevent you or to chastise you from buying what you want. It's your money, just pointing out some real stats and facts and how everything is shaking up.......With the information you do with it what's best for you and what suits you. People have a right to purchase anything for whatever reason they deem tbh....It could be as simple as, I love Jensen Huang's black leather jacket......In truth I'm not being sarcastic here, people buy things for whatever suits them. an item is pink, they love the design, they have the disposable cash it happens.....I'm just saying, looking at your justification, it's work related reasons, so looking at some rumors, waiting a little to see how everything shakes up is never a bad thing....There will be benches on the PRO software you use I'm sure.......
 

M1chl

Currently Gif and Meme Champion
Well this isn't a thread to prevent you or to chastise you from buying what you want. It's your money, just pointing out some real stats and facts and how everything is shaking up.......With the information you do with it what's best for you and what suits you. People have a right to purchase anything for whatever reason they deem tbh....It could be as simple as, I love Jensen Huang's black leather jacket......In truth I'm not being sarcastic here, people buy things for whatever suits them. an item is pink, they love the design, they have the disposable cash it happens.....I'm just saying, looking at your justification, it's work related reasons, so looking at some rumors, waiting a little to see how everything shakes up is never a bad thing....There will be benches on the PRO software you use I'm sure.......
That made me laugh, thanks : D
 

thelastword

Banned
Is this for real? I want a 3090 just for the vram, but as seen witj the 3080s, I'm afraid the 3090 will be an absolute waste of money. Where did you get the info on new 3080s on october/november? Commonly it has been like 6-9 months after the initial release to reveal a new iteration. 2 months is really soon
Well the Gigabyte leak is sound, the @_rogame leaks are sound. With that and looking at NV's history, where they released Supers and Ti's to compete with the 5700XT, 5700, 5600XT and 5500.....I can say affirmatively that they have cards ready for when AMD launches RDNA 2, which is said to compete from the top end to the low end, unlike their highest RDNA1 sku which started from high-mid.
 

nosseman

Member
You're comparing wrong cards. The Titan was massively faster than the 2080.

The 3090 (Ampere's Titan), is slightly faster than the 3080.

Isn't the Ti "dead"?

3080 is performance/price-wise the new 2080 Ti - especially with the 20Gb memory model that is rumored.

2070 - 2304 shading units.
2080 - 2944 shading units.
2080 Ti - 4352 shading units.
RTX TITAN - 4608 shading units.

3070 - 5888 shading units.
3080 - 8704 shading units.
3090 - 10496 shading units.

Perhaps 3090 is the new "Ti" and we have yet to see the new Titan. On the 20xx series the Titan was released a couple of months after the other cards.

Last generation the Titan was 2500 dollars and comparable models has not become cheaper in the last 3-4 generations. Perhaps there is a "real" Titan coming soon - for 2500 - 3000 dollars.
 

llien

Member
4k is there guys, because 2080/2080s consoles are targeting it as base resolution.
Although fps they'll target is likely 30.


3070 - 5888 shading units.
3080 - 8704 shading units.
3090 - 10496 shading units.
No, not really.
The real number of shading units is half of the claimed.
It is just Huang thought that since new units could do either 1 int + 1 fp or 2 fp ops per clock, he could claim double the number.
 

Ascend

Member
Big Navi will disappoint.
Indeed... It will disappoint the ones that bought a 3000 series card or were losing their shit because they weren't able to get one.

I think you guys are looking at it all wrong.

It's not that the 3090 is so bad but that the 3080 is so good.
We are getting within 10% performance of a $1500 card for only $700 bucks, amazing.:messenger_beaming:
It's the same chip. The pricing of the 3090 is atrocious.

Why does it matter what someone does with their money? There are plenty of performance cars out there, and some will spend double the money, just to get a fraction of a second on a circuit.
Because it unnecessarily hikes the prices for the ones that actually buy what they need rather than what they feel they need.
Let me put it this way... Do you worry about other people wearing a mask?

If people are buying the 3090 for the VRAM, why not just wait for the next 3080 iteration with more VRAM then? I get the feeling that this whole thing is a bait-and-switch.
This. A 20GB 3080 has pretty much been confirmed.
 

BluRayHiDef

Banned
4k is there guys, because 2080/2080s consoles are targeting it as base resolution.
Although fps they'll target is likely 30.



No, not really.
The real number of shading units is half of the claimed.
It is just Huang thought that since new units could do either 1 int + 1 fp or 2 fp ops per clock, he could claim double the number.
So, in reality, the number of hardware/ physical shading units in the RTX 3090 is 5,248? Is each unit larger and/ or faster than a unit in the RTX Titan?
 

ZywyPL

Banned
So, in reality, the number of hardware/ physical shading units in the RTX 3090 is 5,248? Is each unit larger and/ or faster than a unit in the RTX Titan?

It depends on how much INT calculations there are, NV figured out that typically only about 30-35% are those while gaming, so it was a better design choice than a fixed 1:1 ratio between INT and FP cores. So usually there's around 7-7.5k CUDA cores available for games in 3090, and about 5.5-6k in 3080.
 

llien

Member
So, in reality, the number of hardware/ physical shading units in the RTX 3090 is 5,248? Is each unit larger and/ or faster than a unit in the RTX Titan?
Yes. Notably, it supports 2fp instead of int + fp op, sort of "doubling" tflops figure. It is a far cry from having twice as many units though.

It's been "here" for the last 2 years.
This time it's here the way 1080p has been.
 
Last edited:

BluRayHiDef

Banned
It depends on how much INT calculations there are, NV figured out that typically only about 30-35% are those while gaming, so it was a better design choice than a fixed 1:1 ratio between INT and FP cores. So usually there's around 7-7.5k CUDA cores available for games in 3090, and about 5.5-6k in 3080.

So, does the following demonstrate what you mean?

RTX 3090's "10,496" CUDA cores / 2 = 5,248 CUDA cores

5,248 CUDA cores * 0.30 = 1,574.4 CUDA cores for INT operations

5,248 CUDA cores * 0.35 = 1,836.8 CUDA cores for INT operations

Average number of CUDA cores available per clock = 5,248 CUDA cores for FP operations + 1,574.4 or 1,836.8 CUDA cores for INT operations = 6,822.4 to 7,084.8 total CUDA cores
 

MiguelItUp

Member
Man, this was not what I was expecting at all, lmao. So clearly the 3080, or even the 3070 would make the most sense. At least, until we see 3070 numbers, 3080 it is!

Pretty wild that we all flipped about 3090 and it's... Strong, sure, but not what we were expecting lmao.
 

llien

Member
So, does the following demonstrate what you mean?

RTX 3090's "10,496" CUDA cores / 2 = 5,248 CUDA cores

5,248 CUDA cores * 0.30 = 1,574.4 CUDA cores for INT operations

5,248 CUDA cores * 0.35 = 1,836.8 CUDA cores for INT operations

Average number of CUDA cores available per clock = 5,248 CUDA cores for FP operations + 1,574.4 or 1,836.8 CUDA cores for INT operations = 6,822.4 to 7,084.8 total CUDA cores

No. Cores are generic programmable units. There are no "cores for int operations".
Think of them as mini CPUs.
 
I don't know much about how RAM works in video cards, but the 3080/90 use GDDR6x as opposed to just GDDR6 that the 2080TI used. I don't think you can compare 10GB in a 1080TI to 10GB in a 3080/90.

gddr6x_comparison.jpg



There's not orders of magnitude difference in bandwidth. 10GB is still 10GB, it's not a lot of RAM but it is enough in my opinion
 

ZywyPL

Banned
So, does the following demonstrate what you mean?

RTX 3090's "10,496" CUDA cores / 2 = 5,248 CUDA cores

5,248 CUDA cores * 0.30 = 1,574.4 CUDA cores for INT operations

5,248 CUDA cores * 0.35 = 1,836.8 CUDA cores for INT operations

Average number of CUDA cores available per clock = 5,248 CUDA cores for FP operations + 1,574.4 or 1,836.8 CUDA cores for INT operations = 6,822.4 to 7,084.8 total CUDA cores



Yeah, almost exactly like that. Here is the (gaming) Ampere SM scheme, to better illustrate the whole INT vs FP confusion (which is still confusing lol):

ampere_architektura.png




Basically, in RTX3090, 5248 is the fixed/guaranteed number of FP32 cores, whereas from what I found on the internet, more than half of those "INT+FP32" cores are actually occupied by INT calculations, so leaving around ~2500 additional FP32 cores to the fixed ones.
 

BluRayHiDef

Banned
No. Cores are generic programmable units. There are no "cores for int operations".
Think of them as mini CPUs.

What I mean by "cores for INT operations" are cores that - for a particular cycle - are calculating an INT operation in addition to a floating point operation, which means that said cores can be counted twice: once as a core for a floating point operation and once for an integer operation. This is why I added the 30% to 35% of cores that would be partially dedicated to an INT operation to the total number of actual, physical cores (5,248 cores).
 
Last edited:
Top Bottom