• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA GeForce RTX 3090 TecLab Review Leaked: 10% Faster Than RTX 3080

thelastword

Banned
TecLab appears to have gotten leaky again and published a detailed review of the RTX 3090 (spotted and compiled by Videocardz). Since the video was taken down the last time they did this, WhyCry took the liberty of compiling all the juicy bits and it seems that according to their review the RTX 3090 is on average just 10% faster than the RTX 3080 (while being twice as expensive). While this is certainly due to some bottleneck, these are very interesting (and controversial) results.

TecLab leaks NVIDIA RTX 3090 review: 10% faster than the RTX 3080
TecLabs is quickly approaching cult status in the leak scene as the husky mask-donned vigilantes break the usual embargo to publish juicy videos with details earlier than anyone else. This time around, the subject of the video is the anxiously anticipated RTX 3090. Spoiler alert, however, as the RTX 3090 only seems to be capable of generating 10% more performance than the RTX 3080 on the current state of drivers and game code.

Before we begin, the preliminaries are as follows: An Intel Core i9-10900 clocked at a solid 5 GHz was used with 32 GB of RAM at 4133 MHz. Needless to say, this is an absolutely solid compute configuration and more than exceeds the amount available in usual circumstances to gamers. A Galax HOF PRO M.2 1 TB SSD was used and a flurry of titles was tested. While the review doesn't explicitly mention the RTX 3080 and RTX 3090, they do say the 5000 YUAN flagship and the 10000 YUAN flagship so it is very clear what they are referring to.

The following data was compiled by Videocardz. As we can see below, the average performance increase across a panel of 16 synthetics and titles is roughly 10% (the exact number is actually 8.8% which is even less) delta over the RTX 3080. This is something that isn't in itself particularly surprising considering the RTX 3080 already doubles the CUDA core count of the RTX 2000 series and any software based bottlenecks would easily show up here as well as the RTX 3090.

NVIDIA GeForce RTX 3090 vs RTX 3080 (compilation by Videocardz)
Score / 4K AVG FPSRTX 3090RTX 30803090/3080
3DMark Time Spy Extreme99489000+10.5%
3DMark Port Royal1282711981+7.1%
Metro Exodus RTX/DLSS OFF54.448.8+10.2%
Metro Exodus RTX/DLSS ON74.567.6+10.2%
Rainbow Six Siege275260+5.8%
Horizon Zero Dawn8476+10.5%
Forza Horizon156149+4.7%
Far Cry10799+8.1%
Assassins Creed Oddysey7165+9.2%
Shadow of the Tomb Raider RTX/DLSS Off9183+9.6%
Shadow of the Tomb Raider RTX/DLSS On111102+8.8%
Borderlands 367.661.3+10.3%
Death Stranding DLSS/RTX ON175164+6.7%
Death Stranding DLSS/RTX OFF116104+11.5%
Control DLSS/RTX ON7165+9.2%
Control DLSS/RTX OFF6257+8.8%

We have also attached some screenshots of the benchmarks below but the question then becomes: is the RTX 3090 just 9% faster than the RTX 3080? and if so why? The first answer that comes to my mind is that the amount of core increase that we saw in the 3000 series is just too big for software stacks to handle. While the drivers would (probably) have been updated to handle the massive throughput, game code and engines have to scale up to take advantage of the available processing power as well. This is sort of like games being optimized primarily to take advantage of just 1 core and not scaling perfectly.

What we are seeing with the RTX 3080 and 3090 also appears to be a similar problem where the hardware is being bottlenecked by software. AMD's GPUs are usually fondly referred to as finewine but if my gut instinct is correct, the RTX 3000 series is going to turn out to be the biggest load of finewine silicon the gaming world has ever seen. With the cards only delivering half of the performance promised by NVIDIA I am fairly certain we are going to get massive incremental performance improvements pushed via software.
 
1. Yea this card isn't the best choice for gamers, at all, video production and the like? yeeee

2. There's no way in hell you would've posted this thread if it was a massive gain
2.) Have you seen his post history? This is like comparing the 2080ti to the Rtx Titan. Different strokes for different folks. But of course thelastword thelastword thought it was a "gotcha" to PC folks.

Commence thread backfire
 
Last edited:

Kenpachii

Member
Its the same chip so it was always going to be a OC'd version of it.
U buy the card for the v-ram.

Sadly it's the only next gen ready gpu nvidia has atm in its lineup.

If it has 16-24GB HBM2e VRAM, then it's worth it. Otherwise, it's crap investment over 3080.

And that's exactly why they didn't release a 16gb 3080.
 
Last edited:

skneogaf

Member
I'm okay with 3080 performance but with more than 10GB of ram, I don't need 24GB though so I'm hoping a super or ti version of the 3080 will be in between the 3080 and 3090 prices.

I'm actually fine with my rtx 2080ti but I want hdmi 2.1 for my LG C9 and it's 4k@120hz gsync rgb 10 bit ability.
 

waylo

Banned
Sadly it's the only next gen ready gpu nvidia has atm in its lineup.
tenor.gif
 

ancelotti

Member
What an odd choice of CPU bottlenecked games. I think there will be greater seperation in games that utilize the GPU better, but yeah it's not great value for most gamers.
 

Aidah

Member
I'm okay with 3080 performance but with more than 10GB of ram, I don't need 24GB though so I'm hoping a super or ti version of the 3080 will be in between the 3080 and 3090 prices.

I'm actually fine with my rtx 2080ti but I want hdmi 2.1 for my LG C9 and it's 4k@120hz gsync rgb 10 bit ability.
Apparently people are having issues with 4K/120/gsync on C9 and CX. Don't know if it's on LG or Nvidia's side.
 

Bolivar687

Banned
You're better off buying the best factory overclocked 3080 and finding out how much more room you have for overclocking beyond that. And you're still hundreds of dollars below the 3090.

It's not a Titan. It's marketed as a gaming card and will get gaming drivers. The size and TDP is not amenable to having multiples of these for production.

There's a reason why the 3090 exists and we are going to find out what it is on October 28.
 
10GB isn't enough for me to upgrade my 1080 Ti. 24GB is overkill. Something needs to exist inbetween these two SKUs. A 3080 Ti with 20GB and almost identical performance to the 3090, but at $1000, is the card to get. 1080 Ti brethren, hold the line. Don't give in for the false prophet 10GB babby.
 

BluRayHiDef

Banned
10GB isn't enough for me to upgrade my 1080 Ti. 24GB is overkill. Something needs to exist inbetween these two SKUs. A 3080 Ti with 20GB and almost identical performance to the 3090, but at $1000, is the card to get. 1080 Ti brethren, hold the line. Don't give in for the false prophet 10GB babby.
I have an EVGA GTX 1080Ti and I'm planning on jumping ship to the RTX 3090. I'm not waiting for an RTX 3080Ti.

fMTDa7F.jpg
 
Last edited:

Kenpachii

Member
Yup, the 10gb 3080 can't even keep up with a Series S, it's a sad state of affairs :messenger_face_screaming::messenger_poop:

What do you mean by this?

PS4 games use 3gb of v-ram allocation, this is where games are designed for.
PC uses up to 6gb ( re2 ), maybe even higher for some games at top settings but lets keep that out of the comparison.

Why does PC uses more v-ram?
- higher base settings (which made 4gb pretty much the high setting standard for this generation vs 3gb on consoles )
- higher settings that consume more v-ram.

Next generation on PC will consume 10gb of v-ram and we know this because of:
- 10gb v-ram allocation on xbox series X
- PS5 could go higher as they don't have a split pool.

But then when u realize the following:
- PC base line goes up ( lets say 25% as with the PS4 ) for higher settings = 12gb of v-ram needed
- Raytracing consuming more v-ram through more aggressive settings then consoles = more then 12gb of v-ram.
- GPU compression which also reserves v-ram = more then 12gb of v-ram.
- higher settings that could go over that 12gb pretty darn easily as we saw this generation to the point of even doubling it.

This is why people wanted 16gb as minimum for the next gpu lineup, but nvidia being nvidia as scummy as they are as company knew this and pretty much cut on the v-ram amounts for the simple reason they don't want u to sit on a gpu for more then a generation. U need reasons to upgrade.

Why do you think the 3090 has 24gb of v-ram and not 12gb of v-ram if that 10gb was more then enough?, because even nvidia knows this. They delivered u a 3080ti as 3080 but nuked the v-ram massively in order for people to still flesh out 2080ti ifnot higher budgets in order to actually have a card that's next gen ready.

But why would they not just bolt a weaker chip on that 3080? because they probably need the GPU performance increase vs competition like AMD to showcase they are still the top dog.

Otherwise u would have never seen nvidia make the following decisions:

- high end chip for a cheaper card, why not launch the 3080 as 3080ti and charge 1200 for it?
- a massive monostory of a gpu that heats up your room like nothing else
- insane power consumption
- massive sized card that probably doesn't fit in much builds.

They would have made a sleek card with again minimal increase in performance for maximum gains at low energy consumption and heat production and called it a day.

This is why i state that 3070 and 3080 are not next gen ready. But the 3090 is. and this is why most high end PC gamers are waiting for nvidia to drop the real 16-20gb cards.

3080 and 3070 buyers are pretty much beta testers, so devs can get there games optimized for the new architecture and nvidia can build drivers on player feedback so that when the real cards come out the problems are solved.
Sadly a lot of people new to PC gaming don't know this and buy into gpu's and hardware they don't understand just to get burned hard. The same as people paying a few months back still 1200 bucks for 2080ti's.
 
Last edited:

thelastword

Banned
I believe Nvidia made it really clear that the 3080 is their flagship gaming card (well... until they announce the 3080Ti). The 3090 is basically a Titan card that happens to not have the Titan branding.
Thing is people still want it thinking it's gonna give them a huge performance uptick over the 3080.....
 

skneogaf

Member
Apparently people are having issues with 4K/120/gsync on C9 and CX. Don't know if it's on LG or Nvidia's side.

Yeah I saw that gsync isn't working properly but I'm quite sure it will be fixed.

The 4k@120hz full colour etc is definitely what I desire.
 

zcaa0g

Banned
Going to wait to see more reviews next week. It wouldn't be shocking if the performance difference is slight when excluding the vram factor but need to see more benchmarks and of different games.
 

GreatnessRD

Member
Big Navi will disappoint.
I really, really hope you're wrong. If it can get close to the 3080 and cheaper, I think a lot of people will be happy. I don't expect anything AMD put out to compete with the 3090.

I think you guys are looking at it all wrong.

It's not that the 3090 is so bad but that the 3080 is so good.
We are getting within 10% performance of a $1500 card for only $700 bucks, amazing.:messenger_beaming:

In gaming, yes. If that's what you were going for. But yeah, the 3080 is a solid card.
 

BluRayHiDef

Banned
PS4 games use 3gb of v-ram allocation, this is where games are designed for.
PC uses up to 6gb ( re2 ), maybe even higher for some games at top settings but lets keep that out of the comparison.

Why does PC uses more v-ram?
- higher base settings (which made 4gb pretty much the high setting standard for this generation vs 3gb on consoles )
- higher settings that consume more v-ram.

Next generation on PC will consume 10gb of v-ram and we know this because of:
- 10gb v-ram allocation on xbox series X
- PS5 could go higher as they don't have a split pool.

But then when u realize the following:
- PC base line goes up ( lets say 25% as with the PS4 ) for higher settings = 12gb of v-ram needed
- Raytracing consuming more v-ram through more aggressive settings then consoles = more then 12gb of v-ram.
- GPU compression which also reserves v-ram = more then 12gb of v-ram.
- higher settings that could go over that 12gb pretty darn easily as we saw this generation to the point of even doubling it.

This is why people wanted 16gb as minimum for the next gpu lineup, but nvidia being nvidia as scummy as they are as company knew this and pretty much cut on the v-ram amounts for the simple reason they don't want u to sit on a gpu for more then a generation. U need reasons to upgrade.

Why do you think the 3090 has 24gb of v-ram and not 12gb of v-ram if that 10gb was more then enough?, because even nvidia knows this. They delivered u a 3080ti as 3080 but nuked the v-ram massively in order for people to still flesh out 2080ti ifnot higher budgets in order to actually have a card that's next gen ready.

But why would they not just bolt a weaker chip on that 3080? because they probably need the GPU performance increase vs competition like AMD to showcase they are still the top dog.

Otherwise u would have never seen nvidia make the following decisions:

- high end chip for a cheaper card, why not launch the 3080 as 3080ti and charge 1200 for it?
- a massive monostory of a gpu that heats up your room like nothing else
- insane power consumption
- massive sized card that probably doesn't fit in much builds.

They would have made a sleek card with again minimal increase in performance for maximum gains at low energy consumption and heat production and called it a day.
Brilliant deductive reasoning skills and explanation. Now I'm even more confident in my decision to get an RTX 3090.
 

thelastword

Banned
I really, really hope you're wrong. If it can get close to the 3080 and cheaper, I think a lot of people will be happy. I don't expect anything AMD put out to compete with the 3090.



In gaming, yes. If that's what you were going for. But yeah, the 3080 is a solid card.
CDNA cards will be the product for workstations, professional graphics and compute on the AMD side.....


As far as I can see, Nvidia has marketed the 3090 as a gaming card and many people were saying in a number of Nvidia threads that they would be getting this card for gaming.....

GEFORCE RTX 3090
THE BFGPU

The GeForce RTX™ 3090 is a big ferocious GPU (BFGPU) with TITAN class performance. It’s powered by Ampere—NVIDIA’s 2nd gen RTX architecture—doubling down on ray tracing and AI performance with enhanced Ray Tracing (RT) Cores, Tensor Cores, and new streaming multiprocessors. Plus, it features a staggering 24 GB of G6X memory, all to deliver the ultimate gaming experience.

https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3090/

At twice the price of the 3080, that 8.8% uptick overall must be really worth it....
 
CDNA cards will be the product for workstations, professional graphics and compute on the AMD side.....


As far as I can see, Nvidia has marketed the 3090 as a gaming card and many people were saying in a number of Nvidia threads that they would be getting this card for gaming.....

GEFORCE RTX 3090
THE BFGPU

The GeForce RTX™ 3090 is a big ferocious GPU (BFGPU) with TITAN class performance. It’s powered by Ampere—NVIDIA’s 2nd gen RTX architecture—doubling down on ray tracing and AI performance with enhanced Ray Tracing (RT) Cores, Tensor Cores, and new streaming multiprocessors. Plus, it features a staggering 24 GB of G6X memory, all to deliver the ultimate gaming experience.

https://www.nvidia.com/en-us/geforce/graphics-cards/30-series/rtx-3090/

At twice the price of the 3080, that 8.8% uptick overall must be really worth it....
Why does it matter what someone does with their money? There are plenty of performance cars out there, and some will spend double the money, just to get a fraction of a second on a circuit.
 

Siri

Banned
Apparently people are having issues with 4K/120/gsync on C9 and CX. Don't know if it's on LG or Nvidia's side.

I just saw that. 120Hz 4K Gsync support would be the chief reason for me to replace my 2080 TI.

Wow - I’m feeling really good about my 2080 TI right now. It’s been a great 2 years at 4K with this card.... and now it looks like I’ll be heading into the Fall with this card. At least I’m ready for Cyberpunk.

That said, I really hope a fix is found soon - but how the hell did this get overlooked? I’m worried about that.
 

thelastword

Banned
I was thinking that maybe the card would shine a bit more at 4K, but it only has a a 20% uptick in cuda cores over the 3080, which we all know is not a 1:1 representation.......so perhaps that explains it. Unless you get a game that is a Vram fiend, you will not see bigger divides than what is seen here.....Can it run crysis benches would have been interesting over the 3080 though....

Now Nvidia is pitching this as an 8K card (DLSS mind you), but with all the talk we heard of a huge uptick in RTX cores and generally doubling performance with Ampere.....The 3090 can't even hit 60fps with raytracing at 4K in older titles like Control and Metro......When the demanding next gen games come, how will it fare........I'm not seeing a huge upgrade in RTX performance over Turing like they promised.....
 
I been saying I was gonna go for a 3090, but I might just hold onto this 2080 Super we got until one of those rumored 3080 20gb cards come out to be honest.
 

Shakka43

Member
So there's a good chance that the 3080 Super reaches 3090 performances at a hopefully much lower cost. I was thinking of getting a 3080 but with the pre-orders fiasco and already pre-ordering a PS5 and Oculus Quest 2 I think I'll better save my money for a Super/Ti card.
 

BluRayHiDef

Banned
I was thinking that maybe the card would shine a bit more at 4K, but it only has a a 20% uptick in cuda cores over the 3080, which we all know is not a 1:1 representation.......so perhaps that explains it. Unless you get a game that is a Vram fiend, you will not see bigger divides than what is seen here.....Can it run crysis benches would have been interesting over the 3080 though....

Now Nvidia is pitching this as an 8K card (DLSS mind you), but with all the talk we heard of a huge uptick in RTX cores and generally doubling performance with Ampere.....The 3090 can't even hit 60fps with raytracing at 4K in older titles like Control and Metro......When the demanding next gen games come, how will it fare........I'm not seeing a huge upgrade in RTX performance over Turing like they promised.....

Ray Tracing is extremely demanding and its implementation via consumer graphics cards is still novel. So, it's unreasonable to expect even the RTX 3090 to be able to process ray tracing at 4K while maintaining 60 frames per second. Hence, there is DLSS.

Via its Tensor cores, the RTX 3090 can render games with ray tracing at a sub-4K resolution while maintaining 60 frames per second and while upscaling its output in real time with picture quality that rivals native 4K. I'd say that's a win.

Considering the enormity of 4K resolution and subsequent resolutions (e.g. 8K, 16K, etc), relying on brute force to render games is unwise but relying on smarts (i.e. machine learning) is requisite.
 
I think people need to remember the 3090 is also aimed at media creation users hence the 24gb memory. This is kinda like Nvidia's version of the Radeon 7, what pushes up the price is the memory.
 

ZywyPL

Banned



Meh, that rumored 20-25% performance uplift was very promising, but for a mere 10% (if not less) it's absolutely not worth double the price, it's better to just OC the 3080 and call it a day. The only way NV could make it gaming-worthy would be to allow that Ultra Performance DLSS mode to be used not just for 8K but also to upscale games to 4K but from 1080p instead of 1440p, giving it quite and edge compared to 3080/70.
 
Last edited:
Top Bottom