• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

New Nvidia RTX 4000 Adola Lovelace Cards Announced | RTX 4090 (1599$) October 12th | RTX 4080 (1199$)

FireFly

Member
If nvidia is trying to stall the mid-tier 40 cards to clear out the 30's, doesn't that mean amd has an opportunity to get mid-tier 7000's out as fast as they can? A 7600xt up against 3060's would be a great postion, right? Would nvidia be forced to release the 4060/70?
That is if AMD themselves don't have stock to clear. The 7600 XT (Navi 33) is rumoured to only be releasing next year anyway.
 
Last edited:

Crayon

Member
That is if AMD themselves don't have stock to clear. The 7600 XT (Navi 33) is rumoured to only be releasing next year anyway.

That's a good point. You just hear about nvidia when it comes to mad overstock. Haven't specifically heard that amd doesn't have the same problem even on a smaller scale. They must have less chips on hand because their marketshare is so much smaller, but it could still be an issue.

Anyone know how much more popular nvidia was for mining? That would surely be a factor if their "marketshare" with miners was a lower percentage than that of actual gamers.
 

lukilladog

Member
I wonder how difficult the 4090 will be to get at launch

Let´s see. Miners are mourning crypto, nvidia was trying to cut chip orders at TSMC two months ago, +$1000 usd cards are still a niche market, most people think they got plenty of gpu power for console ports, 3000 miner high end series start flooding second hand markets... it now all depends on how hard Nvidia goes about Scarcity tactics.

Ps.- And lets not forget about inflation :messenger_grinning_sweat:
 
Last edited:

OZ9000

Banned
Rising GPU prices here to stay.

"The idea that the chip is going to go down in price is a story of the past."
During the Q&A session, Jensen Huang was asked about GPU prices. His response was very telling.
"Moore's Law is dead. […] A 12-inch wafer is a lot more expensive today. The idea that the chip is going to go down in price is a story of the past," said Nvidia CEO Jensen Huang.
 

peronmls

Member
Why do they keep using the same design every generation?

5skYC5Y.png
Because consumers are idiots and will buy it regardless.
 

Rbk_3

Member
Let´s see. Miners are mourning crypto, nvidia was trying to cut chip orders at TSMC two months ago, +$1000 usd cards are still a niche market, most people think they got plenty of gpu power for console ports, 3000 miner high end series start flooding second hand markets... it now all depends on how hard Nvidia goes about Scarcity tactics.

You would think is supply isn't an issue they would want to get as many 4090s out there before AMD's announcement, no? Creating artificial stock issues seems foolish
 
Rising GPU prices here to stay.

The pricing strategy is an insult to consumers, it has little to do with the rising GPU costs. It's pretty clear that Nvidia want to clear out their 30 Series stock first.

Nvidia's hot and loud approach for Lovelace is also backfiring on them. The die size for the 4090 is around 600mm2, meanwhile Navi 31 is rumoured to be around 350mm2, that's almost half the die size.

AMD are at a massive advantage here to price their cards much more competitively than Nvidia, but we'll have to wait and see.
 
You would think is supply isn't an issue they would want to get as many 4090s out there before AMD's announcement, no? Creating artificial stock issues seems foolish
Rumors are the 4090 was done in August and could have been released already. Probably plenty of stock available in October.
 
Last edited:

lukilladog

Member
You would think is supply isn't an issue they would want to get as many 4090s out there before AMD's announcement, no? Creating artificial stock issues seems foolish

They have plenty of marketshare and their company shares need a a little boost from a new product that "everybody wants".
 

GreenAlien

Member
That AI texture enhancement stuff is wild. I wonder if it will only work with games that are very modable to begin with, like Morrowind. If it works for a wide range of games that could really be a killer feature.
 

twilo99

Member
They had the opportunity with RDNA 2 and they had their cards priced $50 cheaper than Nvidia's but with missing features. Not to mention that AMD's MSRP was a joke, they wanted to stop producing reference cards for a reason, they did not want to sell at those prices and the AIBs, for the same cooler solution as Ampere, were sometimes +$100-$150.

I'm not even sure they would be ready to ramp up to supply an increased demand in the likes that Nvidia sees. The 3090 alone sold more than the entire 6000 series.

KpyheGH.jpg


Say, AMD thinks they have a winner on their hands and somehow an attractive pricing. To ramp up production to compete, which requires to be ready on launch day as ramping up takes time and it would already be the next generation, they would be at risk of having millions of cards not being sold since competition can always change price too.

I really doubt we would have an AMD "nice guy" moment having a competitive card for >$100 less. Zen was a desperate move, like on the brink of bankruptcy move, a do or die situation. GPUs don't seem to be their core business and seem fine getting 1-2% of the userbase..

Eye opening stats
 

twilo99

Member
No worries, you can always settle for AMD's offering.

I'm on my second AMD card for the last 3 years and I really don't think I'm missing out on anything major. Perhaps it depends on the games you play...

Unfortunately price wise they will probably be very similar.
 

Ironbunny

Member
I'm on my second AMD card for the last 3 years and I really don't think I'm missing out on anything major. Perhaps it depends on the games you play...

Unfortunately price wise they will probably be very similar.

You aint missing anything. Take this from a guy with 3080.
 

OZ9000

Banned
If DLSS 3 adds more latency then it sounds worse than DLSS 2.

Why can't I find the latency video anywhere on YouTube? Did Nvidia delete it?
 

//DEVIL//

Member
Yes I meant the 4070 in 4080 12gb disguise
Uhh where does it say the 4080 12 gig is less powerful than 3090/3090ti ?

Different architecture. Just because 3090 has more coda cores doesn’t make it better.

Not defending a 3090 ( I have that card and I love it ), but we just don’t have numbers. And even if the 3090 is more powerful. With 3rd Gen ray tracing and dlss3 vs 2 on 3090… I don’t have high hopes
 

twilo99

Member
I listened to a few podcasts and I guess some of the price increases can be attributed to TSMC raising prices..

One good thing about the new cards is that they are no longer using Samsung!


You aint missing anything. Take this from a guy with 3080.

I'm sure Ray tracing looks much better on certain titles but it hasn't really bothered me much.

I think if you are a twitch streamer Nvidia is almost a must have because of their software, at least that's what I've heard.

3080 is a very nice card tho, are you going to upgrade to the not cut down 4080?

I think ill stick with my 6800xt for a while longer.
 

winjer

Gold Member
Uhh where does it say the 4080 12 gig is less powerful than 3090/3090ti ?

Different architecture. Just because 3090 has more coda cores doesn’t make it better.

Not defending a 3090 ( I have that card and I love it ), but we just don’t have numbers. And even if the 3090 is more powerful. With 3rd Gen ray tracing and dlss3 vs 2 on 3090… I don’t have high hopes

That's true. But nvidia didn't say anything about improvements to shader units.
And they always make a point in talking about what is new.
Which is what they did, when they went on about the new RT cores and tensor cores. As well as all their features.
But shaders are probably the same thing as Ampere.
The only real diference is in the L2 cache size, but considering the reduction in memory bus, it might end up the same.
 

Ironbunny

Member
I listened to a few podcasts and I guess some of the price increases can be attributed to TSMC raising prices..

One good thing about the new cards is that they are no longer using Samsung!




I'm sure Ray tracing looks much better on certain titles but it hasn't really bothered me much.

I think if you are a twitch streamer Nvidia is almost a must have because of their software, at least that's what I've heard.

3080 is a very nice card tho, are you going to upgrade to the not cut down 4080?

I think ill stick with my 6800xt for a while longer.

I have a mental cap at 999€ = 999$ so 4xxx series is out of option for me. I always turn RTX off :) Waiting to see what AMD has to offer for that 999 price. NVIDIA's screen capture systems is kinda neat yea.
 

Loxus

Member
Uhh where does it say the 4080 12 gig is less powerful than 3090/3090ti ?

Different architecture. Just because 3090 has more coda cores doesn’t make it better.

Not defending a 3090 ( I have that card and I love it ), but we just don’t have numbers. And even if the 3090 is more powerful. With 3rd Gen ray tracing and dlss3 vs 2 on 3090… I don’t have high hopes
Right here.
7qJpuID.jpg

Note:
This is without 4000-Series features.
 

//DEVIL//

Member
That's true. But nvidia didn't say anything about improvements to shader units.
And they always make a point in talking about what is new.
Which is what they did, when they went on about the new RT cores and tensor cores. As well as all their features.
But shaders are probably the same thing as Ampere.
The only real diference is in the L2 cache size, but considering the reduction in memory bus, it might end up the same.
L2 cache will make a huge difference.
Right here.
7qJpuID.jpg

Note:
This is without 4000-Series features.
3090ti is barely beating it. If anything, the 2 games not highlighted by you the 3090ti is getting destroyed .

And that’s without dlss3 advantage and the better ray tracing on the new cards.

Your and my 3090 are about 4060ti level now lol
 

Valonquar

Member
I mainly play FFXIV on PC so I guess I can milk my 2080 another year or more. Maybe when they release 7.0 with "graphical updates" a beefier card will matter more.
 

Celcius

°Temp. member
Nvidia says falling GPU prices are ‘a story of the past’:
https://www.digitaltrends.com/computing/nvidia-says-falling-gpu-prices-are-over/

Nvidia also did a Q&A with Reddit yesterday:
Q: What makes the GeForce RTX 4080 16 GB and 12 GB graphics cards keep the same “4080” name if they have completely different amounts of CUDA cores and are different chips?

Nvidia: "The GeForce RTX 4080 16GB and 12GB naming is similar to the naming of two versions of RTX 3080 that we had last generation, and others before that. There is an RTX 4080 configuration with a 16GB frame buffer, and a different configuration with a 12GB frame buffer. One product name, two configurations.

The 4080 12GB is an incredible GPU, with performance exceeding our previous generation flagship, the RTX 3090 Ti and 3x the performance of RTX 3080 Ti with support for DLSS 3, so we believe it’s a great 80-class GPU. We know many gamers may want a premium option so the RTX 4080 16GB comes with more memory and even more performance. The two versions will be clearly identified on packaging, product details, and retail so gamers and creators can easily choose the best GPU for themselves."

Oof
 
Last edited:

GHG

Member
The 4080 12GB is an incredible GPU, with performance exceeding our previous generation flagship, the RTX 3090 Ti and 3x the performance of RTX 3080 Ti with support for DLSS 3, so we believe it’s a great 80-class GPU.

I wish they would cut the bullshit here.

Just give us an indication without DLSS muddying the waters and be done with it.

Not every game uses DLSS and I certainly don't like using it in VR in most titles that support it.

At least reviews will tell us what we need to know but this obscurification in the meantime is ridiculous.
 
Last edited:

//DEVIL//

Member
Nvidia says falling GPU prices are ‘a story of the past’:
https://www.digitaltrends.com/computing/nvidia-says-falling-gpu-prices-are-over/

Nvidia also did a Q&A with Reddit yesterday:
Q: What makes the GeForce RTX 4080 16 GB and 12 GB graphics cards keep the same “4080” name if they have completely different amounts of CUDA cores and are different chips?

Nvidia: "The GeForce RTX 4080 16GB and 12GB naming is similar to the naming of two versions of RTX 3080 that we had last generation, and others before that. There is an RTX 4080 configuration with a 16GB frame buffer, and a different configuration with a 12GB frame buffer. One product name, two configurations.

The 4080 12GB is an incredible GPU, with performance exceeding our previous generation flagship, the RTX 3090 Ti and 3x the performance of RTX 3080 Ti with support for DLSS 3, so we believe it’s a great 80-class GPU. We know many gamers may want a premium option so the RTX 4080 16GB comes with more memory and even more performance. The two versions will be clearly identified on packaging, product details, and retail so gamers and creators can easily choose the best GPU for themselves."

Oof
Jensen .. bend over .. panties down …. Shove the biggest 4090 you can find bish.
 

lmimmfn

Member
Having bought on release a 780Ti, 980Ti, 1080Ti, skipping 2 series I had the cash for a 3090 but couldn't get one with mining etc. and wasn't going to be ripped off.
Instead of keeping that cash for the 4 series I opted to replace my media centre for a lower power one, get a steam deck so I rarely have to turn on my power hungry PC.

I'm out of this nonsensical GPU rat race, I will buy a PS5 before I pay for those nvidiot prices.
 

Reallink

Member
The pricing strategy is an insult to consumers, it has little to do with the rising GPU costs. It's pretty clear that Nvidia want to clear out their 30 Series stock first.

Nvidia's hot and loud approach for Lovelace is also backfiring on them. The die size for the 4090 is around 600mm2, meanwhile Navi 31 is rumoured to be around 350mm2, that's almost half the die size.

AMD are at a massive advantage here to price their cards much more competitively than Nvidia, but we'll have to wait and see.

The raw chip price is largely inconsequential given the end product pricing we're talking about here. A 600mm 4090 chip likely only costs Nvidia somewhere between $250 and $300 each. This means if AMD and Nvidia for some reason both sold their final product at cost, AMD could only undercut them by $130 or so. An amount that is effectively meaningless to someone looking for a $1600+ GPU to put in their $3000+ system. It'd be nowhere close to $130 in reality though, cause Nvidia's >90% larger economy of scale would easily offset AMD's pitiful volume pricing. In otherwords regardless of what AMD does, Nvidia within the hour can come close enough to their price for it not to matter. The 4080's are much smaller and any kind of die size advantage is proportionally diluted.
 
Last edited:

PhoenixTank

Member
The raw chip price is largely inconsequential given the end product pricing we're talking about here. A 600mm 4090 chip likely only costs Nvidia somewhere between $250 and $300 each. This means if AMD and Nvidia for some reason both sold their final product at cost, AMD could only undercut them by $130 or so. An amount that is effectively meaningless to someone looking for a $1600+ GPU to put in their $3000+ system. It'd be nowhere close to $130 in reality though, cause Nvidia's >90% larger economy of scale would easily offset AMD's pitiful volume pricing. In otherwords regardless of what AMD does, Nvidia within the hour can come close enough to their price for it not to matter. The 4080's are much smaller and any kind of die size advantage is proportionally diluted.
Apparently the defect density on N4/N5 is pretty good but unless it is very nearly perfect, yields will have a significant disproportionate effect on monolithic dies vs several chiplets of the same total area.
I would've thought moving from GA102 628mm2 on Samsung 8nm to AD102 608mm2 on TSMC N4 makes for a massive chip comparatively on the new node.
I have no idea whether AMD are going for two chiplets or four or even more, but that leaves a little room to scale down for non-halo models that compete with AD103 and AD104.
Of course, most/all of the chips Nvidia are selling to consumers aren't perfect dies which will help them out too. Standard practice and all, while AMD have to deal with increased assembly costs to put the chiplets together.

I think I will be sticking out this generation for as long as my 1080Ti can hold on but if nothing else I'm keen to see how this all stacks up in 6 weeks or so.
 

Reallink

Member
Apparently the defect density on N4/N5 is pretty good but unless it is very nearly perfect, yields will have a significant disproportionate effect on monolithic dies vs several chiplets of the same total area.
I would've thought moving from GA102 628mm2 on Samsung 8nm to AD102 608mm2 on TSMC N4 makes for a massive chip comparatively on the new node.
I have no idea whether AMD are going for two chiplets or four or even more, but that leaves a little room to scale down for non-halo models that compete with AD103 and AD104.
Of course, most/all of the chips Nvidia are selling to consumers aren't perfect dies which will help them out too. Standard practice and all, while AMD have to deal with increased assembly costs to put the chiplets together.

I think I will be sticking out this generation for as long as my 1080Ti can hold on but if nothing else I'm keen to see how this all stacks up in 6 weeks or so.

Right, the 4090 is already built from the "reject" cut down AD102's. The perfect yields are all going to the inevitable Ti/Titan.
 

hlm666

Member
How much of a concern is the reduced bandwith on the gimped 4080?
Its got much more l2 cache so a comparison to ampere wont be accurate, the cache will increase bandwidth to some degree like AMDs large l3 cache. Going to need 3rd party benchmarks to get an idea how that and the extra frequency plays out.
 

IDKFA

I am Become Bilbo Baggins
I have a few questions from the GAF PC gaming elite.

1. How much do these cards cost to make? What's the mark up on these cards?

2. Will the next AMD cards due in November be comparable in power to these Nvidia cards.

3. Will AMD launch their new cards at a lower price?

Kind regards,

Somebody who wants a decent gaming PC, but doesn't want to sell a kidney.
 

THE DUCK

voted poster of the decade by bots
I mean it might be different if they were suddenly offering graphics at a whole other level, but the games don't seem to be there yet. So the more I think about it the less sense this pricing seems to make.
 

Mister Wolf

Gold Member
I have a few questions from the GAF PC gaming elite.

1. How much do these cards cost to make? What's the mark up on these cards?

2. Will the next AMD cards due in November be comparable in power to these Nvidia cards.

3. Will AMD launch their new cards at a lower price?

Kind regards,

Somebody who wants a decent gaming PC, but doesn't want to sell a kidney.

They will be similar in raster performance but most likely fall behind in raytracing and image upscaling techniques. If you do not value either of those two then a AMD card is fine. Keep in mind that AMD is not our friend and will price their lineup of cards just as high as Nvidia's cards.
 

IDKFA

I am Become Bilbo Baggins
They will be similar in raster performance but most likely fall behind in raytracing and image upscaling techniques. If you do not value either of those two then a AMD card is fine. Keep in mind that AMD is not our friend and will price their lineup of cards just as high as Nvidia's cards.

So AMD's next offering won't perform as well, but also won't be any better in terms of pricing?!

That's a shame. I really want to get a gaming rig, but I don't want to get a bang average GPU. I want a card that's going to be future proof and last many years. However, I'm not willing to spend the sort of money Nvidia are asking for.
 

Mister Wolf

Gold Member
So AMD's next offering won't perform as well, but also won't be any better in terms of pricing?!

That's a shame. I really want to get a gaming rig, but I don't want to get a bang average GPU. I want a card that's going to be future proof and last many years. However, I'm not willing to spend the sort of money Nvidia are asking for.

Look at the release price of their previous generation of cards and the generation before that as to get your answer. Every generation people try to pretend like AMD might be willing to undercut Nvidia on prices and it hasn't happened yet. If you want a good GPU then there will be plenty of 3000 series cards available new and used on the market a cheap prices. These cards are perfectly fine. If you want the latest and greatest from Nvidia then you gotta pay the cost to be the boss.

These are great prices:

https://www.amazon.com/dp/B099ZCG8T5/?tag=neogaf0e-20

https://www.amazon.com/dp/B097DTK1F6/?tag=neogaf0e-20

https://www.amazon.com/dp/B09CML48LD/?tag=neogaf0e-20

Tons of used 3070ti/3080/3080Ti/3090/3090ti on EBAY at bargain prices from sellers that will provide some sort of warranty.
 
Last edited:

IDKFA

I am Become Bilbo Baggins
Look at the release price of their previous generation of cards and the generation before that as to get your answer. Every generation people try to pretend like AMD might be willing to undercut Nvidia on prices and it hasn't happened yet. If you want a good GPU then there will be plenty of 3000 series cards available new and used on the market a cheap prices. These cards are perfectly fine. If you want the latest and greatest from Nvidia then you gotta pay the cost to be the boss.

These are great prices:

https://www.amazon.com/dp/B099ZCG8T5/?tag=neogaf0e-20

https://www.amazon.com/dp/B097DTK1F6/?tag=neogaf0e-20

https://www.amazon.com/dp/B09CML48LD/?tag=neogaf0e-20

I wouldn't even say those are great prices.

When the 3080 first launched, it started at £649 in the UK. That's a responsible price for a top of the range GPU.

The 4080 is launching in the UK for £1269. That's a absolutely insane difference! Even the 4080 12gb (which lets be honest, is the 4070) costs £949.

I can't justify those costs, so unless the price comes down, I'm out of the PC gaming market.
 

Mister Wolf

Gold Member
I wouldn't even say those are great prices.

When the 3080 first launched, it started at £649 in the UK. That's a responsible price for a top of the range GPU.

The 4080 is launching in the UK for £1269. That's a absolutely insane difference! Even the 4080 12gb (which lets be honest, is the 4070) costs £949.

I can't justify those costs, so unless the price comes down, I'm out of the PC gaming market.

High end PC gaming now is like buying a sports car. Not for most people. The consoles are fine for the masses.
 
Last edited:
Top Bottom