• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

New Nvidia RTX 4000 Adola Lovelace Cards Announced | RTX 4090 (1599$) October 12th | RTX 4080 (1199$)

DonkeyPunchJr

World’s Biggest Weeb
Card A also has twice the VRAM. Really depends on the quality of those magically generated extra frames doesn't it?
Yeah I definitely want to see what the final results look like for DLSS3.

As for VRAM, even the 10GB on the 3080 hasn’t been a bottleneck for any games AFAIK. Time will tell, but my guess is we won’t see any games where <= 12GB handicaps them in 4K gaming over the next few years.
 

skneogaf

Member
What is the power usage difference between a 3090ti and a 4090?

Also from the 3 games without dlss it looks like a 12GB 4080 (4070) is comparable to a 3090ti?
 
AMD need to come with nothing over 1000 dollars.
Their 4080 competitor should be 700 - 800 dollars and easily match the RTX4080 16'G

Then we pray Nvidia amends their partner prices.

LMFAO, dictionary definition of why AMD will NEVER EVER cross a 20% market space in GPU's. You want AMD to be aggressive in pricing, not to buy them, but so that you can have Nvidia undercut after a potential backlash online.

Guess what? This has been every nvidia fan's wishful thinking since time immemorial (if my memory serves me right, since AMD lost the crown to Fermi and the HD 5870 being their last hurrah at high end single GPU space) and NEVER have nvidia budged, nor will they now when it comes to pricing.

RDNA3 will not outperform or trade blows with Ada, just like usual, and I won't be surprised if the gulf is more than 10% this time, as Nvidia are back on TSMC.

Oh, and you really think Uncle Jensen and Niece Lisa aren't "in" on this whole pricing thing that's going on for a few years between these 2 companies? Blood's thicker than water.

You WILL buy your RTX 4090 at $1600 or 2000 EUR and will cope. Get ready for $2000 RTX 5090, 2 years from now.

Cry Love GIF by Pudgy Penguins
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
LMFAO, dictionary definition of why AMD will NEVER EVER cross a 20% market space in GPU's. You want AMD to be aggressive in pricing, not to buy them, but so that you can have Nvidia undercut after a potential backlash online.

Guess what? This has been every nvidia fan's wishful thinking since time immemorial (if my memory serves me right, since AMD lost the crown to Fermi and the HD 5870 being their last hurrah at high end single GPU space) and NEVER have nvidia budged, nor will they now when it comes to pricing.

RDNA3 will not outperform or trade blows with Ada, just like usual, and I won't be surprised if the gulf is more than 10% this time, as Nvidia are back on TSMC.

Oh, and you really think Uncle Jensen and Niece Lisa aren't "in" on this whole pricing thing that's going on for a few years between these 2 companies? Blood's thicker than water.

You WILL buy your RTX 4090 at $1600 or 2000 EUR and will cope. Get ready for $2000 RTX 5090, 2 years from now.

Cry Love GIF by Pudgy Penguins
CUDA is king.
But i still dont want to pay over 1000 dollars for a GPU.

I can wait for prices to "normalize" i only play at 3440x1440 so im still sorted for a bit.
But cutting down GPU renders always entices me.
 

hlm666

Member
DLSS 3 might reduce power consumption according to some info on videocardz, cyberpunk is using 100 watts less power with dlss 3 compared to native. If i've got cyberpunk still installed i'll have a look later and see if my 3080 uses less power with dlss 2 enabled, power use isn't something i generally looked at when playing around with dlss.

 

Reallink

Member
I have a few questions from the GAF PC gaming elite.

1. How much do these cards cost to make? What's the mark up on these cards?

2. Will the next AMD cards due in November be comparable in power to these Nvidia cards.

3. Will AMD launch their new cards at a lower price?

Kind regards,

Somebody who wants a decent gaming PC, but doesn't want to sell a kidney.

The 4090 chip likely costs Nvidia $250-300 to produce, the 24GB of RAM $150-200, and the board/cooler/fans around $100 combined. So around $500-600 to produce a $1600 4090. The 4080 chips are around half the size (under in the 12GBs case) so like $120-170 per die, $80-120 for the RAM, and maybe $60-80 for the less robust boards/coolers. No idea how much they sell the chips to AIBs for, but based on EVGA throwing away 80% of their revenue over it, I'd have to assume it's at least 3-4x the wholesale prices I estimated above.

Wait, so now the 4070 is not only slower than 3090 but on par with a vanilla 3080? Dafuq?

This seems a bit of a stretch.

Not a stretch, the 3090s are on average (i.e multi-game bench) only 10-15% faster in raster than a vanilla 3080, though some games are outliers. The 12GB 4080 is shown being decidely slower than a 3090Ti in raster, ergo it'll be within spitting distance of a vanilla 3080 in games that don't support or don't update DLSS.
 
Last edited:

GymWolf

Member
The 4090 chip likely costs Nvidia $250-300 to produce, the 24GB of RAM $150-200, and the board/cooler/fans around $100 combined. So around $500-600 to produce a $1600 4090. The 4080 chips are around half the size (under in the 12GBs case) so like $120-170 per die, $80-120 for the RAM, and maybe $60-80 for the less robust boards/coolers. No idea how much they sell the chips to AIBs for, but based on EVGA throwing away 80% of their revenue over it, I'd have to assume it's at least 3-4x the wholesale prices I estimated above.



Not a stretch, the 3090s are on average (i.e multi-game bench) only 10-15% faster in raster than a vanilla 3080, though some games are outliers. The 12GB 4080 is shown being decidely slower than a 3090Ti in raster, ergo it'll be within spitting distance of a vanilla 3080 in games that don't support or don't update DLSS.
what in the actual fuck?
 

kittoo

Cretinously credulous
I usually am not concerned about power consumption, heat and sizes of the cards but this shit is volcanic! I dont live in Iceland and dont need a heater in my room all year round.
 

GymWolf

Member
"Free" room warming and people are against that?

I play with the window open so the room where i have the pc during winter is usually around 10 degree celsius or lower and i have to turn up the air conditioner to keep the room from being assaulted by fucking penguins, with the gpu warming the whole room i'm just gonna cheap out on the AC, easy peasy.

Not even joking.
 
Last edited:

Buggy Loop

Member
At this point they should revise the motherboard standard and just slap RAM, CPU & SSD on the GPU. It’ll be like 5 slots by then? Then you have a GPU that is basically the PC case.

I would buy that

The sandwiched mini-ITX SFF builds are almost that already


nyln8p0ehdt41.jpg
 
Last edited:

twilo99

Member
DLSS 3 might reduce power consumption according to some info on videocardz, cyberpunk is using 100 watts less power with dlss 3 compared to native. If i've got cyberpunk still installed i'll have a look later and see if my 3080 uses less power with dlss 2 enabled, power use isn't something i generally looked at when playing around with dlss.


That actually makes sense, software is magical like that sometimes.
 
Anyone knows when review embargo lifts?

Waiting for DF new features showcase/review


still mad about the pricing.. expected 900 for the real 4080
 

OZ9000

Banned
Not a stretch, the 3090s are on average (i.e multi-game bench) only 10-15% faster in raster than a vanilla 3080, though some games are outliers. The 12GB 4080 is shown being decidely slower than a 3090Ti in raster, ergo it'll be within spitting distance of a vanilla 3080 in games that don't support or don't update DLSS.
Jesus. Who the fuck are Nvidia trying to fool here?
 
Will DLSS 3.0 come to 3000 series as well?
It's limited to the 40 series. They use that acceleration unit in the tensor core called the optical flow accelerator. It's what nvidia uses to inject artificial frames between real frames to double framerates, *cough* framerate interpolation *cough*. It's basically their way of saying "hey, you have 30 series gpu?"?"Well, you're fucked. You're gonna need a 40 series gpu for dlss3, it's exclusive". The consumer is now regarded as a massively milkable cash cow & nvidia is the milkmaid. Only this time, they'll use "ai" to milk you instead and then claim it's gonna cost you extra.
 
Last edited:

skneogaf

Member
I bet modders can't wait to see if they can get dlss 3 to work on non 4000 series cards!

If I were to sell my 3 month old 3090ti, how much would I be likely to get for it?
 

RoboFu

One of the green rats
how much do you expect for actual 4070? 699 at least i think, i'm scared to think less, and for what? 4080 12 gb is something like 3090 judging by nvidia slides, so around maybe touch better than base 3080 for a bigger msrp? not right

The 4080 12gig is really a 4070. It’s pretty messed up that they named two different cards 4080 when they don’t have the same number of cores.
 

Diogodx

Neo Member
DLSS 3 might reduce power consumption according to some info on videocardz, cyberpunk is using 100 watts less power with dlss 3 compared to native. If i've got cyberpunk still installed i'll have a look later and see if my 3080 uses less power with dlss 2 enabled, power use isn't something i generally looked at when playing around with dlss.

What probably is happening is that when you activate DLSS it lowers the resolution so the GPU is reaching the CPU limit. Since it is not being fully utilised the power goes down. Native 4K is GPU bound so you see the full power that is 460W.

So DLSS3 is not reducing the power per se.
 

hlm666

Member
What probably is happening is that when you activate DLSS it lowers the resolution so the GPU is reaching the CPU limit. Since it is not being fully utilised the power goes down. Native 4K is GPU bound so you see the full power that is 460W.

So DLSS3 is not reducing the power per se.
It's using the new overdrive RT mode so I would be surprised if they were not still gpu bound, but what your saying is definetly the logical conclussion. The test was at 1440p so native would have been 1440p and 1080p for dlss quality although they don't state the dlss mode.

edit: went and ran the benchmark of cyberpunk maxed with physco RT on my 3080 at 1440p, power use hit 368 watts native and 352 with dlss on quality. gpu use was pegged at 98% for native but did drop slightly with dlss to 96% at one point only but that point wasn't the lowest I saw power go so it might be factor but my quick and dirty test isn't good enough to make a conclussion.

On a seperate note my cyberpunk doesn't have the new overdrive mode yet for obvious reasons but that 4090 using the heavier overdrive RT mode is over 2.5 times faster than my 3080 using the less demanding phsyco RT using the same power (about 350 watts when both are using dlss).
 
Last edited:

Celcius

°Temp. member
If you plan to get a 4090 you might want to watch this:




I just checked and the nice Corsair hx1000i (platinum 80+, 1000w, etc) that I bought in 2020 is atx12v 2.4 not 3.0 oof
 

Pagusas

Elden Member
It’s so hard not to pull a trigger on a 4090…. But I realize i have no games worthy of needing it. My 3090 is still a beast, and the games I’m looking forward to most, Starfield,elder scrolls 6, Witcher 3 Remastered, Witcher 4, GTA6, Squadron and so on, are so far off that by the time any of them are out we’ll be on the 5xxx series or at least the 4090ti will be on shelves.

But man do I like new shiney expensive hardware…
 

RoadHazard

Gold Member
Will there be no low/mid-range RTX cards (60, 70) this time around at all, nothing sanely priced? Or are those coming later?
 

octiny

Banned
At this point they should revise the motherboard standard and just slap RAM, CPU & SSD on the GPU. It’ll be like 5 slots by then? Then you have a GPU that is basically the PC case.

I would buy that

The sandwiched mini-ITX SFF builds are almost that already


nyln8p0ehdt41.jpg

For context.

14.1" x 6.4" x 2.95" (4.4 Liters) is the size of the new Aorus 4090 Master

11.2" x 6.9" x 3.9" (4.9 Liters) is the size of my entire Velka 5 v2 build w/ 6800 XT (Dell 2-slot OEM version) & internal PSU

hnFtaFY.jpg

j9a5nJ7.jpg

kIXXWun.jpg

QRhHulY.jpg
 
Last edited:

DonkeyPunchJr

World’s Biggest Weeb
If you plan to get a 4090 you might want to watch this:




I just checked and the nice Corsair hx1000i (platinum 80+, 1000w, etc) that I bought in 2020 is atx12v 2.4 not 3.0 oof


Didn’t watch the video yet but Corsair has these cables that are compatible with all type 4 power supplies. Is there any reason this wouldn’t work?
 

Xellos

Member
Impressive cards but too expensive and power hungry for my ITX build. All I want is 3080-3080 ti performance with 12+GB VRAM at 200-250 watts (the lower the better) and a reasonable ($400-500) price, so hopefully Nvidia (or AMD) can deliver that next year.
 

Celcius

°Temp. member
[/URL][/URL]

Didn’t watch the video yet but Corsair has these cables that are compatible with all type 4 power supplies. Is there any reason this wouldn’t work?
The biggest point of the video is that atx12v3.0 delivers power AND data between the psu and videocard. If you have a “dumb” power supply that isn’t 3.0 compliant then even if you use an adapter the power supply is only delivering power but not data between the two.
 
Last edited:

DonkeyPunchJr

World’s Biggest Weeb
The biggest point of the video is that atx12v3.0 delivers power AND data between the psu and videocard. If you have a “dumb” power supply that isn’t 3.0 compliant then even if you use an adapter the power supply is only delivering power but not data between the two.
Okay I had a chance to watch it. From what I can tell those 4 pins are for the PSU to tell the GPU how much power it’s capable of supplying.

I’m not sure how much this would matter if you have a high wattage PSU with a single 12v rail (which most of them do).
 

PhoenixTank

Member
The biggest point of the video is that atx12v3.0 delivers power AND data between the psu and videocard. If you have a “dumb” power supply that isn’t 3.0 compliant then even if you use an adapter the power supply is only delivering power but not data between the two.
I'm trying to follow the first problem through and I didn't get a good feel of it from that video.
I may be reading the spec incorrectly but it seems like if both Sense0 & Sense1 are open/not grounded then the spec calls for minimum power only as a fail safe. 100W start up, 150W sustain. Cool, underutilised but no magic smoke escaping today. The spec allows for combinations of open or ground to signal the power limits. Simplest via not connecting pins or... perhaps at the PSU end to do it properly? They're not allowed to dynamically change, though. I guess the problem is the many 8pin to 12VHPWR adapters provided with GPUs literally have no idea what the limit of your PSU actually is. If they don't connect any of these sideband pins at all, they're limited to low power. Sooo they have to ground or disconnect pins in the adapter to suit the GPU they're selling you? Which becomes a big problem if you plug that into a PSU that can't handle that wattage.

The PSU manufacturers that are selling a full cable that goes from PSU to 12VHPWR for existing power supplies can in theory the sideband pin setup to the PSU the client orders. e.g. "What model do you have? OK, buy this one". As long as they don't do something stupid and the client doesn't use the cables with a different PSU, those should be safe with existing ATX 2 PSUs as far I can tell?
Jay is talking about data transfer, but it doesn't look crazy smart like that. While Sense0&1 are required and fairly simple, the other two pins are a bit smarter and can be dynamic but they're also optional. Things are meant to function and without catching fire even if they're not there.

Have I made a dumb assumption anywhere here? Apart from hoping that optional actually means optional and people won't be stupid?

There is the follow up matter at about 7 minutes in of the many 8 Pin to 12VHPWR drawing unevenly over those 8Pins, which is another can of worms and probably another strike specifically against using that kind of adapter.
 

RoboFu

One of the green rats
So it looks like these cards are made to work with atx 3 psu. so they could possibly melt cables on atx 2 psus . Lol
 

Dirk Benedict

Gold Member
These cards are fucking huge and power hungry, also... what Jensen said... even his memes will no longer shield him from criticisms. I am waiting. I have a 3090 and I want to see how the market settles Nvidia's bullshit, before jumping on the 4090.
 

Codiox

Member
My EVGA 12GB 3080 will tide me over for a long while.

Thank you nvidia.
same here. as long as these game devs pump out cross gen shit that per definition needs to run on an og xbone from 2013 with jaguar cpus we have nothing to fear.

im playing only on tv and always want 60fps but dont care for 4k, i can easily drop it down to 1600p or even 1440p and wouldnt bother.

we are settled for the next years, especially with the overclocked evga 3080 card.
 

DenchDeckard

Moderated wildly



What a shit show all of this is. The timing of EVGA stepping away makes even more sense now, they will have seen the writing on the wall.

AMD are needed more than ever now.

Absolutely agree, what an absolute joke!

I just got an RM1000x thinking it was going to be future proofed. aint no way Im re doing my cable management, pc is looking perfect.
 

GHG

Member
Absolutely agree, what an absolute joke!

I just got an RM1000x thinking it was going to be future proofed. aint no way Im re doing my cable management, pc is looking perfect.

At this rate I'll just wait for the 3080ti prices to fall further and get one of those. Don't really care for the DLSS 3.0 features.
 
Top Bottom