• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

New Nvidia RTX 4000 Adola Lovelace Cards Announced | RTX 4090 (1599$) October 12th | RTX 4080 (1199$)

//DEVIL//

Member
Looks like it does provide near double the fps.
geforce-rtx-nvidia-reflex-overwatch2-performance-fps-system-latency.jpg
Of a 3080 at 2k. There is a difference of 30 to 40 frames between 3090 and 3080 for this game at 2k.

Ideally the 3090 if this game anything goes to go by, is on par with the 4080 12 gig.

Which is in line with previous gens or what I said before . The 4080 12g ( aka 4070 ) is about same performance of 3090 / 3090ti ( with 5 to 10 frames difference which won’t mean anything )

Same goes for the difference between 3070 and 2080ti

With that being said , I still want to buy the 4090 ( I think . I am motivated but not much because I still think my 3090 FE not only more than enough for me at 4K even, it also looks sexy and well built and ( well I got it for 800$ Canadian lol
 

GHG

Member
Worst priced card ever . 200$ more and u get a much better card with 50% more performance . Seriously the 16 gig should have been 1k max

Yep that's my take on it as well. Unless AMD come up with something remarkable it's looking like it's between the 3090 (if the prices fall a bit more) or the 4080 12gb for me.

Will need to see how the 12GB 4080 holds up at 4k and in in VR though, I suspect the reduced bandwidth could pose issues when pushed at higher resolutions.
 
Last edited:

Fredrik

Member
Little? that's a huge upgrade, I just don't know how much of it will transfer to getting more kills or enjoying the game more.. also, we don't have 500+hz monitors so you won't even be able to match that monster.
It’s not a huge upgrade going from 3070Ti to 4080 12GB compared to the cost.
4090 is certainly a beast though, in performance, price and SIZE lol
 

//DEVIL//

Member
Yep that's my take on it as well. Unless AMD come up with something remarkable it's looking like it's between the 3090 (if the prices fall a bit more) or the 4080 12gb for me.

Will need to see how the 12GB 4080 holds up at 4k and in in VR though, I suspect the reduced bandwidth could pose issues when pushed at higher resolutions.
Agreed.
the nerd in me wants a 4090. But in reality I am really happy with a 3090 I got 2 weeks ago FE with receipt and mint condition that we purchased by a gamer not a miner 4 months ago for 800$ Canadian ( 600$ US ? )

So I am really happy with that card. I did proper correct under volting and my temps don’t reach 64c in Cod and fan curve at 40%

I am not in rush but I admit . The 4090 is really tempting
 
Last edited:

Fredrik

Member
The Strix is 358mm tall. How can it not fit in a very standard ATX case, which often times come with over 370mm clearance?
What case is that? Lancool 2?
Says 384mm max for the graphics card. Could be a scenario where you have to kinda rotate it in to go in between the corners of the case.
 

Chiggs

Gold Member
Whar retailer can I walk in and buy this at? Is it worth calling best buy?

Order online from Best Buy and then choose ship to store. Get your profile set up in advance.

If AMD had something better or close enough to a 4090, they wouldn't be letting Nvidia sell them for 3 weeks before they even announce anything. At the very least they'd be doing played out social media marketing like teasing shadowy images of the halo card with quotes about patience being a virtue or something. Or "leaking" benchmarks.

Not so sure it's that cut and dry. AMD did announce their GPU reveal for RDNA3 the day of Nvidia's 40 series press conference. Also, Nvidia played it--and still is playing it--very close to their chest with their performance figures.

What we "know" about the 7000 series?
  • Chiplet design
  • Traditional power connectors
  • One card with 24GB of RAM
  • Most likely cheaper
All of these things, at least for me, point to "wait and see."

Even if they can't touch the 4090's RT performance, they certainly will with rasterization. And if the price is right...
 
Last edited:

OZ9000

Banned
Order online from Best Buy and then choose ship to store. Get your profile set up in advance.



Not so sure it's that cut and dry. AMD did announce their GPU reveal for RDNA3 the day of Nvidia's 40 series press conference. Also, Nvidia played it--and still is playing it--very close to their chest with their performance figures.

What we "know" about the 7000 series?
  • Chiplet design
  • Traditional power connectors
  • One card with 24GB of RAM
  • Most likely cheaper
All of these things, at least for me, point to "wait and see."

Even if they can't touch the 4090's RT performance, they certainly will with rasterization. And if the price is right...
November can't come close enough. I'm excited to see what they have in store.

Or worst case scenario, I wait until the RTX 5000 series (and stick to 1080p gaming for a year lol)
 
Last edited:

Chiggs

Gold Member
November can't come close enough. I'm excited to see what they have in store.

Or worst case scenario, I wait until the RTX 5000 series (and stick to 1080p gaming for a year lol)

I think it's smart to wait. There's nothing to lose, really. The 4090 will allegedly be in plentiful supply. Feel free to quote this a month from now when we're all in scalper hell. ;)

But yeah, I would see what AMD has to offer. This is rumored to be their best GPU in quite some time.
 
What case is that? Lancool 2?
Says 384mm max for the graphics card. Could be a scenario where you have to kinda rotate it in to go in between the corners of the case.
I always thought of something like the Fractal Design Pop Silent/Air as pretty standard. They have 380mm clearance for the GPU. Yes, installing it could prove to be a bit tricky still.
 

OZ9000

Banned
I think it's smart to wait. There's nothing to lose, really. The 4090 will allegedly be in plentiful supply. Feel free to quote this a month from now when we're all in scalper hell. ;)

But yeah, I would see what AMD has to offer. This is rumored to be their best GPU in quite some time.
All I ask is for FSR to match DLSS 2 in quality.

The input lag from DLSS 3 sounds garbage.
 

gypsygib

Member
They won't. They never do.
People would have been much better off going with a 6800 vs a 3070. Sure 3070 has RT but you can't use it in most games due to VRAM limitations without lowering texture settings and multiple others, or the performance will just be too low to make it worth it, even with DLSS. And often you could have gotten a 6800XT for $20-50 more than a 3070ti, also making it a much better buy than Nvidia's price equivalent as it suffered from the same RT limitations as the 3070. Also, as a 3070ti owner (it was the same price as a 3070 during the GPU depression of the early 2020-2021), RT really isn't that impressive...yet. It's really just a settings upgrade akin from going from medium to ultra rather than a major graphical improvement and in most cases, just not worth it.

I've only had Nvidia cards but RDNA3 could be a real contender is AMD doesn't screw it up with pricing.
 
Last edited:

LiquidMetal14

hide your water-based mammals
According to the PSU estimators on several PSU supplier sites, for what I have and adding a 4090 I would need 790w. I have a 1000w Platinum EVGA Supernova.
 

SlimySnake

Flashless at the Golden Globes
ASUS has officially lost their mind 🤣
knYJWJi.png

6AZfZ8z.png

The 4090 Strix
I really dont understand wtf is going on with these cards. They are $1,600. Why are these fucking manufacturers going with inexpensive and large cooling solutions? And why is the chip so big and power hungry anyway? Is it still on their 8mm samsung node?

According to the PSU estimators on several PSU supplier sites, for what I have and adding a 4090 I would need 790w. I have a 1000w Platinum EVGA Supernova.
yeah, that wont do. my 3080 12 GB was pulling almost 400 Watts sometimes going over when running cyberpunk with uncapped framerate and because it also pushes the CPU pretty hard, it was pushing my entire PC to over 650 watts. Maybe if you can cap your CPU at 65 watts, it might be enough but you might end up bottlenecking some of the next gen games.
 

LiquidMetal14

hide your water-based mammals
yeah, that wont do. my 3080 12 GB was pulling almost 400 Watts sometimes going over when running cyberpunk with uncapped framerate and because it also pushes the CPU pretty hard, it was pushing my entire PC to over 650 watts. Maybe if you can cap your CPU at 65 watts, it might be enough but you might end up bottlenecking some of the next gen games.
Running a Vision 3090 now that is OC'd so doing some calculations and research.
 

Hot5pur

Member
I have a hard time understanding the hype for these cards. Is everyone gaming on 4k monitors trying to hit 90 fps?

Have you seen most modern games? They don't need these beefy cards, save for a very small handful. A 3080 will easily get you through the next 2-3 years at 1440p and probably at 4k with a few tweaks to keep things mostly on high/ultra.

We still haven't phased out last gen consoles, and the current gen hardware is limiting how much devs can push things. On PC you might get your "ultra" settings and "psycho" raytracing but most of the time these are not that noticeable.

And of course now Nvidia needs to justify the new cards with new graphics modes that chew up performance to justify these new cards, while hardly making a different in IQ. If these prices are the new normal a console is a no brainer for a small compromise in FPS and IQ but massive savings that can be put towards games instead.
 

Chiggs

Gold Member
I have a hard time understanding the hype for these cards. Is everyone gaming on 4k monitors trying to hit 90 fps?

Have you seen most modern games? They don't need these beefy cards, save for a very small handful. A 3080 will easily get you through the next 2-3 years at 1440p and probably at 4k with a few tweaks to keep things mostly on high/ultra.

We still haven't phased out last gen consoles, and the current gen hardware is limiting how much devs can push things. On PC you might get your "ultra" settings and "psycho" raytracing but most of the time these are not that noticeable.

And of course now Nvidia needs to justify the new cards with new graphics modes that chew up performance to justify these new cards, while hardly making a different in IQ. If these prices are the new normal a console is a no brainer for a small compromise in FPS and IQ but massive savings that can be put towards games instead.

I largely agree. The RTX 4090 really makes sense if you're both a gamer and a content creator.

There are a few titles where it will go beast mode, like Flight Simulator 2020, which is a sight to behold. Outside of that, the pickings are slim.
 

PhoenixTank

Member
Running a Vision 3090 now that is OC'd so doing some calculations and research.
You should be okay at 1KW tbh. 3090Ti has the same listed 450W TGP as the 4090, for which they recommend a 850W+ PSU. The 3XXX test system did include a 10900K rather than a 5900X, so it may vary a little depending on what else you have.
Important to care about power supplies that will soak spikes though.
 

Fredrik

Member
I really dont understand wtf is going on with these cards. They are $1,600. Why are these fucking manufacturers going with inexpensive and large cooling solutions? And why is the chip so big and power hungry anyway? Is it still on their 8mm samsung node?
Nvidia is on 4nm now and from what I’ve seen on the unpackning videos the pcb is like 1/2-1/3 of the partner cards in size, the bulkiness is all about the cooling.

There is the MSI 4090 Suprim Liquid X with an AIO cooler, it’s small …ish, and expensive.
But then you need to fit the radiator in the case too and with a Ryzen 9 you might want an AIO for the CPU as well which could end with a packed case.
 
Last edited:

twilo99

Member
People would have been much better off going with a 6800 vs a 3070. Sure 3070 has RT but you can't use it in most games due to VRAM limitations without lowering texture settings and multiple others, or the performance will just be too low to make it worth it, even with DLSS. And often you could have gotten a 6800XT for $20-50 more than a 3070ti, also making it a much better buy than Nvidia's price equivalent as it suffered from the same RT limitations as the 3070. Also, as a 3070ti owner (it was the same price as a 3070 during the GPU depression of the early 2020-2021), RT really isn't that impressive...yet. It's really just a settings upgrade akin from going from medium to ultra rather than a major graphical improvement and in most cases, just not worth it.

I've only had Nvidia cards but RDNA3 could be a real contender is AMD doesn't screw it up with pricing.

I had a 3070ti and replaced it with a 6800xt, and in my experience and for my usage the 6800xt is much better
 

daninthemix

Member
I have a hard time understanding the hype for these cards. Is everyone gaming on 4k monitors trying to hit 90 fps?

Have you seen most modern games? They don't need these beefy cards, save for a very small handful. A 3080 will easily get you through the next 2-3 years at 1440p and probably at 4k with a few tweaks to keep things mostly on high/ultra.

We still haven't phased out last gen consoles, and the current gen hardware is limiting how much devs can push things. On PC you might get your "ultra" settings and "psycho" raytracing but most of the time these are not that noticeable.

And of course now Nvidia needs to justify the new cards with new graphics modes that chew up performance to justify these new cards, while hardly making a different in IQ. If these prices are the new normal a console is a no brainer for a small compromise in FPS and IQ but massive savings that can be put towards games instead.
I disagree. I've seen for years that it's basically impossible to have too much GPU power. At 4K you are always GPU limited, much more so since ray tracing was introduced.
 

Fredrik

Member
According to the PSU estimators on several PSU supplier sites, for what I have and adding a 4090 I would need 790w. I have a 1000w Platinum EVGA Supernova.
I put in a new rig in pcpartpicker and ended up at around 800W. Then I watched a guide that said you should take that figure *1.5 and also add an ”Nvidia tax” of 100W to deal with power spikes and leave some headroom. Ended at roughly 1200W 😳
 
Last edited:

jaysius

Banned
Someone has probably already said this, but isn't the 4090 overkill? Who needs 507 fps? And yes, I understand that this is 1440p and that 4K would be lower, but presumably 4K would still be over 200 FPS..
Shhh, you're ruining their dick swinging.

The high end GPU game is really dumb.

It's because of all the stagnation in the industry.

These cards are too big and these number don't have any practical value anymore.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Someone has probably already said this, but isn't the 4090 overkill? Who needs 507 fps? And yes, I understand that this is 1440p and that 4K would be lower, but presumably 4K would still be over 200 FPS..
Absolutely overkill for an eSports title.

Alot of people here have 4K120 Panels, assuming they want to run native, games like Dying Light 2, Valhalla, FarCry 6, WatchDogs Legion and Cyberpunk will have the 4090 still be the bottleneck.
 
These cards are too big and these number don't have any practical value anymore.
The cards were built by the partners expecting a 600W power envelope coming out of the Samsung 8nm process. Switching to TSMC 5nm mid-game while companies like Asus weren't ready to go back to the drawing board for their designs, we are now expecting 450W cards with 600W worth of cooling. They should be very efficient and quiet.
 
Last edited:

The_Mike

I cry about SonyGaf from my chair in Redmond, WA
I know a high end pc rig is way better and more expensive than consoles, but being able to buy 4 series x at the cost of one of these GPUs.

Nvidia makes it hard to be a PC gamer this time around.
 

M1chl

Currently Gif and Meme Champion
I really dont understand wtf is going on with these cards. They are $1,600. Why are these fucking manufacturers going with inexpensive and large cooling solutions? And why is the chip so big and power hungry anyway? Is it still on their 8mm samsung node?


yeah, that wont do. my 3080 12 GB was pulling almost 400 Watts sometimes going over when running cyberpunk with uncapped framerate and because it also pushes the CPU pretty hard, it was pushing my entire PC to over 650 watts. Maybe if you can cap your CPU at 65 watts, it might be enough but you might end up bottlenecking some of the next gen games.
Given that it has 5 times the transistor count as XSX, I would say that power consumption isn't relatively speaking, high. It's on 4nm TSMC node. As for that cooling FE version has vaporchamber, the other AIB probably don't care enough to research their new coolers. If EVGA story is anything to go by, it's hard to make money of the high-end chips.
 

Admerer

Member
The cards were built by the partners expecting a 600W power envelope coming out of the Samsung 8nm process. Switching to TSMC 5nm mid-game while companies like Asus weren't ready to go back to the drawing board for their designs, we are now expecting 450W cards with 600W worth of cooling. They should be very efficient and quiet.
Yeah, but still, even if I could afford the 4090 (only one worth getting) it won't fit in my case, 😂.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I know a high end pc rig is way better and more expensive than consoles, but being able to buy 4 series x at the cost of one of these GPUs.

Nvidia makes it hard to be a PC gamer this time around.
Dont buy a 4090?
You can get a 3060Ti for ~450
 
I'll either get a founders or I won't get one for the foreseeable future.
I'd expect AMD to not only deliver traditional power connectors on their cards but also much smaller ones, maybe even a little more affordable. Or are you firmly in the "AMD is shit no matter what they produce"-camp?
 

Fredrik

Member
What in the holy fucking name of our lord and savior is that?

Are they taking the piss? Do they now expect us to change case on top of the PSU?

I'll either get a founders or I won't get one for the foreseeable future.
Lol yeah I’ve been laughing for awhile now over that whole video, no matter how much of a fan you are you need to see the absurdity of it all.

Time-stamped 😄
 
Do they now expect us to change case on top of the PSU?
It's a bit funny for sure, but cases and mainboard standards should have already been made with eg. an actual 2-3-4 slot slide on both ends of the GPU. Fix the length of GPUs. And actually maybe flip the fan orientation and do 6-7-8 slots or whatever. High end is clearly not flat at all anymore, so aiming at proper CPU-like coolers would make it probably smaller in total volume and less complicated.

While lower end probably should abandon PCIe slots alltogether and shrink like many tiny PCs already do.

The market needs just two form factors. One without dGPUs/PCIe and one where current and future designs clearly outgrew what AGP/PCIe was intended for.

edit: the comparison to a PS5 is actually quite telling how huge the PS5 is more so than the 4090. 10floppys (SoC + PSU of course) but against 80 (without mobo and CPU) ... I mean relatively the nVidia card seems small.
 
Last edited:

GymWolf

Member
I have a hard time understanding the hype for these cards. Is everyone gaming on 4k monitors trying to hit 90 fps?

Have you seen most modern games? They don't need these beefy cards, save for a very small handful. A 3080 will easily get you through the next 2-3 years at 1440p and probably at 4k with a few tweaks to keep things mostly on high/ultra.

We still haven't phased out last gen consoles, and the current gen hardware is limiting how much devs can push things. On PC you might get your "ultra" settings and "psycho" raytracing but most of the time these are not that noticeable.

And of course now Nvidia needs to justify the new cards with new graphics modes that chew up performance to justify these new cards, while hardly making a different in IQ. If these prices are the new normal a console is a no brainer for a small compromise in FPS and IQ but massive savings that can be put towards games instead.
I want to play everything at 4k60 with ultra/high settings, and the minimum requirement to do that is a 3090-6900, and you probably still need to tinker with settings on some heavy games like cyberpunk vanilla and next dlc or broken unoptimized stuff like starfield.

If games gets even heavier in the next 1-3 years, then a 3090 is not gonna be enough anymore.

We always act like the newest gpu is overkill for gaming and every fucking gen we eat our words because broken console ports or heavy ass games continue to be released.

Series 4000/7000 is for people who want to be future proof for the next 1-2 years while playing at the highest standards.

And let's not even start talking about people who want to play at 4k120 or 144...overkill my ass.
 
Last edited:
Top Bottom