• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

New Nvidia RTX 4000 Adola Lovelace Cards Announced | RTX 4090 (1599$) October 12th | RTX 4080 (1199$)

//DEVIL//

Member
i have o11 air mini. I have Strix 3090 in it and I still have around 3 cm space before I hit the fan on the front ( which in worst case senario I can remove one of those fans.

I am also using mini ITX z690i motherboard. I have like 7 slots before i hit the fan seated at the bottom of the case.

I am good with the 4090, the question is should I buy it ? where are the games that require it ? I do not care that much about ray tracing. and by the time the big games come out next year ( assuming they come next year ) , that 4090 will be replaced with probably 4090 ti / super for the same price.
 

Reallink

Member
With frame interpolation effectively indistinguishable from native in practice (i.e rare occlusion artifacts only existing for like 8ms) and input lag lower than rasterization can accomplish, you won't be able to buy 4090s for like 6mo to a year. 3XXX Copium snorters not withstanding, this shits gonna be selling out instantly at $2500+ until they release the Ti variants. There are so many middle age PCMRs with 6 figure salaries who'll be tripping over themselves to buy these thing at any price.
 
Last edited:
It's not apples to apples comparison. DLSS 3 has DLSS 2 upscaling running at the same time in addition to frame generation. So you have to compare DLSS2 to DLSS3 latency.
I mean, I'm not going to self impose rules on how I interpret the benefits. If DLSS 3 looks *as good as native rez but also runs at 2.5x the framerate while delivering more or less the same or improved input latency, that sounds like a big win to me.

Nobody should expect an interpolated image to have the same or better latency than native frames. I think in most cases this is a fine tradeoff especially if you are playing something like a single player game where its not really all that important to run at native 144hz and up. Most competitive games are already quite easy to run at high frames, so I think where DLSS 3 will really shine are the ultra demanding titles, which are generally single player experiences.
 
Last edited:
I mean, I'm not going to self impose rules on how I interpret the benefits. If DLSS 3 looks *as good as native rez but also runs at 2.5x the framerate while delivering more or less the same or improved input latency, that sounds like a big win to me.

Nobody should expect an interpolated image to have the same or better latency than native frames. I think in most cases this is a fine tradeoff especially if you are playing something like a single player game where its not really all that important to run at native 144hz and up. Most competitive games are already quite easy to run at high frames, so I think where DLSS 3 will really shine are the ultra demanding titles, which are generally single player experiences.
My bad, I didn't mean to sound like I do so either and I inherently agree. On 50 series cards I imagine latency increase to be much smaller.
 

Killer8

Member
Frame Generation is not for latency dependent games.
So for sure not any FPS, but if you are playing a slower "cinematic" experience DLSS-Frame Generation could be beneficial for playing at 4K or even 8K.

It will benefit the overall look of the game, sure, since it is making games look perceptually smoother.

However, I think a lot of people make the connection in their mind when it comes to frame rate talk that higher number = feels better (key word here being feels, not looks). Even if someone isn't technically minded enough to explain what a frame rate or latency even is, generally they can still tell you that a higher framerate 'feels smoother', 'better', 'more responsive', or whatever else.

I don't know if i'd go as far as calling Nvidia's marketing deceptive. It's not as if they are lying by saying that there are higher frames being generated by the 4000 series. But it does seem like a lie by omission to dodge the huge caveat that this type of DLSS3 driven framerate gain is only going to be a perceptual one. It will not actually be felt in the controls like a) people typically expect and b) what the benchmarks Nvidia are touting would normally suggest.
 
Last edited:
Shame these fuckers are gonna sell out at $2500 probably. And holy shit have you ever seen a console generation get completely outclassed like this in two short years?
I don't believe it happened before. PS360 era was the most impressive relative to PC hardware; with PS4, One it took a year to get gpu with 2x or so perf on PC, but with this gen there were roughly 2x perf cards even before new consoles launched. I'm glad CPU is capable on those consoles at least, but then again where are the games that shows us incredible destruction or some crazy physics simulations that takes advantage of extra cpu cycles? I've seen none of that so far on any console exclusive or multiplat.
 
Last edited:
When I noticed that the Founders 4090 is the same size as my 3070Ti Suprim X I stopped worrying about the size, fits nicely in an old Corsair Obsidian 550D 👍
No idea about the partner cards though.
Can you even get the Founders Edition in Sweden though?
 
Last edited:
You watched the video?

DLSS2 +Reflex
DLSS3 (is the same DLSS2 + Reflex + Interpolation)

Portal + 3ms
Spiderman + 15ms
Cyberpunk + 23ms

If TVs can do 20ms interpolation I don`t see how this is revolutionary.
I'm comparing these DLSSx3 results to native

Portal 73 ms improvement
Native 129ms
DLSSx3 56ms

Spiderman 1 ms improvement
Native 39ms
DLSSx3 38ms

Cyberpunk 54ms improvement
Native 108ms
DLSSx3 54ms

TVs motion upscaler would add 20-100ms input lag and not to mention intruduce way more noticeable artifacts.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
i have o11 air mini. I have Strix 3090 in it and I still have around 3 cm space before I hit the fan on the front ( which in worst case senario I can remove one of those fans.

I am also using mini ITX z690i motherboard. I have like 7 slots before i hit the fan seated at the bottom of the case.

I am good with the 4090, the question is should I buy it ? where are the games that require it ? I do not care that much about ray tracing. and by the time the big games come out next year ( assuming they come next year ) , that 4090 will be replaced with probably 4090 ti / super for the same price.
Are you anywhere near 4K120 in any modern title with settings beyond peasant?
If yes, then you could probably live without a 4090 cuz you are already probably getting as much from your panel as you can.
If you arent hitting 4K120 or there abouts, then your CPU is probably chilling.....might as well actually give it a workout and actually play at 4K120.
 
I hate to bother you guys, but you're all way smarter and better informed about this than me.

Has anyone seen performance testing of the 4090 on tensor loads? I'm interested from a ML perspective, but haven't come across anything. Thanks!
 

Crayon

Member
I hate to bother you guys, but you're all way smarter and better informed about this than me.

Has anyone seen performance testing of the 4090 on tensor loads? I'm interested from a ML perspective, but haven't come across anything. Thanks!

I think all we got so far is some sketchy bar graphs from nvidia.
 
Shame these fuckers are gonna sell out at $2500 probably. And holy shit have you ever seen a console generation get completely outclassed like this in two short years?
Of course, I remember buying the GeForce 8800 GTX (I think I still got the card somewhere in my storage) which released at retail before the Playstation 3 and it wiped the absolute fucking floor with it in performance. I was playing PC videogames at 2560x1600 16 years ago.
 

Buggy Loop

Member
Choose wisely :messenger_beaming:


everyone GIF
 

Fredrik

Gold Member
Can you even get the Founders Edition in
Sweden though?
No idea tbh but the MSI 4090 Suprim X is the same length as the 3070Ti I have. I bet there are smaller cards too, Suprim X is usually a big boy.

The bigger issue imo is the price… I’ve seen some local prices for the 4090 cards and they’re literally like 4 Xbox Series X… Or like a Samsung 49” G7 Odyssey Neo… Or a Marantz SR7015… Or a LG OLED Evo… Or a Gibson Les Paul…
Etc etc
 
Last edited:

//DEVIL//

Member
Are you anywhere near 4K120 in any modern title with settings beyond peasant?
If yes, then you could probably live without a 4090 cuz you are already probably getting as much from your panel as you can.
If you arent hitting 4K120 or there abouts, then your CPU is probably chilling.....might as well actually give it a workout and actually play at 4K120.
i am at 4k 144. and most games dont come close to 4k 144 unless DLSS is enabled. MW2 for example I was able to get 4k 144 frames but if there is big stuff going on, it goes down to 120 or so for a second before it goes back up. its not locked 144 frames.

by no means I hate my 3090 ( after all I kinda got it 900$ CAD 3 months ago. which is like what 700$ US ?

not complaining or anything. But since I have I7 12th gen and 32gig 5600 DDR5 cl36, was thinking of going all out. while paying this much is not fun, it's not the end of the world. but when it's so much and no games to justify the price... thats where I am torn.. like who the hell cares about ray tracing in cyberpunk when characters look like frogs ??
 

GHG

Gold Member
Will get one of the 3080ti, 3090 or the 16GB 4080 dependant on availability and pricing at launch.

I'm not convinced the 4xxx series will prove to be a big leap over the 3xxx series when DLSS 3.0 is taken out of the equation. But I'm happy to be proven wrong because if that's the case then even the 12GB 4080 could be a possibility for me.
 

Diogodx

Neo Member
I'm comparing these DLSSx3 results to native

Portal 73 ms improvement
Native 129ms
DLSSx3 56ms

Spiderman 1 ms improvement
Native 39ms
DLSSx3 38ms

Cyberpunk 54ms improvement
Native 108ms
DLSSx3 54ms

TVs motion upscaler would add 20-100ms input lag and not to mention intruduce way more noticeable artifacts.
Comparing a game running in 3840X2160 with 1920X1080 + reflex and of course the second option will have lower latency. It would be crazy if not.
 

thuGG_pl

Member
I want to believe that but everything so far points to interpolation

First you asked if it's just a copy of one of the frames. It's not.
And now you're talking about interpolation, which is not simply copying frames.

It is interpolation technically. Because the new frame is based on one of two. That's basicaly the definition of interploation.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
i am at 4k 144. and most games dont come close to 4k 144 unless DLSS is enabled. MW2 for example I was able to get 4k 144 frames but if there is big stuff going on, it goes down to 120 or so for a second before it goes back up. its not locked 144 frames.

by no means I hate my 3090 ( after all I kinda got it 900$ CAD 3 months ago. which is like what 700$ US ?

not complaining or anything. But since I have I7 12th gen and 32gig 5600 DDR5 cl36, was thinking of going all out. while paying this much is not fun, it's not the end of the world. but when it's so much and no games to justify the price... thats where I am torn.. like who the hell cares about ray tracing in cyberpunk when characters look like frogs ??
So there are plenty of games right now that would require more power to hit 4K120 and those that can you could push higher DLAA?
I dont understand what your question is?

I cant hit anywhere near 4K120 in:
FarCry 6
Assassins Creed Valhalla
Borderlands 3
Control
Cyberpunk 2077
Deathloop
Forza Horizon 5
Guardians of the Galaxy
Read Dead Redemption 2 from frikken 2019

And a plethora more games.

And by not hit 4K120 I mean miles off with a 3080'10G or a 3080'12G, the 12G gets closer because of its wider memory bus, but its still aint 4K120.....and I doubt a 3090 is pulling much higher, an extra 5 maybe 10fps....so still some ways off no?
So where are the games that require this much power......literally everywhere.
 

I Master l

Member
First you asked if it's just a copy of one of the frames. It's not.
And now you're talking about interpolation, which is not simply copying frames.

It is interpolation technically. Because the new frame is based on one of two. That's basicaly the definition of interploation.
If an object is hidden in frame 1 and in frame 3 its visible then AI generated frame 2 should know what the
object looks in between, I am not sure if DLSS3 is capable of that
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Will get one of the 3080ti, 3090 or the 16GB 4080 dependant on availability and pricing at launch.

I'm not convinced the 4xxx series will prove to be a big leap over the 3xxx series when DLSS 3.0 is taken out of the equation. But I'm happy to be proven wrong because if that's the case then even the 12GB 4080 could be a possibility for me.
$1300 for the 4080'16G is such a tough pill to swallow when 3080s were frikken 7-800 dollars.
If I can scam sell off my 3080 for basically MSRP in the sticks then I could possibly....possibly justify spending the dough on a 4080'16 (preferably a 4080'20G) or 4090.
Otherwise I think im sticking to Ampere.


The 4080'12G?

Disgusting.
 

Chiggs

Member
These idiots on eBay selling their 3090s for $500 - $600...the card is still plenty capable, so why not keep it around just in case-or maybe throw it in another system?
 

Hawk269

Member
It has been a while since I built my last rig and have been out of touch on some stuff. I have a Corsair AX1500i Power Supply. I know I have the wattage to run a 4090, but since the power supply was bought a few years ago, would it be compatible with the 4090?
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
These idiots on eBay selling their 3090s for $500 - $600...the card is still plenty capable, so why not keep it around just in case-or maybe throw it in another system?
People are selling 3090s for 600 dollars?
Links my man, links.
Cuz at 600 dollars the 3090 is a better buy than the 900 dollar 4080'12G that it fights.
 

kiphalfton

Gold Member
With frame interpolation effectively indistinguishable from native in practice (i.e rare occlusion artifacts only existing for like 8ms) and input lag lower than rasterization can accomplish, you won't be able to buy 4090s for like 6mo to a year. 3XXX Copium snorters not withstanding, this shits gonna be selling out instantly at $2500+ until they release the Ti variants. There are so many middle age PCMRs with 6 figure salaries who'll be tripping over themselves to buy these thing at any price.

Let me guess, you don't have anything even relatively recent graphics card wise, have some cheapo graphics card, or are a console plebs... but still feel the need to "talk down" to RTX 3000 series owners.
 

Celcius

°Temp. member
It has been a while since I built my last rig and have been out of touch on some stuff. I have a Corsair AX1500i Power Supply. I know I have the wattage to run a 4090, but since the power supply was bought a few years ago, would it be compatible with the 4090?
Yes you should have no problems at all and the card will come with an adapter
 

LiquidMetal14

hide your water-based mammals
It's marginally interesting seeing the 8k but this won't matter for a long while. We need to master 4k120+ with (just about) everything before even eyeing some other unnecessary plateau.
 

Celcius

°Temp. member
How long do you guys think it will take the 4090 to be readily available on shelves? About 2 months?
 

GHG

Gold Member
$1300 for the 4080'16G is such a tough pill to swallow when 3080s were frikken 7-800 dollars.
If I can scam sell off my 3080 for basically MSRP in the sticks then I could possibly....possibly justify spending the dough on a 4080'16 (preferably a 4080'20G) or 4090.
Otherwise I think im sticking to Ampere.


The 4080'12G?

Disgusting.

Honestly I don't really care at this point, I just want double the performance of my current 2070 super and at least 12gb VRAM for ~$1000. So whatever card ultimately gets me there I'll take it. I can sell my old card for about $400 so net it's a $600 upgrade which I'll be more than happy with.

It's funny because the 2070 super was only ever meant to be a stop-gap due to the timing of my latest PC build being awkward and just a month prior to the 3xxx series releasing. Little did I know what chaos was about to happen in the GPU market at the time. So as far as I'm concerned I might as well just get on with it to get the performance I want.

How long do you guys think it will take the 4090 to be readily available on shelves? About 2 months?

Honestly, from day 1. You only need to look at how the latest CPU/motherboard releases have been received along with the news out of apple yesterday regarding the new iPhones to get an idea of what is coming as far as these new GPUs are concerned.

It's not last year's market anymore, far from it.
 
Last edited:

Chiggs

Member
People are selling 3090s for 600 dollars?
Links my man, links.
Cuz at 600 dollars the 3090 is a better buy than the 900 dollar 4080'12G that it fights.
Which idiots? Send them my way. I'd buy a 3090 for 500-600.

About a day left on this one.

Bet it goes for a little over $600.

And here's one near me (Atlanta) going for $220. Check Craigslist, people!

https://atlanta.craigslist.org/atl/sop/d/kennesaw-nvidia-rtx-3090/7526981188.html
 
Last edited:

hlm666

Member
These idiots on eBay selling their 3090s for $500 - $600...the card is still plenty capable, so why not keep it around just in case-or maybe throw it in another system?
Mining is dead, so even if these are not ex mining cards (which they are probably going to lie about anyway) they have to deal with the prices from these cards being sold off.
 

Bboy AJ

My dog was murdered by a 3.5mm audio port and I will not rest until the standard is dead
Not happy about pricing but ok. Jensen wants to bleed you dry. But what’s up with how huge it is?? I am a small form factor gamer. I have a 3080 FE and it works great in two slot size. This new card is huge! Ridiculous.

I’ll pass due to size, mainly. Plus the 12 GB 4080 really being a 4070 is just scummy.
 
Top Bottom