• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

New Nvidia RTX 4000 Adola Lovelace Cards Announced | RTX 4090 (1599$) October 12th | RTX 4080 (1199$)

DenchDeckard

Moderated wildly
When I noticed that the Founders 4090 is the same size as my 3070Ti Suprim X I stopped worrying about the size, fits nicely in an old Corsair Obsidian 550D 👍
No idea about the partner cards though.

Are you sure that's right? The 4090 is bigger than any other card, it's bigger than a 3090 ti.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
About a day left on this one.

Bet it goes for a little over $600.

And here's one near me (Atlanta) going for $220. Check Craigslist, people!

https://atlanta.craigslist.org/atl/sop/d/kennesaw-nvidia-rtx-3090/7526981188.html
200 dollars for a 3090?
Sus as fuk.
a305e9bc243c70bb477d0e9b96879078.jpg



But looks like 700 dollars for a 3090 is actually a thing.
God damn.......sell a 3080'10G just to get a 3090?
Decisions, decisions.

This is like the mining craze all over again for me.
When the LHR chips dropped I swapped a 3070FHR for a 3080LHR.
 

GymWolf

Member
Aren't second hand gpus too risky?

You have no idea how they were treated, the amount of crazy overclock they had to sustain, how many furmark test they had to endure, etc.


On the other hand, prices are still high robbery here in europe...
 
Last edited:

Bojji

Member
Aren't second hand gpus too risky?

You have no idea how they were treated, the amount of crazy overclock they had to sustain, how many furmark test they had to endure, etc.


On the other hand, prices are still high robbery here in europe...

There is always risk there is scammer on the other side but I bought tons of used GPUs in my life and all of them worked correctly. Only GPU that died on me (gtx680) was brand new 🤣
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Aren't second hand gpus too risky?

You have no idea how they were treated, the amount of crazy overclock they had to sustain, how many furmark test they had to endure, etc.


On the other hand, prices are still high robbery here in europe...
Realistically as long as they werent mining you should be okay.
Nvidias boost tech doesnt allow overclocks to melt GPUs anymore.
vBios modding could fuck them up, but the number of people vBios modding is pretty low, you'd be incredibly unlucky to buy one.
But the concern is with the memory chips cuz they can and will run hot forever if a GPU was mining 24/7, and that will reduce the lifespan and could even kill a module.
 

winjer

Gold Member
But the concern is with the memory chips cuz they can and will run hot forever if a GPU was mining 24/7, and that will reduce the lifespan and could even kill a module.

And it gets even worse with cards that use GDDR6X, as these are much hotter than GDDR6 modules.
 

OZ9000

Banned
Aren't second hand gpus too risky?

You have no idea how they were treated, the amount of crazy overclock they had to sustain, how many furmark test they had to endure, etc.


On the other hand, prices are still high robbery here in europe...
I have purchased 3 used GPUs and both have worked well without issues (GTX 970, RTX 2070 Super, RTX 2080Ti).

I think there is greater risk with the 3000 series however. So many used 3080 flooding eBay and I'm absolutely certain vast majority have been used for mining. I will often ask directly if the product has been used for mining and I'll be met with zero response.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I have purchased 3 used GPUs and both have worked well without issues (GTX 970, RTX 2070 Super, RTX 2080Ti).

I think there is greater risk with the 3000 series however. So many used 3080 flooding eBay and I'm absolutely certain vast majority have been used for mining. I will often ask directly if the product has been used for mining and I'll be met with zero response.
Pre-mining and GDDR6X it was a non-issue buying second hand GPUs.

GTX500s and under could be dicey with overclocks and heat actually fucking up GPUs eventually but otherwise your were pretty safe from there onwards.
I personally killed a GTX570 with overclocks and heat, random artifacts and rain showers -> Green lines everywhere -> nothing but green -> dead.
It was old anyway, and it was my OC beast for years already so dont feel bad for it, he went down like a champ.

In my home country something like 90% of our LAN meet PCs would have second hand GPUs from the west or asia (im originally from SADC).
So yeah before mining started burning memory components it was pretty safe, Nvidias protections made it nigh impossible to kill a GPU without actually going out and trying.

Now even RX570s can come with faulty memory modules....3000 series especially those with GDDR6X are especially vulnerable to their owners not giving a fuck about the temperatures of the memory modules and cooking them.
 

poodaddy

Gold Member
Pre-mining and GDDR6X it was a non-issue buying second hand GPUs.

GTX500s and under could be dicey with overclocks and heat actually fucking up GPUs eventually but otherwise your were pretty safe from there onwards.
I personally killed a GTX570 with overclocks and heat, random artifacts and rain showers -> Green lines everywhere -> nothing but green -> dead.
It was old anyway, and it was my OC beast for years already so dont feel bad for it, he went down like a champ.

In my home country something like 90% of our LAN meet PCs would have second hand GPUs from the west or asia (im originally from SADC).
So yeah before mining started burning memory components it was pretty safe, Nvidias protections made it nigh impossible to kill a GPU without actually going out and trying.

Now even RX570s can come with faulty memory modules....3000 series especially those with GDDR6X are especially vulnerable to their owners not giving a fuck about the temperatures of the memory modules and cooking them.
Damn the nostalgia. My EVGA GTX 570 was the first card I ever RMA'd, (eventually I did a 980 as well down the road), but I RMA'd it because of my own irresponsible overclocking causing issues. I didn't realize other people had issues with the same card, I figured I was just new to everything and made silly mistakes lol.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Damn the nostalgia. My EVGA GTX 570 was the first card I ever RMA'd, (eventually I did a 980 as well down the road), but I RMA'd it because of my own irresponsible overclocking causing issues. I didn't realize other people had issues with the same card, I figured I was just new to everything and made silly mistakes lol.
The good ol days of overclocking for tangible gains.
I half miss them, but at the same time its good that now you you dont have work to effectively get the most out of your silicon.
 

Rbk_3

Member
LoL...,they are crazy......PC gaming is dead for me.
Long live console's!!!

You can build a PC that will shit on a consoles for under $1200. Just because they are releasing expensive next gen top end cards doesn't kill PC gaming. I just built a 12400 3060ti similar to below for my buddy and it does 140-180 FPS in Warzone at 1440P. The best you will do on consoles is 1080p 120.

 
You can build a PC that will shit on a consoles for under $1200. Just because they are releasing expensive next gen top end cards doesn't kill PC gaming. I just built a 12400 3060ti similar to below for my buddy and it does 140-180 FPS in Warzone at 1440P. The best you will do on consoles is 1080p 120.

[/URL][/URL][/URL]
This

We know these consoles have a GPU in them really no better than a 2060 Ti and the CPU is roughly the same as a R7 3700X. Just make sure you build a PC at that level or above, and you'll play every single game with ease for the entire console generation since no matter what, at the very worst that hardware level will be the baseline.

Edit: So maybe after looking, a R7 2700X + RTX 2080 as your baseline will put you on par with consoles, and anything above is gravy.
 
Last edited:

sachos

Member
Leaving the pricing debacle and the scummy 4080 12gb aside, this is AMAZING tech! Those Portal gains are insane. There is a weird hate bandwagon going around every time they announce a new tech, same happened with RTX and DLSS.

This tech will be critical in the future to reach strobless CRT like quality motion on sample and hold displays.

I wonder what will be the average consumer reaction once they start testing this tech for themselves, will people be able to tell the input lag increase or those artifacts? I tried really hard to catch them on their video and was able to see some of them but most of the time it looked like the real deal. The worst i saw was some shimmer around Peter when walking around that stadium floor (this could have been some SSR artifacts too?) and when swinging around at night in one of the last clips.
 
Last edited:

yamaci17

Member
This

We know these consoles have a GPU in them really no better than a 2060 Ti and the CPU is roughly the same as a R7 3700X. Just make sure you build a PC at that level or above, and you'll play every single game with ease for the entire console generation since no matter what, at the very worst that hardware level will be the baseline.

Edit: So maybe after looking, a R7 2700X + RTX 2080 as your baseline will put you on par with consoles, and anything above is gravy.
i agree with you on some points but bragging with 8 gb gpus is not going to be healthy going forward

8 gb gpus are in a problematic situation and nextgen games will only add more salt to the injury. after seeing how forza, spiderman and other dx12 titles interact with GPUs, I can safely say that I cannot take 8-10-12 gbs at face value going forward. most games only use a maximum of %80 - 85 vram of your total budget. think of 8 gb as 6.4 gb (%80 rule), 10 gb is more like 8 gb and 12 gb is like 9.6 gb. considering both consoles can FULLY allocate entire 10 gb as video memory to games ( 6 gb is for system ram + cpu operations), no way in hell 8 gb gpus will be able to "keep up" with consoles in terms of fidelity, especially in terms of texture quality going forward

in terms of CPU side, there are already a lot of games where even a 3700x cannot match ps5 equivalent performance. so 2700x is just blatant comparison at this point

it is not ideal to match or build pcs that can "match" console in current games. things will go super sour once actual nextgen games hit the market. both vram requirements and cpu requirements will skyrocket.

only way to properly future proof yourself in terms of texture quality / console equivalent textures going forward will be to have a 11-12 gb gpu. anything less, you will have to make huge compromises on texture quality for mere improvements to vram consumption, which defeats the whole purpose of having a lot of grunt (such as 3070ti)
 
i agree with you on some points but bragging with 8 gb gpus is not going to be healthy going forward

8 gb gpus are in a problematic situation and nextgen games will only add more salt to the injury. after seeing how forza, spiderman and other dx12 titles interact with GPUs, I can safely say that I cannot take 8-10-12 gbs at face value going forward. most games only use a maximum of %80 - 85 vram of your total budget. think of 8 gb as 6.4 gb (%80 rule), 10 gb is more like 8 gb and 12 gb is like 9.6 gb. considering both consoles can FULLY allocate entire 10 gb as video memory to games ( 6 gb is for system ram + cpu operations), no way in hell 8 gb gpus will be able to "keep up" with consoles in terms of fidelity, especially in terms of texture quality going forward

in terms of CPU side, there are already a lot of games where even a 3700x cannot match ps5 equivalent performance. so 2700x is just blatant comparison at this point

it is not ideal to match or build pcs that can "match" console in current games. things will go super sour once actual nextgen games hit the market. both vram requirements and cpu requirements will skyrocket.

only way to properly future proof yourself in terms of texture quality / console equivalent textures going forward will be to have a 11-12 gb gpu. anything less, you will have to make huge compromises on texture quality for mere improvements to vram consumption, which defeats the whole purpose of having a lot of grunt (such as 3070ti)
There will never be a single game that comes out for next gen PS5 and Series X that you won't be able to play on a 2700X/2080 even if it looks or performs better on the console, which in most cases probably isn't going to be the case. You are looking at this from a purely technical perspective and wanting like to like performance, where I am looking at this from a perspective of having that as your baseline means moving forward, you will have a good experience.
 

lukilladog

Member
Pre-mining and GDDR6X it was a non-issue buying second hand GPUs.

GTX500s and under could be dicey with overclocks and heat actually fucking up GPUs eventually but otherwise your were pretty safe from there onwards.
I personally killed a GTX570 with overclocks and heat, random artifacts and rain showers -> Green lines everywhere -> nothing but green -> dead.
It was old anyway, and it was my OC beast for years already so dont feel bad for it, he went down like a champ.

In my home country something like 90% of our LAN meet PCs would have second hand GPUs from the west or asia (im originally from SADC).
So yeah before mining started burning memory components it was pretty safe, Nvidias protections made it nigh impossible to kill a GPU without actually going out and trying.

Now even RX570s can come with faulty memory modules....3000 series especially those with GDDR6X are especially vulnerable to their owners not giving a fuck about the temperatures of the memory modules and cooking them.

The 3000 series throttle when they reach unsafe temperatures, and miners in general would rather improve the cooling or reduce clocks and voltages to keep the profits high. The number one killer of cards is thermal cycling I think, and mining cards did not see much of that lol, I think most 3000 mining cards will be fine.

Ps.- That being said, I would only buy from a local seller that lets me test the memory with msi afterburner or from an online marketplace that offers full refund (like ebay or mercadolibre).
 
Last edited:

DenchDeckard

Moderated wildly
Yeah I double checked the specs sheet on the Nvidia and MSI websites and the 4090 is actually smaller than my 3070ti unless they’re lying.

4090 Founders
olJV219.png


3070Ti Suprim X
Vd7N4Sb.jpg

Wow, that's crazy. Maybe the suprim is just huge. I have a 3080 gaming trio x and I don't think itsbthat bog. I'll double check :) thx for heads up.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
The 3000 series throttle when they reach unsafe temperatures, and miners in general would rather improve the cooling or reduce clocks and voltages to keep the profits high. The number one killer of cards is thermal cycling I think, and mining cards did not see much of that lol, I think most 3000 mining cards will be fine.

Ps.- That being said, I would only buy from a local seller that lets me test the memory with msi afterburner or from an online marketplace that offers full refund (like ebay or mercadolibre).
Reducing the core clocks and voltage doesnt do anything to prevent the memory from running hot, hell at some point you cant even reduce the memory clocks further cuz then you are losing more hashrate than its worth to even run the card.
And alot of manufacturers dont put thermal pads on the backplate to help cool the memory modules.
Except for I think EVGA pretty much all GPU partners use the core temp to dictate what the GPU as a whole does, so the memory could be melting itself but the core is "fine" so the GPU assumes everything is all good.
Thermal cycle can kill the core, GDDR6X literally just fries itself from running at those high temps for so long, even sustained.
Common "knowledge" in mining circles was its fine at ~100 degrees......but no one....literally no one knows how long the memory modules are happy at such high temps.
So thats why mining GPUs are hit and miss on the memory modules, the next time they hit high temps could be the last.
You could just be gaming for an hour or whatever and thats the straw that broke the camels back.
 

Fredrik

Member
Wow, that's crazy. Maybe the suprim is just huge. I have a 3080 gaming trio x and I don't think itsbthat bog. I'll double check :) thx for heads up.
Yup so now you get why I’m not too worried 🙃 It’s not that there is lots of space left though but I just have an old Corsair Obsidian 550D so can’t see why people with new cases would have issues. Unless new cases are smaller?

Anyway instead of burning the money on stocks or some other crap that don’t give me much fun I’ve decided to do some sort of upgrade. Just don’t know what yet. Pootin is destroying the world anyway so I might just as well jump in big this time on a complete PC.

One thought - If this launch won’t decrease Nvidia’s market share then what will? I’m a big fan, hardly unbiased, but even I think the pricing this time is absolute insanity.
 

lukilladog

Member
Reducing the core clocks and voltage doesnt do anything to prevent the memory from running hot, hell at some point you cant even reduce the memory clocks further cuz then you are losing more hashrate than its worth to even run the card.
And alot of manufacturers dont put thermal pads on the backplate to help cool the memory modules.
Except for I think EVGA pretty much all GPU partners use the core temp to dictate what the GPU as a whole does, so the memory could be melting itself but the core is "fine" so the GPU assumes everything is all good.
Thermal cycle can kill the core, GDDR6X literally just fries itself from running at those high temps for so long, even sustained.
Common "knowledge" in mining circles was its fine at ~100 degrees......but no one....literally no one knows how long the memory modules are happy at such high temps.
So thats why mining GPUs are hit and miss on the memory modules, the next time they hit high temps could be the last.
You could just be gaming for an hour or whatever and thats the straw that broke the camels back.

I would not worry about chips running near tjmax for 1 or 2 years just because some miner got some lemon lol (that is like 1/5th to 1/10th of industry standards), maybe if you want to use your card for 10 years or something lol.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I would not worry about chips running near tjmax for 1 or 2 years just because some miner got some lemon lol (that is like 1/5th to 1/10th of industry standards), maybe if you want to use your card for 10 years or something lol.
Okay.


On Topic
If my predictions and the leaks; which so far in terms of chips, CUDA and VRAM are correct.....the "baby" AD102 is much better value than the even 4080'16G.
Assuming its 200 dollars more expensive, you are getting alot more performance from that chip than the 16G.
Going up another 200 to the 4090 however isnt netting you that much more performance.
I think I can hold out for a while till we see if that 20G ever becomes a thing.

GAfmMQg.jpg


AD102-300 - $1600
AD102-250 - $1400
AD103-300 - $1200
AD104-400 - $900
AD106-300 - What are you doing....go buy a 3080
AD106-210 - You just really hate 3070s?
 

Reallink

Member
https://www.newegg.com/asus-geforce-rtx-4090-rog-strix-rtx4090-o24g-gaming/p/N82E16814126593

I had forgotten that the non FE cards would be more expensive. Yep… my 3090 lives on for now.

Looks like we found the price mandate EVGA threw their hissy fit about, Nvidia are clearly requiring AIB's sell at least 1 model at FE MSRP. There's no chance in hell they're doing this of their own accord. They'll probably ship 1 unit of them and 10's of thousands of the $1999 ones. They probably also placed a 4090 price cap at $1999 for AIB's esoteric bullshit.
 
Last edited:

KungFucius

King Snowflake
Looks like we found the price mandate EVGA threw their hissy fit about, Nvidia are clearly requiring AIB's sell at least 1 model at FE MSRP. There's no chance in hell they're doing this of their own accord. They'll probably ship 1 unit of them and 10's of thousands of the $1999 ones. They probably also placed a 4090 price cap at $1999 for AIB's esoteric bullshit.
They have ___ model at MSRP then ____OC model for 1699. I forget the 3000 series details but the OC always tends to be minimal. Like 50 or 100 MHz and the non OC ones were never available. It is bullshit that only really works if demand > supply and everyone colludes. It's stupid and only makes the FEs more popular. I mean if they thought their tiny OC was really marketable why wouldn't they make the base model OC and say look this thing runs 3% faster than the FE at the same price. So I suspect that the mandate is that the MSRP ones need to also run at spec.

Odd. I have been stalking BestBuy and people are still buying 3090Tis at $1100. They were in stock yesterday and not now. FFS people wait a few weeks for the good 4080.
 

OZ9000

Banned
I’m seeing some YouTube vids on how these are not performing as well as nvidia claimed??
In the absence of DLSS 3.0 yes the performance increase isn't too impressive.

I don't think the new cards represent good value for money personally. I think it's worth waiting for November to see AMD's offering.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I’m seeing some YouTube vids on how these are not performing as well as nvidia claimed??
DLSS3-Frame Generation is what Nvidia was using to get those 2-4x claims.

The 4080'12G is almost certain going to have strict embargos on the reviews for it cuz its not going to look that impressive.
 

yamaci17

Member
yeah, you can shave off gpu bound input lag and bring overall latency to a lower amount
but that also works for normal dlss2. that's why in the end, in a situation like this:

say you have 60 fps + %75 gpu load. you will have the most optimum input lag possible. interpolating to 120 fps in this case will actually increase input lag in a noticable way.

only situations where interpolated 120 fps will have the same lag with 60 fps is situations where you have heavy gpu bound input lag

and that comparison is skewed. you can always eliminate gpu bound input lag regardless of what tech you use.

so if you knew how to reduce input lag beforehand at any arbitrary framerate, then frame interpolation will always net you an increase in input lag.
 
Last edited:

01011001

Banned
DLSS3-Frame Generation is what Nvidia was using to get those 2-4x claims.

The 4080'12G is almost certain going to have strict embargos on the reviews for it cuz its not going to look that impressive.

I think everyone, including the reviewers, should just agree to stop calling that card the 4080 12GB and simply call it what it really is, the 4070.
that alone would instantly put it into perspective how overpriced it is
 
Last edited:
Shame these fuckers are gonna sell out at $2500 probably. And holy shit have you ever seen a console generation get completely outclassed like this in two short years?
Yes. Every time? Specially when we are talking about GPUs that cost as much as $1599 while the entire console costs $500.

Back in the day we also had PC AAA games that actually made proper use of all this hardware, unlike today.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I think everyone, including the reviewers, should just agree to stop calling that card the 4080 12GB and simply call it what it really is, the 4070.
that alone would instantly put it into perspective how overpriced it is
I fully agree.
We should just call it a 4070 so we dont have to keep distinguishing between it and the true 4080.

Tinfoil hat time!!!!!!
The reason there is no 4080'12G Founders Edition is because Nvidia already made the chassis with 4070 badges on them.
When greed took over, they released a new vBios to AIBs that has the 4070 show up as a 4080'12G.
They cant fit the AD104 chip and PCB in the 4080 chassis so no Founders Edition.

The rumored 10GB card that was supposed to be the 4060Ti still has an AD104 so it shares a PCB with the AD104 in the 4080'12G so it will have a Founders Edition cuz they dont have to change the badging on the chassis, they will just repurpose the ones that were supposed to go to AD104-400.
 

GymWolf

Member
DLSS3-Frame Generation is what Nvidia was using to get those 2-4x claims.

The 4080'12G is almost certain going to have strict embargos on the reviews for it cuz its not going to look that impressive.
I'm fucking sorry? So now even reviews can share results if they are bad?

Isn't the whole point of a gpu review to have precise benchmarks?
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I'm fucking sorry? So now even reviews can share results if they are bad?

Isn't the whole point of a gpu review to have precise benchmarks?
If you dont think your product is gonna be hot, put an embargo on releasing the results.
The reviewers will review it either which way, but atleast you can save face for a day or 2 during your launch window.

Hell Digital Foundry already have the FPS results for the RTX 4090 vs 3090.......they just arent allowed to tell us what those numbers are until whenever the embargo lifts.
 

GymWolf

Member
If you dont think your product is gonna be hot, put an embargo on releasing the results.
The reviewers will review it either which way, but atleast you can save face for a day or 2 during your launch window.

Hell Digital Foundry already have the FPS results for the RTX 4090 vs 3090.......they just arent allowed to tell us what those numbers are until whenever the embargo lifts.
Oh ok i was thinking about embargo like hiding negative results.
 

01011001

Banned
I fully agree.
We should just call it a 4070 so we dont have to keep distinguishing between it and the true 4080.

Tinfoil hat time!!!!!!
The reason there is no 4080'12G Founders Edition is because Nvidia already made the chassis with 4070 badges on them.
When greed took over, they released a new vBios to AIBs that has the 4070 show up as a 4080'12G.
They cant fit the AD104 chip and PCB in the 4080 chassis so no Founders Edition.

The rumored 10GB card that was supposed to be the 4060Ti still has an AD104 so it shares a PCB with the AD104 in the 4080'12G so it will have a Founders Edition cuz they dont have to change the badging on the chassis, they will just repurpose the ones that were supposed to go to AD104-400.
Cult Illuminate GIF by Squirrel Monkey
 

Reallink

Member
Okay.


On Topic
If my predictions and the leaks; which so far in terms of chips, CUDA and VRAM are correct.....the "baby" AD102 is much better value than the even 4080'16G.
Assuming its 200 dollars more expensive, you are getting alot more performance from that chip than the 16G.
Going up another 200 to the 4090 however isnt netting you that much more performance.
I think I can hold out for a while till we see if that 20G ever becomes a thing.

GAfmMQg.jpg


AD102-300 - $1600
AD102-250 - $1400
AD103-300 - $1200
AD104-400 - $900
AD106-300 - What are you doing....go buy a 3080
AD106-210 - You just really hate 3070s?

The 4080's are both going to bomb catastrophically at these prices, there's no way they stay there once the 3XXX inventory starts to clear or Ti variants release. The 12GB will drop to like $599 and the 16GB $899, then the Ti will take the $1099 or $1199 slot.
 
Last edited:
Top Bottom