• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA launches GeForce RTX 40 SUPER series starting January 27: $999 RTX 4080S, $799 RTX 4070 TiS and $599 RTX 4070S

amigastar

Member
I really wish AMD were a serious competitor in terms of market share. nVidia do whatever they damn well please and the uber graphics whores are bending over and taking everything with enthusiasm.
I've had Nvidia Graphics Cards since 2000 so 24 years ago. Never really cared for AMD or Ati Cards.
 

Celcius

°Temp. member
I really wish AMD were a serious competitor in terms of market share. nVidia do whatever they damn well please and the uber graphics whores are bending over and taking everything with enthusiasm.
Same... I haven't cared about Radeon since they were owned by ATI. They used to be really cool but now their hardware is seen as the budget option and their software is constantly behind nvidia's innovations. Their ray tracing performance is a generation behind as well.

I miss ATI:
r420angle.jpg
 
Last edited:

ShirAhava

Plays with kids toys, in the adult gaming world
Great upgrade if you are coming from pascal or before

I'm sticking with my 2070 for at least a year longer
 
And so continues the absolute trash value proposition that is PC gaming. Components are more expensive than entire consoles, used to run unoptimized ports that perform worse despite better hardware. Sticking with my 1070 forever at this rate. AMD, where you at? Can you at least pretend to care and offer something competitive???
 

CuNi

Member
My guy… you good?

He is right, tho.
While the 4090 is undeniable a beast and a monster in terms of performance, the GPU market is in a horrible state.
I remember the days when a XX70 was 300-400€ and had great performance and the only thing better was the XX80. Now the 70s already start at 550-600€ and are down a segment since not only does an 80-tier exist, there is another one above that with the 90-tier.
 

kiphalfton

Member
He is right, tho.
While the 4090 is undeniable a beast and a monster in terms of performance, the GPU market is in a horrible state.
I remember the days when a XX70 was 300-400€ and had great performance and the only thing better was the XX80. Now the 70s already start at 550-600€ and are down a segment since not only does an 80-tier exist, there is another one above that with the 90-tier.

Blame mining in 2017 and again in 2021. It's no coincidence that after 2017 (i.e. when the RTX 2000 series came out, pricing sucked relative to GTX 1000 series). Then it fell off, but then once again after 2021 we got RTX 4000 series and pricing sucked once again.

Unfortunately I see prices continuing to suck for RTX 5000 series, despite mining being done for, because idiots are still scalping 4090's on eBay. Which if they're flying off the shelves will be seen as "well guess we (Nvidia) can increase prices, since people are scooping this shit up".
 
And so continues the absolute trash value proposition that is PC gaming. Components are more expensive than entire consoles, used to run unoptimized ports that perform worse despite better hardware. Sticking with my 1070 forever at this rate. AMD, where you at? Can you at least pretend to care and offer something competitive???
PC gaming has been getting watered down for a while now. A lot of us oldies have kinda moved on (I’m barely hanging in here), where this new wave of gamers let these companies do any and whatever.
 
Last edited:

Jayjayhd34

Member
And so continues the absolute trash value proposition that is PC gaming. Components are more expensive than entire consoles, used to run unoptimized ports that perform worse despite better hardware. Sticking with my 1070 forever at this rate. AMD, where you at? Can you at least pretend to care and offer something competitive???

This just plain ( ports do not exist anymore as everything x86) ports where reference to transferring code from uqinue console architecture to x86 hardware. So let's call them what they are pc version.

There's far more good pc version releasing than bad ones. Alan wake baulders gate 3 ac mirage avatar losds. Most of the time is not even developers fault it time constraints by publisher.

They have develop for 3 platforms series S X ps5 then PC platforms graphics card with diffrent features and advancement choose to support grahiics card going back 6-,8 years not every developer can afford to abandoned old hardware like remedy did with alan wake. Even though it would be fucking awesome considering how it looks.

Just take look at games that stay in development for as long as ghey need sonys games good example to this. Develop solely with one platform in mind with all time they need.

Now even if game relases with stutter, unoptimised .buggy 99 % of time month or two downline it'll be fixed and cheaper look at castiillo protocol now or hogwarts legacy or even batman arkham kinght all released in terrible states now run flawlessly.
 

twilo99

Member
And so continues the absolute trash value proposition that is PC gaming. Components are more expensive than entire consoles, used to run unoptimized ports that perform worse despite better hardware. Sticking with my 1070 forever at this rate. AMD, where you at? Can you at least pretend to care and offer something competitive???

Used RDNA2 cards are a good proposition currently
 
Still no GTX1650 or A2000(ie SFF 3050/3060) successor or consumer RTX4000Ada around?
Something that's truly entry level and compatible with PCs that initially get shipped with integrated GPUs and lackluster PSUs?
I don't understand that no one is even remotely interested in that segment. It's either go all in or go home.
 
If you haven’t bought in at this point, I’d advise sitting these cards out. There are not too many GPU intensive games announced for this year so it’s pointless to upgrade. The last big push was with cyberpunk. Well there’s AW2 but that sold poorly anyway so not many people are playing it.

Nvidia will release Blackwell later this year and if you buy these cards, the buyers remorse will be real. For me, there is only 1 good time to buy a card and that’s when a new architecture launches. If you buy the new architecture, you get 2 years before the next architecture releases thus reducing buyers remorse significantly. If you choose to buy the old architecture, you get a huge discount. That’s pretty much it. Anytime after that is a waste imo except your GPU dies or something.
 

kittoo

Cretinously credulous
If you haven’t bought in at this point, I’d advise sitting these cards out. There are not too many GPU intensive games announced for this year so it’s pointless to upgrade. The last big push was with cyberpunk. Well there’s AW2 but that sold poorly anyway so not many people are playing it.

Nvidia will release Blackwell later this year and if you buy these cards, the buyers remorse will be real. For me, there is only 1 good time to buy a card and that’s when a new architecture launches. If you buy the new architecture, you get 2 years before the next architecture releases thus reducing buyers remorse significantly. If you choose to buy the old architecture, you get a huge discount. That’s pretty much it. Anytime after that is a waste imo except your GPU dies or something.

I plan to rock my 3080 10GB till 5000 series launches. I usually like to have about 3 times jump in performance whenever I upgrade. This is how I've upgraded over the years-

8800GTS (2006) -> 470GTX (2010) -> 780 (2013) -> 1070 (2016) -> 3080 (2020)
 

Celcius

°Temp. member
Still stupidly overpriced with stingy VRAM amount.
How much vram did you expect on the 4070 Super and 4070 Ti Super?
16gb seems perfect for this tier of card at the moment IMHO.
Additional VRAM would just increase the price for no reason.
 

SmokedMeat

Gamer™
I’m going to wait until the 50XX series hits, and hopefully that’s not a mess like the 40 Series launch.

Then again if no games are really pushing my 7900XT there would still be no point in upgrading.
 

XOMTOR

Member
And so continues the absolute trash value proposition that is PC gaming. Components are more expensive than entire consoles, used to run unoptimized ports that perform worse despite better hardware. Sticking with my 1070 forever at this rate. AMD, where you at? Can you at least pretend to care and offer something competitive???
Yea, I just don't see the point of desktop PC gaming any more; the cost/benefit ratio just seems awful (oooh, shinier grafix). SteamDeck and the likes I get; those offer something completely different and exciting.
 

MacReady13

Member
Yea, I just don't see the point of desktop PC gaming any more; the cost/benefit ratio just seems awful (oooh, shinier grafix). SteamDeck and the likes I get; those offer something completely different and exciting.
I'm sort of here at the moment. I'm looking at the Legion Go as a little PC gaming machine to go along with my OLED Steam Deck, my PS5 and Switch. I truly want to get back into PC gaming (I did build a machine last year but the less said about that build the better and I had to sell the parts off).
I feel like now is a great time to jump in but fuck me, to buy a 4070 gaming PC from Scorptec (Aussie PC gaming site) which I can get built from them with parts I choose, I cannot get it below $3400. And that doesn't include keyboard and mouse! Yet I can use that money to get a new OLED tv and a Legion Go and still play PC games on the TV or on the go! Just don't know what to do to be honest. The price barrier to get into PC gaming is just huge at the moment and the cost of living crisis doesn't help...
 

GreatnessRD

Member
Still slightly overpriced, in my opinion, but at least the prices are coming down. Curious to see how AMD responds with their own price cuts because there's no way they can keep the 7900 XTX at $999 or the 7900 XT at $849. (Though both are routinely under these official prices)
 

kiphalfton

Member
How much vram did you expect on the 4070 Super and 4070 Ti Super?
16gb seems perfect for this tier of card at the moment IMHO.
Additional VRAM would just increase the price for no reason.

VRAM is cheap... It's just Nvidia wants to keep perpetuating that VRAM should come at a big premium.

There's no reason why the RTX 3080 should have ever had 10GB of VRAM, and from there it jumped to 24GB with the RTX 3090. Especially when price to performance between the two was so piss poor.

Yeah the gap in VRAM has been made smaller with the RTX 4080 and RTX 4090... but the price gap was also made smaller too.
 

Omnipunctual Godot

Gold Member
I plan to rock my 3080 10GB till 5000 series launches. I usually like to have about 3 times jump in performance whenever I upgrade. This is how I've upgraded over the years-

8800GTS (2006) -> 470GTX (2010) -> 780 (2013) -> 1070 (2016) -> 3080 (2020)
I'm still hanging onto my 1080ti. But I HAD to make the jump from 6700k to 12700k last year. That change alone should be enough for me to wait until the next GPU gen.
 

TheShocker

Member
I wonder how long after launch will the prebuilt start showing up? I have no interest in building my own, but I do have a micro center near me and I have heard good things about their brand of PCs.
 

Celcius

°Temp. member
VRAM is cheap... It's just Nvidia wants to keep perpetuating that VRAM should come at a big premium.

There's no reason why the RTX 3080 should have ever had 10GB of VRAM, and from there it jumped to 24GB with the RTX 3090. Especially when price to performance between the two was so piss poor.

Yeah the gap in VRAM has been made smaller with the RTX 4080 and RTX 4090... but the price gap was also made smaller too.
I mean true the 3080 should have had more vram, but that was also years ago and nvidia doesn't even sell that card anymore. Are we still going to complain about the gtx 970 and it's 3.5gb usable vram too?
It feels to me like people saying AMD has bad drivers... that was true in the past but it's simply not the case anymore.
 
What config could you possibly be running to need 2800w as an ordinary user/gamer (no miners or AI builds)?
I only know that the outlet behind all my Audio Video stuff is a 120V 15 Amp circuit. 1800 Watts before the circuit tripps. I wish it was a 20 amp circuit, then I could handle 2400 Watts. :(
But as for now my TV, Reciever and 1 or 2 devices dosn't come close to maxing out, YET.
 

DaGwaphics

Member
I only know that the outlet behind all my Audio Video stuff is a 120V 15 Amp circuit. 1800 Watts before the circuit tripps. I wish it was a 20 amp circuit, then I could handle 2400 Watts. :(
But as for now my TV, Reciever and 1 or 2 devices dosn't come close to maxing out, YET.

I was referring more to the 2800w PSU in the thumbnail. What kind of builds actually need that.

But, yeah I can see where if you have a lot of equipment on a single circuit you might have an issue. Do you use a lot of it at the same time?

I can't see myself ever building a PC that needs more than 1000w from the wall, that seems crazy enough as it is. I'd only go that high if no other hardware combinations were available.
 

kiphalfton

Member
I mean true the 3080 should have had more vram, but that was also years ago and nvidia doesn't even sell that card anymore. Are we still going to complain about the gtx 970 and it's 3.5gb usable vram too?
It feels to me like people saying AMD has bad drivers... that was true in the past but it's simply not the case anymore.

That's not the point.

The point is we went from a $699 card (RTX 3080) to a $1499 card (RTX 4080), and really the only notable difference between the two cards was the amount of VRAM (performance difference was negligible).

You said VRAM amounted to higher costs, and clearly Nvidia tries to perpetuate that BS rationale. Clearly that's the case, or there wouldn't be some ridiculous upcharge for the RTX 3090. Otherwise what exactly lends to that $800 premium?

You do know there's a database with costs for RAM/VRAM that shows a cost of $3.55 per a 1GB module (or by association $7.10 for a 2GB module). And that's spot market costs, so it'd be even cheaper for Nvidia, since they have contracts in place with Samsung, Micron, SK Hynix. Yet somehow, some way "VRAM would just increase the price for no reason"?

I'm guessing you don't even have any real concept of how much VRAM costs. And are acting like it would add some substantial costs to those cards.
 
Last edited:

Celcius

°Temp. member
That's not the point.

The point is we went from a $699 card (RTX 3080) to a $1499 card (RTX 4080), and really the only notable difference between the two cards was the amount of VRAM (performance difference was negligible).

You said VRAM amounted to higher costs, and clearly Nvidia tries to perpetuate that BS rationale. Clearly that's the case, or there wouldn't be some ridiculous upcharge for the RTX 3090. Otherwise what exactly lends to that $800 premium?

You do know there's a database with costs for RAM/VRAM that shows a cost of $3.55 per a 1GB module (or by association $7.10 for a 2GB module). And that's spot market costs, so it'd be even cheaper for Nvidia, since they have contracts in place with Samsung, Micron, SK Hynix. Yet somehow, some way "VRAM would just increase the price for no reason"?

I'm guessing you don't even have any real concept of how much VRAM costs. And are acting like it would add some substantial costs to those cards.
If you want to be mad then you can be mad. Like I said - the rtx 4070 super, 4070 ti super, and 4080 super all have enough vram to get the job done.
I have nothing else to say to you lol.
 

kiphalfton

Member
If you want to be mad then you can be mad. Like I said - the rtx 4070 super, 4070 ti super, and 4080 super all have enough vram to get the job done.
I have nothing else to say to you lol.

Never ceases to fail; every time somebody is pressed, and can't come up with something that isn't otherwise objective evidence to back up what they're saying, they make it about the other person "being mad"/etc.
 
Last edited:

Hudo

Member
This all wouldn't have happened if 3dfx were still around!

A real tragedy that they fucked themselves.
 

FireFly

Member
That's not the point.

The point is we went from a $699 card (RTX 3080) to a $1499 card (RTX 4080), and really the only notable difference between the two cards was the amount of VRAM (performance difference was negligible).

You said VRAM amounted to higher costs, and clearly Nvidia tries to perpetuate that BS rationale. Clearly that's the case, or there wouldn't be some ridiculous upcharge for the RTX 3090. Otherwise what exactly lends to that $800 premium?

You do know there's a database with costs for RAM/VRAM that shows a cost of $3.55 per a 1GB module (or by association $7.10 for a 2GB module). And that's spot market costs, so it'd be even cheaper for Nvidia, since they have contracts in place with Samsung, Micron, SK Hynix. Yet somehow, some way "VRAM would just increase the price for no reason"?

I'm guessing you don't even have any real concept of how much VRAM costs. And are acting like it would add some substantial costs to those cards.
I guess you mean the 3090 rather than 4080? The 3090 used a more expensive clamshell design with 24 chips due to the absence of 1.5/2 GB GDDR6X memory modules. The 3080 could have shipped with 12 GB for a fully enabled GA102, like the 3080 Ti, but it makes sense that they cut it down for better yields.
 
Still no GTX1650 or A2000(ie SFF 3050/3060) successor or consumer RTX4000Ada around?
Something that's truly entry level and compatible with PCs that initially get shipped with integrated GPUs and lackluster PSUs?
I don't understand that no one is even remotely interested in that segment. It's either go all in or go home.

A New RTX 3050 6 GB is coming next month

It's basically a consumer A2000 (70W Card)
 
Last edited:

Jayjayhd34

Member
Never ceases to fail; every time somebody is pressed, and can't come up with something that isn't otherwise objective evidence to back up what they're saying, they make it about the other person "being mad"/etc.

It might very well be scummy but nothing new to tech industry the same thing happens all sorts gadgets. Phones are great example with some older models having more ram than newer models. Insane price increase for diffrent storage options . TV's also: selling cut down models then offering the better one premium price even though manufacting cost is probably mo where near what they ask.

You got to remember there not just recouping the manufacturing cost fr9m these products they need make back R&D cost aswell. I would expect things like frame gen dlss have cost them billions to make I seriously believe this is responsible for the higher prices
 
Top Bottom