• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 4070 to cost $599. Launches April 13.

They kept the 12GB which is good. Its going to sell a lot i reckon and this will be what most people will buy for 1440p/1080p gaming which is the majority.
The majority will buy the 4060 and 4050 like they always do. Most popular card on Steam is the 3060, and before that it was the 1650 and before that it was the 1060.
 

THE DUCK

voted poster of the decade by bots
But your 3060 is the problem. That video card barely gets the same result as ps5 or Xbox. If that's the card I have, I will be playing on a console fuck PC.

Me, I don't like optimizing. If I am not going to get the absolute best, then why to waste money to get some graphics that look like a 400$ ps5 digital would do or close to it ( which is what the 3060 is, I honestly think the ps5 is even much better than a 3060 due to hardware optimization vs PC )

Dlss is an upscaling factor. Consoles have the same in the form of FSR 2.0 or whatever. May not look as good but then again in the heat of gameplay, who really pauses the game in call of duty and talk to himself look at that pixel is not clear god damn it !! )

To me gaming PC is about having a clear edge over consoles. That's why I get a high-end card. ( It's not ideal, it's not logical or makes financial logic whatsoever ) but if we can afford it, why not. A 3060 is not that ( with all respect, I would rather play on my ps5 )

Hell at 4k it seems the 3080 is not enough these days.

If I didn't have like 500+ games between steam and epic store, I would probably never bother with PC gaming. But when you have so much library, you're kinda stuck.)

That and I also don't want to give MS or Sony money to play online. They can fuck off with that practice. I would rather pay triple the money just to make a point. Pay-to-play online needs to die. It's a scam ( that and paying to get an upgrades graphics version of an X game. what a shady practice while PC users 15 years from now will still get the better-looking version of said game. if not from the devs, from the community ).

Ya the 3060 was really just for regular pc use mostly, probably have 10 hours of gaming on it. Thought about getting something beefier but I'd probably just play xbox and PS5 and switch anyhow. I hate optimizing and or issues that pc brings too, that's why I gravitate to consoles.
I am more locked into Xbox for my library, I have a ton of xbox games purchased so I know what you mean by stuck. Though my Sony library has been growing lately so it's evening out a bit. My pc library is only like 50 games.
I'm ok with the online pay as I always found value in xbox gold and or gamepass or PS+, for me it always paid off. So on both now the online play is included in something I would buy anyhow.
I'm probably best off waiting for the "pro" machines that upgrading my pc card, just seems tempting for some reason like I would suddenly switch even though I wouldn't......
 

OZ9000

Banned
The 4080 is absolutely indefensible, I've seen hardcore Jensen dick suckers even tap out on that one.
The 4080 costs almost as much as the 4090 in the UK.

Why pay £1400 for one when you can get a 4090 for £1700?

As far as I'm concerned the 4090 should have been the price of the 4080. 4080 as the 4070 Ti, 4070Ti as the 4070, and so on.
 

SmokedMeat

Gamer™
It has more VRAM, less power usage, supports DLSS3 and costs less. The 3080 will be outclassed by this card.

More VRAM and much less Memory Bandwith. I have my doubts that it will be outclassed. I expect evenly matched, and losing in 4K.
Fake frames isn’t pulling off much of a win. Especially when image quality suffers, but that’s what Nvidia wants people using to make up for them cutting back on their cards.
 
You are right, i just hope for the 12GB of VRAM to continue in the 4060 and hopefully in the 4050. I just need VRAM to play not only with games, but also with AI

You're crazy if you think you're getting anywhere closer to that on a card with a 96-Bit bus. Basically daddy jacket saw what AMD mommy did with the 6500 XT / 6400 and decided to one up her and cheap out on framebuffer because suckers will lineup to buy this crap at full price anyway, a whole $100 more than what it used to cost in 2016 for the same class, and maybe even $100 more than the predecessor.

Nice to have a cousin lead the rival company, so you can price fix during dinner time at home.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Ive been an xx70 user since the GTX260c216 (it was the xx70 of that generation).
But this gen I dont think so.
Hell, im really tempted to just skip the generation and glide with a 3080 (no i dont just stick the settings to ultra then bitch that the game is unoptimized).
Unless Nvidia releases that 4080'20G we know they have for a price similar or lower than that of the current 4080......Ada can suck my dick.

Time for some "retro" gaming....6800GS and for my console a Dreamcast.
Man these things brought me so much joy way back when. (Last year same time, as I do this every year to keep myself from getting too jaded by modern gaming)
SbtW8he.jpg
 

Vognerful

Member
Would this be a good upgrade from 2060 super? taking into consideration that I have a tiny case but 800 watt UPS.
 

amigastar

Member
Would this be a good upgrade from 2060 super? taking into consideration that I have a tiny case but 800 watt UPS.
I'm also upgrading from RTX 2060. I think the jump to an RTX 4070 will be huuuge. 800 watt should be enough for 4070.
 
Last edited:

Leonidas

Member
Navi 32 is intended to compete with AD104 (4070 Ti/4070), and should do fine from a margins perspective. It just apparently isn't ready yet.
Being on par with raster while losing 30% in RT, and currently lacking a DLSS3-like feature isn't competing IMO (that's the best case I think for N32)... maybe they will have a DLSS3-like feature by the time they launch.

AMD needs to be at least 20% cheaper than Nvidia, IMO, due to their lack of features, higher power draw, and average RT performance.
 
Last edited:

DaGwaphics

Member
Do scalpers still go after these things? Or is that a dead fad? 🤔

That's basically dead outside the 4090. With the 4080/4070ti piled up in stores all over the place, there isn't much chance for a return. That is good news really, but it comes along with the GPUs being out of budget for many.
 
Last edited:

Crayon

Member
Being on par with raster while losing 20% in RT, and lacking a DLSS3-like feature isn't competing IMO (that's the best case I think for N32)... maybe they're waiting for their DLSS-3 competitor to become ready...

AMD needs to be at least 20% cheaper than Nvidia, IMO, due to their lack of features, higher power draw, and average RT performance.

Both at $800, the 7900xt and the 4070ti trade blows. 4070 wins in rt. Meanwhile the 7900 has TWENTY gigs of vram vs 12. I don't see why that should be cheaper.
 

Leonidas

Member
Both at $800, the 7900xt and the 4070ti trade blows. 4070 wins in rt. Meanwhile the 7900 has TWENTY gigs of vram vs 12. I don't see why that should be cheaper.
The post I was responding to was talking about N32 (7800 XT).

7900 XT is okay vs. 4070 Ti at the same price.
 

Crayon

Member
The post I was responding to was talking about N32 (7800 XT).

7900 XT is okay vs. 4070 Ti at the same price.

Yeah I don't think that n32 is going to do too hot if the 7900xt ties with the 4070ti. 😬 Red and greens tiers are well off from each other in naming now. No thanks to amd's choice to go with that xtx naming shenanigans and apparently Nvidia going upmarket with the old naming schemes.
 
Considering I paid $460 (paid $80) in 2021 to step up to a 3060ti from a $380 2060, it's in line with their current prices (rip evga)

Not saying it's cheap. In fact it's way too much for what I would spend on a gpu. I haven't bought above a xx60 series card since 9800gt/8800gtx back then high end was $399 to $500.
My 3060ti is still running strong, not upgrading anytime soon.
 
Last edited:

FireFly

Member
Being on par with raster while losing 30% in RT, and currently lacking a DLSS3-like feature isn't competing IMO (that's the best case I think for N32)... maybe they will have a DLSS3-like feature by the time they launch.

AMD needs to be at least 20% cheaper than Nvidia, IMO, due to their lack of features, higher power draw, and average RT performance.
So the Meta Review tells us that at 1440p the 4070 Ti is 14.8% faster than the 7900 XT in RT, and in raster the 7900 XT is 8.4% faster (https://tinyurl.com/5ajez335). The rumoured core counts of the 4070 and 7800 XT indicate that they are 77% and 71% respectively of their bigger brothers. However the leaked boost clock on the 4070 is 5% lower than on the 4070 Ti (2475 MHz vs 2610 MHz). While the 7800 XT is rumoured to be clocked higher.

So as a base expectation we should expect the 4070 to match up with the 7800 XT like the 4070 Ti matches up with the 7900 XT. Slightly slower in raster and a bit faster in RT. In raster we would expect the 4070 to perform just under 3080 at 1440p and the 7800 XT to perform just under a 3080 Ti.
 
Last edited:

Leonidas

Member
So the Meta Review tells us that at 1440p the 4070 Ti is 14.8% faster than the 7900 XT in RT, and in raster the 7900 XT is 8.4% faster (https://tinyurl.com/5ajez335). The rumoured core counts of the 4070 and 7800 XT indicate that they are 77% and 71% respectively of their bigger brothers. However the leaked boost clock on the 4070 is 5% lower than on the 4070 Ti (2475 MHz vs 2610 MHz). While the 7800 XT is rumoured to be clocked higher.

So as a base expectation we should expect the 4070 to match up with the 7800 XT like the 4070 Ti matches up with the 7900 XT. Slightly slower in raster and a bit faster in RT. In raster we would expect the 4070 to perform just under 3080 at 1440p and the 7800 XT to perform just under a 3080 Ti.
Even, if that's the case AMD would still need to be around 20% cheaper. Losing DLSS3, launching later, worse efficiency and RT is huge for me, but I might be willing to give all that up if I can save an extra $100+ (as I'm someone who frequently upgrades GPUs anyways, I wouldn't be burdened by the lack of features/RT for too long).

But if AMD doesn't announce anything by the time 4070 launches (less than two weeks from now) it will be too late for me...
 

FireFly

Member
Even, if that's the case AMD would still need to be around 20% cheaper. Losing DLSS3, launching later, worse efficiency and RT is huge for me, but I might be willing to give all that up if I can save an extra $100+ (as I'm someone who frequently upgrades GPUs anyways, I wouldn't be burdened by the lack of features/RT for too long).

But if AMD doesn't announce anything by the time 4070 launches (less than two weeks from now) it will be too late for me...
I am not talking about your personal preferences but the market as a whole. Right now the prices on the 7900 XT seem to be stable at around $800. So we would expect AMD to follow a similar strategy and price the 7800 XT against the 4070 at $600. They will adjust when the market decides having a slight edge in rasterisation performance isn't good enough.
 

Djin1980

Neo Member
My PC is a i7-9700K@3.60GHz paired with a RTX 2080 EVGA (powered by a 550W PSU), I'm considering buying a 4070 cuz it's a bit short for racing games on ultra wide screen (1440)/VR (+ I'm tired of the ventilators noises, it blows like crazy at 88°c).
What do you think guys ? I think my CPU could be a bottleneck and the PSU might be too weak for the 4070.
 

Leonidas

Member
I am not talking about your personal preferences but the market as a whole.
The market has told AMD they can't get away with charging Nvidia prices. Even AMD realizes this, it's why the 7900 XTX is cheaper than RTX 4080...
It's why the 7900 XT was pilloried at $900 and the market adjusted it to $800 and I think we'll see that drop even further in the not too distant future...
 
You are right, i just hope for the 12GB of VRAM to continue in the 4060 and hopefully in the 4050. I just need VRAM to play not only with games, but also with AI
Oh man the rumors point to 8GB for the 4060. I just hope the 4050 gets 8GB of VRAM like the 3050 desktop did. Nvidia will do w.e it wants cause people will always buy their stuff.
 
More VRAM and much less Memory Bandwith. I have my doubts that it will be outclassed. I expect evenly matched, and losing in 4K.
Fake frames isn’t pulling off much of a win. Especially when image quality suffers, but that’s what Nvidia wants people using to make up for them cutting back on their cards.
That's been the trend for most Lovelace cards and the reason for this is because Nvidia stuffed the L2 cache so memory performance vastly increased, at the same time they reduced memory bandwidth of the VRAM since the L2 made up for it. That said there are scenarios where the reduced bandwidth is not fully made up for by the L2 but it seems Nvidia did a great enough job with this strategy.
 

PeteBull

Member
My PC is a i7-9700K@3.60GHz paired with a RTX 2080 EVGA (powered by a 550W PSU), I'm considering buying a 4070 cuz it's a bit short for racing games on ultra wide screen (1440)/VR (+ I'm tired of the ventilators noises, it blows like crazy at 88°c).
What do you think guys ? I think my CPU could be a bottleneck and the PSU might be too weak for the 4070.
Had 8700k slighty oced paired with gtx 1080, upgraded to 3080ti and cpu is still ok so urs will be too, unless u aiming at 1080p/1440p and very high framerates, way above 60.
https://www.techpowerup.com/gpu-specs/geforce-rtx-2080.c3224 thats where ur gpu lands currently, u can guestimate rtx 4070 will be around rtx 3070ti/3080 performance, it has 12gigs of vram but smaller bus, so ofc there will definitely be outliers among games, but thats rough ballpark u can expect.

U gotta dig deep coz depending on independend benchmarks it might turn out currently avaiable/discounted 6800/xt/3080 provide u better performance or at least value aka price/perf ratio, even if bought brand new with warranty.

Guestimating around 40-50% performance increase vs ur current card but if its worth it it highly depends on actual prices, even if msrp is 599$ streetprice might be 650 or even close to 700$ at least for some models.
Remember rtx 4070 is definitely better card from ur current rtx 2080 but question is- is it better performance/value wise from currently avaiable relatively cheap 6800/xt/3080 ?

PSU wise u will be fine unless ur current is some low quality no name shit but doubt u chose such one(unless u got prebuild-thats where usually bad psu's are put :p ).
Here proof, 3070 that actually has 220W tdp(so supposedly 20 higher vs 200W 4070) paired with ur cpu ofc, works fine with 550W psu's https://www.whatpsu.com/psu/cpu/Intel-Core-i7-9700K/gpu/NVIDIA-GeForce-RTX-3070
 
Last edited:

FireFly

Member
The market has told AMD they can't get away with charging Nvidia prices. Even AMD realizes this, it's why the 7900 XTX is cheaper than RTX 4080...
It's why the 7900 XT was pilloried at $900 and the market adjusted it to $800 and I think we'll see that drop even further in the not too distant future...
Yes, and that's why the 7800 XT will likely be priced against the 4070 despite being expected to have a close to 10% rasterisation advantage. Maybe AMD will need to lower prices further in the future, but we are talking about Navi 32 being able to compete in current market conditions.

(In any case Navi 32 should have better relative margins due to the die size being substantially lower)
 
Last edited:

PeteBull

Member
You are right, i just hope for the 12GB of VRAM to continue in the 4060 and hopefully in the 4050. I just need VRAM to play not only with games, but also with AI
Impossible, smaller die so it will be 8gigs for 4060, maybe 10gigs, depending on how cut the die gonna be, but very likely 8gigs for 4060/ti and 6gigs for 4050/ti if/when they get accounced, nvidia wants/needs u to buy their cards often, latest every 2nd gen, hence very rarely we get offered producs like back in the pascal days, when we got gtx 1070 and 1080 with 8gigs, and 1080ti with fricken 11gigs of vram.

They are smart and wont make pascal mistake again- aka giving gamers(their customers) something relatively cheap that wont need to be replaced after max 3-4 years, nowadays such a product is 4090 but price is so high to make sure only ppl who buy it are ppl who got budget to replace gpu every gen coz go for best possible anyways.
 

adamosmaki

Member
Seems like quite a few people in this thread pretend that inflation doesn't exist :messenger_grinning_smiling:
seems like there are quite a few shills forgetting 4-5years ago mid range gpus were 300-400. a 40-50 price increase every few years is fair but paying 600 for rtx 4070 isnt inflation. Its a ripoff
 

PeteBull

Member
seems like there are quite a few shills forgetting 4-5years ago mid range gpus were 300-400. a 40-50 price increase every few years is fair but paying 600 for rtx 4070 isnt inflation. Its a ripoff
It all depends on its actual performance, if its on avg just as good as 3070ti then its big rip off, if its similar or even slighty better from 10gigs 3080, its worth its price and will sell quickly. especially considering it will have 12gigs of vram so 50% more than 3070ti and 20% more vs 3080.

https://www.techpowerup.com/gpu-specs/geforce-rtx-3070-ti.c3675 here 3070ti, 600$ msrp, 8gigs vram, launched end of may 2021 so durning crypto boom, aka fake msrp, street price was around 2x bigger.

But lets just compare value vs value, if 2 years later we getting actual 3080 12gigs effectively for same msrp and it wont be fake msrp but actual street price i count it as progress, 20% faster with 50% more vram for same price.

Inflation matters and is really big even in the US but progress in the tech should be bigger vs inflation, we will know in under 2 weeks how it looks.

Here inflation calculator https://www.bls.gov/data/inflation_calculator.htm quick math 499$ in may 2021( so launch of 3070) is 558$ in february 2023, so if 4070 costs 599 we paying only 40bucks over inflation price, thats not some crazy mark up.

In the end we gotta see independend benchmarks to be sure how good of a value this card gonna be, ofc we can tell its ballpark performance( 3070ti to 3080) but here actually specifics matter.

Remember in the end its perceived value is decided by overall market, 1600$ rtx 4090 was considered too expensive by many, just the thing is, those "many" werent market aka target audience of 4090, actual target adience happily bough those cards w/o any complains.

Same gonna happen here, some1 who is customer in 350-400$ budget range gonna be super vocal on forums/twitter/whenever his bubble is but actual target aka ppl who are willing to spend 600$ will decide on their own, using independend benchmarks and comparing cards value to other ones in same bracket- thats why rtx 4080 sells extremly badly at 1200$ and thats why rx 7900xt had to lower its price from 900 to 800$ msrp- actual market decides if product is good, not one person or bunch of ppl who wouldnt buy it anyways.
 
Last edited:

Leonidas

Member
seems like there are quite a few shills forgetting 4-5years ago mid range gpus were 300-400. a 40-50 price increase every few years is fair but paying 600 for rtx 4070 isnt inflation. Its a ripoff
RTX xx60 series has been $330-$400 since 2018. It's due for inflation correction this year.

RTX xx70 series has been $500 since 2018. Adjusted for inflation today that's around $600.

RTX 4070 will easily become the best selling current gen GPU upon launch if it releases at $599. AMD has nothing to compete against it.

You say it's a ripoff at $600, what wouldn't be a ripoff? $500? So you want them to sell it for cheaper (inflation adjusted) than the 3070 and 2070 Super?

Or is $550 not a rip-off? If so are you really going to bitch about $50 when they upped the VRAM and produced it on the most advanced node with excellent efficiency?

Keep crying and holding on to your ancient GPUs :messenger_grinning_sweat:
 

Leonidas

Member
Yes, and that's why the 7800 XT will likely be priced against the 4070 despite being expected to have a close to 10% rasterisation advantage. Maybe AMD will need to lower prices further in the future, but we are talking about Navi 32 being able to compete in current market conditions.

(In any case Navi 32 should have better relative margins due to the die size being substantially lower)
But you do agree that 7800 XT needs to be cheaper than 4070 though right? Considering it's lacking features, higher power draw and the fact that it's coming however many months later.

AMD pricing seem weird. Seems like there's going to be like a $400 MSRP gap between 7900 XT and 7800 XT. 7900 XT was the biggest ripoff this GPU generation.
 
Last edited:

nbkicker

Member
Ive got a i9 9900k with 32gb ram and a 2080super, recently just been using a 1440p 144fps monitor but recently updated to a 4k 144fps and at min using both monitors for game dev, and the 4k for gaming, im gonna be updating in a few months tempted to go for the 4080 as i no might slighty bottle neck at 4k but can in a few yrs when update my motherboard and processor should still be decent, although wonder how much of a upgrade from my current gpu to downgrade of a 4080 the 4070 and 4070ti would be if i went for one of them
 

Leonidas

Member

FireFly

Member
But you do agree that 7800 XT needs to be cheaper than 4070 though right? Considering it's lacking features, higher power draw and the fact that it's coming however many months later.

AMD pricing seem weird. Seems like there's going to be like a $400 MSRP gap between 7900 XT and 7800 XT. 7900 XT was the biggest ripoff this GPU generation.
Again it depends on the market, not on me. Last generation AMD was pricing higher with worse ray tracing performance and no DLSS alternative at all.
 

64bitmodels

Reverse groomer.
I went from a 1060 to a 3080 Ti…
i honestly don't know what to upgrade to. I've heard that Nvidia support on Linux is actually not as bad as it once was so they're not off the table now

My PC was simply not as futureproofed as i thought, thanks to the outdated mobo and small case i might have more trouble upgrading than i thought. Don't wanna build another one so soon though because it's only 1 year old
 

64bitmodels

Reverse groomer.
RTX xx60 series has been $330-$400 since 2018. It's due for inflation correction this year.

RTX xx70 series has been $500 since 2018. Adjusted for inflation today that's around $600.

RTX 4070 will easily become the best selling current gen GPU upon launch if it releases at $599. AMD has nothing to compete against it.

You say it's a ripoff at $600, what wouldn't be a ripoff? $500? So you want them to sell it for cheaper (inflation adjusted) than the 3070 and 2070 Super?

Or is $550 not a rip-off? If so are you really going to bitch about $50 when they upped the VRAM and produced it on the most advanced node with excellent efficiency?

Keep crying and holding on to your ancient GPUs :messenger_grinning_sweat:
ok cool inflation is a thing. counterpoint these companies are exponentially richer than you and I. They're able to take a few losses. Besides you know for a fact that even if inflation goes back to normal these prices will stay.
 

Leonidas

Member
Again it depends on the market, not on me. Last generation AMD was pricing higher with worse ray tracing performance and no DLSS alternative at all.
After crypto ended we saw what gamers were willing to pay (HUB did monthly charts comparing what cards were actually selling for vs. MSRP)

The latest month showed Nvidia 30-series cards still selling an average of 4% over MSRP while AMD cards were selling 26% under MSRP.

AMD knows what the market is and unless they don't want to sell cards, they will undercut the 4070...

ok cool inflation is a thing. counterpoint these companies are exponentially richer than you and I. They're able to take a few losses.
What does that have to do with anything?

I reward companies that give me what I want at the right price. 4070 does that (at $599).

Besides you know for a fact that even if inflation goes back to normal these prices will stay.
Intel would love for that to happen, they could actually sell GPUs if AMD/Nvidia overpriced their low-end/mid-range GPUs...
 
Last edited:

Reallink

Member
I don't see a logical reason to get this over a 3080...

Not an option, and never was. You haven't been able to buy new 3080's for at least 4 - 6 months, only thing left is refurbs and a few straggling Chinese offbrands that are very likely repackaged mining farm chips. All the high end 3XXX's inventory dried up long before there was any notable discounting, AIB 3080's pretty much never went under $800. The only notable "deals" were EVGA's modest $50 - $100 off "firesell" (which sold out within a hour or two) and a rare sell that would occassionally pop up for other AIB's trying to move them at like $780 (which would also sell out fast). There was never any surplus of 3XXX chips like clickbait rumor peddlers claimed.
 
Last edited:

DaGwaphics

Member
AMD knows what the market is and unless they don't want to sell cards, they will undercut the 4070...

Seems unlikely though because there is the 7900XT price matched with the 4070Ti. I'd expect them to do the same with the 6800xt which would already be $50 lower than last gen for them and they'll have 4GB more memory there.

The interesting play for AMD will be the 7700xt as that 12GB part will match up against the 8GB 4060 series at a time when 8GB cards are quickly being relegated to 1080p duty.
 

PeteBull

Member
ok cool inflation is a thing. counterpoint these companies are exponentially richer than you and I. They're able to take a few losses. Besides you know for a fact that even if inflation goes back to normal these prices will stay.
For prices to have potential to get lower u would need not stop/slow down of inflation, but actual deflation, i remember japan was once close to that or in the range of small deflation.
I dont live in the US and in my country inflation wasnt small even before Russian invasion on Ukraine, after invasion it got so nasty so think of this like that- w/e u suffer inflation wise there are many ppl who suffer much more.
Just be glad ur country doesnt have border to Russia, Bellarus and Ukraine at the same time like mine :)
 

yamaci17

Member
just your friendly reminder that frame generation itself gobbles up 1 gb to 2 gb vram depending on the game and implementation

good luck running frame gen on limited 8-12 gb vram in upcoming games. stay away from these GPUs.
 
Top Bottom