• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 4070 to cost $599. Launches April 13.

hlm666

Member
Is a 1000 watt power supply enough? I don't remember what mine is, and I don't think there is a way too see unless I open up the case?
would be massive overkill for the 4070, don't even need 1000 watts for a 4090, if you plan on transferring the psu to future builds/systems it might be worth it.
 

amigastar

Member
Is a 1000 watt power supply enough? I don't remember what mine is, and I don't think there is a way too see unless I open up the case?
Idk man, i asked google and yes 1000 watt should be enough. My personal opinion is the same 1000 watt is enough for 3090.
Unless you mean 4070 not 3090 then 1000 watt is overkill.
 
Last edited:
People actually thought Nvidia would just lower prices to how they were before scalping LMAO

JZPJ7Kr.jpg
 

OZ9000

Banned
GPU experts - how much better is this than a 3070 or a 1660 ti? I am stuck with a 1660ti. Is it just worth waiting for a 4070 at this point? I almost gotten a 3090.
It will be a huge upgrade over the 1660 Ti. My 1660 Ti could barely play some of the latest titles even in 1080p. I guess it depends how close the performance of this card will be to the 4070 Ti
 

rofif

Can’t Git Gud

I hope it's true...
You hope it's 600$ ?!
Which means 700$ from retailers,
which means 800 euro lol because fuck europe.

600 for a mid range is a joke man.
And we already know that 12gb is barely scraping it nowadays... Imo there should be no cards with less than 16gb
 
Last edited:
Thanks crypto "entrepeneurs"!

Used is the way to go it seems, and praying that without warranty the card won't die too soon.

Absolutely bonkers that the 1650 is still their entry level card, or 3060 if former GTS prices are now entry, and those kind of suck even more than those insanely priced newer 40 cards.
 

THE DUCK

voted poster of the decade by bots
Imagine people who bought a fucking 1080 ti 8 years ago for 599$ and they still can play games on high settings 60 fps now. Todays gpu market is a joke

I'm not crazy about high prices either, but is this card really that out of line? I mean adjusted for inflation the 1080ti was $749, and the 4070 is 3 times as powerfull.........
Plus your probably still going to sell that 1080i for $200 making the new card $399 to triple your performance........not completely horrible.
 

Three

Member
I'm not crazy about high prices either, but is this card really that out of line? I mean adjusted for inflation the 1080ti was $749, and the 4070 is 3 times as powerfull.........
Plus your probably still going to sell that 1080i for $200 making the new card $399 to triple your performance........not completely horrible.
The 1080ti was flagship at the time if I remember right. Your 3x as powerful is meaningless here because you've used advancement there to compensate for price increases in the low-mid range. The flagship today is far more powerfull than the 1080ti.
 
Last edited:

Corndog

Banned
3080 performance for nearly 3080 price, 2.5 years later.. With a bit more VRAM but limited to 192-bit bus. So basically an old XX60ti tier product rebadged and sold for $600.

Man, Nvidia are smoking something.
Ya. That bus sucks. I understand is hard to shrink memory controllers but it should be at least 256 bit or go with stacked ram.
 

Kenpachii

Member
I guess people were expecting AMD to massively undercut them. Problem is that people somehow still buy Nvidia so AMD probably thought "fuck it why should we have to undercut".
both ceo's are related towards eachother. Big change its a push to up the prices on GPU and then competition starts again.

It's a joke.
 

Buggy Loop

Member
Seems like quite a few people in this thread pretend that inflation doesn't exist :messenger_grinning_smiling:

Crazy silicon wafer prices and TSMC premium taxes, this cards actually falls down quite close to earth compared to Ampere series. +$100 the 3070 (of course it was almost never at that price)

But 4000 series baseline have much beefier cooling solutions that would typically have a premium adder on AIBs in the past

It’s maybe expensive, but we’re not at 4080 stupid levels
 

Hot5pur

Member
You hope it's 600$ ?!
Which means 700$ from retailers,
which means 800 euro lol because fuck europe.

600 for a mid range is a joke man.
And we already know that 12gb is barely scraping it nowadays... Imo there should be no cards with less than 16gb
I comfortably play all games at 4k 60 with DLSS (which looks nearly identical) with a 3080 10 GB.

For lower resolution this would be plenty even without DLSS. So not sure where this "we need 16 GB" is coming from.

Seems the 4070 may come in around a 3080 in performance and will be a decent 4k card. Accounting for inflation this may have been closer to $500. Basically inflation chewed up the performance to price gain of this new generation. With the 4070ti and above that is just pure fleecing.

If I was thinking about gaming in 2023 I would 100% go with a PS5 or Xbox. PC ports are often shit anyway. The benefits of PC gaming are cheap keys on day 1 and free online. Also some games are better with m/kb and peripheral flexibility. I also use the PC for 3D modeling and other stuff.
 

GreatnessRD

Member
Won't lie, I'm actually surprised Nvidia didn't go the Mooreslawisdead route and make it $750 like King Goofy said it was gonna be. $600 is still atrocious. $549 would've been the sweet spot in my opinion. $499 really, but we know they gotta up the price. I'm sure AMD is just fine chugging alone at their pace of being #2, but they better stop bluffin' because Intel is slowly creeping behind them. Just don't buy all this wild shit, folks. We'd get the correct prices if people had patience.
 

rofif

Can’t Git Gud
I comfortably play all games at 4k 60 with DLSS (which looks nearly identical) with a 3080 10 GB.

For lower resolution this would be plenty even without DLSS. So not sure where this "we need 16 GB" is coming from.

Seems the 4070 may come in around a 3080 in performance and will be a decent 4k card. Accounting for inflation this may have been closer to $500. Basically inflation chewed up the performance to price gain of this new generation. With the 4070ti and above that is just pure fleecing.

If I was thinking about gaming in 2023 I would 100% go with a PS5 or Xbox. PC ports are often shit anyway. The benefits of PC gaming are cheap keys on day 1 and free online. Also some games are better with m/kb and peripheral flexibility. I also use the PC for 3D modeling and other stuff.
My 3080 10gb is showing vram limits in two or three games. Fps is fine with 4k and dlss. Raw 4k is starting to struggle in few games
 

SmokedMeat

Gamer™
3080 performance for nearly 3080 price, 2.5 years later.. With a bit more VRAM but limited to 192-bit bus. So basically an old XX60ti tier product rebadged and sold for $600.

Man, Nvidia are smoking something.

Yep. 2GB VRAM extra and they cut down the memory bus versus the 3080, just like the 4070ti.

That’s because they want you upgrading again in two years.
 
Last edited:

DaGwaphics

Member
Sure has a tiny bandwidth increase. Might be a bottleneck.

The 4000 series has more on chip cache so they have backed down the memory bus just like AMD did. Seems like there would still be situations where the wider bus would be preferable though.
 
Last edited:

manfestival

Member
it is pretty amazing that we have reached a $600 MIDrange product. To think at one point I had stood my ground on $300. Post Crypto boon is wild.
 

hinch7

Member
Isn't this only 20% more expensive than the last one?
Until you realise that how cut down this AD104 configuration is to have the exact core count as last gen's GA104, with less ROPS. Where they would naturally give you higher specs gen on gen. They are selling you smaller dies for hyper inflated costs and you are only netting performance increases from the higher clocks and the small cache. Actually went backwards in some ways with lower bus and less ROP's.

If AMD can offer higher specs per tier, even at inflated prices.. the specs should at least be better.
 
Last edited:

hlm666

Member
I see, thanks.
here are some power measurements, it's gpu only but unless you have overclocked the hell out of your cpu and have alot of drives and usb devices attached the rest of your system shouldn't be going over 300 watts and even that's a bit high. What is your current psu?
edit: just saw you don't know what the psu is, no unless you have something like a corsair i series psu with usb header attached you can't see it's output capacity with software.

index.php
 
Last edited:

Buggy Loop

Member
Resident evil games and tlou.

The RE2, 3, village have no problem with your card, whatever they report in-game is bullcrap. You can push that game to max with a 3080.

In menu VRAM predicts 12GB and you end up with 7.8GB allocated and 6.2 GB usage. RE engine is bonkers for these predictions
dc4QCYA.jpeg


Same with RE 3, 15GB predicted in the menu
DLhENGr.jpeg


So i hope you didn't tweak down the game based on menu predictions. Get the real usage VRAM overlay in afterburner if you don't have it already.

RE4 only has a memory leak with RT on, otherwise you can max it.

TLOU has memory leak from the get go and is in the list of tasks for future patch.

So not bad, two recent bugged games.
 

THE DUCK

voted poster of the decade by bots
The 1080ti was flagship at the time if I remember right. Your 3x as powerful is meaningless here because you've used advancement there to compensate for price increases in the low-mid range. The flagship today is far more powerfull than the 1080ti.

While I understand that one may have been a flagship at the time, saying the power level is meaningless is wrong as we will always need to take in to consideration the market, component costs, and actual power level to determine value.
I can concede that the value level is lower, however it's not quite as bad as what is being implied. (acting like it's 5x as much or something) The fact is it's still a very powerful card and is comparable to other hardware on the market at that price point today. (some would even say cheaper than the competition for overall power)
 

rofif

Can’t Git Gud
The RE2, 3, village have no problem with your card, whatever they report in-game is bullcrap. You can push that game to max with a 3080.

In menu VRAM predicts 12GB and you end up with 7.8GB allocated and 6.2 GB usage. RE engine is bonkers for these predictions
dc4QCYA.jpeg


Same with RE 3, 15GB predicted in the menu
DLhENGr.jpeg


So i hope you didn't tweak down the game based on menu predictions. Get the real usage VRAM overlay in afterburner if you don't have it already.

RE4 only has a memory leak with RT on, otherwise you can max it.

TLOU has memory leak from the get go and is in the list of tasks for future patch.

So not bad, two recent bugged games.
Village is fine but re2, re3 and re4 will crash on maxed settings with rt because of vram.
I am using dedicated vram overlay. it's going overboard and crashing atm bout 9700mb used.
 
That is a fair price if the card meets the standard expected from the xx70 series.

This was my original plan to build a pc, before I was offered a large bonus to switch airlines after my last contract ended. I splurged and bought a 4090, but no way I would have spent that much normally for a gpu. The price of modern high end cards are absurd, but this price is fair***. Hell, my travel gaming laptop has a 3070ti, and thats laptop form, and it’s still a beast.

***If it performs within reason during testing.
 
While I understand that one may have been a flagship at the time, saying the power level is meaningless is wrong as we will always need to take in to consideration the market, component costs, and actual power level to determine value.
I can concede that the value level is lower, however it's not quite as bad as what is being implied. (acting like it's 5x as much or something) The fact is it's still a very powerful card and is comparable to other hardware on the market at that price point today. (some would even say cheaper than the competition for overall power)
The problem is that the better the tech gets, the less viable chips they produce. As die shrinks, the failure rate goes up. I don’t know what the failure rate is for 4090, but I expect its very high, and costly. I don’t completely understand it, but im guessing the process for creating less powerful chipsets is quite a bit more successful, therefore cheaper.

It’s been a while since I heard it talked about, but I know failure rates on high end hardware production we’re getting worse. Imagine the increased cost if half or more of your silicon could never make it to market?
 
Top Bottom