• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

RTX 4070 to cost $599. Launches April 13.

PeteBull

Member
just your friendly reminder that frame generation itself gobbles up 1 gb to 2 gb vram depending on the game and implementation

good luck running frame gen on limited 8-12 gb vram in upcoming games. stay away from these GPUs.
U arent forced to use dlss3 aka frame generation, dont come to buying gpu with a mindset of- no i wont buy this or that specific model/vendor coz of some weird reasons- always look at independend benchmarks and shop in ur price bracket, aka budget, if say u can comfortably spend around 600$ for gpu take ur time and check whats avaible for u, then make knowledgeable purchasing decision- if u wanna get best deal u shouldnt look at those cards with fanboy mindset, rather with clear mind of a customer who wants to get best possible bang for ur buck- thats the path to lead u to making best purchasing decision and not regretting it in the future.
 

yamaci17

Member
U arent forced to use dlss3 aka frame generation, dont come to buying gpu with a mindset of- no i wont buy this or that specific model/vendor coz of some weird reasons- always look at independend benchmarks and shop in ur price bracket, aka budget, if say u can comfortably spend around 600$ for gpu take ur time and check whats avaible for u, then make knowledgeable purchasing decision- if u wanna get best deal u shouldnt look at those cards with fanboy mindset, rather with clear mind of a customer who wants to get best possible bang for ur buck- thats the path to lead u to making best purchasing decision and not regretting it in the future.
Wdym, you have to. 4060ti and 3070 literally costs the same whereas

4060ti has less bandwidth (it will be problematic at 1440p and above where 3070 is more capable)
4060ti has same VRAM
Similar performance
Same PRICE (500 bucks most likely)

How is that progression? You pay the same price for a 3070 equivalent GPU because Jensen promises you 1.5x-2x frames with FG. but that only will properly work at 1080p and with games that are not too limiting in terms of VRAM.

its not weird or anything. see how much problematic stuff has become with 8 GB. It is not a fanboy mindset. If anything, I despise AMD cards. I'd get consoles rather than getting a 6700xt 12 GB instead of my 8 GB 3070.

4060ti and 4070ti is not going to be thest bang for your buck when 1-1.5 years down the line they will be problematic. 4060ti could even be problematic nowadays. I'm sure it will crap out with frame gen at 1440p. If you think a 500 bucks GPU is only capable of 1080p is okay in 2023, all the power to you.

Only justification for 4070ti and 4060ti to be 800/500 bucks is frame generation. and if that feature becomes decrepit to use after 1-2 years when vram usage creeps up further, it is a legal scam.

all the power to you. what i see is FG adding 1-2 gb worth of VRAM on top of games. them games already stress 8 gb and 12 gb to their limits at 1080p and 1440p respectively. they even planned the obsolence of usage of FG in these lowerend cards. sorry but this is the brutal truth. 4060ti should have 12 gb and 4070 4070ti should've had 16 gb. more so if FG is going to add an extra amount of VRAM usage completely unrelated to devs and their targets on consoles.
 
Last edited:

Spyxos

Gold Member
just your friendly reminder that frame generation itself gobbles up 1 gb to 2 gb vram depending on the game and implementation

good luck running frame gen on limited 8-12 gb vram in upcoming games. stay away from these GPUs.
With which resolution?
 

yamaci17

Member
With which resolution?
8 gb for 1080p
and 12 gb forf 1440p



see here; game runs somewhat okayish with 1440p dlssq ray tracing. it drops frames however. VRAM is maxed out. Gpu is actually more than capable here.if not for VRAM, it would've been more stable framerate. but at least it does not drop to 30s and 40s

move to 22:50 where he enables frame generation. since vram was tapped out with aforementioned settings, frame gen procures additional VRAM pressure on top (pushing game assets out of the way, as the game will never allocate more than 11.2 gb on 12 gb budget. you can also blame developer on this one. they're not gonna change their stance on this. many games do this. I've cried about it a lot. it is not going to change).

now see how 4070ti craps out! it drops frames horrendously to 30s. notice it did not do that without FG. if you want my honest opinion; This is just pathetic. GPU has the grunt to run 100+ fps there with frame generation. but it drops frames horrendously because of VRAM limitation. this is just a joke. this is a brand new 800+ bucks product with premium features running at its destined resolution of 1440p. ofc turn down settings. but yea 800 bucks. will be a hard pill to swallow at that price point.

should've been 16 gb.

the game literally runs better without FG at those settings. the GPU is literally not equipped with enough VRAM to reliably use its premium features. I find this simply a joke.

w0Y4WNK.png
Wxn3dAT.png



This is profoundly more relevalant for 4070 laptop, 4060ti 4060 desktop that also hail 8 GB VRAM. and ofc, the meme 4050 6 GB.

Sorry but I have to criticise these. These gpus are only worth a damn because of Frame Generation. Literally nothing else. If it needs additional gbs worth of VRAM, it is too problematic. these gpus are equipped with VRAM amounts that do not have such a luxury to run stuff like this.
 
Last edited:

PeteBull

Member
Wdym, you have to. 4060ti and 3070 literally costs the same whereas

4060ti has less bandwidth (it will be problematic at 1440p and above where 3070 is more capable)
4060ti has same VRAM
Similar performance
Same PRICE (500 bucks most likely)

How is that progression? You pay the same price for a 3070 equivalent GPU because Jensen promises you 1.5x-2x frames with FG. but that only will properly work at 1080p and with games that are not too limiting in terms of VRAM.

its not weird or anything. see how much problematic stuff has become with 8 GB. It is not a fanboy mindset. If anything, I despise AMD cards. I'd get consoles rather than getting a 6700xt 12 GB instead of my 8 GB 3070.

4060ti and 4070ti is not going to be thest bang for your buck when 1-1.5 years down the line they will be problematic. 4060ti could even be problematic nowadays. I'm sure it will crap out with frame gen at 1440p. If you think a 500 bucks GPU is only capable of 1080p is okay in 2023, all the power to you.

Only justification for 4070ti and 4060ti to be 800/500 bucks is frame generation. and if that feature becomes decrepit to use after 1-2 years when vram usage creeps up further, it is a legal scam.

all the power to you. what i see is FG adding 1-2 gb worth of VRAM on top of games. them games already stress 8 gb and 12 gb to their limits at 1080p and 1440p respectively. they even planned the obsolence of usage of FG in these lowerend cards. sorry but this is the brutal truth. 4060ti should have 12 gb and 4070 4070ti should've had 16 gb. more so if FG is going to add an extra amount of VRAM usage completely unrelated to devs and their targets on consoles.
Im not telling u or any1 else to buy 4070 or 4060/ti, im just saying- if u are shopping in that price range- to come into it with open mind- if u can find other product with more vram but similar performance and price/perf obviously go with that, and check ur local streetprices, not msrp which often are misleading.
Example from my country, very established online store, that even has its own youtube channel with 370k+ subs, so u can tell it isnt some niche mom and pops store. https://www.youtube.com/@MoreleTV

Quick look u got there same price 4070ti ventus(so lowest grade) https://www.morele.net/karta-graficzna-msi-geforce-rtx-4070-ti-ventus-3x-oc-12gb-gddr6x-12486402/ vs low grade model of rx 6800xt https://www.morele.net/karta-grafic...erc-319-core-16gb-gddr6-rx-68xtalfd9-9805954/

And here u can easily check how good both of those cards perform https://www.techpowerup.com/gpu-specs/radeon-rx-6800-xt.c3694 as u can see nvidias product has 21% higher performance despite 4gigs less vram, so logically thinking customer gonna chose that one over the other.

Always go for value, and like i said- look at ur local prices, not msrp.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I guess the minimum spec for every upcoming game AAA or otherwise is now an A770 with 16GB of VRAM?






Ohh its not?
People seem to have forgotten PC games have settings.
 

blastprocessor

The Amiga Brotherhood
Looking at the generations you essentially have to move upwards to get previous generation x80 performance. So Nvidia is making midrange more costly to attain.

GeForce 1060 6gb ~ GeForce 980 4gb

GeForce 2060 6gb ~ GeForce 1080 8gb

GeForce 3060 8gb Ti ~ GeForce 2080 8gb super
(2080 ~current generation consoles)

GeForce 4070 12gb ~ GeForce 3080 10gb (at a guess)
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Do you really want to play Last of Us with medium texture settings?

Dont you dare mention that absolute insult of a medium setting to me.
And with a 10 or even 8GB card you can get away with using High Environment settings.

But yes, I would use medium settings in most games if my VRAM was limited because unlike in TLoU where the difference in texture quality between medium and high is the difference between vaseline in your eyes and 20/20 vision, most games have a gradual step.
TLoU straight up borked whatever the fuck medium environment settings was supposed to be.
Thats not even compression thats straight up dont bother loading it setting.
fMF4ThX.png
 

blastprocessor

The Amiga Brotherhood
Dont you dare mention that absolute insult of a medium setting to me.
And with a 10 or even 8GB card you can get away with using High Environment settings.

But yes, I would use medium settings in most games if my VRAM was limited because unlike in TLoU where the difference in texture quality between medium and high is the difference between vaseline in your eyes and 20/20 vision, most games have a gradual step.
TLoU straight up borked whatever the fuck medium environment settings was supposed to be.
Thats not even compression thats straight up dont bother loading it setting.
fMF4ThX.png

There appears to be more texture layers (e.g., dirt/stains) on TLOU (look under the window, ground etc), hence i'd imagine that users more VRAM.
 
Last edited:

Spyxos

Gold Member
Dont you dare mention that absolute insult of a medium setting to me.
And with a 10 or even 8GB card you can get away with using High Environment settings.

But yes, I would use medium settings in most games if my VRAM was limited because unlike in TLoU where the difference in texture quality between medium and high is the difference between vaseline in your eyes and 20/20 vision, most games have a gradual step.
TLoU straight up borked whatever the fuck medium environment settings was supposed to be.
Thats not even compression thats straight up dont bother loading it sett
fMF4ThX.png
Medium setting, medium setting, medium setting :) I wouldn't have a problem with medium setting, but only if the card wasn't just 2 years old. After 4-5 years I'm happy that the game runs well at all, but that's way too early.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
There appears to be more texture layers (e.g., dirt/stains) on TLOU (look under the window, ground etc), hence i'd imagine that users more VRAM.
TLoUs Medium Environment Textures setting is an affront to textures.

I do believe you are right that the ND engine layers textures on top of each other so it likely has to load a bunch of textures at once regardless, but thats neither hear nor there cuz at its core this is a slightly modified version of TLoU2......which on base PS4 has better textures in larger enviornments.
Get fucked.
They borked something with this medium setting, literally every other medium setting is fine in the game, but the medium enviornment textures settings is straight up disgusting.
 

DaGwaphics

Member
Then I can still use as an example broken Resi 4, Hogwarts and Forspoken. This is just the beginning.

The writing is basically on the wall now (as it clearly has been since the day the current-gen consoles launched), 8GB will survive only at 1080p and you'll likely need to turn down the texture quality. The horrible 1% lows on the 3060ti, 3070, and the rest of the 8GB cards were a forgone conclusion to anyone that was paying attention (at more than 1080p with the highest quality textures).

The reason why it will be that way is just because the developers now can do it that way thanks to the memory in the consoles, and they will.
 
Last edited:

Djin1980

Neo Member
Had 8700k slighty oced paired with gtx 1080, upgraded to 3080ti and cpu is still ok so urs will be too, unless u aiming at 1080p/1440p and very high framerates, way above 60.
https://www.techpowerup.com/gpu-specs/geforce-rtx-2080.c3224 thats where ur gpu lands currently, u can guestimate rtx 4070 will be around rtx 3070ti/3080 performance, it has 12gigs of vram but smaller bus, so ofc there will definitely be outliers among games, but thats rough ballpark u can expect.

U gotta dig deep coz depending on independend benchmarks it might turn out currently avaiable/discounted 6800/xt/3080 provide u better performance or at least value aka price/perf ratio, even if bought brand new with warranty.

Guestimating around 40-50% performance increase vs ur current card but if its worth it it highly depends on actual prices, even if msrp is 599$ streetprice might be 650 or even close to 700$ at least for some models.
Remember rtx 4070 is definitely better card from ur current rtx 2080 but question is- is it better performance/value wise from currently avaiable relatively cheap 6800/xt/3080 ?

PSU wise u will be fine unless ur current is some low quality no name shit but doubt u chose such one(unless u got prebuild-thats where usually bad psu's are put :p ).
Here proof, 3070 that actually has 220W tdp(so supposedly 20 higher vs 200W 4070) paired with ur cpu ofc, works fine with 550W psu's https://www.whatpsu.com/psu/cpu/Intel-Core-i7-9700K/gpu/NVIDIA-GeForce-RTX-3070
Thanks for the extensive reply, I'll wait some reviews but he more I think about it, the more I want to skip this gen and wait for the 5xxx series, +50% performance increase is a bit low for that price.
 

Thyuda

Member
The whole 4000 series shapes up to be the biggest joke in recent gpu history, absolutely horrendous. I'm glad I got a 3080 when they were still available and ok-ish in price.
 

ToTTenTranz

Banned
Unless Nvidia releases that 4080'20G we know they have for a price similar or lower than that of the current 4080......Ada can suck my dick.

IIRC the AD103 chip of the 4080 uses a 256bit bus, so it's either the current 16GB or twice that at 32GB using clamshell operation (not going to happen with Nvidia on a consumer GPU).
The only way for you to get 20GB on a 4080 is by either cutting the memory channels from 8 to 5 (160bit) or by getting an uneven amount of memory per channel like the GTX970 did, or the GTX660 Ti.

TL;DR: the 4080's chip is designed for 16GB VRAM.
 

scydrex

Member
Looking at the generations you essentially have to move upwards to get previous generation x80 performance. So Nvidia is making midrange more costly to attain.

GeForce 1060 6gb ~ GeForce 980 4gb

GeForce 2060 6gb ~ GeForce 1080 8gb

GeForce 3060 8gb Ti ~ GeForce 2080 8gb super
(2080 ~current generation consoles)

GeForce 4070 12gb ~ GeForce 3080 10gb (at a guess)
The 3060 is worse or equal to a 2070. How is the 3060ti equal to a 2080? Looked the 3060 vs 2070 today to see if it's not faster than a 2070.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
IIRC the AD103 chip of the 4080 uses a 256bit bus, so it's either the current 16GB or twice that at 32GB using clamshell operation (not going to happen with Nvidia on a consumer GPU).
The only way for you to get 20GB on a 4080 is by either cutting the memory channels from 8 to 5 (160bit) or by getting an uneven amount of memory per channel like the GTX970 did, or the GTX660 Ti.

TL;DR: the 4080's chip is designed for 16GB VRAM.
The rumored 4080'20G is using a cutdown AD102 on a 320bit bus.

The gap between the 4080 and 4090 is simply too vast for it to not exist.

lADRdIB.png



Note that Kopite has been right about some 99% of all the Nvidia chips that have come out.....the names have changed somewhat and the whole 4080'12G/4070Ti/4070 the chips were all right the names were just speculation.
Even the 4060, 4060Ti chips were leaked by Kopite a long time ago.
 
Last edited:

Reallink

Member
The rumored 4080'20G is using a cutdown AD102 on a 320bit bus.

The gap between the 4080 and 4090 is simply too vast for it to not exist.

lADRdIB.png



Note that Kopite has been right about some 99% of all the Nvidia chips that have come out.....the names have changed somewhat and the whole 4080'12G/4070Ti/4070 the chips were all right the names were just speculation.
Even the 4060, 4060Ti chips were leaked by Kopite a long time ago.

The absolute best case scenario is that the 4080Ti takes over the 4080's price point, i.e. priced within $200 - $300 of the 4090. So I'm not really sure what you're waiting for, you've already lost 6 months of use trying to save a measly 15-20%. Just buy a 4090.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
The absolute best case scenario is that the 4080Ti takes over the 4080's price point, i.e. priced within $200 - $300 of the 4090. So I'm not really sure what you're waiting for, you've already lost 6 months of use trying to save a measly 15-20%. Just buy a 4090.
No.





Okay yes, if the 4080'20G isnt announced by GTC Fall.

Im very patient, I only play at 1440p and after 100fps I cant really tell the difference....literally my Monitor goes all the way to 165 but I lock pretty much every game at 120.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I never understand comments about inflations, since... it's just repeating that prices increased, which is what we're complaining about :messenger_confused:.
To put context to the real price.

Saying you paid xyz in the 1902 means shit all without context of what was the real worth of that xyz you paid.....by giving the inflation price we now have an idea of what that was actually worth.
 

dave_d

Member
I never understand comments about inflations, since... it's just repeating that prices increased, which is what we're complaining about :messenger_confused:.
I guess people are trying to justify it when all I'm seeing is econ 101. Demand is high for the supply so prices go up.(Should I give the example of natural gas?)
 

CrustyBritches

Gold Member
I'm guessing this lands somewhere in between the 3080 and 3080 Ti at 1080p/1400p, so basically a 6800 XT. I've been kind of bouncing between the idea of getting a 4070 or 6800 XT. 6800 XT is like $30-40 cheaper, has more memory, and sometimes includes TLOU 1 that might be able to be sold for $20. 4070 has DLSS+DLSS 3, and probably better RT performance. Considered used, but at this price I'd rather have new with the warranty.
 

DaGwaphics

Member
I guess people are trying to justify it when all I'm seeing is econ 101. Demand is high for the supply so prices go up.(Should I give the example of natural gas?)

GPU demand is at a 10 year low, but go on.

The vendors have just decided to stretch their stack by moving the high-end to 1k and above. They are willing to accept fewer sales for higher margins, it is what it is. And before someone says something about inflation and rising costs, both companies are making more profits per GPU than they ever have at MSRP pricing (though AMD has now discounted a good portion of their margins on the 6000 series).
 

PeteBull

Member
To put context to the real price.

Saying you paid xyz in the 1902 means shit all without context of what was the real worth of that xyz you paid.....by giving the inflation price we now have an idea of what that was actually worth.
Exactly this, in countries with big inflation it has 0 meaning what u paid for stuff few years back, and last 2 years looks like US experienced big jump in inflation too(European countries ofc did much worse, still US isnt close to 0 inflation heaven it was few years back anymore, so u can never expect that ur 500$ has same value it once had, that way ur behaviour is getting dangerously close to that grandfather emigrant complaining/bragging, that back 80years ago he came fresh off the a boat having only 20 bucks in his pocket(ofc he never mentions food costed 20cents or something similar, aka value of money was much much higher ).

Its like canadians or australians saying- "im not paying 90-100$ for this game" when few years back they paid visibly less- just their money is worth much less than few years back, and much less vs US dolars of same amount.
 

PeteBull

Member
GPU demand is at a 10 year low, but go on.

The vendors have just decided to stretch their stack by moving the high-end to 1k and above. They are willing to accept fewer sales for higher margins, it is what it is. And before someone says something about inflation and rising costs, both companies are making more profits per GPU than they ever have at MSRP pricing (though AMD has now discounted a good portion of their margins on the 6000 series).
Inflation is real thing and last 2 years it was especially nasty around the globe, ofc ur point is valid too, nvidia/amd, their board partners and even retailers got used to ungodly fricken margins coz of crypto boom.

Good example of such thing is in my country one of the big online tech/pc components store owners(and footballclub owner at the same time) is one of the richest guys around, the thing is, he wasnt anywhere close to that few years back, he got sizeable chunk of his wealth very recently, selling pc components(especially gpu's) last 2 years taking advantage of crypto boom and got himself now to top38 richest person in 2022 in the country when only year prior he wasnt in top100 even. This year, so for 2023 richest in the country he is already in top20, even tho his football club pays highest salaries/contracts to his players being from small city and barely having any football fans coming to see matches/stadium has only 5000 places lol :)
 
Last edited:

SmokedMeat

Gamer™
Do you really want to play Last of Us with medium texture settings?

I can confirm Resident Evil 4 looked poor with medium textures on my old 8GB 3070ti.
I’m sure Last of Us would look equally as bad.

People need to do their due diligence and read up on these cards before just settling for whatever slop Nvidia wants to shovel out. We’re not seeing meaningful improvements and value because people just blind buy and they know it.
 

blastprocessor

The Amiga Brotherhood
The 3060 is worse or equal to a 2070. How is the 3060ti equal to a 2080? Looked the 3060 vs 2070 today to see if it's not faster than a 2070.

See TechPowerUp re: your query.. relative performance is 2% higher than 2080 super.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I can confirm Resident Evil 4 looked poor with medium textures on my old 8GB 3070ti.
I’m sure Last of Us would look equally as bad.

People need to do their due diligence and read up on these cards before just settling for whatever slop Nvidia wants to shovel out. We’re not seeing meaningful improvements and value because people just blind buy and they know it.
You bought a 3070Ti(obviously during the pandemic).....I wouldnt talk about settling for slop to anyone.

4lIEN51.png


6b-p.webp
 
Last edited:

SmokedMeat

Gamer™
You bought a 3070Ti(obviously during the pandemic).....I wouldnt talk about settling for slop to anyone.

I bought it during the pandemic for msrp, which was extremely difficult to do unless you were camped out at the one retailer who happened to get FE models in.

With things being back to normal, there’s no reason not to research all of your options.
 

SolidQ

Member
Very interesting video about VRAM with UE5 dev. There first time i'm hear for RTX IO need 32GB vram :messenger_grinning_smiling:
 
Last edited:

winjer

Gold Member
Top Bottom