• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: NVIDIA GeForce RTX 4070 Graphics Card Specs, Performance, Price & Availability (300W + 36TF performance)

The performance doesn't scale linearly with resolution tho, to go from FHD to 4K you need "just" 2-2.5x more computing power, depending on the title. So 4K120 in reality needs about 4-5x more power than 1080p60, which is still a lot, but the upcoming RTX4000 and RDNA3 cards should have enough of it.
Yea I figured I was oversimplifying it. Thank you for clarifying how it actually works.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I don't think these games actually use that kind of VRAM. Pretty sure they just "hold" as much as they can for cache?

On my PC at 1080p/max it takes about 5GB on my 6GB card. And even that is too much i think, some of it must be the cache.
The in game menu is borked.
It lists basically double what it would reserve.
 

Korranator

Member
Actually, a 3070 will play 4k60fps just fine. Sure you may have to tweak the setting a bit, but it can be done.

Also, For the life of me I still don't understand why people insist on playing games on ultra settings, unless it's a dick measuring contest. I thought gamers understood that ultra setting is for benchmarks and screenshots, only? There's plenty of videos out there proving that point.


 

RedPyramidHead

Gold Member
Also, For the life of me I still don't understand why people insist on playing games on ultra settings, unless it's a dick measuring contest.

 

Beanbox

Member
Actually, a 3070 will play 4k60fps just fine. Sure you may have to tweak the setting a bit, but it can be done.

Also, For the life of me I still don't understand why people insist on playing games on ultra settings, unless it's a dick measuring contest. I thought gamers understood that ultra setting is for benchmarks and screenshots, only? There's plenty of videos out there proving that point.



Depends on the game and the specific setting, but overall you're right. Also people don't insist, most gamers are in the mid-range segment, there is just a segment of enthusiasts who want to spend money and see how many FPS they can get while maxing everything out.
It's less about the visual quality and more about the thrill of overclocking and chasing performance.
 

hlm666

Member
So they really want RTX 3080 owners to NOT buy the RTX 4080.
It was bad enough with the old specs, but this is getting insulting.
They are cutting up that chip some more?

They better not try to be clever and say the 4080Ti is the full AD103.
The 4080Ti better be AD102 and atleast......atleast 320Bit.


P.S

The is not a Spec Bump.....its a Spec Dump.
The old specs had the RTX 4080 with 10240 CUDA cores this drops the CUDA cores down to 9728.
whats a 3080 do in TSE about ~10k? If it was priced at 700usd like the 3080 it would be a pretty compelling upgrade. Maybe there might be some larger increases in RT than raster? I think it would make a good upgrade over a 3080 if the price is the same, it wont be though (900-999 as per my call earlier in this thread). The prices across the board are going to make all these 4000 series good or bad and I feel most will be leaning in the bad direction. The 4090 looks awesome compared to the 3090 but at 2k rrp with a ti or titan follow up at over that price they start to look a little less impressive.

I'm more interested in seeing the pricing at this point than the performance metrics. If they are pulling this kind of stuff and looking to try push their customers to more expensive tiers to get the performance uplift they want it doesn't imply they think AMD are going to be a threat either, we need AMD to do what they did with the 4050hd/4070hd (great price/perf) but they seem happy pricing around nvidia prices and pretending a 10% perf win for in some games will gain market share. Nvidia arn't intel and AMD need to stop waiting for them to screw up, they did it with the samsung node 3000 series and they failed to punish it they arn't getting another chance for a while.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
whats a 3080 do in TSE about ~10k? If it was priced at 700usd like the 3080 it would be a pretty compelling upgrade. Maybe there might be some larger increases in RT than raster? I think it would make a good upgrade over a 3080 if the price is the same, it wont be though (900-999 as per my call earlier in this thread). The prices across the board are going to make all these 4000 series good or bad and I feel most will be leaning in the bad direction. The 4090 looks awesome compared to the 3090 but at 2k rrp with a ti or titan follow up at over that price they start to look a little less impressive.

I'm more interested in seeing the pricing at this point than the performance metrics. If they are pulling this kind of stuff and looking to try push their customers to more expensive tiers to get the performance uplift they want it doesn't imply they think AMD are going to be a threat either, we need AMD to do what they did with the 4050hd/4070hd (great price/perf) but they seem happy pricing around nvidia prices and pretending a 10% perf win for in some games will gain market share. Nvidia arn't intel and AMD need to stop waiting for them to screw up, they did it with the samsung node 3000 series and they failed to punish it they arn't getting another chance for a while.

Realistically Nvidia could double the performance of all their tiers easily with the chips theyve got.
Im just being a bit of a stickler because the gen on gen upgrade on the 3080 -> 4080 and 3070 -> 4070 is so much smaller than the jump from 3090 -> 4090.
Coming from Ampere where the 3080 wasnt "that" far off a 3090 it feels like the upgrade is being purposely held back.
The 4080 and 4090 have a huge gap between them.
 

hlm666

Member
Realistically Nvidia could double the performance of all their tiers easily with the chips theyve got.
Im just being a bit of a stickler because the gen on gen upgrade on the 3080 -> 4080 and 3070 -> 4070 is so much smaller than the jump from 3090 -> 4090.
Coming from Ampere where the 3080 wasnt "that" far off a 3090 it feels like the upgrade is being purposely held back.
The 4080 and 4090 have a huge gap between them.
That's completely valid, i've got a 3080 and not hugely interested in the 4080 unless it's cheaper than I expect because of what your saying here. The price between them might be bigger than the 3080/90 though also.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
The price between them might be bigger than the 3080/90 though also.
Hmm thats a good point.
But would seem unlikely.

The 3090 was double the price of the 3080.

If we assume Nvidia is adding a 100 dollar price premium for "reasons" the 4080 will be ~800 dollars MSRP.
1600 for a 4090 seems insane but then i remember people were buying 3090s for 2K.

Im praying to the moons and all the elder gods they dont price hike these things and scalpers/miners stay the fuck away.....can we get another crypto crash ASAP?

Im in no rush as is to upgrade since I dont think a game will truly crush me a 3080 at 3440x1440 but those high framerates.......and CUDA are all im hunting for.

Ill probably end up waiting to see what the 4080Ti is, there is too much space between the 4080 and 4090 for there not to be a card that slots in between the two.
 

Kenpachii

Gold Member
Actually, a 3070 will play 4k60fps just fine. Sure you may have to tweak the setting a bit, but it can be done.

Also, For the life of me I still don't understand why people insist on playing games on ultra settings, unless it's a dick measuring contest. I thought gamers understood that ultra setting is for benchmarks and screenshots, only? There's plenty of videos out there proving that point.



Because ultra actually gives better settings that people want, the same goes for even further then ultra modding.

Them pretending u can't see a difference is just them being special. Sure in same games its minor at best and some settings can have huge impact like in RDR2 as they are shit optimized on PC and u can ignore them, but that's a far cry from most games.
 

Rickyiez

Member


4070 300w is possibly faster than 3090 at 350w . That's pretty sweet . Not a big fans of 4080 consumption though , hopefully AMD has something just as fast but consume less

 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not


4070 300w is possibly faster than 3090 at 350w . That's pretty sweet . Not a big fans of 4080 consumption though , hopefully AMD has something just as fast but consume less


A 300W xx70?
Thats pretty sweet?

Mate what?


Lets look at the last few generations:

  • 0970 - 150W
  • 1070 - 150W
  • 2070 - 174W
  • 3070 - 220W
  • 4070 - 300W????

Sweet?
Mate Nvidia are taking a piss.
If it wasnt for kopite being such a reliable leaker I wouldnt believe Nvidia would be cocky enough to make an xx70 thats 300W+ when weve been skating the 200W line for years and years in the midrange..
 

Rickyiez

Member
A 300W xx70?
Thats pretty sweet?

Mate what?


Lets look at the last few generations:

  • 0970 - 150W
  • 1070 - 150W
  • 2070 - 174W
  • 3070 - 220W
  • 4070 - 300W????

Sweet?
Mate Nvidia are taking a piss.
If it wasnt for kopite being such a reliable leaker I wouldnt believe Nvidia would be cocky enough to make an xx70 thats 300W+ when weve been skating the 200W line for years and years in the midrange..
Well , 30xx series was already hinting heavy bumps on power consumption . At least it's getting 3090 performance or better with 50w less . Meanwhile , some here still believe that 4080 will only be as fast as 3090 🤷‍♂️
 
Last edited:

64bitmodels

Reverse groomer.
Now we’re calling the xx70 line “high end”? I classify it as mid-tier.

high end: xx80 and xx90
mid tier: xx70
economy tier: xx60

anything below is low tier. it looks like this gen will make the difference between tiers even larger.
ehhhhh
economy tier implies the 60 cards are budget options which is true to an extent, but 50 series is far more effective in that case
they should be in mid tier, and anything lower than that low tier

anyways, its surprising how no one here is mentioning the nvidia A2000.
is it expensive? yeah, it's 700 dollars (520 on amazon) but it's 3060 power in the size of a low profile 1650 (although it has less VRAM, 6gb)
not only that, but the power consumption is so fucking LOW that it can run off of PCIE power alone. With 3060 performance. Nvidia should be marketing and pushing LP GPUs like this more, havent even heard of it until this guys video

 
I'm pretty sure a 3080 will be more than good enough until the 5000 series come out, so I'm not going to bother with this. Maybe by then the shortages will be over and it won't take 1.5 years to get easily get one that isn't scalped.

The whole 3000 series launch, upselling, Newegg shuffle, etc. left a bad taste in my mouth that'll be there for a couple more years
 

//DEVIL//

Member


4070 300w is possibly faster than 3090 at 350w . That's pretty sweet . Not a big fans of 4080 consumption though , hopefully AMD has something just as fast but consume less

I don't believe these numbers.

I don't think there was ever any generational difference where a card is 100% more powerful than the previous generation. (If I am not mistaken ).
 

Reallink

Member
ehhhhh
economy tier implies the 60 cards are budget options which is true to an extent, but 50 series is far more effective in that case
they should be in mid tier, and anything lower than that low tier

anyways, its surprising how no one here is mentioning the nvidia A2000.
is it expensive? yeah, it's 700 dollars (520 on amazon) but it's 3060 power in the size of a low profile 1650 (although it has less VRAM, 6gb)
not only that, but the power consumption is so fucking LOW that it can run off of PCIE power alone. With 3060 performance. Nvidia should be marketing and pushing LP GPUs like this more, havent even heard of it until this guys video

Because there are only like 3 people on the planet who care about the size or power draw of the $800 GPU they're putting in their 3 cubic foot PC Gaming Tower. Marketing such a product would be flushing marketing dollars down the toilet.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Well , 30xx series was already hinting heavy bumps on power consumption . At least it's getting 3090 performance or better with 50w less . Meanwhile , some here still believe that 4080 will only be as fast as 3090 🤷‍♂️
3070 was like 25% increase.....and thats almost entirely because the 2070 was power starved at that TDP should have been 200W(see 2070Super)
Even the 3070s 220W TDP is actually closer to 200 in real world.
300W is ridiculous a 37% increase?
Breaking into 300 is not a good thing for the midrange.
Because you effectively cant have a midrange PSU now, your midrange computer needs to have high end parts to keep up with the "midrange GPU"?
I don't believe these numbers.

I don't think there was ever any generational difference where a card is 100% more powerful than the previous generation. (If I am not mistaken ).
None of the figures are out of the realm of possibility.

The 4090Ti/Titan being Full AD102 and some ways ahead of the next chip down is different to the 3090Ti which is a few cores over a 3090 base, id say its like an overclock ahead of the 3090.
The 3090Ti was a joke and was Nvidia spitting in the mouth of people who bought them.
This time the Titan is a sizeable upgrade over the 4090 so that kinda skews the gen on gen increase.
I can see why Nvidia wants this to be the first card launched....they pretty much already knew it was going to be near double a 3090.

But if we look just a generation ago, 3080Ti and 2080ti it was about 55%.
Ada is an increase in core count, architectural improvements and a node shrink, gains should be higher than last gen but here are my TSE predictions.


2080Ti
3080Ti
~55% uplift.

3090
4090
~60% uplift.

3080
4080
~62% uplift.

3070
4070
~60% uplift.
 

OZ9000

Member
A 300W xx70?
Thats pretty sweet?

Mate what?


Lets look at the last few generations:

  • 0970 - 150W
  • 1070 - 150W
  • 2070 - 174W
  • 3070 - 220W
  • 4070 - 300W????

Sweet?
Mate Nvidia are taking a piss.
If it wasnt for kopite being such a reliable leaker I wouldnt believe Nvidia would be cocky enough to make an xx70 thats 300W+ when weve been skating the 200W line for years and years in the midrange..
I'm still rocking my 2070S and holds up incredibly well.

I'll probably cave in for the 4070 but that power jump is staggering.

The trend for ever increasing power is pretty fucking retarded.
 

//DEVIL//

Member
3070 was like 25% increase.....and thats almost entirely because the 2070 was power starved at that TDP should have been 200W(see 2070Super)
Even the 3070s 220W TDP is actually closer to 200 in real world.
300W is ridiculous a 37% increase?
Breaking into 300 is not a good thing for the midrange.
Because you effectively cant have a midrange PSU now, your midrange computer needs to have high end parts to keep up with the "midrange GPU"?

None of the figures are out of the realm of possibility.

The 4090Ti/Titan being Full AD102 and some ways ahead of the next chip down is different to the 3090Ti which is a few cores over a 3090 base, id say its like an overclock ahead of the 3090.
The 3090Ti was a joke and was Nvidia spitting in the mouth of people who bought them.
This time the Titan is a sizeable upgrade over the 4090 so that kinda skews the gen on gen increase.
I can see why Nvidia wants this to be the first card launched....they pretty much already knew it was going to be near double a 3090.

But if we look just a generation ago, 3080Ti and 2080ti it was about 55%.
Ada is an increase in core count, architectural improvements and a node shrink, gains should be higher than last gen but here are my TSE predictions.


2080Ti
3080Ti
~55% uplift.

3090
4090
~60% uplift.

3080
4080
~62% uplift.

3070
4070
~60% uplift.

Your estimate is about what I would say too. Each generation it's about 50% increase. Give or take few %. But there is no way it will be a 100% increase it's not logical
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Your estimate is about what I would say too. Each generation it's about 50% increase. Give or take few %. But there is no way it will be a 100% increase it's not logical
For the 3090Ti to 4090Ti/Titan its much much higher (not 100% but much higher gen on gen)
It kinda makes sense because the 3090Ti was basically an overclocked 3090.
Where as the 4090Ti isnt an overclocked 4090, theres like ~2000 extra CUDA cores and its using the latest 24Gbps Memory chips.

Down the stack yeah 50 - 60% should be the norm and within what usually happens Turing not withstanding.
 

//DEVIL//

Member
For the 3090Ti to 4090Ti/Titan its much much higher (not 100% but much higher gen on gen)
It kinda makes sense because the 3090Ti was basically an overclocked 3090.
Where as the 4090Ti isnt an overclocked 4090, theres like ~2000 extra CUDA cores and its using the latest 24Gbps Memory chips.

Down the stack yeah 50 - 60% should be the norm and within what usually happens Turing not withstanding.
Assuming this is correct, look at the graph posted above, everything down to the 4070 is 100% more powerful than it's 3000 card on TSE.

This is no way to be correct.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Assuming this is correct, look at the graph posted above, everything down to the 4070 is 100% more powerful than it's 3000 card on TSE.

This is no way to be correct.
Whoever made that graph isnt very good at Maths and/or doesnt know how to preset uplifts.
And they are probably overestimating TSE for these new cards.
In real world examples itll be quite different.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
anyways, its surprising how no one here is mentioning the nvidia A2000.
is it expensive? yeah, it's 700 dollars (520 on amazon) but it's 3060 power in the size of a low profile 1650 (although it has less VRAM, 6gb)
not only that, but the power consumption is so fucking LOW that it can run off of PCIE power alone. With 3060 performance. Nvidia should be marketing and pushing LP GPUs like this more, havent even heard of it until this guys video


Hahahaha ~700 dollars for a below RTX2060 level card.
Its literally a 1660S with RT cores.
For gaming Nvidias worst GeForce cards beat it.

You can get an RTX2060 for $250.
 

Sanepar

Member
Actually, a 3070 will play 4k60fps just fine. Sure you may have to tweak the setting a bit, but it can be done.

Also, For the life of me I still don't understand why people insist on playing games on ultra settings, unless it's a dick measuring contest. I thought gamers understood that ultra setting is for benchmarks and screenshots, only? There's plenty of videos out there proving that point.


If it is to play on lower settings I will just go for consoles.
 

//DEVIL//

Member
If it is to play on lower settings I will just go for consoles.
You would say that but no. The max output as 120fps on consoles is fhd if COD is anything to go by with medium settings graphics wise compared to a PC.

The 3070 with dlss on quality can get you 4k 120 fps ultra graphics settings .
If you go med settings you will get locked 144 frames with dlss on quality. 100 frames at native 4k.

So no. Consoles aren't even close to PC on terms of graphics.

They are severely under powered. I mean, the matrix demo at 30 fps fhd with all the blurr should have gave you a hint :/
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Finally some good news.
Not only that.
But looks like my predicted 4080Ti is atleast in the works.
I am finally truly excited for Ada.
I aint coughing up for a 4090 now that I know a 20GB Ti is likely on its way.

AD102 was just too big a chip for them to not cut it up and fit atleast one more card in there.


RTX4080 - 16GB GDDR6X
RTX4080Ti - 20GB GDDR6X
RTX4090 - 24GB GDDR6X
RTX4090Ti - 48GB GDDR6X
 

tusharngf

Member
rumored specs

NVIDIA GeForce RTX 4080 Series Preliminary Specs:​

GRAPHICS CARD NAMENVIDIA GEFORCE RTX 4080 TINVIDIA GEFORCE RTX 4080NVIDIA GEFORCE RTX 3090 TINVIDIA GEFORCE RTX 3080
GPU NameAda Lovelace AD102-250?Ada Lovelace AD103-300?Ampere GA102-225Ampere GA102-200
Process NodeTSMC 4NTSMC 4NSamsung 8nmSamsung 8nm
Die Size~450mm2~450mm2628.4mm2628.4mm2
TransistorsTBDTBD28 Billion28 Billion
CUDA Cores148489728?102408704
TMUs / ROPsTBD / 232?TBD / 214?320 / 112272 / 96
Tensor / RT CoresTBD / TBDTBD / TBD320 / 80272 / 68
Base ClockTBDTBD1365 MHz1440 MHz
Boost Clock~2600 MHz~2500 MHz1665 MHz1710 MHz
FP32 Compute~55TFLOPs~50 TFLOPs34 TFLOPs30 TFLOPs
RT TFLOPsTBDTBD67 TFLOPs58 TFLOPs
Tensor-TOPsTBDTBD273 TOPs238 TOPs
Memory Capacity20 GB GDDR6X16 GB GDDR6X12 GB GDDR6X10 GB GDDR6X
Memory Bus320-bit256-bit384-bit320-bit
Memory Speed21.0 Gbps?21.0 Gbps?19 Gbps19 Gbps
Bandwidth840 GB/s672 2GB/s912 Gbps760 Gbps
TBP450W320W350W320W
Price (MSRP / FE)$1199 US?$699 US?$1199$699 US
Launch (Availability)2023?July 2022?3rd June 202117th September 2020

NVIDIA GeForce RTX 4070 Series Preliminary Specs:​

GRAPHICS CARD NAMENVIDIA GEFORCE RTX 4070 TINVIDIA GEFORCE RTX 4070NVIDIA GEFORCE RTX 3070 TINVIDIA GEFORCE RTX 3070
GPU NameAD104-400?AD104-400?Ampere GA104-400Ampere GA104-300
Process NodeTSMC 4NTSMC 4NSamsung 8nmSamsung 8nm
Die Size~300mm2~300mm2395.2mm2395.2mm2
TransistorsTBDTBD17.4 Billion17.4 Billion
PCBNVIDIA PG141-SKU331NVIDIA PG141-310 SKU341NVIDIA PG141NVIDIA PG142
CUDA Cores~7680~768061445888
TMUs / ROPsTBD / 160TBD / 160192/ 96184 / 96
Tensor / RT CoresTBD / TBDTBD / TBD192/ 48184 / 46
Base ClockTBDTBD1575 MHz1500 MHz
Boost Clock~2.6 GHz~2.5 GHz1770 MHz1730 MHz
FP32 Compute~40 TFLOPs~38 TFLOPs22 TFLOPs20 TFLOPs
RT TFLOPsTBDTBD42 TFLOPs40 TFLOPs
Tensor-TOPsTBDTBD174 TOPs163 TOPs
Memory Capacity12 GB GDDR6X?12 GB GDDR6X?8 GB GDDR6X8 GB GDDR6
Memory Bus192-bit192-bit256-bit256-bit
Memory Speed21 Gbps21 Gbps19 Gbps14 Gbps
Bandwidth504 GB/s504 GB/s608 Gbps448 Gbps
TBP~400W285W?290W220W
Price (MSRP / FE)$599 US?$499 US?$599 US$499 US
Launch (Availability)2022202210th June 202129th October 2020

Source https://wccftech.com/nvidia-geforce...-4070-at-285w-tbp-tamer-powerful-gaming-gpus/
 

Sanepar

Member
rumored specs

NVIDIA GeForce RTX 4080 Series Preliminary Specs:​

GRAPHICS CARD NAMENVIDIA GEFORCE RTX 4080 TINVIDIA GEFORCE RTX 4080NVIDIA GEFORCE RTX 3090 TINVIDIA GEFORCE RTX 3080
GPU NameAda Lovelace AD102-250?Ada Lovelace AD103-300?Ampere GA102-225Ampere GA102-200
Process NodeTSMC 4NTSMC 4NSamsung 8nmSamsung 8nm
Die Size~450mm2~450mm2628.4mm2628.4mm2
TransistorsTBDTBD28 Billion28 Billion
CUDA Cores148489728?102408704
TMUs / ROPsTBD / 232?TBD / 214?320 / 112272 / 96
Tensor / RT CoresTBD / TBDTBD / TBD320 / 80272 / 68
Base ClockTBDTBD1365 MHz1440 MHz
Boost Clock~2600 MHz~2500 MHz1665 MHz1710 MHz
FP32 Compute~55TFLOPs~50 TFLOPs34 TFLOPs30 TFLOPs
RT TFLOPsTBDTBD67 TFLOPs58 TFLOPs
Tensor-TOPsTBDTBD273 TOPs238 TOPs
Memory Capacity20 GB GDDR6X16 GB GDDR6X12 GB GDDR6X10 GB GDDR6X
Memory Bus320-bit256-bit384-bit320-bit
Memory Speed21.0 Gbps?21.0 Gbps?19 Gbps19 Gbps
Bandwidth840 GB/s672 2GB/s912 Gbps760 Gbps
TBP450W320W350W320W
Price (MSRP / FE)$1199 US?$699 US?$1199$699 US
Launch (Availability)2023?July 2022?3rd June 202117th September 2020

NVIDIA GeForce RTX 4070 Series Preliminary Specs:​

GRAPHICS CARD NAMENVIDIA GEFORCE RTX 4070 TINVIDIA GEFORCE RTX 4070NVIDIA GEFORCE RTX 3070 TINVIDIA GEFORCE RTX 3070
GPU NameAD104-400?AD104-400?Ampere GA104-400Ampere GA104-300
Process NodeTSMC 4NTSMC 4NSamsung 8nmSamsung 8nm
Die Size~300mm2~300mm2395.2mm2395.2mm2
TransistorsTBDTBD17.4 Billion17.4 Billion
PCBNVIDIA PG141-SKU331NVIDIA PG141-310 SKU341NVIDIA PG141NVIDIA PG142
CUDA Cores~7680~768061445888
TMUs / ROPsTBD / 160TBD / 160192/ 96184 / 96
Tensor / RT CoresTBD / TBDTBD / TBD192/ 48184 / 46
Base ClockTBDTBD1575 MHz1500 MHz
Boost Clock~2.6 GHz~2.5 GHz1770 MHz1730 MHz
FP32 Compute~40 TFLOPs~38 TFLOPs22 TFLOPs20 TFLOPs
RT TFLOPsTBDTBD42 TFLOPs40 TFLOPs
Tensor-TOPsTBDTBD174 TOPs163 TOPs
Memory Capacity12 GB GDDR6X?12 GB GDDR6X?8 GB GDDR6X8 GB GDDR6
Memory Bus192-bit192-bit256-bit256-bit
Memory Speed21 Gbps21 Gbps19 Gbps14 Gbps
Bandwidth504 GB/s504 GB/s608 Gbps448 Gbps
TBP~400W285W?290W220W
Price (MSRP / FE)$599 US?$499 US?$599 US$499 US
Launch (Availability)2022202210th June 202129th October 2020

Source https://wccftech.com/nvidia-geforce...-4070-at-285w-tbp-tamer-powerful-gaming-gpus/
If 4080 is at least 60% faster in gaming than 3080 I will jump in.
 

nkarafo

Member
These power requirements are getting ridiculous. After having to deal with GPUs being 2x more expensive than normal for 2 years, now we have to pay as much for power bills, if we buy a new gen card.

Even the 4060 is probably going to be a 220+ watt card :messenger_neutral:

I'm thinking of switching to consoles... I like efficiency.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
rumored specs

NVIDIA GeForce RTX 4080 Series Preliminary Specs:​

GRAPHICS CARD NAMENVIDIA GEFORCE RTX 4080 TINVIDIA GEFORCE RTX 4080NVIDIA GEFORCE RTX 3090 TINVIDIA GEFORCE RTX 3080
GPU NameAda Lovelace AD102-250?Ada Lovelace AD103-300?Ampere GA102-225Ampere GA102-200
Process NodeTSMC 4NTSMC 4NSamsung 8nmSamsung 8nm
Die Size~450mm2~450mm2628.4mm2628.4mm2
TransistorsTBDTBD28 Billion28 Billion
CUDA Cores148489728?102408704
TMUs / ROPsTBD / 232?TBD / 214?320 / 112272 / 96
Tensor / RT CoresTBD / TBDTBD / TBD320 / 80272 / 68
Base ClockTBDTBD1365 MHz1440 MHz
Boost Clock~2600 MHz~2500 MHz1665 MHz1710 MHz
FP32 Compute~55TFLOPs~50 TFLOPs34 TFLOPs30 TFLOPs
RT TFLOPsTBDTBD67 TFLOPs58 TFLOPs
Tensor-TOPsTBDTBD273 TOPs238 TOPs
Memory Capacity20 GB GDDR6X16 GB GDDR6X12 GB GDDR6X10 GB GDDR6X
Memory Bus320-bit256-bit384-bit320-bit
Memory Speed21.0 Gbps?21.0 Gbps?19 Gbps19 Gbps
Bandwidth840 GB/s672 2GB/s912 Gbps760 Gbps
TBP450W320W350W320W
Price (MSRP / FE)$1199 US?$699 US?$1199$699 US
Launch (Availability)2023?July 2022?3rd June 202117th September 2020

NVIDIA GeForce RTX 4070 Series Preliminary Specs:​

GRAPHICS CARD NAMENVIDIA GEFORCE RTX 4070 TINVIDIA GEFORCE RTX 4070NVIDIA GEFORCE RTX 3070 TINVIDIA GEFORCE RTX 3070
GPU NameAD104-400?AD104-400?Ampere GA104-400Ampere GA104-300
Process NodeTSMC 4NTSMC 4NSamsung 8nmSamsung 8nm
Die Size~300mm2~300mm2395.2mm2395.2mm2
TransistorsTBDTBD17.4 Billion17.4 Billion
PCBNVIDIA PG141-SKU331NVIDIA PG141-310 SKU341NVIDIA PG141NVIDIA PG142
CUDA Cores~7680~768061445888
TMUs / ROPsTBD / 160TBD / 160192/ 96184 / 96
Tensor / RT CoresTBD / TBDTBD / TBD192/ 48184 / 46
Base ClockTBDTBD1575 MHz1500 MHz
Boost Clock~2.6 GHz~2.5 GHz1770 MHz1730 MHz
FP32 Compute~40 TFLOPs~38 TFLOPs22 TFLOPs20 TFLOPs
RT TFLOPsTBDTBD42 TFLOPs40 TFLOPs
Tensor-TOPsTBDTBD174 TOPs163 TOPs
Memory Capacity12 GB GDDR6X?12 GB GDDR6X?8 GB GDDR6X8 GB GDDR6
Memory Bus192-bit192-bit256-bit256-bit
Memory Speed21 Gbps21 Gbps19 Gbps14 Gbps
Bandwidth504 GB/s504 GB/s608 Gbps448 Gbps
TBP~400W285W?290W220W
Price (MSRP / FE)$599 US?$499 US?$599 US$499 US
Launch (Availability)2022202210th June 202129th October 2020

Source https://wccftech.com/nvidia-geforce...-4070-at-285w-tbp-tamer-powerful-gaming-gpus/

Not like this.
If the 70Ti is AD104 based and on the same bus as the 4070 expect it to sell horribly.
It needs to be on AD103 else that extra 100 dollars wont do much to sway people towards it.....considering you would be spitting distance from an FE 4080.
aw man, I dont wanna wait for 2023 for a 4080Ti
I think we are gonna have to.
My only choices going into this generation were:
320bit 20GB Card - 4080Ti
Skip the generation
RTX 4090

In that order.

So if that 4080Ti has any chance, any chance of coming out before August 2023, im holding out.
Its the perfect upgrade IMO for 3080 class cards.
The 4090 is likely gonna cost an arm and a leg but performance and FOMO had a chance of forcing my hand, seeing that Nvidia is atleast thinking about an 80Ti takes the 4090 off the table.
 

CuNi

Member

Not like this.
If the 70Ti is AD104 based and on the same bus as the 4070 expect it to sell horribly.
It needs to be on AD103 else that extra 100 dollars wont do much to sway people towards it.....considering you would be spitting distance from an FE 4080.

I think we are gonna have to.
My only choices going into this generation were:
320bit 20GB Card - 4080Ti
Skip the generation
RTX 4090

In that order.

So if that 4080Ti has any chance, any chance of coming out before August 2023, im holding out.
Its the perfect upgrade IMO for 3080 class cards.
The 4090 is likely gonna cost an arm and a leg but performance and FOMO had a chance of forcing my hand, seeing that Nvidia is atleast thinking about an 80Ti takes the 4090 off the table.

I dunno. I think I will sit out the 4000-gen and see if power draw goes down with 5000-gen as well as RTX improvements etc.
The 3080 is having no issues pushing current games well past 120fps at 1080p and many others even further than that and so far I'm not that hyped for Ray tracing.
RTX I/O hasn't even released yet on top of that.
 

DenchDeckard

Gold Member

Not like this.
If the 70Ti is AD104 based and on the same bus as the 4070 expect it to sell horribly.
It needs to be on AD103 else that extra 100 dollars wont do much to sway people towards it.....considering you would be spitting distance from an FE 4080.

I think we are gonna have to.
My only choices going into this generation were:
320bit 20GB Card - 4080Ti
Skip the generation
RTX 4090

In that order.

So if that 4080Ti has any chance, any chance of coming out before August 2023, im holding out.
Its the perfect upgrade IMO for 3080 class cards.
The 4090 is likely gonna cost an arm and a leg but performance and FOMO had a chance of forcing my hand, seeing that Nvidia is atleast thinking about an 80Ti takes the 4090 off the table.

Yeah. I think you are right and I might just hold out.....if the 4090 is like 1399 though....I might bite.
 

Sanepar

Member
I'm really disappointed with 4080 perf. 15000 tse will be 45% improv over my 6900 xt. I hope 7800xt can do at least 17000 tse and have decent ray tracing.
 
That post is from way before the delay and the second delay.
The 3090Ti paper launched in April, Ada was supposed to come out in July.
Waiting ~3 months to pay the same amount of money or less for double the performance would be quite err something.
But I can understand why people jump on the latest greatest thing immediately.

Im still hopeful that there is a AD102 320bit RTX 4080Ti that comes with 20GB of VRAM. I dont trust Nvidia to NOT charge $2000 dollars for a GPU.
Nvidia saturating the market ends up cannibalizing their own products.
Which leads to weird price drops with retailers and ebay.
RTX3060Ti
RTX3070
RTX3070Ti
RTX3080Ti 16G - Cancelled

Were basically all fighting for the same customers.

RTX3080
RTX3080 12G
RTX3080 20G - Cancelled
RTX3080Ti

Again were basically all fighting each other to the point RTX308012Gs cost the same as RTX3080Tis so Nvidia cancelled that shit ASAP.
Why does nvidia overcharge hundreds for their gpus? I’m looking to build a pic but I may just go amd out of principle I can accept a 2000$ gpu if it’s 3x the performance but not when it’s barely 2x so what gives
 

Neo_game

Member
Actually, a 3070 will play 4k60fps just fine. Sure you may have to tweak the setting a bit, but it can be done.

Also, For the life of me I still don't understand why people insist on playing games on ultra settings, unless it's a dick measuring contest. I thought gamers understood that ultra setting is for benchmarks and screenshots, only? There's plenty of videos out there proving that point.



Obviously there are more benefits to PC but IMO if someone is happy with low or medium setting they should simply get the console instead.

I still find it crazy people are calling RTX 4070 as mid range card and it is going to be perform better than 3080 🤷‍♂️
 
Top Bottom