• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rumor: NVIDIA GeForce RTX 4070 Graphics Card Specs, Performance, Price & Availability (300W + 36TF performance)

Hezekiah

Banned
I have a 3080 and I am happy with it. seeing no next-gen game only game coming this year, buying any card really higher than 3080 is pointless. once next-gen only games are out that will use the GPU power needed, I will probably upgrade then. I need a card that will run unreal engine 5 games at ultra-wide 2k with 120 frames everything ultra. and from the look of it, not even the new-gen cards will do such a thing. (if the matrix demo is anything to go by)

The main game I actually play the most is Call of duty. and MW2 is a cross-gen game. meaning even the 3070 is more than fine for 2k gaming.

Software isn't really keeping up with the hardware advancement sadly. we are 2 years behind when it comes to software to hardware ratio.

Pretty much what Devil says. Got a 3080 can' t get excited about these cards when theres nothing mind-blowing to run on it. Can't see me upgrading until the year after at the earliest simply no need.

Although will be upgrading my laptop once these 240hz oleds drop, but thats a different thread.
If I had a 3080 I would absolutely be holding off until the 5000 series, with there being no software upcoming to take advantage of these upcoming cards.
 
My 3080tinis enough till at least pro consoles come out...I'll upgrade after that if only pro consoles can match or go beyond that
I'm blindly positive the pro consoles won't touch the base 3080 let alone the ti.
Edit-
Ps5 and Xbox series x are in mobile 3070 territory, and not ti.
 
Last edited:

rofif

Gold Member
then buy that… the 4060 will likely fall into that range.
It's not working like that.
You can clearly see that the gpu market power draw is shifting to around 250-350w.
Previously, high end video cards were less power hungry.
1070ti was 150w and 1080 was 180w.... and it was already a lot.

You cannot reasonably ask someone to give up their hobby by capping to a low end graphics card. You should be asking the market leaders to innovate nd really shrink the dies rather than just expanding.
It's not a challange to make 3000w gpu. It's just a matter of die size and materials
 
Pretty much what Devil says. Got a 3080 can' t get excited about these cards when theres nothing mind-blowing to run on it. Can't see me upgrading until the year after at the earliest simply no need.

Although will be upgrading my laptop once these 240hz oleds drop, but thats a different thread.
I also am excited for when we can make that thead!
 
It's not working like that.
You can clearly see that the gpu market power draw is shifting to around 250-350w.
Previously, high end video cards were less power hungry.
1070ti was 150w and 1080 was 180w.... and it was already a lot.

You cannot reasonably ask someone to give up their hobby by capping to a low end graphics card. You should be asking the market leaders to innovate nd really shrink the dies rather than just expanding.
It's not a challange to make 3000w gpu. It's just a matter of die size and materials
Not disagreeing with how insane the power requirements are getting but the 4060 should be really good still, maybe as good as 3080.

300 watts is my absolute limit for cards though, would never get these 450 watt+ space heaters.
 

Rudius

Member
I'm interested in the laptop GPUs below 100w and, hopefully, the same price as current gen mobile mid-range.
 

Dream-Knife

Member
L

This doesn't make any sense.
A 3070 ti is 5% slower than a 3080 so u think a 4070 new gen will be on par with a 3080? Come on!

4070 will be on par with a 3090 ti, probably a bit faster 5%

4080 will be 20-30% faster than a 3090 ti

3090 ti is 40 tflops gpu. 4080 is 50. So just for that will be probably 20% faster besides there is the new arch features.
3070ti to 3080 is 20%.
 

clarky

Member
I'm interested in the laptop GPUs below 100w and, hopefully, the same price as current gen mobile mid-range.
Yeah waiting on the mobile side myself, coming from a current 2080 maxq and a 60hz oled, a 4080mobile and a 240hz oled should be a decent bump next year.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
I really doubt there will be that huge difference in TF between RTX 4080 and 4090(50tf vs 90tf).

I also think, even though RTX 4080 will be beast, its still better choice to have 1440p/144hz monitor than 4k/60hz..

The 80ti and 90s are gonna be AD102s so its back to old days of 102s being massively more performant than the xx80s.
The 1080ti was a huge upgrade over the 1080.
Ampere is where things got weird with the xx80 being a 102 chip.

So yeah I totally expect the 80 ti, 90 and 90ti to be a huge upgrade over the xx80.
For people with 3080s we basically have to wait for the 80ti or buy a 90.
 

Dream-Knife

Member
 

Pagusas

Elden Member
It's not working like that.
You can clearly see that the gpu market power draw is shifting to around 250-350w.
Previously, high end video cards were less power hungry.
1070ti was 150w and 1080 was 180w.... and it was already a lot.

You cannot reasonably ask someone to give up their hobby by capping to a low end graphics card. You should be asking the market leaders to innovate nd really shrink the dies rather than just expanding.
It's not a challange to make 3000w gpu. It's just a matter of die size and materials
no thanks, I 100% disagree with you.

we have game consoles and lowered powered GPU’s for people who want a balanced power/thermal/performance ratio, but I love that the market has said “you choose what you want, you want to water cool, push the limits and get a 390watt card with 115% power limit over ride? here you go”. I want them to push the limits, and price it high as an enthusiast option. people who care about power draw shouldn’t be buying a high end enthusiast gpu, just like people who care about MPG shouldn’t be buying a Lamborghini.
 
Last edited:

clarky

Member
I know I should wait for the next round of cards but it’s hard to wait when I’ve already waited 5 years
If your coming from a 1070/1080 no reason to wait, or get a used 3080 for less when these drop( thats what i would do)
 
Last edited:

Dream-Knife

Member
Ok guys keep dreaming a 4070 will be a 3080 ti perf.

I know people paid a lot for current gen but perf will be there with these new cards and avaliability will be better.
What does that have to do with 3080 being 20% above 3070ti?

I'm just going by the tf measurement. My 3080 is clocked at 2085 giving it 36.29 tf, which from the leak appears to be where the 4070 is.
 
This is going to be like going from Kepler to Maxwell. And ray tracing will be faster on the lower end Lovelace cards even vs. 3090.
 
Offtopik: I'm planning to get the C2 48" since GAF told me it's the best tv for gaming, will 1440p be enough at 1.5 meters?
1440 on a 4k screen is fine and the only way you would notice a difference is if a 4k display is sitting right next to you running native 4k content.

But saying the C2 is the best for gaming is quite a vague statement and having tried the C2 in my home and owning a 77" C9 I can assure in 90% of my cases I would rather game on anything else

In perfect low light conditions then maybe yeah but for me OLED simply is not bright enough
 

Klik

Member
The 80ti and 90s are gonna be AD102s so its back to old days of 102s being massively more performant than the xx80s.
The 1080ti was a huge upgrade over the 1080.
Ampere is where things got weird with the xx80 being a 102 chip.

So yeah I totally expect the 80 ti, 90 and 90ti to be a huge upgrade over the xx80.
For people with 3080s we basically have to wait for the 80ti or buy a 90.
Well if the difference between 4080 and 4090 is about 25%-30%in games and price being around 700$ for 4080 and 1000$ for 4090 i can see many people opt for 4090, including myself. But i think price will be around 900$/1400$.

This time i really need to buy good GPU for VR
 
Last edited:

rofif

Gold Member
no thanks, I 100% disagree with you.

we have game consoles and lowered powered GPU’s for people who want a balanced power/thermal/performance ratio, but I love that the market has said “you choose what you want, you want to water cool, push the limits and get a 390watt car with 115% power limit over ride? here you go”. I want them to push the limits, and price it high as an enthusiast option. people who care about power draw shouldn’t be buying a high end enthusiast gpu, just like people who care about MPG shouldn’t be buying a Lamborghini.
What are you disagreeing with? The facts?
And putting consoles in comparison only serves to show how PAINFULLY POWER HUNGRY the gpus are.
PS5 takes 200 watts at most.
rtx 2080 is alone around 250watt... and where is cpu, motherboard, ssd, fans ?
 
VR seems like an endless pit of resources so even if I did have a 3090 it wouldn't be enough. Hopefully 4080 can do stuff like hitman 3 with ray tracing. If not I'll settle for games like re8 beeing crisp and smooth
 

Pagusas

Elden Member
What are you disagreeing with? The facts?
And putting consoles in comparison only serves to show how PAINFULLY POWER HUNGRY the gpus are.
PS5 takes 200 watts at most.
rtx 2080 is alone around 250watt... and where is cpu, motherboard, ssd, fans ?
i’m disagreeing with the idea that the current trend is bad. you can still 100% build a low powered build that’s very powerful, i’ve build several SFF builds that could run great in a very small power profile. but now we have the option to go crazy and build 800+ watt beast of machines, I love it.
 

winjer

Member
What are you disagreeing with? The facts?
And putting consoles in comparison only serves to show how PAINFULLY POWER HUNGRY the gpus are.
PS5 takes 200 watts at most.
rtx 2080 is alone around 250watt... and where is cpu, motherboard, ssd, fans ?

Comparing current gen consoles to a 2080 is not the best comparison.
For one, the 2080 is made in the 12nm process node. Which was just an improvement on the 16nm node.
Then the RTX2080 has more hardware in it. Like better RT units and full tensor units. Resulting in a bigger chip.

If we are goin to compare a PS5 to a PC GPU it would be closer to a 6600XT. And in that case, the difference is not that big.
For power consumption, the 6600XT does have an advantage by having just a 128bit bus. While the PS5 has a 256 bit bus that is more power hungry.
This is the real power usage of a 6600XT, removing other components.

 

rofif

Gold Member
Comparing current gen consoles to a 2080 is not the best comparison.
For one, the 2080 is made in the 12nm process node. Which was just an improvement on the 16nm node.
Then the RTX2080 has more hardware in it. Like better RT units and full tensor units. Resulting in a bigger chip.

If we are goin to compare a PS5 to a PC GPU it would be closer to a 6600XT. And in that case, the difference is not that big.
For power consumption, the 6600XT does have an advantage by having just a 128bit bus. While the PS5 has a 256 bit bus that is more power hungry.
This is the real power usage of a 6600XT, removing other components.

you are still comparing gpu alone to whole system.
And yes. ps5 does compare to 2070ti - 2080
 

winjer

Member
you are still comparing gpu alone to whole system.
And yes. ps5 does compare to 2070ti - 2080

The 6600XT is comparable to a 2070S and it only uses 165W.
A Zen2 at 3.5Ghz will probably use around 35W.
Add fans, SSD, keyboard+mouse, motherboard, and it goes a bit over 200W.
Not that much above a PS5. Your argument is only valid when comparing with GPUs made in old process nodes.
 
Last edited:

Reallink

Member
Wuuut ? You expect a new gen of cards to be the same thing as the old one ? A 3090 is 10% faster at 1080p and 15% at 4k than a 3080. What you posted here has never happened since graphics cards were invented on this planet. Not once, in the entire history of dedicated graphics cards.

The 2080 literally traded blows with a 1080Ti, which was uncoincidentally also a GPU series designed during a Crypto bubble plagued by multi-year shortages and price hikes. Beginning to see the trend here?
 
Last edited:

Kuranghi

Gold Member
Gotta save up for that 4080, fuck its probably gonna be 9 hundo, fml. I can't wait until the 50-series because I'm addicted to native 4K now, this 1080 won't cut it for that usage by the end of 2022 probably. I've already had to abandon 60fps for 4K for the most part and when its 1620p60 or 1080p60 it just makes me want to wait and play the game later.
 

Sho_Nuff

Neo Member
got a 3070ti for $700 2 months ago I can't really complain since every game runs smoothly maxed out. Still might consider this GPU if it really comes out at $599
 
What’s the point in 100+ teraflop GPU’s when AAA games will be designed around a 4tflop Series S and no one’s going to build a AAA PC exclusive game around it.

Isn’t Unreal Engine 5 Matrix demo also CPU bound rather than GPU?
 

winjer

Member
What’s the point in 100+ teraflop GPU’s when AAA games will be designed around a 4tflop Series S and no one’s going to build a AAA PC exclusive game around it.

Isn’t Unreal Engine 5 Matrix demo also CPU bound rather than GPU?

Games on PC are not meant to be just played at 1080p or lower, with low-medium graphics settings, no RT, at 30 fps.
Imagine real 4K, 120-360 fps, RT at high settings, and higher quality settings all around.
 
What’s the point in 100+ teraflop GPU’s when AAA games will be designed around a 4tflop Series S and no one’s going to build a AAA PC exclusive game around it.

Isn’t Unreal Engine 5 Matrix demo also CPU bound rather than GPU?
Games are not going to be designed around series s.

They're going to be cut down to work on series s.

Also, you want native 4k60 or even 120, you gotta go PC. PC also has much faster cpus already so not sure what your point with ue5 is.
 
Last edited:
Top Bottom