• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen NVIDIA GeForce RTX 4090 With Top AD102 GPU Could Be The First Gaming Graphics Card To Break Past 100 TFLOPs

Kenpachii

Member
EVGA and MSI are known to up the powerlimits on their cards for overclockers.
But realistically average power draw in gaming wont be in the 500W range.

power-gaming-average.png

Who gives a shit about average gaming, u buy the card for its peak gaming. I don't buy a 3080 to play heroes of the storm or some other cpu bottlenecked game. I buy it to push the demanding games to its max and i need more performance already from the day i bought it. A 3090 owner specially the premium models buy this stuff to push framerates to the max.

This thing runs 500w even in the same benchmark with furmark that is ( so the stock bios is able to get there ), now if the card throttles or limits i don't know, i do know yay2cents or whatever that guy is called use the card like many on a 999w power limit bios which means it can up big time and in games it also showcases it going way over 500w this way.


power-gaming-peak.png
power-maximum.png


Anyway the point is, that aftermarket cards specially from EVGA will make 1000w models of it or around it and it will be laughable. Even the stock 600w is laughable and the 500w from amd is also laughable.

Honestly i wanted to upgrade to the 4000 series, but after seeing these numbers and the lackbusting lower tier versions of the card, i will probably be sitting on this 3080 for a few more years to come as 350w this card pushes is already a joke.

Not to forget the 2080ti was a 250w card, now we hit 600w.
 
Last edited:

Kenpachii

Member
That's after a pretty significant overclock using a custom BIOS though, right? IIRC even the 3090 FTW3 only consumed 350 Watts out of the box.

I think 1000 Watt cards are far fetched, and the reason is something you've already touched on in your post: you can't reliably cool a card drawing that much power using conventional means, and that includes most water cooling solutions.

Yea custom bios which probably most enthousiast want to use for max performance, and yea 1000w, it will be interesting to see how to cool it, i would not be shocked if we get hybrid air coolers + water aio cooling solutions. Could very well be we will already see that stock from nvidia this time.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Who gives a shit about average gaming, u buy the card for its peak gaming. I don't buy a 3080 to play heroes of the storm or some other cpu bottlenecked game. I buy it to push the demanding games to its max and i need more performance already from the day i bought it. A 3090 owner specially the premium models buy this stuff to push framerates to the max.

This thing runs 500w even in the same benchmark with furmark that is ( so the stock bios is able to get there ), now if the card throttles or limits i don't know, i do know yay2cents or whatever that guy is called use the card like many on a 999w power limit bios which means it can up big time and in games it also showcases it going way over 500w this way.


power-gaming-peak.png
power-maximum.png


Anyway the point is, that aftermarket cards specially from EVGA will make 1000w models of it or around it and it will be laughable. Even the stock 600w is laughable and the 500w from amd is also laughable.

Honestly i wanted to upgrade to the 4000 series, but after seeing these numbers and the lackbusting lower tier versions of the card, i will probably be sitting on this 3080 for a few more years to come as 350w this card pushes is already a joke.

Not to forget the 2080ti was a 250w card, now we hit 600w.

They dont need to hit 500Ws to hit the same framerates is the point.
Upping the powerlimits is literally for people who are climbing the 3D Mark chart thos extra 10 points matter to them, which is why you see EVGA/MSI cards with powerlimits outta here.
But when you look at the frame rate versus say a Gpro they are basically scoring the same.
For "normal" people they would likely undervolt the card and still hit the exact same framerates.(Im assuming this power consumption is concerning for some).
Upping the powerlimit doesnt do much for the average gamer.
I legit havent even booted Furmark since like the 1070....since then synthetic benches have been pretty pointless for me....same with Intel chips and people running prime on them constantly to show off how hot they get.
If its a gaming build, game on it.
 

Hezekiah

Banned
At this point, we don't know how SMs are configured in Ada Lovelace. But if it's something like Ampere, then those TFLOPs are not that impressive.
Remember that Ampere replaced the Int units with more FP32 units. This increased TFLOP count, but performance per TFLOP decreased.
Example: The 2070 and 3060 are almost identical in rasterization performance.
But the 2070 is a 8.9 TLOP card. But the 3060 is a 14 TFLOP card.
On the AMD side, for example, a 6600 is only 4% slower than a 2070. With 8.9 TFLOPs.

The thing to keep in mind is that TFLOPs are no longer a good representation of real world performance.
Yep, anybody expecting a ~2x jump in performance is going to be in for a nasty shock.

Still hoping to get a 4080 after missin out on the 3000 series, but not paying £1,000 for it.
 

Hezekiah

Banned
Nvidia has pushed the GPU more than anyone and should be applauded for bringing things to the market ahead of everyone else.
They were first with Mesh Shaders and VRS.
They were first with Ray Tracing.
They were first with ML upscaling
AMD and now Intel are playing catch up.

I actually think the GPU advancements have gone too fast for developers to actually adopt.
We have no games with Mesh Shaders on the market.
Ray Tracing is only in its beginning stages.
Sampler Feedback Streaming isn't being used on the Xbox Series front.
VRS has been touched on slightly and with the advancements in VRS 2.0 it promises alot of upside.
DLSS and other upscaling methods are also starting to mature and not yet fully utilised. We have the lower Int4 and Int8 capabilities on the Xbox Series console which heading into 2 years into the console generation have yet to be touched.
Game engines haven't caught up yet to what's already available.
Is this partly because game development takes so long nowadays? Developers don't want to spend a significant amount of time updating or creating new engines, so don't always take advantage of the very latest performance features.
 

kikkis

Member
Is this partly because game development takes so long nowadays? Developers don't want to spend a significant amount of time updating or creating new engines, so don't always take advantage of the very latest performance features.
Not really. Consoles are the dominant revenue platform for aaa games and min spec used to be xbone and ps4.
 

kikkis

Member
I just don't get why people bitch and moan about the power consumption. It's a powerful card, it uses lots of energy. That's like bitching about a new Lamborghini using more gas than a Toyota Corolla. Like no shit. If you want less performance and less power they still make cards for you.
I don't mind it either, but it's somewhat depressing that perf per watt doesn't seem to be increasing a lot anymore.
 

GymWolf

Member
A German tech site recently tested how well three popular PC cases handled 450+ Watt GPUs in terms of thermals, and only one of them passed. The other two had serious issues getting rid of all the heat, causing temperature spikes in other components and increased case/CPU fan noise.

Some of these new cards will supposedly use up to 600 Watts. Unless Nvidia and AMD make water cooling mandatory, people are going to start running into all sorts of issues.
This is pretty scary to read.
 

YCoCg

Member
I just don't get why people bitch and moan about the power consumption.
Bruh electricity prices in the UK alone have doubled/tripled for most people, having a new GPU suck up 3-4x more power to play games doesn't feel like a justifiable option. Of course the high end is always for the extreme buyers, but gassing over 500W on GPU alone seems crazy when just 10 years ago 300W was considered too much and too hot (and most settled on around 120-150W of the 60 series).

That's like bitching about a new Lamborghini using more gas than a Toyota Corolla.
It'd be more like trading in one Lambo for a newer model that IS faster, but sucks up twice as much fuel in general use.
 

Pagusas

Elden Member
Bruh electricity prices in the UK alone have doubled/tripled for most people, having a new GPU suck up 3-4x more power to play games doesn't feel like a justifiable option. Of course the high end is always for the extreme buyers, but gassing over 500W on GPU alone seems crazy when just 10 years ago 300W was considered too much and too hot (and most settled on around 120-150W of the 60 series).


It'd be more like trading in one Lambo for a newer model that IS faster, but sucks up twice as much fuel in general use.

Which is still silly, you don't buy a lambo worrying about its fuel efficiency.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
A German tech site recently tested how well three popular PC cases handled 450+ Watt GPUs in terms of thermals, and only one of them passed. The other two had serious issues getting rid of all the heat, causing temperature spikes in other components and increased case/CPU fan noise.

Some of these new cards will supposedly use up to 600 Watts. Unless Nvidia and AMD make water cooling mandatory, people are going to start running into all sorts of issues.

i fear nothing!
title.jpg
 

Midn1ght

Member
Bruh electricity prices in the UK alone have doubled/tripled for most people, having a new GPU suck up 3-4x more power to play games doesn't feel like a justifiable option. Of course the high end is always for the extreme buyers, but gassing over 500W on GPU alone seems crazy when just 10 years ago 300W was considered too much and too hot (and most settled on around 120-150W of the 60 series).


It'd be more like trading in one Lambo for a newer model that IS faster, but sucks up twice as much fuel in general use.
Exactly, not only will these card significantly raise our bills, but what's happening in EU right now with energy prices is pure insanity.
I've always been OK with paying more to play on high-end hardware but with the likely inflated prices/scalpers shenanigans and how power hungry these will be, I might just call it a day.
 

YCoCg

Member
Which is still silly, you don't buy a lambo worrying about its fuel efficiency.
Well true, but if the fuel costs were such a massive jump you'd be expecting something serious right? For 500W+ this thing better be running shit in 4k120 with full ray tracing features!
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
You should. That was one of the cases they tested and it didn't do well.
i-wanna-see-the-receipts-receipts.gif


3x140 in, 2x140 out.
The vertical mounts becomes a breathing point.

Pfft....ill be laughing in Enthoo, cuz if it cant do it, then basically no case can.
 

Pagusas

Elden Member
Well true, but if the fuel costs were such a massive jump you'd be expecting something serious right? For 500W+ this thing better be running shit in 4k120 with full ray tracing features!
100%, and it sounds like it likely will be. But its also important to remember these cards pull max power only in gaming, at full load, so even a 1000 watt card will barely make a impact on a normal gamers power bill. We're talking a few dollars a year. I'd prefer the flagship cars not be power restricted, if thats something someone worries about they can get a mid range card thats more geared towards thermal/power balance. Flagships should be 100% thermal bound only.

Even right now, on a 550watt modded 3090, posting on this board, having an active MS Teams video group chat going and adobe premiere running in the background, I'm pulling 44 watts on the 3090. These chips are good at staying low powered when they arent at 100% load. And honestly it took A LOT of work to force this 3090 to draw as much power as it does, Nvidia very much intentionally power limited these cards.
 
Last edited:
I meant to say if they can fix their RT implementation. Not RDNA.

Their RT implementation don't need a "fix", it's just a weaker solution to accelerate the rays than what Nvidia have now. And if you think about it it's actually not that bad, it's almost on part with Turing using much less hardware. Naturally the next version will accelerate more things and be better because of this. AMD was not wrong on offering a more "generic" solution that it's flexible for the developers. Just see what Insomniac did with a RX6700 (non XT) tier RT hardware.
 
i-wanna-see-the-receipts-receipts.gif


3x140 in, 2x140 out.
The vertical mounts becomes a breathing point.

Pfft....ill be laughing in Enthoo, cuz if it cant do it, then basically no case can.

Your case is actually the reason they decided to test this in the first place: they used it for their 3090 Ti review and noticed the card was unusually loud. After ruling out issues with the card itself, they realized that the heat it produced simply wasn't being moved out of the case fast enough. They were only running the stock fan configuration at first (three 140mm Noctuas, two in one out), but adding more fans made no discernible difference. The only thing that helped was removing the side panel.

Out of the three cases they tested, only the Fractal Design Torrent performed without issues.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not

Your case is actually the reason they decided to test this in the first place: they used it for their 3090 Ti review and noticed the card was unusually loud. After ruling out issues with the card itself, they realized that the heat it produced simply wasn't being moved out of the case fast enough. They were only running the stock fan configuration at first (three 140mm Noctuas, two in one out), but adding more fans made no discernible difference. The only thing that helped was removing the side panel.

Out of the three cases they tested, only the Fractal Design Torrent performed without issues.
Jesus a 12900K + a 3090ti.

I fear nothing!
None of them cept the beQuiet at its slowest setting even reaches thermal throttling?
60Mhz isnt exactly the end of the world.
iQYs4VA.jpg



Granted it is an alarming report to consider 600W GPUs might be on the horizon and I was actually looking to upgrade to RaptorLakes i7
Hzgo3qz.jpg


P.S Top mounted PSUs with bottom mounted fans might be making a comeback sooner than anyone expected.
3-1080.e6034dac.jpg


P.P.S I should have gotten the P600S....but the Evolv X looks soo soo good!
 
I fear nothing!
None of them cept the beQuiet at its slowest setting even reaches thermal throttling?
Yeah, the GPU might not be throttling in the Evolv X, but they note that it reaches noise levels most people would consider unacceptable. The overall noise level of the case is almost four times that of the Fractal. And that's at "just" 450 Watts.
 
Is this partly because game development takes so long nowadays? Developers don't want to spend a significant amount of time updating or creating new engines, so don't always take advantage of the very latest performance features.
I think so. They appear to redo an engine after a couple of games, which is as long a 3 to 4 years each. Some studios like IW or Idtech might do theirs between each game, but still you are looking at quite awhile between updates.
I think they also wait for others to jump in before they do.
 
Anyone honestly think the 7900 XT will be hitting near 92 TF if it only has 12,288 shader cores as per the new leaks? Without overclocking, I mean?

Because you'd need something like 3.6 GHz to do that and even with chiplet benefits and 5nm/6nm power consumption improvements, AND architectural redesign improvements, that high a clock on that big a die area will surely creep past 450 watts, maybe near 500 😲

What is SIMD2? What does this slide mean for future GeForce GPU's?
hvuqd5P.jpg

Well, SIMD is Single Instruction Multiple Data, but I have not seen it referred to as SIMD squared. That's new to me.

EVGA and MSI are known to up the powerlimits on their cards for overclockers.
But realistically average power draw in gaming wont be in the 500W range.

power-gaming-average.png

Based on this where do you think the 7900 XT will land in terms of typical power consumption? I saw some claiming around 375 - 380 watt but that was with the 15360 shader core/2.5 GHz leak and suspected performance.

Do you think a 12,288 shader core/3.6 GHz 7900 XT can limit typical power consumption to around 425 watts or so, or come in lower?

On the other hand, now that the GPU has so many flops, Tensor cores might not even be needed anymore as the GPU might have enough calculation power to do everything on the GPU itself. I can just guess here, but I don't think that the tensor cores have a bright future. I think they were just a "bridge" technology used while the GPU itself was not powerful enough to also make the things that tensor cores are used now. So far tensor cores are only used for a minority of tasks. E.g. with DLSS 1.x we already seen the "just shader"-DLSS which wasn't bad at all.

I think that's probably going to hold even more true as graphics pipelines move away from relying as much on fixed function and more on programmable mesh shading. At least, it seems that way.
 
Last edited:

winjer

Gold Member
What is SIMD2? What does this slide mean for future GeForce GPU's?
hvuqd5P.jpg

Generalized Matrix Instruction Set for Accelerating Tensor Computation
Seems to be similar to SIMD, but adapted to instructions in Matrixes. Meaning it will save execution time by not having to fetch the instruction to execute, if it's the same, just the data.
 
Which is still silly, you don't buy a lambo worrying about its fuel efficiency.
False equivalence.
A Lamborghini is $300,000. Most people could never dream of affording one of those.

Most GPUs cost less than most smartphones. Even the highest end halo GPUs historically have been <$1000. It wasn't until Turing that GPUs started getting priced stupidly high.

Likelihood is most people here could afford a $3000 PC. You probably choose not to because it's not a sensible investment, but you could afford it. I guarantee most people here would get laughed out of a Lambo dealership. The scale of purchase is different.

Also it's not just about the cost of energy, though for a lot of it will make a big difference. Especially when rumours also have the 4070 chip AD104 also at 300+ Watts.

But the bigger problem is actually heat. A 600W GPU is a 600W space heater. If you have the (Mis)fortune of owning a 12900KS which is 250W, plus other expensive, thirsty components and you slap on PSU efficiency losses you're looking at a PC that draws ~1kW.

That's a 1kW space heater heating your room. If you live in a hot climate without A/C good luck. If you have A/C you can bet your ass that it'll be working overtime keeping your room cold, and if it's an older unit it'll be failing to do that.

People who spend $2000 on a GPU will no doubt care about having a giant radiator heating their gaming space in their McMansions.
 

OZ9000

Banned
I just don't get why people bitch and moan about the power consumption. It's a powerful card, it uses lots of energy. That's like bitching about a new Lamborghini using more gas than a Toyota Corolla. Like no shit. If you want less performance and less power they still make cards for you.
This is the dumbest comparison.

Electricity prices have tripled.

Graphics cards have shot up by 100-200% for no reason whatsoever.

And no one wants a fucking blast furnace in there room.

Cards such as the RTX 2080 Super or Ti offered fantastic performance per watt. There is absolutely zero fucking point in offering a high end card if it will cost 10 bucks a day to run and also make you sweat like a pig, just to play vidya games.
 

Xdrive05

Member
DF made a comment in a recent video that (more or less) ever since the Xbox One X days, the most meaningful way to get more performance is basically to up the die size & power consumption. I think they ascribed the 4090 rumors to this phenomenon holding true. If you want these kind of gains, be prepared to pay for it with your electric bill while wearing sweaty undies in your gaming room.

As per usual, the XX90 series is the weirdo luxury card that almost no one will own because who did you murder for that kind of F U money. And 4090 is looking to be even more so.
 

Edder1

Member
Just here to remind everyone that 30 and 40 series Tflops do not scale 1:1 with GPU architectures prior to them. Tflops are largely irrelevant from 30 series onwards and don't represent a fair point of comparison, especially if we compare Tflops of these GPUSs to consoles. NX Gamer has a really good video breaking this down.
 

Hollowpoint5557

A Fucking Idiot
This is the dumbest comparison.

Electricity prices have tripled.

Graphics cards have shot up by 100-200% for no reason whatsoever.

And no one wants a fucking blast furnace in there room.

Cards such as the RTX 2080 Super or Ti offered fantastic performance per watt. There is absolutely zero fucking point in offering a high end card if it will cost 10 bucks a day to run and also make you sweat like a pig, just to play vidya games.
This is the dumbest fucking post.
So because YOU don't want to push the medium forward because of your own reasons everyone else should be held back?
I want faster cards. I don't care about electricity prices. I don't care if it makes a room hot.
For people like you they are already making cards that suit your needs. For people like me I want them to push the envelope and move the industry forward.
 
Last edited:

OZ9000

Banned
This is the dumbest fucking post.
So because YOU don't want to push the medium forward because of your own reasons everyone else should be held back?
I want faster cards. I don't care about electricity prices. I don't care if it makes a room hot.
For people like you they are already making cards that suit your needs. For people like me I want them to push the envelope and move the industry forward.
Ramping up the power requirements to 1200W to get a faster GPU is NOT pushing the industry forward.
 
Last edited:

EverydayBeast

thinks Halo Infinite is a new graphical benchmark
For me new graphic cards make you feel old, ok I’ll change my card but far too many times the motherboard comes up with an error.
 

DukeNukem00

Banned
Ramping up the power requirements to 1200W to get a faster GPU is NOT pushing the industry forward.

nobody is ramping anything up to 1200W. Didnt a tweet just get posted where it has the same power draw we have since 2020 and claims double the performance ? Why are people just obsessing over leaked power draws for the halo and absurd card that will get bought by 5 people in the world when every other model will be the same as its always been. With various amounts of power required.
 

TrackZ

Member
Much as I want to migrate myself 100% to consoles for simplicity and accessibility, hmm I'm probably buying this... :messenger_expressionless:

I'm getting used to seeing games fully maxed and higher FPS and my 83" G2 with Gsync should be here this week.
 

Knightime_X

Member
I'm still wondering why we don't have video ram in the 64gb range or even higher.
I think that alone solves most people's problems for certain things.
 
Last edited:
i-wanna-see-the-receipts-receipts.gif


3x140 in, 2x140 out.
The vertical mounts becomes a breathing point.

Pfft....ill be laughing in Enthoo, cuz if it cant do it, then basically no case can.
eh, got me one of those glass cubes (disabled basically all RGB).
026nHay2FCZ8JXc65oBvTXv-1..v1569469954.jpg


5x120mm in
5x120mm out

wish it was all 140mm though
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
eh, got me one of those glass cubes (disabled basically all RGB).
026nHay2FCZ8JXc65oBvTXv-1..v1569469954.jpg


5x120mm in
5x120mm out

wish it was all 140mm though
What in interesting looking case.
Thermals must be nice.

But looking at the latest rumors I dont think Ill need to worry about the Evolv X.
I aint getting no 4090.
 
Top Bottom