• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

New Nvidia RTX 4000 Adola Lovelace Cards Announced | RTX 4090 (1599$) October 12th | RTX 4080 (1199$)

DaGwaphics

Member
The prices will just go up and up every gen won't they, fook it all

Depends on how this gen actually sells down the line. It will be interesting to see how the 4000s series compares to the 3000 series on the Steam hardware survey in a year or two. The 4080 is still widely available in most stores, that's not something that happens within the first few months when Nvidia cards are truly popular with the masses.
 

Razvedka

Banned
Depends on how this gen actually sells down the line. It will be interesting to see how the 4000s series compares to the 3000 series on the Steam hardware survey in a year or two. The 4080 is still widely available in most stores, that's not something that happens within the first few months when Nvidia cards are truly popular with the masses.
Without price cuts I don't see the 4000 series doing much of anything at all.
 

DaGwaphics

Member
Without price cuts I don't see the 4000 series doing much of anything at all.

The 4090 will do okay, maybe even better than the 3090. The rest of the lineup will be the real test.

Hopefully the majority of consumers reject the trend of next to zero fps per $ improvement between gens. GPUs are going to be scary expensive in a few generations if they don't.

No one would ever stand for this in the CPU market, not sure why people are being sheeps with the GPUs.
 
Last edited:

HeisenbergFX4

Gold Member
The 4090 will do okay, maybe even better than the 3090. The rest of the lineup will be the real test.

Hopefully the majority of consumers reject the trend of next to zero fps per $ improvement between gens. GPUs are going to be scary expensive in a few generations if they don't.

No one would ever stand for this in the CPU market, not sure why people are being sheeps with the GPUs.
I think we have entered the era of early adopters paying a premium just to have them and there is no going back

Its a great time to be looking at 3080 systems for people wanting to upgrade 1000 series Nvidia card based machines

I saw an HP Omen open box buy in a Best Buy last weekend 5800x 3080 for $1199

Thing looked like it rolled down the side of a mountain but it was guaranteed to work
 
i'm reading up on undervolting before my 4080 comes. not sure if i understand how to do it properly or if it's really worth it but it seems like a lot of people recommend it. i get that less voltage = less energy/heat = better lifespan and possibly better performance.

is it worth doing it on 4000 cards?

apparently i can still overclock and undervolt...i don't understand. i mean to overclock don't i need to give it more voltage?

any good guides with photos or a video tutorial? i'm reading some guides and i can't follow them.
 

GHG

Member
i'm reading up on undervolting before my 4080 comes. not sure if i understand how to do it properly or if it's really worth it but it seems like a lot of people recommend it. i get that less voltage = less energy/heat = better lifespan and possibly better performance.

is it worth doing it on 4000 cards?

apparently i can still overclock and undervolt...i don't understand. i mean to overclock don't i need to give it more voltage?

any good guides with photos or a video tutorial? i'm reading some guides and i can't follow them.

Less voltage = less heat = high sustained boost frequencies

You can also take advantage of there being less heat by overclocking the memory slightly as well.

Guide for the 4090 but the principles and process will be the same for the 4080:




I'm still running my 4090 stock for now to make sure it's firing on all cylinders while I'm still in the return/exchange period but will probably undervolt it in the next couple of weeks.
 
Last edited:

SABRE220

Member
There is legit no competition for nvidia...amds best effort is more than a generation behind in RT and trading blows with the 4080 in rasterization while losing out in power efficiency....this is not even going into the massive edge nvidia has in software tech. It really has come to the point where there really is no point in amd its not even close on an engineering level....might aswell pray intel comes out swinging because thats our only chance.

From the days of the great 8800gt to this.....man I think im going to skip pc gaming for a while its depressing.
 
Last edited:

DaGwaphics

Member
I think we have entered the era of early adopters paying a premium just to have them and there is no going back

Its a great time to be looking at 3080 systems for people wanting to upgrade 1000 series Nvidia card based machines

I saw an HP Omen open box buy in a Best Buy last weekend 5800x 3080 for $1199

Thing looked like it rolled down the side of a mountain but it was guaranteed to work

That's certainly could be a possibility.

Maybe they will surprise and the 4050/4060 line won't be marked up as bad. This series seems to offer quite the performance boost over the 3000 series, maybe the lower-end cards will still perform well even though they've cut them down considerably in terms of comparing them to the 80/90 cards.

Either that or the new trend will be the majority of consumers wait on the next-gen before buying the last-gen cards on discount. :messenger_grinning_smiling:
 

Mister Wolf

Member
There is legit no competition for nvidia...amds best effort is more than a generation behind in RT and trading blows with the 4080 in rasterization while losing out in power efficiency....this is not even going into the massive edge nvidia has in software tech. It really has come to the point where there really is no point in amd its not even close on an engineering level....might aswell pray intel comes out swinging because thats our only chance.

From the days of the great 8800gt to this.....man I think im going to skip pc gaming for a while its depressing.

AMD doesn't know what they're doing.
 

Reallink

Member
The 4090 will do okay, maybe even better than the 3090. The rest of the lineup will be the real test.

Hopefully the majority of consumers reject the trend of next to zero fps per $ improvement between gens. GPUs are going to be scary expensive in a few generations if they don't.

No one would ever stand for this in the CPU market, not sure why people are being sheeps with the GPUs.

The 4090 already has and will continue to blow the 3090 out if the water. It's likely sitting around 200K as we speak while 3090s lifetime sales were around 600K-700K. 4090 would have already passed that if they could even begin to meet demand in North America. So not only are people not rejecting the prices, huge percentages of them are successfully being upsold 2 or 3 tiers higher. So several times the opposite of rejection.
 
Last edited:
  • Like
Reactions: GHG

DaGwaphics

Member
The 4090 already has and will continue to blow the 3090 out if the water. It's likely sitting around 200K as we speak while 3090s lifetime sales were around 600K-700K. 4090 would have already passed that if they could even begin to meet demand in North America. So not only are people not rejecting the prices, huge percentages of them are successfully being upsold 2 or 3 tiers higher. So several times the opposite of rejection.

Clearly you are just joking because you'd have to see the final sales numbers before you could even begin to make an assumption that wild. Have they really grown the footprint for that price class or are they just selling through it quicker? If they do sell more, how much more? Even doubling things would leave the card in a relatively insignificant position overall.

Not to mention the 4090 is the one card in this series that has been launched/announced so far that is near the same price as its predecessor while greatly increasing the fps per $ as one would expect, it sits outside everything else announced and rumored so far. Since it overpowers the 3090ti buy about the margin that one would expect for a generational upgrade, it is even cheaper this time around. Basically the 4090 was priced to sell for its position in the market.

It's a big leap to go from that fact to thinking that all the buyers looking in lower price categories are going to be jumping to spend $300, $400 or $500 more than last gen to get the performance boost they expect gen over gen anyway. Especially in a struggling economy. If Nvidia can pull it over on consumers that well, it will be quite the windfall for them. It could absolutely happen though, it will be interesting to watch how it plays out.

These Nvidia cards seem solid, the negative reactions about them aren't about the products but the pricing. The performance numbers Nvidia released for the 4070ti/4080 12GB looked good. If it's $599 it's a really great gen over gen upgrade from the 3070ti. At $699, maybe you can give them a pass because of inflation. At $899 people are going to complain about it.
 
Last edited:

Reallink

Member
Clearly you are just joking because you'd have to see the final sales numbers before you could even begin to make an assumption that wild. Have they really grown the footprint for that price class or are they just selling through it quicker? If they do sell more, how much more? Even doubling things would leave the card in a relatively insignificant position overall.

Not to mention the 4090 is the one card in this series that has been launched/announced so far that is near the same price as its predecessor while greatly increasing the fps per $ as one would expect, it sits outside everything else announced and rumored so far. Since it overpowers the 3090ti buy about the margin that one would expect for a generational upgrade, it is even cheaper this time around. Basically the 4090 was priced to sell for its position in the market.

It's a big leap to go from that fact to thinking that all the buyers looking in lower price categories are going to be jumping to spend $300, $400 or $500 more than last gen to get the performance boost they expect gen over gen anyway. Especially in a struggling economy. If Nvidia can pull it over on consumers that well, it will be quite the windfall for them. It could absolutely happen though, it will be interesting to watch how it plays out.

These Nvidia cards seem solid, the negative reactions about them aren't about the products but the pricing. The performance numbers Nvidia released for the 4070ti/4080 12GB looked good. If it's $599 it's a really great gen over gen upgrade from the 3070ti. At $699, maybe you can give them a pass because of inflation. At $899 people are going to complain about it.
Nvidia confirmed 130k sold a month ago, so it's not unreasonable to assume they must be knocking on 200k's door by now (and quite likely exceeding it). Coupled with every 4090 still selling out instantly, it's all but guaranteed demand is several times higher than 200k. In another 2 years, it's unquestionably going to dwarf 600-700k.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Nvidia confirmed 130k sold 2 or 3 weeks ago, so it's not unreasonable to assume they must be knocking on 200k's door by now. Coupled with every 4090 still selling out instantly, it's all but guaranteed demand is several times higher than 200k. In another 2 years, it's unquestionably going to dwarf 600-700k.
Kind of puts into perspective why games do not target these GPU’s more than what bruteforcing allows them to do with ease. We are talking about 1 million units over maybe 3 years?!?
 

//DEVIL//

Member
Was at Canada computer today here in Canada .

Saw the 4080 strix in stock plus other cards. 4080 Strix is 2200$ CAD. This is how much I paid for my FE 4090.

Man ASUS have lost their mind and no one is giving them hell for it is annoying.

It’s ok not OK for Nvidia to sell the card for 100$ or 200$ more than what it should, but an AIB card to sell a card 500$ US higher than reference / FE card is ok ? You don’t get 5 frames boost over these cards .

Fuck asus and their greed
 

Reallink

Member
Kind of puts into perspective why games do not target these GPU’s more than what bruteforcing allows them to do with ease. We are talking about 1 million units over maybe 3 years?!?

Yes and no. If you expand "high end" to encompass 3070 desktop class and higher, you'd be looking at 10+ million cards, and that's just 3000 series. That demographic probably accounts for >50% of "Premium" PC game sales as they're the hardest core biggest spenders, so there's a pretty significant market.
 

DaGwaphics

Member
Nvidia confirmed 130k sold 2 or 3 weeks ago, so it's not unreasonable to assume they must be knocking on 200k's door by now. Coupled with every 4090 still selling out instantly, it's all but guaranteed demand is several times higher than 200k. In another 2 years, it's unquestionably going to dwarf 600-700k.

Possibly, but by how much? And again, the point I was making was that I hoped that consumers would remain tepid towards cards that represent a near zero fps per $ gain over what was already on the market. The 4090 doesn't fit that bill anyway, so, I'm not sure how the 4090 figures in at all.

It will be interesting to see what they do with pricing, whether they just move everything upward (likely the worst case scenario) or they increase the performance and price gaps between each line of cards (the better way to fill out the lineup if they are determined to increase the net prices).

Last time the 3070ti was only about 14% off the 3080, it looks further away this time and the 4060ti should be further away from 4070 as well. If there are some gaps in the pricing between these groups the 4060/4050 lines might not be that bad in price to performance, and these are the primary cards they sell anyway.
 
Last edited:

skneogaf

Member
Damn, this was amazing. I've watched plenty of repair videos but I've never seen anyone resolder a GPU like that, using stencils and such.

In other news, I'm ramping up to build a $4000+ PC, I'm currently in the Stone Age with an aging 4770k CPU and a 1080ti. I'm sure a 4090 is overkill for my monitor (3440x1440 g-sync 120hz) but I don't care, I want to build something that is good for several years. It's upsetting how hard it is to get a 4090 but eh, I'll do what's necessary.

I registered with nvidia and they sent me a direct link to buy one through the geforce experience application.
 

Reallink

Member
Possibly, but by how much? And again, the point I was making was that I hoped that consumers would remain tepid towards cards that represent a near zero fps per $ gain over what was already on the market. The 4090 doesn't fit that bill anyway, so, I'm not sure how the 4090 figures in at all.

It will be interesting to see what they do with pricing, whether they just move everything upward (likely the worst case scenario) or they increase the performance and price gaps between each line of cards (the better way to fill out the lineup if they are determined to increase the net prices).

Last time the 3070ti was only about 14% off the 3080, it looks further away this time and the 4060ti should be further away from 4070 as well. If there are some gaps in the pricing between these groups the 4060/4050 lines might not be that bad in price to performance, and these are the primary cards they sell anyway.

If they can somehow maintain their current price to performance hierarchy all generation (which collusive niece AMD seems content to let them do), the 4090 will likely capture the entire (historical) XX80 and higher demographic, which will be several million units by the time its replaced. When there's no sensible alternative, a LOT of people will choose bite the bullet and spend the money, it's the only card that even begins to make sense. They'll make more selling 3 million $1600+ cards than they would selling 10 million $300 - $600 cards.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Yes and no. If you expand "high end" to encompass 3070 desktop class and higher, you'd be looking at 10+ million cards, and that's just 3000 series. That demographic probably accounts for >50% of "Premium" PC game sales as they're the hardest core biggest spenders, so there's a pretty significant market.
10 Million cards? Even if it were true… in 3+ years, still small numbers and you are including a wide variety of GPU features and performance profiles there which makes the problem even worse.

Now look at how many users you get targeting just the new HD twins one year in (so two very close profiles) not to talk what you get when you add Xbox One and PS4.
 
Last edited:

DaGwaphics

Member
If they can somehow maintain their current price to performance hierarchy all generation (which collusive niece AMD seems content to let them do), the 4090 will likely capture the entire (historical) XX80 and higher demographic, which will be several million units by the time its replaced. When there's no sensible alternative, a LOT of people will choose bite the bullet and spend the money, it's the only card that even begins to make sense. They'll make more selling 3 million $1600+ cards than they would selling 10 million $300 - $600 cards.

I think you are wildly overestimating the amount of users willing to jump past $1k on GPUs, but we'll see. Ultimately, I think a good number of historical 80 series buyers drop to a lower card or just sit the generation out. Selling fewer products at a higher margin is a typical business tactic, but wouldn't prove to be a smart tactic for Nvidia IMO. Their position is largely built on being the brand everyone is using, the buyers they aren't selling to are going to end up somewhere else, either going with another PC GPU vendor or switching to console (the worst case scenario for Nvidia since these customers would now be investing in a completely separate ecosystem). If you did that for long enough you aren't the brand everyone is using after a certain point.

We'll see what they do with the 4070 and below, since those are the cards that actually matter in the bigger discussion (especially the 60 and 50 lines).
 

Captn

Member
Im getting my hands on the gigabyte 4090 OC for 2500$ CND flat out.

Read some reviews seems like a good card.

Should I worry about the power cables that comes with it? I have a evga 1000w full modular power supply.

Will this work without fear of melting cables?

Anyone has that version who could share thoughts?

Thnx
 

Captn

Member
No.



Yes.



It's a good card. Software to control RGB is trash. That's my only complain.
Thanks for confirmation. As for the software for rgb I don't really care about it anyways never did. It can flash any color it wants as long as it's performant loll 😁

What is the max wattage you pulling from it?

Also how is the overclock on it.. Temps.. Have you tried undervolting it?

Thnx
 
Last edited:

Reallink

Member
I think you are wildly overestimating the amount of users willing to jump past $1k on GPUs, but we'll see. Ultimately, I think a good number of historical 80 series buyers drop to a lower card or just sit the generation out. Selling fewer products at a higher margin is a typical business tactic, but wouldn't prove to be a smart tactic for Nvidia IMO. Their position is largely built on being the brand everyone is using, the buyers they aren't selling to are going to end up somewhere else, either going with another PC GPU vendor or switching to console (the worst case scenario for Nvidia since these customers would now be investing in a completely separate ecosystem). If you did that for long enough you aren't the brand everyone is using after a certain point.

We'll see what they do with the 4070 and below, since those are the cards that actually matter in the bigger discussion (especially the 60 and 50 lines).

Spoiler alert, they'll all be $100 - $200 more than their predessors for 20-30% gains. Most people will happily pay the premium while those who can't afford to will begrudgingly buy the step down tier. End result is they're all still buying Nvidia. PC gamers are never going to consoles cause they have to have their weird shit like stretch res esports pro gamers, stretch armstrong 120 degree FOVs, and play character action games with a mouse and keyboard.
 
Last edited:

DaGwaphics

Member
Spoiler alert, they'll all be $100 - $200 more than their predessors for 20-30% gains. Most people will happily pay the premium while those who can't afford to will begrudgingly buy the step down tier. End result is they're all still buying Nvidia. PC gamers are never going to consoles cause they have to have their weird shit like stretch res esports pro gamers, stretch armstrong 120 degree FOVs, and play character action games with a mouse and keyboard.

If they bump the bottom of the stack by $100-$200, that should certainly leave the door wide open to competitors. But they definitely do have a crazy strong brand. 🤷‍♂️
 

Reallink

Member
If they bump the bottom of the stack by $100-$200, that should certainly leave the door wide open to competitors. But they definitely do have a crazy strong brand. 🤷‍♂️

Who, Intel? They're 2 or 3 years behind. The 7900s have already demonstrated that AMD is content to be the also ran and simply match Nvidia's new gouging price to performance hierarchy.
 
Last edited:

DaGwaphics

Member
Who, Intel? They're 2 or 3 years behind. The 7900s have already demonstrated that AMD is content to be the also ran and simply match Nvidia's new gouging price to performance hierarchy.

The 7600 line from AMD will be monolithic and it's been reported that they are even making the new chips drop-in compatible with existing 6600 series boards (to save AIBs money), I think if they take an aggressive shot with pricing anywhere in the lineup it will be with these. They've done it before and have achieved mainstream success in this range.

Plus, if they bump the 4050 by a hundred, it will be directly competing with the Arc 750/770 that would likely outperform it in modern titles. Though it's hard to say what impact that could have, I suspect that Intel isn't really producing all that many of those to begin with.
 
Last edited:

MidGenRefresh

*Refreshes biennially
Thanks for confirmation. As for the software for rgb I don't really care about it anyways never did. It can flash any color it wants as long as it's performant loll 😁

See, that's the problem. I don't want it to flash any RGB. That's why I use the RGB software. It does the job but it's another piece of shit app that needs to start at Windows launch.

What is the max wattage you pulling from it?

No idea and I don't know how to check. But I'm on 850W PSU and it works perfectly fine. 0 crashes or freezes since I got it. Funny enough, with my previous card (3080) my system crashed now and then. Like once every week or every 2 weeks.

Also how is the overclock on it.. Temps.. Have you tried undervolting it?

I don't know, I'm a bloody casual. I just want a card that allows me to play any game at max settings and this one does the trick.
 

GHG

Member
No idea and I don't know how to check. But I'm on 850W PSU and it works perfectly fine. 0 crashes or freezes since I got it. Funny enough, with my previous card (3080) my system crashed now and then. Like once every week or every 2 weeks.

That would be due to transient spikes. Those are far more controlled and infrequent on the 4xxx series than they were on the 3xxx series.
 

Captn

Member
See, that's the problem. I don't want it to flash any RGB. That's why I use the RGB software. It does the job but it's another piece of shit app that needs to start at Windows launch.



No idea and I don't know how to check. But I'm on 850W PSU and it works perfectly fine. 0 crashes or freezes since I got it. Funny enough, with my previous card (3080) my system crashed now and then. Like once every week or every 2 weeks.



I don't know, I'm a bloody casual. I just want a card that allows me to play any game at max settings and this one does the trick.

Perfect thank you for the info!
 

Captn

Member
I finally installed the beast gigabyte OC 4090.

Undervolted the card with afterburner to 0.950v

2730mhz with a +1450 on memory.

Power limit to 85%

Getting locked 120fps on the Witcher 3 with DLSS 3 activated on quality and everything else maxed out 4k.

Temps are 54 degrees celsius. Fans are not turning.

Power usage 300watts

Really impressed here. Wow!
 
Anyone has a tutorial and values to undervolt a rtx 4080 ?
would appreciate this too!

i got my 4080 and new PSU all set up and will start messing about with it tomorrow. usually i'd get a new GPU and OC the shit out it but i want to try undervolting.

my plan is to currently run some benchmarks/games to see what frequency, voltage, temperatures, and power consumption it gets with stock settings.

as i understand it i would then need to go into MSI afterburner and have a look at the freq/voltage curve. say the card boosts to 2520 with 1.050V then i should try flattening the curve at 2520 with 1.025V or 1.000V to see if it can run. if no crashes then lower it by another 0.025/0.050V and test again. once there is instability raise the voltage up by 0.030-0.050V

after that i don't know what to do. i see people saying they increase memory frequency or that they overclock AND under volt. i'm confused.
 

GHG

Member
https://github.com/PrimeO7/How-to-u...X3D-Guide-with-PBO2-Tuner/blob/main/README.md

Do this

If your motherboard allows direct pbo2 adjustment to the 5800x3d, then you’re lucky, they are rare. Otherwise follow the guide above.

I did a -35 on all cores. (It’s lottery, but 5800x3D is known for easy -30 undervolt) Had a drastic reduction in power and heat. I’m running the 5800x3D in an SFFPC meshilicious case with a mere noctua NH-L12s.

Just bumping to say thank you Buggy Loop Buggy Loop .

Got PBO2 all set up, -25 on each core and it's resulted in a drop in temps of ~10C. Got it all automated so it does it at start-up. Bonus, the performance is slightly better as well:

Before:

dUaoAlh.png

After:

FAWdGNZ.png


That saves me needing to upgrade my AIO for now at least.
 
Last edited:

MikeM

Member
I finally installed the beast gigabyte OC 4090.

Undervolted the card with afterburner to 0.950v

2730mhz with a +1450 on memory.

Power limit to 85%

Getting locked 120fps on the Witcher 3 with DLSS 3 activated on quality and everything else maxed out 4k.

Temps are 54 degrees celsius. Fans are not turning.

Power usage 300watts

Really impressed here. Wow!
Is that with RT? If so, damn man.
 

//DEVIL//

Member
I finally installed the beast gigabyte OC 4090.

Undervolted the card with afterburner to 0.950v

2730mhz with a +1450 on memory.

Power limit to 85%

Getting locked 120fps on the Witcher 3 with DLSS 3 activated on quality and everything else maxed out 4k.

Temps are 54 degrees celsius. Fans are not turning.

Power usage 300watts

Really impressed here. Wow!
undervolting and power limit? it doesn't work like that bud. pick either lol.
 

Captn

Member
undervolting and power limit? it doesn't work like that bud. pick either lol.
Been working since day one 2 weeks ago bud.. Loll

Maybe expending on the reasons on why it does not work like that would of been a better, more informative and respectful suited way of replying but guess it's too much to expect these days 🙄
 

//DEVIL//

Member
Been working since day one 2 weeks ago bud.. Loll

Maybe expending on the reasons on why it does not work like that would of been a better, more informative and respectful suited way of replying but guess it's too much to expect these days 🙄
I don't know how to explain it. but you are kinda doing the same thing but worse in terms of limiting.

by undervolting (it's what I do to my card ), I tell the card to run at max 900mv at a specific Mhz speed. but when you also lower the power target, you are kinda doing the same thing but different way. it's like you are lowering your max power of the card and undervolt at the same time.

let's say undervolting the card limit the card to run higher than 2500Mhz. but your undervolt is at 2700. you are not getting that 200Mhz because you are already limiting your card. get it? this is just an example.
 

Captn

Member
I don't know how to explain it. but you are kinda doing the same thing but worse in terms of limiting.

by undervolting (it's what I do to my card ), I tell the card to run at max 900mv at a specific Mhz speed. but when you also lower the power target, you are kinda doing the same thing but different way. it's like you are lowering your max power of the card and undervolt at the same time.

let's say undervolting the card limit the card to run higher than 2500Mhz. but your undervolt is at 2700. you are not getting that 200Mhz because you are already limiting your card. get it? this is just an example.

Yeah I get it but does it impare my card as it is now? Am I waisting energy or potential better performance?

Always wondered if power and voltage are related, why do they both have 2 separate sliders to work with?

What are your settings exactly?

I'd like to be able to do 4k max settings close to 120 in every game but with trying to not go above 350 watts power draw.

Thnx for the detailed answer.
 

//DEVIL//

Member
Yeah I get it but does it impare my card as it is now? Am I waisting energy or potential better performance?

Always wondered if power and voltage are related, why do they both have 2 separate sliders to work with?

What are your settings exactly?

I'd like to be able to do 4k max settings close to 120 in every game but with trying to not go above 350 watts power draw.

Thnx for the detailed answer.
you have 2 options.

if you go as low as 60% performance target, you lose a max of 10% performance ( yes, this video card is this good ) .. I believe at a 70% power target you lose only 6%. ( I think its 310 Watts at 60% with not much of a performance lose. but I prefer 70% as its still around 350Watts )

so you can either go that route. and don't worry about the game crashing with you. or undervolt and keep the power target at 100%.

then do a benchmark for any game you have. and see what gives you better performance at the desired temperature.

in my case, for example, my undervolt is 875MV at speed of 2475Mhz. I didn't try to go higher ( even though maybe I should have, but it never crashed on any game for me at this speed so I was like fuck it, it's good enough for me) and to me, the performance is more than what I need and my temp is never higher than 55c for my FE card ( your card should be lower maybe I dunno ).

remember, it's the mV that increases your GPU temp, the higher you go, the higher your temp is and vice versa, that's why 875mV is a good spot for me.
 
Last edited:

Captn

Member
you have 2 options.

if you go as low as 60% performance target, you lose a max of 10% performance ( yes, this video card is this good ) .. I believe at a 70% power target you lose only 6%. ( I think its 310 Watts at 60% with not much of a performance lose. but I prefer 70% as its still around 350Watts )

so you can either go that route. and don't worry about the game crashing with you. or undervolt and keep the power target at 100%.

then do a benchmark for any game you have. and see what gives you better performance at the desired temperature.

in my case, for example, my undervolt is 875MV at speed of 2475Mhz. I didn't try to go higher ( even though maybe I should have, but it never crashed on any game for me at this speed so I was like fuck it, it's good enough for me) and to me, the performance is more than what I need and my temp is never higher than 55c for my FE card ( your card should be lower maybe I dunno ).

remember, it's the mV that increases your GPU temp, the higher you go, the higher your temp is and vice versa, that's why 875mV is a good spot for me

you have 2 options.

if you go as low as 60% performance target, you lose a max of 10% performance ( yes, this video card is this good ) .. I believe at a 70% power target you lose only 6%. ( I think its 310 Watts at 60% with not much of a performance lose. but I prefer 70% as its still around 350Watts )

so you can either go that route. and don't worry about the game crashing with you. or undervolt and keep the power target at 100%.

then do a benchmark for any game you have. and see what gives you better performance at the desired temperature.

in my case, for example, my undervolt is 875MV at speed of 2475Mhz. I didn't try to go higher ( even though maybe I should have, but it never crashed on any game for me at this speed so I was like fuck it, it's good enough for me) and to me, the performance is more than what I need and my temp is never higher than 55c for my FE card ( your card should be lower maybe I dunno ).

remember, it's the mV that increases your GPU temp, the higher you go, the higher your temp is and vice versa, that's why 875mV is a good spot for me.
Good food for thought... Thnx

What's your memory at? Did you OC it?
 
Last edited:

Captn

Member
New DLSS 3.0 frame generation version available.

Apparently it gives somewhat of a boost in Witcher 3 as it is on it's nexus mods section as well less studders/stuttering according to comments.

Will try soon.

Only for 4000 series!

 
Last edited:

amc

Member
I'm sat here with a 4090 in its box, delivered a couple of hours ago. Have my previous build which is good to go bar my PSU, it's an RM750x. Got a RM1000x coming tomorrow.

I'm itching to just throw the sucker in regardless. Only one more sleep, I'll be patient.
 

amigastar

Member
New DLSS 3.0 frame generation version available.

Apparently it gives somewhat of a boost in Witcher 3 as it is on it's nexus mods section as well less studders/stuttering according to comments.

Will try soon.

Only for 4000 series!

I wish someone would make it work with RTX 2000/3000 series. Or Nvidia wouldn't have make it work with 4000 series.
 
Last edited:
Top Bottom