HerjansEagleFeeder
Member
Sounds like NVIDIAs plan is working, then.Bizarrely in the UK ever since Nvidias announcements, the price of the 3000 series GPUs has actually shot up. They were selling for £500-520. Now? £580-620.
Sounds like NVIDIAs plan is working, then.Bizarrely in the UK ever since Nvidias announcements, the price of the 3000 series GPUs has actually shot up. They were selling for £500-520. Now? £580-620.
When I noticed that the Founders 4090 is the same size as my 3070Ti Suprim X I stopped worrying about the size, fits nicely in an old Corsair Obsidian 550D
No idea about the partner cards though.
Are you sure that's right? The 4090 is bigger than any other card, it's bigger than a 3090 ti.
200 dollars for a 3090?About a day left on this one.
Bet it goes for a little over $600.
And here's one near me (Atlanta) going for $220. Check Craigslist, people!
https://atlanta.craigslist.org/atl/sop/d/kennesaw-nvidia-rtx-3090/7526981188.html
I tend to agree. Of course you can ask clever questions to the seller beforehand, but I wouldn't bet my life on him answering truthfully. Even professional, non private sellers.Aren't gpu too risky to buy used?
Asking questions and expecting the truth for a gpu deal with nerds?I tend to agree. Of course you can ask clever questions to the seller beforehand, but I wouldn't bet my life on him answering truthfully. Even professional, non private sellers.
Aren't second hand gpus too risky?
You have no idea how they were treated, the amount of crazy overclock they had to sustain, how many furmark test they had to endure, etc.
On the other hand, prices are still high robbery here in europe...
Realistically as long as they werent mining you should be okay.Aren't second hand gpus too risky?
You have no idea how they were treated, the amount of crazy overclock they had to sustain, how many furmark test they had to endure, etc.
On the other hand, prices are still high robbery here in europe...
But the concern is with the memory chips cuz they can and will run hot forever if a GPU was mining 24/7, and that will reduce the lifespan and could even kill a module.
I have purchased 3 used GPUs and both have worked well without issues (GTX 970, RTX 2070 Super, RTX 2080Ti).Aren't second hand gpus too risky?
You have no idea how they were treated, the amount of crazy overclock they had to sustain, how many furmark test they had to endure, etc.
On the other hand, prices are still high robbery here in europe...
Pre-mining and GDDR6X it was a non-issue buying second hand GPUs.I have purchased 3 used GPUs and both have worked well without issues (GTX 970, RTX 2070 Super, RTX 2080Ti).
I think there is greater risk with the 3000 series however. So many used 3080 flooding eBay and I'm absolutely certain vast majority have been used for mining. I will often ask directly if the product has been used for mining and I'll be met with zero response.
Damn the nostalgia. My EVGA GTX 570 was the first card I ever RMA'd, (eventually I did a 980 as well down the road), but I RMA'd it because of my own irresponsible overclocking causing issues. I didn't realize other people had issues with the same card, I figured I was just new to everything and made silly mistakes lol.Pre-mining and GDDR6X it was a non-issue buying second hand GPUs.
GTX500s and under could be dicey with overclocks and heat actually fucking up GPUs eventually but otherwise your were pretty safe from there onwards.
I personally killed a GTX570 with overclocks and heat, random artifacts and rain showers -> Green lines everywhere -> nothing but green -> dead.
It was old anyway, and it was my OC beast for years already so dont feel bad for it, he went down like a champ.
In my home country something like 90% of our LAN meet PCs would have second hand GPUs from the west or asia (im originally from SADC).
So yeah before mining started burning memory components it was pretty safe, Nvidias protections made it nigh impossible to kill a GPU without actually going out and trying.
Now even RX570s can come with faulty memory modules....3000 series especially those with GDDR6X are especially vulnerable to their owners not giving a fuck about the temperatures of the memory modules and cooking them.
The good ol days of overclocking for tangible gains.Damn the nostalgia. My EVGA GTX 570 was the first card I ever RMA'd, (eventually I did a 980 as well down the road), but I RMA'd it because of my own irresponsible overclocking causing issues. I didn't realize other people had issues with the same card, I figured I was just new to everything and made silly mistakes lol.
LoL...,they are crazy......PC gaming is dead for me.
Long live console's!!!
ThisYou can build a PC that will shit on a consoles for under $1200. Just because they are releasing expensive next gen top end cards doesn't kill PC gaming. I just built a 12400 3060ti similar to below for my buddy and it does 140-180 FPS in Warzone at 1440P. The best you will do on consoles is 1080p 120.
[/URL][/URL][/URL]
i agree with you on some points but bragging with 8 gb gpus is not going to be healthy going forwardThis
We know these consoles have a GPU in them really no better than a 2060 Ti and the CPU is roughly the same as a R7 3700X. Just make sure you build a PC at that level or above, and you'll play every single game with ease for the entire console generation since no matter what, at the very worst that hardware level will be the baseline.
Edit: So maybe after looking, a R7 2700X + RTX 2080 as your baseline will put you on par with consoles, and anything above is gravy.
There will never be a single game that comes out for next gen PS5 and Series X that you won't be able to play on a 2700X/2080 even if it looks or performs better on the console, which in most cases probably isn't going to be the case. You are looking at this from a purely technical perspective and wanting like to like performance, where I am looking at this from a perspective of having that as your baseline means moving forward, you will have a good experience.i agree with you on some points but bragging with 8 gb gpus is not going to be healthy going forward
8 gb gpus are in a problematic situation and nextgen games will only add more salt to the injury. after seeing how forza, spiderman and other dx12 titles interact with GPUs, I can safely say that I cannot take 8-10-12 gbs at face value going forward. most games only use a maximum of %80 - 85 vram of your total budget. think of 8 gb as 6.4 gb (%80 rule), 10 gb is more like 8 gb and 12 gb is like 9.6 gb. considering both consoles can FULLY allocate entire 10 gb as video memory to games ( 6 gb is for system ram + cpu operations), no way in hell 8 gb gpus will be able to "keep up" with consoles in terms of fidelity, especially in terms of texture quality going forward
in terms of CPU side, there are already a lot of games where even a 3700x cannot match ps5 equivalent performance. so 2700x is just blatant comparison at this point
it is not ideal to match or build pcs that can "match" console in current games. things will go super sour once actual nextgen games hit the market. both vram requirements and cpu requirements will skyrocket.
only way to properly future proof yourself in terms of texture quality / console equivalent textures going forward will be to have a 11-12 gb gpu. anything less, you will have to make huge compromises on texture quality for mere improvements to vram consumption, which defeats the whole purpose of having a lot of grunt (such as 3070ti)
Yeah I double checked the specs sheet on the Nvidia and MSI websites and the 4090 is actually smaller than my 3070ti unless they’re lying.Are you sure that's right? The 4090 is bigger than any other card, it's bigger than a 3090 ti.
Pre-mining and GDDR6X it was a non-issue buying second hand GPUs.
GTX500s and under could be dicey with overclocks and heat actually fucking up GPUs eventually but otherwise your were pretty safe from there onwards.
I personally killed a GTX570 with overclocks and heat, random artifacts and rain showers -> Green lines everywhere -> nothing but green -> dead.
It was old anyway, and it was my OC beast for years already so dont feel bad for it, he went down like a champ.
In my home country something like 90% of our LAN meet PCs would have second hand GPUs from the west or asia (im originally from SADC).
So yeah before mining started burning memory components it was pretty safe, Nvidias protections made it nigh impossible to kill a GPU without actually going out and trying.
Now even RX570s can come with faulty memory modules....3000 series especially those with GDDR6X are especially vulnerable to their owners not giving a fuck about the temperatures of the memory modules and cooking them.
Yeah I double checked the specs sheet on the Nvidia and MSI websites and the 4090 is actually smaller than my 3070ti unless they’re lying.
4090 Founders
3070Ti Suprim X
Yeah I double checked the specs sheet on the Nvidia and MSI websites and the 4090 is actually smaller than my 3070ti unless they’re lying.
4090 Founders
3070Ti Suprim X
Reducing the core clocks and voltage doesnt do anything to prevent the memory from running hot, hell at some point you cant even reduce the memory clocks further cuz then you are losing more hashrate than its worth to even run the card.The 3000 series throttle when they reach unsafe temperatures, and miners in general would rather improve the cooling or reduce clocks and voltages to keep the profits high. The number one killer of cards is thermal cycling I think, and mining cards did not see much of that lol, I think most 3000 mining cards will be fine.
Ps.- That being said, I would only buy from a local seller that lets me test the memory with msi afterburner or from an online marketplace that offers full refund (like ebay or mercadolibre).
Yup so now you get why I’m not too worried It’s not that there is lots of space left though but I just have an old Corsair Obsidian 550D so can’t see why people with new cases would have issues. Unless new cases are smaller?Wow, that's crazy. Maybe the suprim is just huge. I have a 3080 gaming trio x and I don't think itsbthat bog. I'll double check thx for heads up.
Reducing the core clocks and voltage doesnt do anything to prevent the memory from running hot, hell at some point you cant even reduce the memory clocks further cuz then you are losing more hashrate than its worth to even run the card.
And alot of manufacturers dont put thermal pads on the backplate to help cool the memory modules.
Except for I think EVGA pretty much all GPU partners use the core temp to dictate what the GPU as a whole does, so the memory could be melting itself but the core is "fine" so the GPU assumes everything is all good.
Thermal cycle can kill the core, GDDR6X literally just fries itself from running at those high temps for so long, even sustained.
Common "knowledge" in mining circles was its fine at ~100 degrees......but no one....literally no one knows how long the memory modules are happy at such high temps.
So thats why mining GPUs are hit and miss on the memory modules, the next time they hit high temps could be the last.
You could just be gaming for an hour or whatever and thats the straw that broke the camels back.
Okay.I would not worry about chips running near tjmax for 1 or 2 years just because some miner got some lemon lol (that is like 1/5th to 1/10th of industry standards), maybe if you want to use your card for 10 years or something lol.
https://www.newegg.com/asus-geforce-rtx-4090-rog-strix-rtx4090-o24g-gaming/p/N82E16814126593
I had forgotten that the non FE cards would be more expensive. Yep… my 3090 lives on for now.
https://www.newegg.com/asus-geforce-rtx-4090-rog-strix-rtx4090-o24g-gaming/p/N82E16814126593
I had forgotten that the non FE cards would be more expensive. Yep… my 3090 lives on for now.
They have ___ model at MSRP then ____OC model for 1699. I forget the 3000 series details but the OC always tends to be minimal. Like 50 or 100 MHz and the non OC ones were never available. It is bullshit that only really works if demand > supply and everyone colludes. It's stupid and only makes the FEs more popular. I mean if they thought their tiny OC was really marketable why wouldn't they make the base model OC and say look this thing runs 3% faster than the FE at the same price. So I suspect that the mandate is that the MSRP ones need to also run at spec.Looks like we found the price mandate EVGA threw their hissy fit about, Nvidia are clearly requiring AIB's sell at least 1 model at FE MSRP. There's no chance in hell they're doing this of their own accord. They'll probably ship 1 unit of them and 10's of thousands of the $1999 ones. They probably also placed a 4090 price cap at $1999 for AIB's esoteric bullshit.
In the absence of DLSS 3.0 yes the performance increase isn't too impressive.I’m seeing some YouTube vids on how these are not performing as well as nvidia claimed??
DLSS3-Frame Generation is what Nvidia was using to get those 2-4x claims.I’m seeing some YouTube vids on how these are not performing as well as nvidia claimed??
Wait til the mid gen refresh even the "next gen consoles" struggle with dying light 2.LoL...,they are crazy......PC gaming is dead for me.
Long live console's!!!
DLSS3-Frame Generation is what Nvidia was using to get those 2-4x claims.
The 4080'12G is almost certain going to have strict embargos on the reviews for it cuz its not going to look that impressive.
Yes. Every time? Specially when we are talking about GPUs that cost as much as $1599 while the entire console costs $500.Shame these fuckers are gonna sell out at $2500 probably. And holy shit have you ever seen a console generation get completely outclassed like this in two short years?
I fully agree.I think everyone, including the reviewers, should just agree to stop calling that card the 4080 12GB and simply call it what it really is, the 4070.
that alone would instantly put it into perspective how overpriced it is
DLSS3-Frame Generation is what Nvidia was using to get those 2-4x claims.
The 4080'12G is almost certain going to have strict embargos on the reviews for it cuz its not going to look that impressive.
I'm fucking sorry? So now even reviews can share results if they are bad?DLSS3-Frame Generation is what Nvidia was using to get those 2-4x claims.
The 4080'12G is almost certain going to have strict embargos on the reviews for it cuz its not going to look that impressive.
If you dont think your product is gonna be hot, put an embargo on releasing the results.I'm fucking sorry? So now even reviews can share results if they are bad?
Isn't the whole point of a gpu review to have precise benchmarks?
Oh ok i was thinking about embargo like hiding negative results.If you dont think your product is gonna be hot, put an embargo on releasing the results.
The reviewers will review it either which way, but atleast you can save face for a day or 2 during your launch window.
Hell Digital Foundry already have the FPS results for the RTX 4090 vs 3090.......they just arent allowed to tell us what those numbers are until whenever the embargo lifts.
I fully agree.
We should just call it a 4070 so we dont have to keep distinguishing between it and the true 4080.
Tinfoil hat time!!!!!!
The reason there is no 4080'12G Founders Edition is because Nvidia already made the chassis with 4070 badges on them.
When greed took over, they released a new vBios to AIBs that has the 4070 show up as a 4080'12G.
They cant fit the AD104 chip and PCB in the 4080 chassis so no Founders Edition.
The rumored 10GB card that was supposed to be the 4060Ti still has an AD104 so it shares a PCB with the AD104 in the 4080'12G so it will have a Founders Edition cuz they dont have to change the badging on the chassis, they will just repurpose the ones that were supposed to go to AD104-400.
Okay.
On Topic
If my predictions and the leaks; which so far in terms of chips, CUDA and VRAM are correct.....the "baby" AD102 is much better value than the even 4080'16G.
Assuming its 200 dollars more expensive, you are getting alot more performance from that chip than the 16G.
Going up another 200 to the 4090 however isnt netting you that much more performance.
I think I can hold out for a while till we see if that 20G ever becomes a thing.
AD102-300 - $1600
AD102-250 - $1400
AD103-300 - $1200
AD104-400 - $900
AD106-300 - What are you doing....go buy a 3080
AD106-210 - You just really hate 3070s?