BlackM1st
Banned
I'm not in a rush, really.In the off-chance you're not joking... Nvidia does not launch new generations every year.
(Off: Having a good flat screen is a bigger problem to me than just buying the latest Ti from Nvidia for it.)
I'm not in a rush, really.In the off-chance you're not joking... Nvidia does not launch new generations every year.
Flat? Why not cuuuuurved if you don't mind?I'm not in a rush, really.
(Off: Having a good flat screen is a bigger problem to me than just buying the latest Ti from Nvidia for it.)
Flat? Why not cuuuuurved if you don't mind?
Ahhh definitely understandable!I need 2in1 for work(design/CAD/typing/work with text) and play.
Apparently it is not. Found only 1 lmao. ASUS PG32UQX. So yeah, graphic cards "problems" are nothing compared to "flat screen" problems ;DAhhh definitely understandable!
5700XT is 40 CU’s.Regarding RT, didn't look nice, but perhaps it says something about performance:
AMD's first ray tracing demo
Everything is so shiny o_O If interested in explanations of the demo;www.neogaf.com
I expect 2080Ti + 30%+ performance, which should not be far from the said card, although NV might choose to name it differently, so that "but ti is untouchable" myth that was created in post maxwell era, still lives..
And it is based on basic math:
2080Ti = 5700XT + 36-46%
5700XT is a 250mm2 36CU chip on inferior node and slower RAM.
We know from Sony that RDNA2 can cross 2Ghz, with reasonable power conssumption so clocks won't be lower.
We know AMD aims at 50% higher perf/w vs existing NAVI card, closing the only remaining gap they had vs NV.
"Big Navi" is expected to be a 505mm2 chip with faster memory and 80CUs.
On top of it, Ampere is 8nm Samsung (which is more of a 10nm on steroids) while AMD enjoys TSMC 7nm.
Things stack in AMD's favor, so I expect a lot of "but DLSS" and more agressive "black screen" trolling.
True, although it does not change much.5700XT is 40 CU’s.
People should RMA faulty cards instead of writing it off to drivers.After the awful experience with rx 5700 xt I'm not sure if I should even bother with amd gpus.
I did that.People should RMA faulty cards instead of writing it off to drivers.
That's a fucked up situation indeed. I avoid getting into it by being picky about retailers I buy from.What am I supposed to do then?
Agreed.Should have never left the factory to begin with in that state.
I sold the card to someone but made it extremely clear that black screens happen often.That's a fucked up situation indeed. I avoid getting into it by being picky about retailers I buy from.
Agreed.
Since I got back into PC gaming in 2013 I started out with a rig that had an i7 4770k and a GTX 780 in it, then eventually I upgraded the same rig with a GTX 1060. I sold that rig last year and now I have a new rig with Ryzen 3700X and a Sapphire Nitro+ 5700 RX. The reason I went with AMD for my graphics card this time, was bang for the buck reasons along with features I liked about the card. It was going to be between this card or the lower tier 2070 Super with no overclock. There was about $70 between the cards at the time and I decided to go with the Nitro+. I am not upset that I went with this card, because it is arguably the best 1440p-4K card(on some titles) available. I was kind of burned by Nvidia between the transition between the 780 and 1060. It seems that they stopped supporting the 780 card early in favor of their newest graphics card architecture and it forced me to upgrade sooner than I wanted to. It almost felt that they purposely capped my 780 card that up until 1060 was released was a pretty good 1080p card.
Sapphire is one of the best AMD card makers and they also include great software with the card that works great along with AMD's official software and feature set. The software is called Trixx,(stupid name I know). It has a feature called Trixx Boost that works along side AMD's sharpening technology. It allows you to use a lower custom resolution between 95%-70% of true 4k or true 1440p. You then can add AMD sharpening technology to this to allow for some pretty good results. This feature kills the original DLSS 1.0 feature that Nvidia offers. Even though DLSS 2.0 kills Trixx Boost I think that as of now Trixx Boost supports most games whereas DLSS 2.0 can only be supported by select games that were built from the ground up with the feature in mind.
Anyway, I am a bang for the bucks kind of guy, and I have never owned a top of the line Nvidia Ti card anyway, so the next purchase for my graphics card will not be based on brand loyalty, but the price and feature set of the card at the time of purchase.
Nvidias control panel has been there for like 20 years now. How about a graphical overhaul.AMD does such a better job of supporting their cards for the long run than Nvidia does and their software utility is also leagues better. Nvidia control panel makes me feel like I'm running Windows 95 and GeForce Experience is such a joke.
If true, they just need to sell big navi for 600 dollars and hear the screams from jensen...
At similar price nobody is gonna choose amd over nvidia.
I was confused too, but then I look closely and thread name is missing "Ti", so it's classic 3080...Thread title suggests AMD will match Nvidia's 3080 Ti, but the article suggests it won't. Maybe worth clarifying the OP?
I'm just trying to think what amd can offer to get people who is not brand loyale on their side with a gpu that cost the same as nvidia with similar power...Not going to happen, it will likely be 700-1000 is my guess.
People seem to be buying 5700xt in good quantities since it launched, no? I would say that discounts your second comment.
As far as performance it is hard to say as we hear differing rumours from both AMD and Nvidia all the time. I'm going to throw out a prediction that I think will be roughly in line with where we end up give or take a little. I think Big Navi will either match or exceed the 3080 in performance and likely sit somewhere between 3080 and 3090. I don't think it will match/beat the 3090 but if it sits between 3080/3090 then I think that would be an amazing win for AMD. They would be competing at the high end again.
Could be just wishful thinking on my part though, we will have to see how it all pans out.
AMD is behind in raytracing, dlss and powerdraw. They need to catch up AND have a competetive price.
Big navi hype has began.
It should beat 2080Ti comfortably, I doubt 3070 would do that (and not cost a leg and an arm, card names are, after all, arbitrary, you could call 2080 2070super... oh wait... )I'd be surprised to be honest. I'm expecting it to be at best 3070 or 3070 super (in the future) level. I just hope it's not as power hungry as Ampere.
Is their situation that bad?DLSS is something AMD can't do at all. While they can use the tech they created for the consoles for raytracing, DLSS requires a technology that AMD has not shown they have. Tensor cores are not just extra GPU cores put to the side. They process FP16 x2 (which get put back together as FP32 and sent out as such, but can't be inputted as FP32) so quickly that they have their own FLOP rate. Half the amount of Tensor cores by themselves could do nearly double the FLOP rate of CUDA cores or CUs. This is what allows the AI to super sample so quickly and accurately. Some people have run Control at 580p and had DLSS upscale it so perfectly to 1440p that you can't tell the difference. Until AMD creates something akin to the Tensor cores, don't expect anything like that.
Doesn't change anything, AMD doesn't need to be able to compete on every segment including ones where the prices are ridiculous. Nvidia created this segment that includes $999 GPUs, it makes no sense for the vast majority of people anyway and AMD right isn't on it
What really matters is how they'll be able to compete with the likes of a 3060 or 3070 at most, as well as AMD's answers to RT and DLSS.
It is hard to find the part of the statement that is not wrong, chuckle.DLSS is something AMD can't do at all. While they can use the tech they created for the consoles for raytracing, DLSS requires a technology that AMD has not shown they have. Tensor cores are not just extra GPU cores put to the side.
Is their situation that bad?
I mean not having dlss is gonna be freaking huge...
It is hard to find the part of the statement that is not wrong, chuckle.
1) GPUs are inherently great at ML inference
2) Benchmarks have shown that Volta doesn't do better with tensor cores in conv. inference.
3) AMD supports DirectML (that's a generic ML inference API, could be used for... AI stuff lol)
4) AMD has beaten DLSS 1.0 (which is also ML based) using old school algorithmic approach (FidelityFX, oh it runs on NV GPUs too)
5) Last but not least, curiously AMD supports AI Upscaling
got any links to these benchmarks?
That is wrong.Technically speaking, AMD already has an image sharping software built into it's drivers. The good part about that is this means devs don't have to special insert it like they do DLSS, and it runs on all models rather than having to be an RTX, but I haven't seen anyone compare FedlityFX to DLSS
What do you expect 3080 (that is allegedly beaten by "BIG NAVI") to cost?And cost just as much like the Radeon VII.
That is wrong.
Fidelity FX is an external toolkit that developers can integrate at will.
It runs on AMD, NV, Intel.
Technically speaking, AMD already has an image sharping software built into it's drivers. The good part about that is this means devs don't have to special insert it like they do DLSS, and it runs on all models rather than having to be an RTX, but I haven't seen anyone compare FedlityFX to DLSS
I'm going by what I've read from nVidia's site and articles covering DLSS. I'm aware of FidelityFX and it's AI upscaling, but I don't see it talked about as much as, so I assumed it wasn't as good. Thanks for the information, got any links to these benchmarks?
I didn't know that until you told me. I've only heard AMD bring up Fidelity, so I assumed it was their software. Was not aware it was open source, that's interesting to know. I wonder how well it works with APUs.
Edit: It does A LOT more than I thought it did (I thought it was just an AI super sampler). Why doesn't anyone talk about this more? Can it be implemented into consoles too?
We will just file this under "no shit"
seriously, AMD is no match for Intel or nVidia. The only way AMD can win is by shrinking their chips, but once the big guys shrink, they're simply more efficient and powerful and better designed. For instance, AMD at 7nm is only slightly better than Intel at 14nm+(repeated). Intel will destroy them at 10nm. nVidia will destroy them at 8nm.
That is wrong.
Fidelity FX is an external toolkit that developers can integrate at will.
It runs on AMD, NV, Intel.
What do you expect 3080 (that is allegedly beaten by "BIG NAVI") to cost?
Timestamped for you.
AMD open sources most stuff on the gaming effects side.
AMD GPUOpen
Discover your best graphics performance by using our open source tools, SDKs, FidelityFX effects, and tutorials.gpuopen.com
Eg a lot of the hair you see in major titles has been developed as derived from TressFX (all Square Enix, Guerrilla Games, etc.)
SSSR that was recently showcased in Marvel's Avengers was AMD's.
The sharpening everyone is harping on about (even the one now in Nvidia's CP) is from AMD (CAS).
$700 just like the 2080. Remember this?