• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD RX 6000 'Big Navi' GPU Event | 10/28/2020 @ 12PM EST/9AM PST/4PM UK

Antitype

Member
EnCvSQvXUAA2dMq


Lisa Su, why do you do this to us??

tenor.gif

I don't get who the 6800XT is for. It's on par or maybe slightly better than the 3080 in rasterization, but way behind in Raytracing and yet it costs the same price. They don't even offer any proprietary tech that could somehow entice you going with their GPU over the competition. And their software stack is miserable compared to Nvidia's. So really who is this for? People who only play old rasterized games? Makes no sense to me at this price point.
 

Kenpachii

Member
Like in what world does a 3080 run with raytracing at 1440p metro exodus at 100 fps without dlss. it doesn't even hold 60 fps with a 3080 without dlss.

 
Last edited:

Rikkori

Member
I don't get who the 6800XT is for. It's on par or maybe slightly better than the 3080 in rasterization, but way behind in Raytracing and yet it costs the same price. They don't even offer any proprietary tech that could somehow entice you going with their GPU over the competition. And their software stack is miserable compared to Nvidia's. So really who is this for? People who only play old rasterized games? Makes no sense to me at this price point.
Keep in mind 3080s barely exist at all, let alone at MSRP. We'll have to see how it actually pans out over the next 6 months. And remember, you can't expect the same level of performance from Nvidia past a card's 2 years. etc etc



Like in what world does a 3080 run with raytracing at 1440p metro exodus at 100 fps without dlss. it doesn't even hold 60 fps with a 3080 without dlss.


That's extreme though.
 
Last edited:

Kenpachii

Member
Keep in mind 3080s barely exist at all, let alone at MSRP. We'll have to see how it actually pans out over the next 6 months. And remember, you can't expect the same level of performance from Nvidia past a card's 2 years. etc etc




That's extreme though.


I would assume they use the max settings when benchmarking gpu performance. but we will see.
 

llien

Member
Can we call "without DLSS" what it really is please? Without upscaling.

Don't think it'll lose, but certainly will be behind Ampere, after all Ampere is Nvidia's 2nd gen RT tech.
You guys sound as if RDNA2 and Ampere Fermi2 had not happened.

I don't get who the 6800XT is for. It's on par or maybe slightly better than the 3080 in rasterization, but way behind in Raytracing and yet it costs the same price.
6GB more ram, lower power consumption, same arch as consoles for $50 less.
"it is the same price in Europe" is bullshit, you can't buy 3080 in Europe (or anywhere else for that matter).

3070 is there for you to have at 650 Euro.
 
Last edited:

SantaC

Member
I don't get who the 6800XT is for. It's on par or maybe slightly better than the 3080 in rasterization, but way behind in Raytracing and yet it costs the same price. They don't even offer any proprietary tech that could somehow entice you going with their GPU over the competition. And their software stack is miserable compared to Nvidia's. So really who is this for? People who only play old rasterized games? Makes no sense to me at this price point.
Since when is raytracing a dealbreaker for PC gaming? People wants to play 60fps in 4k at ultra settings
 
Can we call "without DLSS" what it really is please? Without upscaling.


You guys sound as if RDNA2 and Ampere Fermi2 had not happened.


6GB more ram, lower power consumption, same arch as consoles for $50 less.
"it is the same price in Europe" is bullshit, you can't buy 3080 in Europe (or anywhere else for that matter).

3070 is there for you to have at 650 Euro.

Its power consumption is roughly the same though, no ? 300 vs 320 ? As if gamers give the slightest flying fuck about power cons. I dont care about framerates, features, speed, im all about that power consumption. This is a non aspect.

No matter what words you use for DLSS, it doesnt negate the feature, nor its effects, nor that AMD is parroting nvidia for this as well after its ray tracing solution thats behind nvidias first gen of it.


If true that definitely nails the coffin and buries AMD for another generation.


I got higher framerate than them. Clean 100

 

Dampf

Member
EnCvSQvXUAA2dMq


Lisa Su, why do you do this to us??

tenor.gif
Let's wait here. You can't just take FPS numbers from other websites without knowing how AMD measured that result. Metro Exodus performs significantly worse in the in-game benchmark, so if AMD used the benchmark to gauge performance while the 3080 numbers are from a lighter region of the game, well... that would render that comparison completely pointless.
 
Last edited:

NoviDon

Member
AMDs first gen raytracing solution will be great but not as great at nvidia's second crack at it. It's what we expected...you cant expect AMD to jump ahead a generation and compete head to head in rasterization, whoop nvidia's ass in perf/watt, whoop them at lower resolutions, AND match them in RT in one generation. We saving the RT ass whooping for RDNA 3 next year babe 😘
 
DLSS is TAA derivative that blurs shit.


Since Ampere sucks and being defensive.

Holy disengenuous reduction batman. It takes everything good about good TAA implementations and removes almost everything that could be bad about them. You want to obfuscate and bullshit people who are too lazy to google for 2 seconds. And that's fine. I don't pretend that team red doesnt have some distinct, important advantages this go around. It's almost as if you have an agenda...hmmmm
 
AMDs first gen raytracing solution will be great but not as great at nvidia's second crack at it. It's what we expected...you cant expect AMD to jump ahead a generation and compete head to head in rasterization, whoop nvidia's ass in perf/watt, whoop them at lower resolutions, AND match them in RT in one generation. We saving the RT ass whooping for RDNA 3 next year babe 😘


AMD has worse ray tracing than nvidia had in their 1st attempt. They're not wooping anyones ass pretty much ever in this department. Nvidia is more than one gen ahead in this, fully comited.

Holy disengenuous reduction batman. It takes everything good about good TAA implementations and removes almost everything that could be bad about them. You want to obfuscate and bullshit people who are too lazy to google for 2 seconds. And that's fine. I don't pretend that team red doesnt have some distinct, important advantages this go around. It's almost as if you have an agenda...hmmmm


That guy repeats the same bullshit even when presented with pictures and videos. He looks sideways at them and pretends it doesnt exist. Ignores every website and youtube channel and proceeds to call it TAA that blurs
 
Last edited:
Holy disengenuous reduction batman. It takes everything good about good TAA implementations and removes almost everything that could be bad about them. You want to obfuscate and bullshit people who are too lazy to google for 2 seconds. And that's fine. I don't pretend that team red doesnt have some distinct, important advantages this go around. It's almost as if you have an agenda...hmmmm
AMD has worse ray tracing than nvidia had in their 1st attempt. They're not wooping anyones ass pretty much ever in this department. Nvidia is more than one gen ahead in this, fully comited.




That guy repeats the same bullshit even when presented with pictures and videos. He looks sideways at them and pretends it doesnt exist. Ignores every website and youtube channel and proceeds to call it TAA that blurs
Never seen a fanboy as worse as him. Literally turns the blind eye to facts, and even when the whole entire thread he created, laughs at him in regards to DLSS comparison of Death Stranding, he still doubles down. Delusion is a serious thing.
 
AMDs first gen raytracing solution will be great but not as great at nvidia's second crack at it. It's what we expected...you cant expect AMD to jump ahead a generation and compete head to head in rasterization, whoop nvidia's ass in perf/watt, whoop them at lower resolutions, AND match them in RT in one generation. We saving the RT ass whooping for RDNA 3 next year babe 😘
They have more cards releasing next year?
 

FireFly

Member
Well there it is - I knew something smells badly from AMD conference due to what was shown or rather what wasn't shown.

I'm also waiting to see DX 11 benchmarks since AMD carefully avoided them.
I doubt AMD has much to be concerned about with DirectX 11 titles, since they should have a big fillrate and geometry advantage.
 

waylo

Banned
DLSS is TAA derivative that blurs shit.


Since Ampere sucks and being defensive.
This is coming from someone that looked at a DLSS screenshot from literally the first implementation and formed an unshakable opinion that completely ignores the fact that current DLSS is fantastic, and in some cases, quite literally better than native resolution. But, if you want to continue to be ignorant on purpose to try and spread false info, you do you.
 

Ascend

Member
Its power consumption is roughly the same though, no ? 300 vs 320 ? As if gamers give the slightest flying fuck about power cons. I dont care about framerates, features, speed, im all about that power consumption. This is a non aspect.
It's always funny to me that when it's nVidia, nobody cares about power consumption. But when it was the R9 290X, the R9 390X, the R9 Fury, the RX480, the RX580, the Vega 56/64, the Radeon VII and the 5700XT, suddenly power consumption becomes a primary concern. And yes, between them there are differences of lower than 20W compared to nVidia...
 
Last edited:

diffusionx

Gold Member
We kind of had a hint at this when it looked like the Xbox could do 2080-ish levels of performance without RT but it went down to 2060 super territory in RT in WDL. Based on only one game but it was still literally all we had seen from AMD on RT to that point.
 
Last edited:

Ascend

Member
They have more cards releasing next year?
Next year will be the 40CU equivalent. And maybe by the end of next year we get RDNA3, which allegedly will have another 50% bump in performance per watt and improved ray tracing.

That's really weird because those Youtubers have had review units for about a week now according to a few of them.
Yeah but they're under NDA.
 
I remember telling people about my experiences with VRAM requirements in my own development side as well as testing on the games with my PC. I declared that 10G of VRAM was too little for this generation. Instead of hearing me out, I got backlash on "usage" doesn't mean "allocation" over and over again to justify some invisible number that a person THINKS the graphics engine is using.

Bottom Line - 16G or more of VRAM is required for good cohesive FPS and streamlined pipeline. It'd be nice if people actually listened and asked questions instead of strong arming my recommendations only to be proven wrong by some random website. :messenger_beaming:


It'd be nicer if you provided evidence to back your claims. You simply haven't done that. Until you do, it's still speculation. Nobody is arguing that having more wouldn't be better/more ideal. What's been said is exactly what you wrote, but you have provided no proof that it's an actual issue. We haven't seen benchmarks to really support your argument either.
 
Last edited:
Same as every game or tech product these days. It is industry standard

nope. 3080 founders was a day before. 6800xt looks to be on par or even slightly above 3080, even at 4k (although it'll be close there but I'm in the minority gaming at 4k--1440p is still the sweetspot). Why the lack of strength if you've got an undeniably strong hand? I think they're hung up on ray tracing performance (which isn't even that bad, from all accounts)
 

rofif

Can’t Git Gud
I don't get who the 6800XT is for. It's on par or maybe slightly better than the 3080 in rasterization, but way behind in Raytracing and yet it costs the same price. They don't even offer any proprietary tech that could somehow entice you going with their GPU over the competition. And their software stack is miserable compared to Nvidia's. So really who is this for? People who only play old rasterized games? Makes no sense to me at this price point.
The only thing 6800xt has is vram which is not needed yet and it's hard to say if it will be needed in less than 2 years
 

GreatnessRD

Member
I don't get who the 6800XT is for. It's on par or maybe slightly better than the 3080 in rasterization, but way behind in Raytracing and yet it costs the same price. They don't even offer any proprietary tech that could somehow entice you going with their GPU over the competition. And their software stack is miserable compared to Nvidia's. So really who is this for? People who only play old rasterized games? Makes no sense to me at this price point.
RAGE MODE!!!!!

RAGE MODE!!!

RAGE MODE!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!
 
Yeah, like blur, or loss of detail.
Oh wait.

But sure John. :messenger_beaming:


Im wondering why do you keep going with this demonstrably false narative. Any sane person would be happy a technique that improves image quality while boosting performance exists. You were presented with countless proof from every angle, you just close your eyes and keep going with the same thing.
 
35 minutes to the start of November 18th 2020. GO GO INDEPENDENT BENCHMARKS AND REVIEWS!



It's because me love numbers long time.
 

ZywyPL

Banned
Im wondering why do you keep going with this demonstrably false narative. Any sane person would be happy a technique that improves image quality while boosting performance exists. You were presented with countless proof from every angle, you just close your eyes and keep going with the same thing.

It's Lisa Su account, that's why ;)
 
Top Bottom