• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.
  • The Politics forum has been nuked. Please do not bring political discussion to the rest of the site, or you will be removed. Thanks.

AMD Publishes More Radeon RX 6900 XT, RX 6800 XT & RX 6800 RDNA 2 Graphics Card Benchmarks in 4K & WQHD

PhoenixTank

Member
Jul 13, 2017
1,399
1,552
705

Allegedly even that will be supported.
Surprisingly the Baseline Boards will have the biggest "benefit".


Although, I am Not certain If there are even more up-to-date information about that..



Wouldn't make Sense to me If that doesn't include "full Zen 3 capability Support"
That first image is from Zen 2 launch if the URL is to believed. It was added and then removed pre-launch. You're going to be very disappointed if you're hoping the bios update next year will bring PCIE4 to 400 series.

Based on the whole debacle with 400 originally being dropped I wouldn't expect any extras like SAM on those boards.

Might come to zen 2 down the line, but not much to go on there.
 

supernova8

Member
Jun 2, 2020
2,053
3,004
430
Don't forget that the 3070 is $499 MSRP (not saying you'll get it for that price, but that's what it is). If AMD releases a 6700 XT at $499 that cannot beat the 3070, NVIDIA will win the mid-range (again).
 

VFXVeteran

Professional Victim (Vetted)
Nov 5, 2019
5,460
12,409
805
Great that they have brought out some monsters in pure rasterization. Now where are the benchmarks that matter? Raytracing?!

Exactly. I think these benchmarks should have been shown already and only hiding that they are really down on RT performance - which is the true value of the Nvidia boards.
 
  • Love
Reactions: DonJuanSchlong

DonJuanSchlong

Spice Spice Baby
Jul 15, 2020
3,215
8,225
735
Exactly. I think these benchmarks should have been shown already and only hiding that they are really down on RT performance - which is the true value of the Nvidia boards.
Thank you! There's a reason they aren't showing their performance in raytracing. If it was so supreme, they would boast it from the rooftops, right Ascend Ascend ?! I guess raytracing is irrelevant in 2020 n beyond
 
Last edited:

Poppyseed

Member
Feb 4, 2017
821
238
355
Thank you! There's a reason they aren't showing their performance in raytracing. If it was so supreme, they would boast it from the rooftops, right Ascend Ascend ?! I guess raytracing is irrelevant in 2020 n beyond

As a 3080 owner ray tracing is mostly irrelevant anyway. AMD is right to not focus on it. When you can enable it without the precipitous FPS drop, then we can talk. Watch Dogs has the best implementation yet (supremely impressive in parts), and I’m still not particularly interested.
 
Last edited:

Elias

Member
Oct 24, 2020
413
838
280
As a 3080 owner ray tracing is mostly irrelevant anyway. AMD is right to not focus on it. When you can enable it without the precipitous FPS drop, then we can talk. Watch Dogs has the best implementation yet (supremely impressive in parts), and I’m still not particularly interested.

Doesn't watch dogs only use rtx reflections and completely foregoes rtx shadows and gi?
 

Rickyiez

Member
Jan 20, 2020
706
935
430
Ermm Watchdogs Legion RT implementation is very lackluster . The puddles are quite static and no RT GI and RT shadow . I would say Control is the best so far .
 
Dec 28, 2006
1,626
1,244
1,560
Thats my expectation as well. I think we'll see 30% of the raster performance of a 3080 with RT when head to head without DLSS. I think AMD's DLSS solution will need to get here quickly to be able to compete though.

As far as I'm aware, AMD's solution for RT is a little better than turing but behind ampere. At the very minimum a 6800XT would perform at slightly above a 2080ti with RT turned on. Then if you factor in the fact that the 6800XT is more powerful at rasterization than the 2080ti, those gains should increase a little as well. No idea where you are getting 30% of the performance of a 3080 when 2080ti is already way higher than that.

While I think the 3080 is going to be ahead of the 6800XT in RT performance when all is said and done, this will be mostly noticable in fully path traced games such as Minecraft and Quake. I don't think the difference in performance is going to be anywhere as massive a difference as some people believe. I mean already in hybrid rendering scenarios (99% of games) Ampere does not pull massively ahead of Turning in performance.

We have recently seen a leaked benchmark for Shadow of The Tomb Raider with RT for the 6800 (non XT) at both 4K and 1440p, this benchmark shows the card performing quite admirably, beating out a 2080ti and 3070 with Ray Tracing turned on. Of course this could have been faked, or could be an outlier for whatever reason and not represent performance across the board in multiple titles so wait for real benchmarks.

But if this does represent the general performance level, then that is pretty impressive and would indicate that in real world scenarios in hybrid rendering scenarios the performance of the AMD cards might be quite close to the 3000 series cards. Maybe a few FPS difference, nothing monumental the way people are currently making out.

If the 6800 benchmark is real and representative of general RT performance then what is especially impressive is that it got this performance on DXR 1.0, which is designed as a collaboration with MS and Nvidia and designed around Nvidia's RT solution, AMD worked with MS on DXR 1.1 which is optimized for AMD's solution. So any RTX games currently are optimized for Nvidia hardware/DXR 1.0

Not saying this will necessarily make a difference in the end as we haven't seen any DXR 1.1 games yet, and I don't know what kind of performance "loss" AMD has with DXR 1.0 compared to any kind of hypothetical "gain" they might have with DXR 1.1, but just something worth taking into account.

Of course all of this is without taking into account DLSS, when you turn that on obviously the Nvidia cards pull ahead. AMD have mentioned we should hear more about Super Resolution soon, presumably before launch of the cards. I've heard rumours it might launch in Dec/Jan as in a driver update. Right now we don't really know much about it so we will have to wait and see what happens. But for those shouting from the rooftops about DLSS being the be-all, end-all feature and that AMD need to have a similar solution, it looks like they will have one shortly.
 
  • Like
Reactions: Elias and Ascend

FireFly

Member
Aug 5, 2007
1,596
1,146
1,300
Don't forget that the 3070 is $499 MSRP (not saying you'll get it for that price, but that's what it is). If AMD releases a 6700 XT at $499 that cannot beat the 3070, NVIDIA will win the mid-range (again).
If the 6700 XT is slower, they won't release it for the same price. (Kind of sad that $500 is now considered mid-range)
 
  • Thoughtful
Reactions: AquaticSquirrel

Elias

Member
Oct 24, 2020
413
838
280
It very much seems amd is the king of rasterization, and may be competitive in ryx performance and have an answer for DLSS. In fact, this is what everyone should be hoping for since this will be real competition at every level at the high end gpu market.
 
  • Like
Reactions: Ascend

thelastword

Banned
Apr 7, 2006
11,732
11,939
2,000
Don't forget that the 3070 is $499 MSRP (not saying you'll get it for that price, but that's what it is). If AMD releases a 6700 XT at $499 that cannot beat the 3070, NVIDIA will win the mid-range (again).
A 6800 is about 16-18% Stronger than a 2080ti. The 2080ti is a stronger card than the 3070, it has more ram too. so the 6800 beats the 3070 even more so. Sony releasing a 6700XT to beat the 2080ti by 5% with 12GB of Vram at $479-499 is going to be the obvious strategy.........6700XT will win in every scenario.....I can see the 6700XT coming with 48CU's.......
 
  • Like
Reactions: llien

MadYarpen

Member
Nov 12, 2015
1,590
755
620
Warsaw, Poland
A 6800 is about 16-18% Stronger than a 2080ti. The 2080ti is a stronger card than the 3070, it has more ram too. so the 6800 beats the 3070 even more so. Sony releasing a 6700XT to beat the 2080ti by 5% with 12GB of Vram at $479-499 is going to be the obvious strategy.........6700XT will win in every scenario.....I can see the 6700XT coming with 48CU's.......
remember 6800 was benchmarked by AMD with SAM and rage mode
 
Dec 28, 2006
1,626
1,244
1,560
Sony releasing a 6700XT to beat the 2080ti by 5% with 12GB of Vram at $479-499 is going to be the obvious strategy.........6700XT will win in every scenario.....I can see the 6700XT coming with 48CU's.......

Sony releasing?

As far as I'm aware the 6700XT will be based on Navi 22 die, which has a maximum of 40CUs. Now given the smaller power draw/less CUs this card can likely clock quite high so potentially 2.3-2.5 to edge out a little more performance.

All in all though, at 40CUs the 6700XT will likely be a little weaker than the 3070 with 46 SMs. Maybe they can close that gap with the higher clocks and infinty cache? (3070 uses GDDR6, no 6X). Hard to say, and obviously don't count any chickens before they hatch, maybe we will all be surprised in benchmarks. But looking just at the core counts 3070 would appear to have the advantage.

So expect 6700XT to be slightly cheaper (50?) if they are on par, and significantly cheaper if the 6700XT is noticeably weaker (100+?)
 

thelastword

Banned
Apr 7, 2006
11,732
11,939
2,000
remember 6800 was benchmarked by AMD with SAM and rage mode
I have not seen any benchmarks with rage mode, but yes it does have SAM enabled.......Who wouldn't if they were on a X570 motherboard with the latest 5000 Ryzen, it's free performance with such a combo. However, just note that if AMD loses frames being paired with an Intel CPU, so does Nvidia.......AMD is not winning in all these games by 11% with SAM, without SAM AMD still wins Nvidia in Forza 4.
 

VFXVeteran

Professional Victim (Vetted)
Nov 5, 2019
5,460
12,409
805
As a 3080 owner ray tracing is mostly irrelevant anyway. AMD is right to not focus on it. When you can enable it without the precipitous FPS drop, then we can talk. Watch Dogs has the best implementation yet (supremely impressive in parts), and I’m still not particularly interested.


Best implementation goes to Crysis Remake. Control has better implementation too. And Metro's DLC boasts the most advanced feature of area lighting available for rendering yet.
 
Apr 11, 2016
1,423
1,648
510
As a 3080 owner ray tracing is mostly irrelevant anyway. AMD is right to not focus on it. When you can enable it without the precipitous FPS drop, then we can talk. Watch Dogs has the best implementation yet (supremely impressive in parts), and I’m still not particularly interested.


AMD is not focused on it ? Both consoles have it and so does AMD. There's plenty of focus. The reason you havent seen it at their presentation is because they will be pretty far behind nvidia. Even though they had 2 years to reverse engineer nvidias cards and copy them. Raytracing is gonna be everywhere as 2021 unfolds. Hopefully we'll get rid of this raytracing doesnt matter/isnt impressive nonsense. Its an expensive technique and will forever be expensive. You'll always drop the average framerate by alot with it
 

JCK75

Member
Apr 19, 2018
1,554
1,583
475
I'm so happy to see them finally score a solid comeback in their GPU division.
I'll be curious to see benchmarks a year from now because it has previously been my experience that AMD cards tend to really increase performance once drivers mature.
 
Last edited:
  • Thoughtful
Reactions: AquaticSquirrel

thelastword

Banned
Apr 7, 2006
11,732
11,939
2,000
Exactly. I think these benchmarks should have been shown already and only hiding that they are really down on RT performance - which is the true value of the Nvidia boards.
It's really a cycle.....When AMD had not officially shown RDNA 2 yet, "they were hiding something, they can't compete with Nvidia, they will only get to 3070 performance at best" Note all these comments were based on rasterization. Now that AMD flogs NV in rasterization, the goalposts have now shifted to raytracing "is the only thing that matters"......in the 6 games it is in for how many years since Turing launched?

Now when AMD shows good raytracing performance and silences the detractors in raytracing as they did with rasterization, what will be the next goal-shifting point made?
 

regawdless

Banned
Sep 21, 2020
2,724
7,012
555
Just for my understanding:
These benchmarks have been measured in combination with the new AMD CPUs.

So should we substract around 10% of performance to have more or less the result on all other CPUs?

Or how to I have to interpret this results?
 
Last edited:

gspat

Member
Apr 23, 2016
460
236
390
I've started to look at it Ryzen, Ryzen+, Ryzen2...

People were all defensive, saying stuff like, the performance is OK, but it's still not competing in high end, single core, AVX etc.

They were saying this even just days before Ryzen3.

Now, with leaks coming out, it's basically the same.

I'm just happy to see actual competition. Even if we need to wait another year for RDNA3 to really do the job of leveling the playing field.

Go team consumer!
 
  • Like
Reactions: Panajev2001a

Ascend

Member
Jul 23, 2018
3,537
5,154
585
It's really a cycle.....When AMD had not officially shown RDNA 2 yet, "they were hiding something, they can't compete with Nvidia, they will only get to 3070 performance at best" Note all these comments were based on rasterization. Now that AMD flogs NV in rasterization, the goalposts have now shifted to raytracing "is the only thing that matters"......in the 6 games it is in for how many years since Turing launched?

Now when AMD shows good raytracing performance and silences the detractors in raytracing as they did with rasterization, what will be the next goal-shifting point made?
DLSS, obviously. And when AMD comes with their own alternative, it will boil down to nitpicking screenshots with 100x zoom to see one pixel of difference, to the claim that the single pixel makes nVidia better.
 

VFXVeteran

Professional Victim (Vetted)
Nov 5, 2019
5,460
12,409
805
It's really a cycle.....When AMD had not officially shown RDNA 2 yet, "they were hiding something, they can't compete with Nvidia, they will only get to 3070 performance at best" Note all these comments were based on rasterization. Now that AMD flogs NV in rasterization, the goalposts have now shifted to raytracing "is the only thing that matters"......in the 6 games it is in for how many years since Turing launched?

Now when AMD shows good raytracing performance and silences the detractors in raytracing as they did with rasterization, what will be the next goal-shifting point made?

AMD doesn't flog Nvidia with rasterization. Come on dude. Who cares about any FPS above 60FPS right now? That's not important. The big challenge has always been RT performance as that's the thing that will push visuals further this generation. We can only go so far with these light probe static GI, fake cube maps, SSR, and all the other nifty screenspace tricks that we had to stomach for 7yrs. If AMD beats in RT performanc, then I'll be completely surprised and they will get mad props. But if they lose, I better not get lame reponses like "RT isn't even important now". That would be moving the goalpost big time.
 
Last edited:

VFXVeteran

Professional Victim (Vetted)
Nov 5, 2019
5,460
12,409
805
DLSS, obviously. And when AMD comes with their own alternative, it will boil down to nitpicking screenshots with 100x zoom to see one pixel of difference, to the claim that the single pixel makes nVidia better.

I don't think DLSS is something that a software hack to remedy. I doubt it. Those Tensor cores and the entire algorithm took a lot of R&D. It's not some simple 'driver' optimization that will take care of it in one fell swoop. Nope. People are already mitigating the importance of DLSS. Amazing.
 

spyshagg

Should not be allowed to breed
May 13, 2006
1,657
410
1,620
I don't think DLSS is something that a software hack to remedy. I doubt it. Those Tensor cores and the entire algorithm took a lot of R&D. It's not some simple 'driver' optimization that will take care of it in one fell swoop. Nope. People are already mitigating the importance of DLSS. Amazing.

You will wait and see the results like everybody else.

The work being done by AMD doesn't end with Navi 2. Neither does Nvidia's. Speaking like you do with pseudo definite claims its just an invitation to show naiveness and eating crows a lot.

If the rasterization figures AMD claims hold true come the embargo, everybody should be happy because everyone wins. Only a "fanboy" loses in this war.
 

Ascend

Member
Jul 23, 2018
3,537
5,154
585
Who cares about any FPS above 60FPS right now? That's not important.
It was important when the 5700XT reached well beyond 60 fps but nVidia's cards were 10 fps faster...

The big challenge has always been RT performance as that's the thing that will push visuals further this generation. We can only go so far with these light probe static GI, fake cube maps, SSR, and all the other nifty screenspace tricks that we had to stomach for 7yrs. If AMD beats in RT performanc, then I'll be completely surprised and they will get mad props. But if they lose, I better not get lame reponses like "RT isn't even important now". That would be moving the goalpost big time.
We would've learned from the best ;)
But seriously... The first DLSS iteration was crap. nVidia's first RTX implementation was also not exactly something to write home about (it technically still isn't). But you want to slam AMD if their first implementation isn't immediately better than nVidia's latest?

I don't think DLSS is something that a software hack to remedy. I doubt it. Those Tensor cores and the entire algorithm took a lot of R&D. It's not some simple 'driver' optimization that will take care of it in one fell swoop. Nope. People are already mitigating the importance of DLSS. Amazing.
Just like flowy hair couldn't have been done without Hairworks (i.e. tessellation)?

Leaving this here.
 
Last edited:
  • Fire
Reactions: llien

llien

Member
Feb 1, 2017
10,387
8,329
945
Those Tensor cores and the entire algorithm took a lot of R&D.
1.0 likely did, rendering shit at 16k resolution at datacenter, grabbing high res resources etc is helluva venture.

Too bad it failed.

And 2.0 is just uh, TAA on steroids I guess.
Blurriness is so obvious it hurts people keep hyping the shit out of 2.0.

Oh, I was asked which of the two images is DLSS upscaling from 1440p to 4k and which native 4k by people who spent undisclosed sum on 3080.

Which of the two is blurry, I've asked myself:


Anyhow, I don't buy the BS of calling upscaling by it's target resolution, AMD should not waste much time to it, just bribe the likes of DB and get on.
 
Last edited:

VFXVeteran

Professional Victim (Vetted)
Nov 5, 2019
5,460
12,409
805
You will wait and see the results like everybody else.

The work being done by AMD doesn't end with Navi 2. Neither does Nvidia's. Speaking like you do with pseudo definite claims its just an invitation to show naiveness and eating crows a lot.

If the rasterization figures AMD claims hold true come the embargo, everybody should be happy because everyone wins. Only a "fanboy" loses in this war.

I only speak like everyone else does. I am an Nvidia fan because I like their business model and their drivers are superior. But I don't care about which GPU is faster than which when comparing PCs. That's just stupid. I've never heard of a PC warrior war before.

In any case, even if AMD has great performance for RT, I'd be very leery of buying their cards as their drivers have always had problems with several AAA games during a generation. But I'll be happy for both GPUs to succeed for sure.
 

sleepnaught

Member
Jan 12, 2017
1,077
331
400
Crazy how AMD is able to be competitive with both Intel and Nvidia simultaneously. Hats off to them. If they can come up with their own DLSS solution, I'll be team red all the way. But, realistically, it's all going to come down to who has something in stock first!
 
  • Thoughtful
Reactions: Insane Metal

VFXVeteran

Professional Victim (Vetted)
Nov 5, 2019
5,460
12,409
805
1.0 likely did, rendering shit at 16k resolution at datacenter, grabbing high res resources etc is helluva venture.

Too bad it failed.

And 2.0 is just uh, TAA on steroids I guess.
Blurriness is so obvious it hurts people keep hyping the shit out of 2.0.

Oh, I was asked which of the two images is DLSS upscaling from 1440p to 4k and which native 4k by people who spent undisclosed sum on 3080.

Which of the two is blurry, I've asked myself:


Anyhow, I don't buy the BS of calling upscaling by it's target resolution, AMD should not waste much time to it, just bribe the likes of DB and get on.

There is a lot that goes on in ML and image reconstruction. I took a 2-day course on it when Nvidia gave it out our company. It's not like a traditional upscale algorithm (which I've implemented several times before). Sometimes, I wish a few of you guys would actually take courses on this stuff to understand the deeper aspects of it instead of the articles that try to portray the overall picture of it.

I haven't seen DLSS to be very blurry depending on the mode you set it in. In a game like Avengers, it seems like "balance" mode gives a very good result while increasing FPS considerbly. It all just depends on how many iterations you want to run for the ML.
 

Rentahamster

Rodent Whores
Jun 26, 2007
45,254
17,729
1,910
Best Coast
It's really a cycle.....When AMD had not officially shown RDNA 2 yet, "they were hiding something, they can't compete with Nvidia, they will only get to 3070 performance at best" Note all these comments were based on rasterization. Now that AMD flogs NV in rasterization, the goalposts have now shifted to raytracing "is the only thing that matters"......in the 6 games it is in for how many years since Turing launched?

Now when AMD shows good raytracing performance and silences the detractors in raytracing as they did with rasterization, what will be the next goal-shifting point made?
How about you wait to believe something until it's actually proven with evidence by independent third parties?
 

llien

Member
Feb 1, 2017
10,387
8,329
945
I haven't seen DLSS to be very blurry

Define "very".

Ars Technica has shown entire Death Stranding frame being blurred out when mouse was quickly moving.
As one would have expected from TAA derivative.

The pics I was asked about was "quality" which is green way of not telling users "this image has been upscaled from 1440p to 4k", I guess.
 

llien

Member
Feb 1, 2017
10,387
8,329
945
How about you wait to believe something until it's actually proven with evidence by independent third parties?
People who used to (rightly so) not trusting what J.Huang says should remember that Lisa is very different. You won't find a single lie. When it is X% it is X%, no reservations.
 

Ascend

Member
Jul 23, 2018
3,537
5,154
585
Let's get back to more important matters....



 
Last edited:

Rentahamster

Rodent Whores
Jun 26, 2007
45,254
17,729
1,910
Best Coast
People who used to (rightly so) not trusting what J.Huang says should remember that Lisa is very different. You won't find a single lie. When it is X% it is X%, no reservations.
Corporations selling you a product are self interested in promoting their product in the best light possible. They present mostly the contexts in which they look good and leave out the contexts that make them look not so good.

Your duty as a consumer is to take marketing claims with a grain of salt until independent verification is acquired. Take off your fanboy hat and realize that they ALL do this.

See Linus story time:

 

BluRayHiDef

Banned
Aug 21, 2015
4,000
5,322
830
DLSS, obviously. And when AMD comes with their own alternative, it will boil down to nitpicking screenshots with 100x zoom to see one pixel of difference, to the claim that the single pixel makes nVidia better.

Like VFXVeteran said, I don't see how AMD's alternative can be as effective as DLSS if RDNA2 doesn't have dedicated hardware for artificially intelligent upscaling. At best I think that AMD may have components for this task in the CUs, but that wouldn't be as effective as cores designed solely for this task, such as Turing and Ampere's Tensor cores. The CUs already have components for rasterization and ray tracing. So, components for yet another tasks would stretch them thin in term of their workloads.
 

DaGwaphics

Member
Dec 29, 2019
3,980
5,320
540
A 6800 is about 16-18% Stronger than a 2080ti. The 2080ti is a stronger card than the 3070, it has more ram too. so the 6800 beats the 3070 even more so. Sony releasing a 6700XT to beat the 2080ti by 5% with 12GB of Vram at $479-499 is going to be the obvious strategy.........6700XT will win in every scenario.....I can see the 6700XT coming with 48CU's.......
6700XT/6700 is N22 right, so 40 CUs on that one right?
 

llien

Member
Feb 1, 2017
10,387
8,329
945
Corporations selling you a product are self interested...
That is not what you are trying to say.
What you wanted to say was "they are all the same".

No, they are not.

J. Huang finds lies (among other things) acceptable, L. Su does not.


Huang is, actually, quite notable for being bananas. Ampere fiasco is largely due to him picking fights with TSMC.
 
Dec 28, 2006
1,626
1,244
1,560
I don't see how AMD's alternative can be as effective as DLSS if RDNA2 doesn't have dedicated hardware for artificially intelligent upscaling. At best I think that AMD may have components for this task in the CUs, but that wouldn't be as effective as cores designed solely for this task, such as Turing and Ampere's Tensor cores. The CUs already have components for rasterization and ray tracing. So, components for yet another tasks would stretch them thin in term of their workloads.

That may not necessarily be the case. Here in 2019 Microsoft is discussing their DirectML API and feature set.

Here is a discussion/demo of Super Resolution back in 2019:


An interesting part I'll include below:



DLSS is great, however the tensor cores that run it seem to be based on the TesorFlow model, which according to MS was not designed for realtime high framerate scenarios. So their solution doesn't seem to need Tensor Cores at all. Will be interesting to see how it eventually performs.

I don't know if AMD's version is simply this or some custom version, as AMD do mention itheir solution being cross platform and open source so I have no idea what will happen. Looking forward to seeing it in action. Something cool is that AMD/MS Super Resolution should also work on Nvidia GPUs.
 
Last edited:
  • Like
Reactions: llien and Ascend

BluRayHiDef

Banned
Aug 21, 2015
4,000
5,322
830
That may not necessarily be the case. Here in 2019 Microsoft is discussing their DirectML API and feature set.

Here is a discussion/demo of Super Resolution back in 2019:


An interesting part I'll include below:



DLSS is great, however the tensor cores that run it seem to be based on the TesorFlow model, which according to MS was not designed for realtime high framerate scenarios. So their solution doesn't seem to need Tensor Cores at all. Will be interesting to see how it eventually performs.

I don't know if AMD's version is simply this or some custom version, as AMD do mention itheir solution being cross platform and open source so I have no idea what will happen. Looking forward to seeing it in action. Something cool is that AMD/MS Super Resolution should also work on Nvidia GPUs.
I'm pretty sure that Nvidia designed their implementation of the TensorFlow model to compensate for the fact that it wasn't originally intended for real time implementation. Additionally, there's no way that a software solution can ever match or beat a hardware solution, especially a hardware solution that's exclusively for one particular purpose.
 
Dec 28, 2006
1,626
1,244
1,560
I'm pretty sure that Nvidia designed their implementation of the TensorFlow model to compensate for the fact that it wasn't originally intended for real time implementation. Additionally, there's no way that a software solution can ever match or beat a hardware solution, especially a hardware solution that's exclusively for one particular purpose.

You do realize that DLSS, Ray Tacing etc.. are software features right? They are programmed using an API and then accelerated via specialized hardware. RDNA2 do not have Tensor Cores but they do have the ability to run FP operations that ML applications can use, rather than FP32
 

BluRayHiDef

Banned
Aug 21, 2015
4,000
5,322
830
You do realize that DLSS, Ray Tacing etc.. are software features right? They are programmed using an API and then accelerated via specialized hardware. RDNA2 do not have Tensor Cores but they do have the ability to run FP operations that ML applications can use, rather than FP32

Yes, but is there dedicated hardware to run that software? If not, then the CUs will have to compromise their performance in rasterization and ray tracing to perform the AI upscaling, which will not be as effective as dedicated Tensor cores.