• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.
  • The Politics forum has been nuked. Please do not bring political discussion to the rest of the site, or you will be removed. Thanks.

Cringe Another DOA from AMD Radeon (SuperResolution)

M1chl

Currently Gif and Meme Champion
Dec 25, 2019
11,105
20,928
1,025
Prague, Czech Republic

Time stamped




Screenshoted

 

SlimySnake

Member
Feb 5, 2013
12,032
33,830
1,260

Time stamped




Screenshoted

It's DLSS 1.0, AMD doesnt have tensor cores so they cant utilize it like Nvidia is doing with Ampere cards to make DLSS 2.0 look so good. Pretty sure RDNA 3.0 cards will have something equivalent to tensor cores to do the heavy lifting.

BTW, what was their first DoA? ray tracing? it's really not that common at the moment and consoles seem to be doing just fine with it. Look at the UE5 results. It's doing better than Nvidia cards. AMD cards with low level API support often outdo Nvidia cards that are priced much higher.

I think AMD has been doing just fine. They are both selling out. I'd say its like MS vs Sony. PS5s might be selling more but it doesnt mean xbox consoles arent selling out. they are both widely successful and hardly DoA.
 
Last edited:

M1chl

Currently Gif and Meme Champion
Dec 25, 2019
11,105
20,928
1,025
Prague, Czech Republic
It's DLSS 1.0, AMD doesnt have tensor cores so they cant utilize it like Nvidia is doing with Ampere cards to make DLSS 2.0 look so good. Pretty sure RDNA 3.0 cards will have something equivalent to tensor cores to do the heavy lifting.

BTW, what was their first DoA? ray tracing? it's really not that common at the moment and consoles seem to be doing just fine with it. Look at the UE5 results. It's doing better than Nvidia cards. AMD cards with low level API support often outdo Nvidia cards that are priced much higher.

I think AMD has been doing just fine. They are both selling out. I'd say its like MS vs Sony. PS5s might be selling more but it doesnt mean xbox consoles arent selling out. they are both widely successful and hardly DoA.
Whole past GPU series like 5700XT with current day back then features, which nVidia had. New cards are somewhat decent, but then last great was something like 7900XT I believe. Nobody cares if you stuck back in time, like for example intel.
 

SlimySnake

Member
Feb 5, 2013
12,032
33,830
1,260
Whole past GPU series like 5700XT with current day back then features, which nVidia had. New cards are somewhat decent, but then last great was something like 7900XT I believe. Nobody cares if you stuck back in time, like for example intel.
But again, how many ray tracing games were out back in 2019 when 5700xt came out? Metro, Battlefield v and what else? Control came out a few months later. And then nothing until the 6000 series launched. DLSS 1.0 was shit as we can now see from this AMD solution which is pretty much the same thing. So it was three ray traced games vs dozens other that offered equivalent or better performance for a better price. Especially for games that support the vulkan APIs.

Looking at the April steam spy charts, it looks like the 5700xt has a higher market share than the 2080, 3080 and 2080 super.

 
  • Like
Reactions: M1chl

kinjx11

Banned
Jun 25, 2018
273
537
355
They called it FSR 1.0 for a reason, i think it will get better with time, most importantly it's usable on all GPUs not exclusive to a certain model so they got that going for them
 

Clear

Member
Feb 2, 2009
13,180
8,759
1,365
Feels like people are missing the point here.

The idea is simply to facilitate higher frame-rates by rendering lower and then upscaling. Its not a solution for the highest fidelity, its about mitigating fidelity loss versus performance gain.
That its available on a wider range of hardware is the "win" versus DLSS2, not whether it can match it in perceptual quality.

Also if its easily integratable with existing engines, or at least is less intrusive than DLSS2, again its a "win" for uptake.

It doesn't have to be better. Just good enough and a more generally applicable solution.
 

RoboFu

Member
Oct 10, 2017
3,912
6,032
535
lets wait until real people get their hands on it. some of these images look shady to me.
 
Last edited:
  • Like
Reactions: Amiga

Reindeer

Member
Dec 29, 2019
2,729
5,085
565
From the screenshots AMD released it looks absolutely terrible. Devs are better off using their own forms of reconstruction/upscaling that are built into their engines, at least until there's something better from AMD.
 

IntentionalPun

Ask me about my wife's perfect butthole
Aug 28, 2019
8,728
16,157
660
lets wait until real people get their hands on it. some of these images look shady to me.
It's from AMD's 4k video on their YouTube channel...

I agree we should wait for me.. but this is how they chose to show it off.

What's "shady" if anything is that they mostly compared unlike things.. instead of being able to compare the same imagery, we get different sides of the screen w/ different lighting for native vs. FSR.
 
Last edited:

Kazekage1981

Member
Apr 7, 2019
1,339
2,233
410

i think it starts at 3:40 on his video

I think they are releasing Fidelity FX for developers to use so now, so it will automatically look awesome for RDNA 3. I think its a stepping stone for RDNA 3.
 
  • Triggered
  • LOL
Reactions: GHG and M1chl

RoboFu

Member
Oct 10, 2017
3,912
6,032
535
Why would AMD shoot themselves in the foot and release "shady" pictures of it in action?
its obviously blown up of a smaller youtube video image. even the native side is blurry as hell.
 
Last edited:
Dec 14, 2008
33,954
2,554
1,360
Imagine thinking just rendering at a lower resolution and scaling it up using a basic upscaler algorithm and applying a generic Reshade-class sharpening filter would be as good as DLSS. This is even worse than checkerboard rendering or some other variant of TAA.

To the shock of literally no one, Nvidia wins again.
 
  • Like
Reactions: Neys

M1chl

Currently Gif and Meme Champion
Dec 25, 2019
11,105
20,928
1,025
Prague, Czech Republic
Imagine thinking just rendering at a lower resolution and scaling it up using a basic upscaler algorithm and applying a generic Reshade-class sharpening filter would be as good as DLSS. This is even worse than checkerboard rendering or some other variant of TAA.

To the shock of literally no one, Nvidia wins again.
I didn't follow this development, so for me it was a slight shock...
 

PhoenixTank

Member
Jul 13, 2017
1,447
1,626
745
We kinda have a thread 🤷‍♂️
 

M1chl

Currently Gif and Meme Champion
Dec 25, 2019
11,105
20,928
1,025
Prague, Czech Republic

ethomaz

is mad because DF didn't do a video on a video of a video of a video on PS5
Mar 19, 2013
41,319
43,113
1,310
39
Brazil
It indeed looks bad.
 

SlimySnake

Member
Feb 5, 2013
12,032
33,830
1,260
Your own post with youtube videos show ampere faster than rdna 2 in unreal 5, at 4k.
Yeah, the $1,500 35 tflops Ampere card is outperforming a $1,000 23 tflops AMD card at 4k.

But at 1440p ALL Ampere Nvidia cards are underperforming.



The $579 6800 is a 17 tflops gpu is almost on par with the $700 MSRP 3080 a 30 tflops GPU.
 

Reindeer

Member
Dec 29, 2019
2,729
5,085
565
Last person to believe.
It's kinda sad how he's still hyping it up after we all saw how bad it looks. I guess it should be expected coming from him since he's been downplaying DLSS and hyping up AMD solution for a long time.
 
  • Like
Reactions: M1chl

Reindeer

Member
Dec 29, 2019
2,729
5,085
565
Yeah, the $1,500 35 tflops Ampere card is outperforming a $1,000 23 tflops AMD card at 4k.

But at 1440p ALL Ampere Nvidia cards are underperforming.



The $579 6800 is a 17 tflops gpu is almost on par with the $700 MSRP 3080 a 30 tflops GPU.
Why are you bringing up Ampere Tflops and comparing them to RDNA2 when we all know that they don't scale 1:1? There's been numerous videos and articles done debunking this teraflop comparison between different architectures.
 

SlimySnake

Member
Feb 5, 2013
12,032
33,830
1,260
Why are you bringing up Ampere Tflops and comparing them to RDNA2 when we all know that they don't scale 1:1? There's been numerous videos and articles done debunking this teraflop comparison between different architectures.
Because Nvidia is marketing them and using them to over price their cards?

The fact of the matter is that the UE5 demo is running better on AMD cards that are priced lower. Does this mean they run every game better? No. But AMD is no DoA by any means and is doing very well to keep up with Nvidia.
 
Apr 11, 2016
1,423
1,654
510
Yeah, the $1,500 35 tflops Ampere card is outperforming a $1,000 23 tflops AMD card at 4k.

But at 1440p ALL Ampere Nvidia cards are underperforming.



The $579 6800 is a 17 tflops gpu is almost on par with the $700 MSRP 3080 a 30 tflops GPU.


You do know that site doesnt actually test what it shows, right ? They most likely use an algoritm to aproximate results. I would sugest to not use that site for benchmarks, as its not accurate.

You have actual videos with the unreal engine running on both 3090 and a 6900xt. Why look at something else ? You can say it this way, that a 1500 dollar card outperforms a 1000 dollar one. But looking at those results, a 700 dollar 3080 would be on par with the 1000 dollar 6900xt.

These cards are only flexed at 4k, where other bottlenecks are eliminated. Its of little importance what they do at 1080p or 1440p
 
  • Triggered
  • Like
Reactions: Neys and Buggy Loop

Reindeer

Member
Dec 29, 2019
2,729
5,085
565
Because Nvidia is marketing them and using them to over price their cards?

The fact of the matter is that the UE5 demo is running better on AMD cards that are priced lower. Does this mean they run every game better? No. But AMD is no DoA by any means and is doing very well to keep up with Nvidia.
These companies false advertise all the time, but I'm not sure it's a good idea to use their false propaganda as some form of comparison. Some people here don't understand that teraflops between these architectures don't scale 1:1 and therefore could be misled.
 
Last edited:

SlimySnake

Member
Feb 5, 2013
12,032
33,830
1,260
These cards are only flexed at 4k, where other bottlenecks are eliminated. Its of little importance what they do at 1080p or 1440p
We will have to agree to disagree. This demo is pretty much our first taste of next gen engines and if its struggling to do native 1440p 60 fps then clearly there is a bottleneck here and that bottleneck is the GPU horsepower.

You talked about the videos i posted in the other thread, and they clearly show how like the CPU and VRAM usage is so the bottleneck is the GPU.
 

SlimySnake

Member
Feb 5, 2013
12,032
33,830
1,260
These companies false advertise all the time, but I'm not sure it's a good idea to use their false propaganda as some form of comparison. Some people here don't understand that teraflops between these architectures don't scale 1:1 and therefore could be misled.
There are two other comparisons in my post. Pick which ever one you think fits more. The price point, which gives AMD the edge. Or the actual GPU benchmarks which also give AMD the edge.
 

FireFly

Member
Aug 5, 2007
1,754
1,281
1,440
These cards are only flexed at 4k, where other bottlenecks are eliminated. Its of little importance what they do at 1080p or 1440p
It would be pretty bad if UE5 was CPU bottlenecked at 60 FPS displaying a largely static scene.
 
Last edited:
  • LOL
Reactions: Andodalf
Apr 11, 2016
1,423
1,654
510
It would be pretty bad if UE5 was CPU bottlenecked at 60 FPS displaying a largely static scene.

Its not one thing, its probably all the components when running at low resolutions. Instead of relying fully on the gpu, you're tripping on all the rest of the system on the way. There are reviews for these cards that dont even include 1080p anymore.
 

Kenpachii

Member
Mar 23, 2018
8,769
10,895
815


Boht look blurry as fuck, anyway, looks fine to me in that screenshot. if it gives my 1080ti 40% more performance, i am all for it.
 

Buggy Loop

Member
Jun 9, 2004
6,277
2,760
1,710
Quebec, canada
You do know that site doesnt actually test what it shows, right ? They most likely use an algoritm to aproximate results. I would sugest to not use that site for benchmarks, as its not accurate.

You have actual videos with the unreal engine running on both 3090 and a 6900xt. Why look at something else ? You can say it this way, that a 1500 dollar card outperforms a 1000 dollar one. But looking at those results, a 700 dollar 3080 would be on par with the 1000 dollar 6900xt.

These cards are only flexed at 4k, where other bottlenecks are eliminated. Its of little importance what they do at 1080p or 1440p

Sites I didn’t know faked results.. that’s shocking. Tons of YouTube videos do though, just estimated performances and they are cancer.
 

FireFly

Member
Aug 5, 2007
1,754
1,281
1,440
Its not one thing, its probably all the components when running at low resolutions. Instead of relying fully on the gpu, you're tripping on all the rest of the system on the way. There are reviews for these cards that dont even include 1080p anymore.
Well, if you are limited by something else other than the GPU, then you shouldn't see GPU scaling, which we do see in those benchmarks at least. The reason for excluding 1080p in a GPU review is if the cards don't scale at all this resolution, so the benchmark isn't telling you anything useful. But suddenly enable ray tracing and that can change completely. It's hardly useless to test Cyberpunk 2077 at 1080p with all ray tracing effects on for example, since we see huge differences at that resolution between a 3080/3070/3060 etc. Well, Lumen is effectively a software form of ray tracing that is extremely heavy, and from what Epic have said, it is the main limiter on the framerate. (Think about how they are targeting 1080p on the 2070/2080 class hardware in the consoles).
 
Last edited: