• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Another DOA from AMD Radeon (SuperResolution)

M1chl

Currently Gif and Meme Champion


Time stamped


snimekobrazovky2021-0kqk0z.png


Screenshoted

haha-deserves-a-screenshot-4338317.png
 

M1chl

Currently Gif and Meme Champion
Let's wait until we try it for ourselves first, or at least there are decent comparisons out there.
I have my prediction, for a pretty long time anyway.

And yes, after seeing that, I was just jerking off over my 3090FE box, totally worth it.
 

SlimySnake

Flashless at the Golden Globes


Time stamped


snimekobrazovky2021-0kqk0z.png


Screenshoted

haha-deserves-a-screenshot-4338317.png

It's DLSS 1.0, AMD doesnt have tensor cores so they cant utilize it like Nvidia is doing with Ampere cards to make DLSS 2.0 look so good. Pretty sure RDNA 3.0 cards will have something equivalent to tensor cores to do the heavy lifting.

BTW, what was their first DoA? ray tracing? it's really not that common at the moment and consoles seem to be doing just fine with it. Look at the UE5 results. It's doing better than Nvidia cards. AMD cards with low level API support often outdo Nvidia cards that are priced much higher.

I think AMD has been doing just fine. They are both selling out. I'd say its like MS vs Sony. PS5s might be selling more but it doesnt mean xbox consoles arent selling out. they are both widely successful and hardly DoA.
 
Last edited:

M1chl

Currently Gif and Meme Champion
It's DLSS 1.0, AMD doesnt have tensor cores so they cant utilize it like Nvidia is doing with Ampere cards to make DLSS 2.0 look so good. Pretty sure RDNA 3.0 cards will have something equivalent to tensor cores to do the heavy lifting.

BTW, what was their first DoA? ray tracing? it's really not that common at the moment and consoles seem to be doing just fine with it. Look at the UE5 results. It's doing better than Nvidia cards. AMD cards with low level API support often outdo Nvidia cards that are priced much higher.

I think AMD has been doing just fine. They are both selling out. I'd say its like MS vs Sony. PS5s might be selling more but it doesnt mean xbox consoles arent selling out. they are both widely successful and hardly DoA.
Whole past GPU series like 5700XT with current day back then features, which nVidia had. New cards are somewhat decent, but then last great was something like 7900XT I believe. Nobody cares if you stuck back in time, like for example intel.
 

SlimySnake

Flashless at the Golden Globes
Whole past GPU series like 5700XT with current day back then features, which nVidia had. New cards are somewhat decent, but then last great was something like 7900XT I believe. Nobody cares if you stuck back in time, like for example intel.
But again, how many ray tracing games were out back in 2019 when 5700xt came out? Metro, Battlefield v and what else? Control came out a few months later. And then nothing until the 6000 series launched. DLSS 1.0 was shit as we can now see from this AMD solution which is pretty much the same thing. So it was three ray traced games vs dozens other that offered equivalent or better performance for a better price. Especially for games that support the vulkan APIs.

Looking at the April steam spy charts, it looks like the 5700xt has a higher market share than the 2080, 3080 and 2080 super.

 

kinjx11

Banned
They called it FSR 1.0 for a reason, i think it will get better with time, most importantly it's usable on all GPUs not exclusive to a certain model so they got that going for them
 

Clear

CliffyB's Cock Holster
Feels like people are missing the point here.

The idea is simply to facilitate higher frame-rates by rendering lower and then upscaling. Its not a solution for the highest fidelity, its about mitigating fidelity loss versus performance gain.
That its available on a wider range of hardware is the "win" versus DLSS2, not whether it can match it in perceptual quality.

Also if its easily integratable with existing engines, or at least is less intrusive than DLSS2, again its a "win" for uptake.

It doesn't have to be better. Just good enough and a more generally applicable solution.
 

RoboFu

One of the green rats
lets wait until real people get their hands on it. some of these images look shady to me.
 
Last edited:

Reindeer

Member
From the screenshots AMD released it looks absolutely terrible. Devs are better off using their own forms of reconstruction/upscaling that are built into their engines, at least until there's something better from AMD.
 

IntentionalPun

Ask me about my wife's perfect butthole
lets wait until real people get their hands on it. some of these images look shady to me.
It's from AMD's 4k video on their YouTube channel...

I agree we should wait for me.. but this is how they chose to show it off.

What's "shady" if anything is that they mostly compared unlike things.. instead of being able to compare the same imagery, we get different sides of the screen w/ different lighting for native vs. FSR.
 
Last edited:


i think it starts at 3:40 on his video

I think they are releasing Fidelity FX for developers to use so now, so it will automatically look awesome for RDNA 3. I think its a stepping stone for RDNA 3.
 
Imagine thinking just rendering at a lower resolution and scaling it up using a basic upscaler algorithm and applying a generic Reshade-class sharpening filter would be as good as DLSS. This is even worse than checkerboard rendering or some other variant of TAA.

To the shock of literally no one, Nvidia wins again.
 

M1chl

Currently Gif and Meme Champion
Imagine thinking just rendering at a lower resolution and scaling it up using a basic upscaler algorithm and applying a generic Reshade-class sharpening filter would be as good as DLSS. This is even worse than checkerboard rendering or some other variant of TAA.

To the shock of literally no one, Nvidia wins again.
I didn't follow this development, so for me it was a slight shock...
 

M1chl

Currently Gif and Meme Champion

SlimySnake

Flashless at the Golden Globes
Your own post with youtube videos show ampere faster than rdna 2 in unreal 5, at 4k.
Yeah, the $1,500 35 tflops Ampere card is outperforming a $1,000 23 tflops AMD card at 4k.

But at 1440p ALL Ampere Nvidia cards are underperforming.

4rkkHHX.png


The $579 6800 is a 17 tflops gpu is almost on par with the $700 MSRP 3080 a 30 tflops GPU.
 

Reindeer

Member
Yeah, the $1,500 35 tflops Ampere card is outperforming a $1,000 23 tflops AMD card at 4k.

But at 1440p ALL Ampere Nvidia cards are underperforming.

4rkkHHX.png


The $579 6800 is a 17 tflops gpu is almost on par with the $700 MSRP 3080 a 30 tflops GPU.
Why are you bringing up Ampere Tflops and comparing them to RDNA2 when we all know that they don't scale 1:1? There's been numerous videos and articles done debunking this teraflop comparison between different architectures.
 

SlimySnake

Flashless at the Golden Globes
Why are you bringing up Ampere Tflops and comparing them to RDNA2 when we all know that they don't scale 1:1? There's been numerous videos and articles done debunking this teraflop comparison between different architectures.
Because Nvidia is marketing them and using them to over price their cards?

The fact of the matter is that the UE5 demo is running better on AMD cards that are priced lower. Does this mean they run every game better? No. But AMD is no DoA by any means and is doing very well to keep up with Nvidia.
 
Yeah, the $1,500 35 tflops Ampere card is outperforming a $1,000 23 tflops AMD card at 4k.

But at 1440p ALL Ampere Nvidia cards are underperforming.

4rkkHHX.png


The $579 6800 is a 17 tflops gpu is almost on par with the $700 MSRP 3080 a 30 tflops GPU.


You do know that site doesnt actually test what it shows, right ? They most likely use an algoritm to aproximate results. I would sugest to not use that site for benchmarks, as its not accurate.

You have actual videos with the unreal engine running on both 3090 and a 6900xt. Why look at something else ? You can say it this way, that a 1500 dollar card outperforms a 1000 dollar one. But looking at those results, a 700 dollar 3080 would be on par with the 1000 dollar 6900xt.

These cards are only flexed at 4k, where other bottlenecks are eliminated. Its of little importance what they do at 1080p or 1440p
 

Reindeer

Member
Because Nvidia is marketing them and using them to over price their cards?

The fact of the matter is that the UE5 demo is running better on AMD cards that are priced lower. Does this mean they run every game better? No. But AMD is no DoA by any means and is doing very well to keep up with Nvidia.
These companies false advertise all the time, but I'm not sure it's a good idea to use their false propaganda as some form of comparison. Some people here don't understand that teraflops between these architectures don't scale 1:1 and therefore could be misled.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
These cards are only flexed at 4k, where other bottlenecks are eliminated. Its of little importance what they do at 1080p or 1440p
We will have to agree to disagree. This demo is pretty much our first taste of next gen engines and if its struggling to do native 1440p 60 fps then clearly there is a bottleneck here and that bottleneck is the GPU horsepower.

You talked about the videos i posted in the other thread, and they clearly show how like the CPU and VRAM usage is so the bottleneck is the GPU.
 

SlimySnake

Flashless at the Golden Globes
These companies false advertise all the time, but I'm not sure it's a good idea to use their false propaganda as some form of comparison. Some people here don't understand that teraflops between these architectures don't scale 1:1 and therefore could be misled.
There are two other comparisons in my post. Pick which ever one you think fits more. The price point, which gives AMD the edge. Or the actual GPU benchmarks which also give AMD the edge.
 
It would be pretty bad if UE5 was CPU bottlenecked at 60 FPS displaying a largely static scene.

Its not one thing, its probably all the components when running at low resolutions. Instead of relying fully on the gpu, you're tripping on all the rest of the system on the way. There are reviews for these cards that dont even include 1080p anymore.
 

Kenpachii

Member
6WSXhXdJ7SwmL5HjsE6VxK-1920-80.jpg


Boht look blurry as fuck, anyway, looks fine to me in that screenshot. if it gives my 1080ti 40% more performance, i am all for it.
 

Buggy Loop

Member
You do know that site doesnt actually test what it shows, right ? They most likely use an algoritm to aproximate results. I would sugest to not use that site for benchmarks, as its not accurate.

You have actual videos with the unreal engine running on both 3090 and a 6900xt. Why look at something else ? You can say it this way, that a 1500 dollar card outperforms a 1000 dollar one. But looking at those results, a 700 dollar 3080 would be on par with the 1000 dollar 6900xt.

These cards are only flexed at 4k, where other bottlenecks are eliminated. Its of little importance what they do at 1080p or 1440p

Sites I didn’t know faked results.. that’s shocking. Tons of YouTube videos do though, just estimated performances and they are cancer.
 

FireFly

Member
Its not one thing, its probably all the components when running at low resolutions. Instead of relying fully on the gpu, you're tripping on all the rest of the system on the way. There are reviews for these cards that dont even include 1080p anymore.
Well, if you are limited by something else other than the GPU, then you shouldn't see GPU scaling, which we do see in those benchmarks at least. The reason for excluding 1080p in a GPU review is if the cards don't scale at all this resolution, so the benchmark isn't telling you anything useful. But suddenly enable ray tracing and that can change completely. It's hardly useless to test Cyberpunk 2077 at 1080p with all ray tracing effects on for example, since we see huge differences at that resolution between a 3080/3070/3060 etc. Well, Lumen is effectively a software form of ray tracing that is extremely heavy, and from what Epic have said, it is the main limiter on the framerate. (Think about how they are targeting 1080p on the 2070/2080 class hardware in the consoles).
 
Last edited:
Top Bottom