• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD FidelityFX Super Resolution (FSR) review roundup

llien

Member







Overall findings:
  • quite good at ultra quality, close to DLSS 2 give or take
  • much worse at lower quality settings
  • runs not only on announced GPUs, but also on a much older stuff
  • very easy to integrate into a game

Recommended for Ampere users, sponsored by Nvidia:




Relevant:

 

Rikkori

Member
Friendly reminder that no publication talking about these technologies and comparing them is honest if they didn't mention how DLSS fails at reconstructing some RT details like reflections, before they start waxing poetically about the magic of the cloud A.I. and how FSR is "nothing but a dumb upscaler". Yes, I am in fact talking about the charlatans at Digital Foundry but they're far from alone.

UeykcQW.jpg


gV3sZvf.jpg
 

llien

Member
I bet this version of FSA was a knee jerk reaction to DLSS 2.0 eating their lunch, and FSA 2.0 will get a longer baking time to actually give us a good option
Exactly what about "quality ultra" settings (from any review, bar the DF bovine feces) renders it "not a good option"?
 

Rikkori

Member
oh I see we are already in idiot town... get your tinfoil hats everyone!

are you saying the video uses fake images? because if not then it clearly looks worse than every other modern upscaling tech of any modern engine
They don't use fake images, but they only show you when it's worse, not when it's better, that's why it's misleading. It's like if they'd only show you the ghosting artifacts of DLSS but never the benefits for texture detail et al.

What do I mean? He tries to shill TAAU as a better alternative to FSR (and only shows FSR performance, which we know is the worst version, but not also ultra quality), but a less sponsored by Nvidia outlet tested that out in an actually released game that's not in alpha and found that FSR is still better than even TAAU (also on UE4). So - the conclusion of an honest man should be that the efficacy of each solution depends. Instead he only shows you the negative - why? That's where you'll have to figure out on your own why an Nvidia-sponsored outlet wants to show AMD in a worse light.

The TAAU vs FSR test in Godfall (through config tweaks):
 
Last edited:

alucard0712_rus

Gold Member
Friendly reminder that no publication talking about these technologies and comparing them is honest if they didn't mention how DLSS fails at reconstructing some RT details like reflections, before they start waxing poetically about the magic of the cloud A.I. and how FSR is "nothing but a dumb upscaler". Yes, I am in fact talking about the charlatans at Digital Foundry but they're far from alone.

UeykcQW.jpg


gV3sZvf.jpg
Do you realize that DLSS operates at lower than native resolution? Of course those effects will be lower resolution...
However maybe in future DLSS will process half-res/quarter-res textures separately and those "artifacts" will gone.
 

Rikkori

Member
Do you realize that DLSS operates at lower than native resolution? Of course those effects will be lower resolution...
However maybe in future DLSS will process half-res/quarter-res textures separately and those "artifacts" will gone.
Yes I do, do you? Do you know what the word 'reconstruction' means? The point is those effects don't get treated at all, not that they're lower quality than native, of course they would be.
 

IntentionalPun

Ask me about my wife's perfect butthole
Kills their tech. (e.g. G-Sync)

They keep selling licenses for G-Sync compatible displays.. so not sure why you think the tech is killed.

My point was it's their strategy. I'd much prefer open solutions; but they go proprietary to sell hardware, and to sell licenses to their tech. It works for them; to deny that is delusional.
 

llien

Member
They keep selling licenses for G-Sync compatible displays.. so not sure why you think the tech is killed.
Why it is killed, mm, let me think.
Perhaps because VRR (open and free for all) is used on that 'FreeSync" by AMD, and that useless G-Sync chip is nowhere to be found.

My point was it's their strategy.
My point is it undermines their own tech.
It was this kind of behavior by NV that killed OpenGL.
 

alucard0712_rus

Gold Member
Yes I do, do you? Do you know what the word 'reconstruction' means? The point is those effects don't get treated at all, not that they're lower quality than native, of course they would be.
So what the problem then? You say no-one mentioned it? Alex from DF has mentioned it several times. As well as AF (texture quality in the distance) will be lower if you using DLSS.
 

01011001

Banned
They don't use fake images, but they only show you when it's worse, not when it's better, that's why it's misleading. It's like if they'd only show you the ghosting artifacts of DLSS but never the benefits for texture detail et al.

What do I mean? He tries to shill TAAU as a better alternative to FSR (and only shows FSR performance, which we know is the worst version, but not also ultra quality), but a less sponsored by Nvidia outlet tested that out in an actually released game that's not in alpha and found that FSR is still better than even TAAU (also on UE4). So - the conclusion of an honest man should be that the efficacy of each solution depends. Instead he only shows you the negative - why? That's where you'll have to figure out on your own why an Nvidia-sponsored outlet wants to show AMD in a worse light.

The TAAU vs FSR test in Godfall (through config tweaks):


the linked video shows almost exactly the same as DF's video, the only difference is that he goes into how TAA produces more shimmer, but a sharper image.

and of you go down to extreme cases he also thinks TAA looks noticeably better

So I really don't get what the issue is here? TAA shimmers a bit more but is sharper... wow, what a revelation.

also the difference even in the best case scenario is that FSR is slightly less harp but also shimmers slightly less... that's the best case scenario compared to what a modern engine just comes with.

which again, leads to the conclusion, that FSR in its current form is basically useless, because most games already have a comparable or better solution, especially when it comes to 1/4 res upscaling. and even at ~70% to ~80% it's nothing to ride home about
 

FingerBang

Member
I have watched a few of the videos above and so far DF is the only one saying negative things about FSR. I don't think any of the videos above are made by "paid shills" and all the people who can't help themselves here should go eat a bag of dicks, but it's weird that they have such a different opinion.

Is any other of the videos above as negative as the DF one?
 
Last edited:

llien

Member
It doesn't matter if it is done for money or "favors" FingerBang FingerBang
DF was blatantly skewing reality in their "super early super preview" of 3080 trying to make it look faster than it was, and is now the only review bashing a tech by NV's competitor (and stakes are high here).

Do you think it is just a coincidence?
 

IntentionalPun

Ask me about my wife's perfect butthole
Why it is killed, mm, let me think.
Perhaps because VRR (open and free for all) is used on that 'FreeSync" by AMD, and that useless G-Sync chip is nowhere to be found.

It's.. not.. nowhere to be found.. they are still selling g-sync licenses for compatible monitors (basically a logo lol), and g-sync chips for fully supported monitors. TV's too.. not entirely sure why you think their tech is nowhere to be found. Go buy an LG CX TV in 2021 and you are giving nVidia some money.

Which is the point, to make money... their strategy.. AMD makes no money off of FreeSync.

My point is it undermines their own tech.
It was this kind of behavior by NV that killed OpenGL.

You have not made your point well at all, as you are just making shit up as usual.
 

assurdum

Banned
It doesn't matter if it is done for money or "favors" FingerBang FingerBang
DF was blatantly skewing reality in their "super early super preview" of 3080 trying to make it look faster than it was, and is now the only review bashing a tech by NV's competitor (and stakes are high here).

Do you think it is just a coincidence?
Could remember wrong but he is always a lot more critic with AMD as isn't it with Nvidia.
 
Last edited:

BabyYoda

Banned
GN mentioned that it runs on pretty much anything (although not officially supported).


Microsoft announced XSeX will have it.
I don't see why Sony wouldn't.
If it is supported on my GTX670, I'd love to experiment, I suspect it will be a waste of time though. I know, time for an upgrade...man this card has served me well though, it's old reliable!
 

Soltype

Member
It's.. not.. nowhere to be found.. they are still selling g-sync licenses for compatible monitors (basically a logo lol), and g-sync chips for fully supported monitors. TV's too.. not entirely sure why you think their tech is nowhere to be found. Go buy an LG CX TV in 2021 and you are giving nVidia some money.

Which is the point, to make money... their strategy.. AMD makes no money off of FreeSync.



You have not made your point well at all, as you are just making shit up as usual.
I thought they stopped making monitors with the module?
 
Last edited:

ethomaz

Banned
I though they stopped making monitors with the module?
They didn't... they have 3 separately tiers of GSync where 2 uses the GSync module.
2021 has monitors launched with GSync modules and nVidia said it is coming to TVs too.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
I though they stopped making monitors with the module?
Honestly not sure; but when I researched G-Sync monitors like 3 months ago a bunch of the recommended ones have the module. But maybe they weren't made particularly recently.

Quick google suggests some of the top monitors released in 2021 don't have it though. edit: Nevermind, like ethomaz says, I just found the highest end monitor ASUS announced recently and it has the latest processor.. nVidia has expanded that if anything, as now they have a special processor that does G-Sync w/ HDR at peak brightness.... it's also not considered "useless' by non-AMD fanboys, as pretty much anyone sane agrees it's superior to Free-Sync/VRR.

Either way, they are continuing to make money with it.. and expanding into TVs in 2021, 6 years after they introduced G-Sync.
 
Last edited:

ethomaz

Banned
Honestly not sure; but when I researched G-Sync monitors like 3 months ago a bunch of the recommended ones have the module. But maybe they weren't made particularly recently.

Quick google suggests some of the top monitors released in 2021 don't have it though. edit: Nevermind, like ethomaz says, I just found the highest end monitor ASUS announced recently and it has the latest processor.. nVidia has expanded that if anything, as now they have a special processor that does G-Sync w/ HDR at peak brightness.... it's also not considered "useless' by non-AMD fanboys, as pretty much anyone sane agrees it's superior to Free-Sync/VRR.

Either way, they are continuing to make money with it.. and expanding into TVs in 2021, 6 years after they introduced G-Sync.
It just that Preminum monitors are in way lower releases than budget ones... so people misses these... you see like one or two releases each few months... sometimes you have even 6 months without a premium monitor being released.
 
Last edited:

Reindeer

Member
Why are people of old GPUs so happy? Those GPUs are only going to be good for 1080p and we can all see how terrible FSR is at 1080p and below. It's better than nothing, but at this point it's best to just upgrade GPU when/if prices come back to normal.
 

IntentionalPun

Ask me about my wife's perfect butthole
Anyways not trying to derail about nVidia.

It's nice for devs to have options; particularly smaller devs who aren't dev'ing their own solutions who might be using engines that don't have them.

Whether this is widely adopted or not is still up for question, but nice that anyone with just about any GPU would be able to use that option.

I don't even like DLSS personally and really hate what I'm seeing here with the softening effect of FSR, so I'll just continue to output native resolution lol
 

Soltype

Member
Honestly not sure; but when I researched G-Sync monitors like 3 months ago a bunch of the recommended ones have the module. But maybe they weren't made particularly recently.

Quick google suggests some of the top monitors released in 2021 don't have it though. edit: Nevermind, like ethomaz says, I just found the highest end monitor ASUS announced recently and it has the latest processor.. nVidia has expanded that if anything, as now they have a special processor that does G-Sync w/ HDR at peak brightness.... it's also not considered "useless' by non-AMD fanboys, as pretty much anyone sane agrees it's superior to Free-Sync/VRR.

Either way, they are continuing to make money with it.. and expanding into TVs in 2021, 6 years after they introduced G-Sync.

It just that Preminum monitors are in way lower releases than budget ones... so people misses these... you see like one or two releases each few months... sometimes you have even 6 months without a premium monitor being released.
Glad they stuck with the tech, shame more monitors don't use it.
 

Skifi28

Member
Why are people of old GPUs so happy? Those GPUs are only going to be good for 1080p and we can all see how terrible FSR is at 1080p and below. It's better than nothing, but at this point it's best to just upgrade GPU when/if prices come back to normal.
And because that could take months if not a couple of years, people are happy. You answered your own question there.
 

IntentionalPun

Ask me about my wife's perfect butthole
Name a single TV with GS chip.

I'm not just talking about the chips.. they make money off of anything certifying as G-Sync compatible.

And not only is that not dead, it's expanding to TVs this year... I don't know if there are any TVs w/ chips coming.. the chip isn't dead as it's going into monitors, and the tech as a way to make a profit, in licensing / software isn't going anywhere either.

And G-Sync absolutely does outdo FreeSync, particularly with the proprietary chip.

You are one strange character friend.
 
Last edited:
Why are people of old GPUs so happy? Those GPUs are only going to be good for 1080p and we can all see how terrible FSR is at 1080p and below. It's better than nothing, but at this point it's best to just upgrade GPU when/if prices come back to normal.
You've given the exact reason why people with old GPU's are happy. They can get more out of their aging Graphics Card, until new GPU prices return to some form of normality again.
 

IntentionalPun

Ask me about my wife's perfect butthole
I'd ask for receipts, but given your post history in this very thread, I'd better not.


but G-SYNC tends to offer a more polished and consistent platform for a premium (which means that the best monitors around are mostly G-SYNC), while capabilities can vary wildly between the wide range of FreeSync models found in every price range.

Biggest difference tends to be ghosting at lower FPS. It's not like the chip is doing nothing; I'm not saying FreeSync is terrible.. or bad.. it's great. G-Sync just outdoes it a bit.

I'm not gonna spend much time on "receipts"... I researched all of this when deciding to buy a monitor/graphics card last year... as a normal consumer.. who has no allegiance to any one brand.

And you said their chip is nowhere to be found.. that's.. making shit up, bro.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
All of these people are putting out videos.. and you can look for yourself and decide if it's something you like.

I dislike the heavily softened image I'm seeing; I don't quite understand why someone would look at that and proclaim it's as good or better than native.. but I am not going to call them a shill.. I just don't value their opinion particularly highly since at the very least, our tastes in what makes a better image differ.

YouTubers don't have to be shills to make something overly positive and not analyzed that well either.

Instead of judging things by how many people have X opinion, judge for yourself.
 

Turk1993

GAFs #1 source for car graphic comparisons
It’s amazing how people become paid shills as soon as they criticize something you like.
So it seems like either you love fsr or you a shill. Looked pretty shit at everything but ultra. Where my paycheck?
All of these people are putting out videos.. and you can look for yourself and decide if it's something you like.

I dislike the heavily softened image I'm seeing; I don't quite understand why someone would look at that and proclaim it's as good or better than native.. but I am not going to call them a shill.. I just don't value their opinion particularly highly since at the very least, our tastes in what makes a better image differ.

YouTubers don't have to be shills to make something overly positive and not analyzed that well either.

Instead of judging things by how many people have X opinion, judge for yourself.

This guy is delusional, everything he praises about AMD gpu's Nvidia does it better. RT, DLSS, G Sync and overall performance, Nvidia is a gen ahead, but he likes to call people paid chills when people choose the superior tech. He talks trash about RTX, G sync and Dlss but when AMD comes with a inferior version of those he praises it like its the best thing ever. Even if it looks worse (like now with FSR) he still keeps downplaying Nvidia's tech and believes that Amd trumped Nvidia lol. Don't take him to surious, he is guy that told me 1 year ago that a 5700xt was a better gpu than a 2070 super and that dlss and ray tracing where not gonna make much difference in current gen games. Look how that turned out lol.
 

Kangx

Member
*quite good at ultra quality, close to DLSS 2 give or take

This finding from the op
Scratching head after watching digital foundry.
Is this even true?
Did Alex say the same thing? He even question it exist, well from anything below ultra anyway.
My take away from this is ultra is just OK. Not anywhere near DLSS.
 

Armorian

Banned
*quite good at ultra quality, close to DLSS 2 give or take

This finding from the op
Scratching head after watching digital foundry.
Is this even true?
Did Alex say the same thing? He even question it exist, well from anything below ultra anyway.
My take away from this is ultra is just OK. Not anywhere near DLSS.

Of course Ilien thinks it's close to DLSS, and no one can change that. In reality it's like this:

Native =<> DLSS (depends on the game, can better or worse) >> Temporal Upsampling >>> FSR >> Upscaling (bi)
 

Kenpachii

Member
DF review is weird, why put performance the absolute worst version versus taau even AMD says only use that version if you absolutely need the performance in critical situations.

Its misleading as shit and just them shitting on FSR for absolute no reason.

I checked all the reviews out and frankly the conclusion i get out of it.

Ultra quality is solid, gives a nice performance increase and if you need more stable framerate u can push lower settings when needed. Its a good DLSS 1.0 solution as i see it maybe even better and DLSS1 was pretty darn bad.

Then about DLSS vs FSR, DLSS2 is superior however does it matter when both are not available in eachothers games? not really. It's clear AMD tries to avoid direct comparisons with DLSS.

Good first start for AMD.
 
Last edited:

alucard0712_rus

Gold Member
This guy is delusional, everything he praises about AMD gpu's Nvidia does it better. RT, DLSS, G Sync and overall performance, Nvidia is a gen ahead, but he likes to call people paid chills when people choose the superior tech. He talks trash about RTX, G sync and Dlss but when AMD comes with a inferior version of those he praises it like its the best thing ever. Even if it looks worse (like now with FSR) he still keeps downplaying Nvidia's tech and believes that Amd trumped Nvidia lol. Don't take him to surious, he is guy that told me 1 year ago that a 5700xt was a better gpu than a 2070 super and that dlss and ray tracing where not gonna make much difference in current gen games. Look how that turned out lol.
That's what I'm not getting too. They like to pretend that NVIDIA is some evil power, lol. Nvidia is years ahead of anyone because of many reasons. Why such hate? Price? Sure. But I've got 2070S instead of 1080ti and very happy about it. If I got any AMD GPU I will be not able to enjoy games that I now enjoy. (Minecraft RTX, Cyberpunk 2077 RTX, No Man Sky VR with DLSS. don't laugh at me, those are names of few)
 

CitizenZ

Banned
Isnt the best quote, "Simply turn down the graphics and settings?" Everything I saw looked very bad and Im not zooming in to count pixels.
 

Merkades

Member
Of the two I have watched.

Ignore - Level One Techs. They sound like paid shills, everything AMD is just awesome, Nvidia is evil. Also video is out of focus, so don't try to adjust (if you watch it).

Worth Watching - Gamer's Nexus. They offer praise and concerns. Seems fairly reasonable.
 
Top Bottom