• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD FidelityFX Super Resolution (FSR) launches June 22nd

SilentUser

Member
Just watched the video in 4K. Honestly, FSR made Godfall really blurry on the GTX 1060. Will need to see more or perhaps the results will be far better on newer cards or AMD cards.
 
Kudos to AMD to demonstrate 1440p too. 4K is pretty easy to reconstruct given the massive amount of present detail, but 1440p and especially 1080p is much harder.

As expected, quality is not nearly as good as DLSS, but wide range hardware support is a good thing. Here's the full picture in a bit better quality:

AMD-Fidelity-FX-Super-Resolution3-pcgh.jpg

Didn't expect much to begin with, but this looks like the lowest setting on DLSS and its apparently Quality mode according to text in the bottom right corner.

No Way Abandon Thread GIF


Lets wait till the reviews are out for real comparisons.
 
Reminds me of DLSS in that anything less than 1440p/quality mode looks a bit wank :messenger_grinning_sweat:

Some speculate NVidia artificially limits the DLSS availability, using the tensor cores as an excuse. If this Fidelity FX is comparable it will be shameful for them.

Interesting that AMD have managed to provide something comparable to DLSS 1.0 without the need for proprietary tech, so it's hardware agnostic, and it's open source so devs can implement & iterate on it as much they want. I certainly wasn't expecting them to show GTX 1060 performance in their announcement, wasn't even expecting official support stretching back to Vega either. Sure it's no DLSS 2.0, but I'm almost convinced those outside of RTX owners will overlook that for now given it's something they can use
 

Papacheeks

Banned
Didn't expect much to begin with, but this looks like the lowest setting on DLSS and its apparently Quality mode according to text in the bottom right corner.

No Way Abandon Thread GIF


Lets wait till the reviews are out for real comparisons.

Let's also understand this is the first release and use of it and DLSS has been out for years. So similar to DLSS 1.0 which was not great, it's going to take time for developers to implement this.

Context is key on where this software is which is super new here's what NVIDIA's first version looked like relative to DLSS 2.0:

 
Last edited:

DonkeyPunchJr

World’s Biggest Weeb
Didn't expect much to begin with, but this looks like the lowest setting on DLSS and its apparently Quality mode according to text in the bottom right corner.

No Way Abandon Thread GIF


Lets wait till the reviews are out for real comparisons.
WTF that image is atrocious. I’m surprised they’d even show it off in this state. Guess I’ll wait for some more detailed analysis when the review sites get their hands on it, but I have zero expectations now.
 

Great Hair

Banned
The 4K60 video has to be poorly encoded. The whole GTX1060 part of the video is laggy and with artifacts/tearing.

1dUtQZ6.png
RWv605Q.png
eDGxNtD.png
 

HeisenbergFX4

Gold Member
I know I just woke up and maybe my eyes arent focusing great yet but this looks blurry as shit to gain some frames.

Maybe my eyes will get better as the day goes on :)
 

octiny

Banned
Let's also understand this is the first release and use of it and DLSS has been out for years. So similar to DLSS 1.0 which was not great, it's going to take time for developers to implement this.

By that metric, you would think AMD would of foreseen the obvious issues. If it is to be improved, AMD will need to implement real-time motion vectors (like DLSS 2.0) which will nullify any developer profiles from the first iteration of FSR. So it's not so much developers, but the underlying technology which would need to be improved. Developers would need to go back & implement newer profiles to allow FSR 2.0 to see real-time motion vectors within the game engine (still waiting on a SOTTR DLSS 2.0/2.1 update that will never come :messenger_loudly_crying: ). How much of a performance hit it would be compared to DLSS 2.0 remains to be seen. Unfortunately, it just looks like AMD rushed this out so they could say they have "something".

https://www.anandtech.com/show/15648/nvidia-intros-dlss-20-adds-motion-vectors
 
Last edited:

truth411

Member
I think you guys the biggest beneficiaries here. Its Consoles, isnt Godfall a game that Favors AMD since its on Consoles? (only a timed exclusive for PS5)

A 59% speedup using FXR Ultra Quality is a huge plus for Consoles.

Expect Both Microsoft and Sony 1st and 2nd party studios to implement this.

Heck with Unreal Engine V combine with FXR (Ultra Quality Mode), I now think Ninja Theory can actually get close to that Hell Blade 2 trailer!
(Which was totally Fake CGI Video)
Exciting Times!
 

mrmeh

Member
The damage control on green side is at full throttle
I hope it could bring to almost all users a great alternative to use and enjoy more years of gaming

:messenger_grimmacing_ I only have AMD products (Series/PC) and so far it looks abit shit... It benefits everyone if it looks like it can boost performance without tanking image quality but after that video I'm not confident
 
I'm still waiting on homie to explain why this is a godsend, and how DLSS should be illegal and banned in all countries. It's not like he hasn't read this thread.
I think you guys the biggest beneficiaries here. Its Consoles, isnt Godfall a game that Favors AMD since its on Consoles? (only a timed exclusive for PS5)

A 59% speedup using FXR Ultra Quality is a huge plus for Consoles.

Expect Both Microsoft and Sony 1st and 2nd party studios to implement this.

Heck with Unreal Engine V combine with FXR (Ultra Quality Mode), I now think Ninja Theory can actually get close to that Hell Blade 2 trailer!
(Which was totally Fake CGI Video)
Exciting Times!
What are you talking about? Why not just Google search things to get the answers of things you don't have a clue on? Hellblade 2 trailer is captured in engine, NOT CGI. This has already been confirmed almost a year ago. It was captured at 24fps to give the cinematic look (which it nails).
 

SF Kosmo

Al Jazeera Special Reporter
There's a reason they aren't showing detail shots. This is not going to be a DLSS killer. It's going to be closer to checkerboard rendering or other non-AI reconstruction techniques.

I still think it's great that there's something hardware independent, and I really do think it will be good enough in practical cases to get from, say, 1440p to 4K. But it's not going to be like DLSS where you can drop to like 720p native and still get something pretty decent.
 
Last edited:

truth411

Member
I'm still waiting on homie to explain why this is a godsend, and how DLSS should be illegal and banned in all countries. It's not like he hasn't read this thread.

What are you talking about? Why not just Google search things to get the answers of things you don't have a clue on? Hellblade 2 trailer is captured in engine, NOT CGI. This has already been confirmed almost a year ago. It was captured at 24fps to give the cinematic look (which it nails).
Fine, it was still in no way representative of how the game would have looked. It was BS.
 
Fine, it was still in no way representative of how the game would have looked. It was BS.
Just about every game shown in engine is pretty much bs, as gameplay will never represent that fidelity. So definitely agreed there. Hopefully we get more samples, as this can benefit several people. Now if people don't like it, there's an on/off switch.
 

Buggy Loop

Member
Well I’m sad now, even as a new 3060 owner, I kind of wished they would have a home run and start an aliasing war with Nvidia. Would have benefited everyone, well especially since Nvidia owners anyway would get access to both solutions.

But yea, it’s about as expected.
 
Why are so many people saying this has serious ramifications for the consoles? I’d have to disagree on the PS5 side anyway, as this looks just like an average implementation of checkerboarding or some other upscaling technique already in use on said machines for years.

I think Insomniac’s temporal injection method is already leagues ahead of this from what we are seeing, for example. I highly doubt this will be used on many AAA console games, especially Sony first party titles, unless it takes a significant step up with version 2.0.

In fact, Sony should take Insomniac’s temporal injection and add it to the PS5 SDK for all developers to use on PS5 titles, if they wish.
 
Last edited:

Kenpachii

Member
Reminds me of DLSS in that anything less than 1440p/quality mode looks a bit wank :messenger_grinning_sweat:



Interesting that AMD have managed to provide something comparable to DLSS 1.0 without the need for proprietary tech, so it's hardware agnostic, and it's open source so devs can implement & iterate on it as much they want. I certainly wasn't expecting them to show GTX 1060 performance in their announcement, wasn't even expecting official support stretching back to Vega either. Sure it's no DLSS 2.0, but I'm almost convinced those outside of RTX owners will overlook that for now given it's something they can use

Pretty much also used cas in cyberpunk. If this is even dlss 1.0, who knows what could happen in the future.

With supporting such a huge group of hardware even as far back as vega, this will only reenforce people investing into there product. If i was nvidia i would be mighty afraid for the next series of cards that come out. U can already see them in there presenation push into more stuff then just gaming because they will heavily need it in the upcoming year most likely.

Not gona lie, if that presentation is even remotely real and u can get 2x the performance at minimal quality loss on a 1080ti, nvidia has some explaining to do.
 

DonkeyPunchJr

World’s Biggest Weeb
Well I’m sad now, even as a new 3060 owner, I kind of wished they would have a home run and start an aliasing war with Nvidia. Would have benefited everyone, well especially since Nvidia owners anyway would get access to both solutions.

But yea, it’s about as expected.
Same. I own a 3080 but I was still rooting for AMD here. A platform-agnostic alternative to DLSS, with almost as good image quality, and that didn’t require developer effort to add support, would’ve benefited everybody.

But this just looks bad. I can’t imagine anybody with a high end GPU wanting this. Seems like it’ll only be a desirable choice if you have a weaker GPU paired with a display that is way out of its league.
 

PhoenixTank

Member
Not expecting miracles here. A little dubious of the few "might as well just run it at a lower resolution" posts. Surely it has to doing something more than CAS, the older image sharpening tech?

We need better footage. 3 weeks to go I guess?

Definitely a welcome surprise to see it running on Nvidia GPUs too. 1080Ti gang holding strong apparently.
 

CrustyBritches

Gold Member
It will be interesting to see how FSR stacks up against CAS+DRS, 1440p upscaled to 4K, Checkerboard, TSR, etc.

First comparison should be vs 1440p to 4K + sharpening filter.
 
Last edited:
Open source, not locked to newer generation of cards and can be used on non-AMD GPUS? GG guys. My 5500 XT just screamed in hapiness.
 

GHG

Member
They have a point though.

How so? They have just announced a refresh that adds additional options to the product line. At this stage there's nothing they or any other chip manufacture can do to fix the supply issues so all they can do is carry on as they normally would have. People who want/get the Ti variants get more performance and those who don't want that can still hunt down the normal models.

They are having a cry about nothing, but that's expected considering it's Nvidia. At this stage all there is to do is discuss the launch and see what happens regarding how that effects supply. They are making a bunch of negative assumptions in their coverage that are yet to be confirmed.

Why should Nvidia send review samples to a YouTube channel that consistently profits from controversy as a result of misrepresenting their products and intentions?

Are you serious? They're right. There's a chip shortage, those cards won't improve the current situation, they'll make it worse. They're unnecessary at best.
They criticize EVERYONE, not only Nvidia. You obviously don't follow the channel and don't give a shit about the current situation is the only thing you care about is the good name of a multi-billion company.

They don't criticise everyone with the same level of energy, this is well known.

Your agenda is the problem. You did not even see the video. The contents reflect the exact sentiment seen on Nvidia subreddit and all forums I visit. Its called common sense and you lack any of it.

Bored Blah Blah Blah GIF
 

Amiga

Member
FSR is an effective upscaling technique that works on many GPUs + consoles. not as good as DLSS2 or bad as DLSS1. having it is better than not having it and it makes a difference. Moore's Law is Dead said FSR matters for what it dose to mid-low budget PC gaming. not everything is about the top 1%.
 

assurdum

Banned
Can anyone explain what's the difference with this and the other amd solution used in the UE5? Because I'm kinda of confused... should be potentially better?
 

Rikkori

Member
So this will make PS5/XSX even more super?
No. The issue is that it's not a good enough generalised solution to replace an individual game's own solution. By that I mean a studio can tweak their own TAA to look great for the particular game they're developing without having to worry about how that might look in others (something FSR/DLSS have to, because they're not bespoke to each game), which will yield much better image quality and similar performance, for example what Insomniac does with their Temporal Injection, or indeed the industry's best - The Division 2's solution which is on par with DLSS or better, without requiring tensor cores. So when AMD puts forth a solution that's as weak as simply lowering the render scale then it makes no real difference for consoles, and not for indies either (because TSR is better, with UE5, and probably Unity will have something similar too), and then on PC you simply have one more reason to buy an RTX gpu instead of Radeon.

So, in the end, this is honestly just a waste of software dev time in service of marketing. AAA studios with their own engines will have their own solutions, those on UE5 will have TSR, and so there's very few left that would even benefit in the least from this. It's a big whoop.
 
Last edited:

assurdum

Banned
No. The issue is that it's not a good enough generalised solution to replace an individual game's own solution. By that I mean a studio can tweak their own TAA to look great for the particular game they're developing without having to worry about how that might look in others (something FSR/DLSS have to, because they're not bespoke to each game), which will yield much better image quality and similar performance, for example what Insomniac does with their Temporal Injection, or indeed the industry's best - The Division 2's solution which is on par with DLSS or better, without requiring tensor cores. So when AMD puts forth a solution that's as weak as simply lowering the render scale then it makes no real difference for consoles, and not for indies either (because TSR is better, with UE5, and probably Unity will have something similar too), and then on PC you simply have one more reason to buy an RTX gpu instead of Radeon.

So, in the end, this is honestly just a waste of software dev time in service of marketing. AAA studios with their own engines will have their own solutions, those on UE5 will have TS3, and so there's very few left that would even benefit in the least from this. It's a big whoop.
Jeez it's quite early, don't understand all that skepticism about it. It's really unnecessary. If you look to what it was TAA at the beginning you wouldn't have bet a penny to how good is became now.
 
Last edited:

Bo_Hazem

Banned
Thanks AMD for giving your old Nvidia GPU a few more years of life:

j7tSQAz.jpg

Wow! That's as generous as AMD Freesync. Another reason to support AMD.

I feel like if i buy RX 6800xt its gonna last me for next 7 years on 1440p. Am i too optimistic or?

Of course you will. Go ahead if it's available within your reach. This was the only missing feature from AMD's cards, and somehow less performance with RT.
 
Last edited:

Bo_Hazem

Banned
fsrcxkuo.png


This is “quality” mode, the native is 1440p.

Wonder what the base resolution is for that?

edit: keep in mind this isn’t the “ultra” quality setting, I’d be most curious about that one.

I know you've been critical about DLSS 2.0 artifacts, did you notice anything with AMD's solution? And which one you prefer?
 
DNo. The issue is that it's not a good enough generalised solution to replace an individual game's own solution. By that I mean a studio can tweak their own TAA to look great for the particular game they're developing without having to worry about how that might look in others (something FSR/DLSS have to, because they're not bespoke to each game), which will yield much better image quality and similar performance, for example what Insomniac does with their Temporal Injection, or indeed the industry's best - The Division 2's solution which is on par with DLSS or better, without requiring tensor cores. So when AMD puts forth a solution that's as weak as simply lowering the render scale then it makes no real difference for consoles, and not for indies either (because TSR is better, with UE5, and probably Unity will have something similar too), and then on PC you simply have one more reason to buy an RTX gpu instead of Radeon.

So, in the end, this is honestly just a waste of software dev time in service of marketing. AAA studios with their own engines will have their own solutions, those on UE5 will have TS3, and so there's very few left that would even benefit in the least from this. It's a big whoop.

200.gif
 

Papacheeks

Banned
By that metric, you would think AMD would of foreseen the obvious issues. If it is to be improved, AMD will need to implement real-time motion vectors (like DLSS 2.0) which will nullify any developer profiles from the first iteration of FSR. So it's not so much developers, but the underlying technology which would need to be improved. Developers would need to go back & implement newer profiles to allow FSR 2.0 to see real-time motion vectors within the game engine (still waiting on a SOTTR DLSS 2.0/2.1 update that will never come :messenger_loudly_crying: ). How much of a performance hit it would be compared to DLSS 2.0 remains to be seen. Unfortunately, it just looks like AMD rushed this out so they could say they have "something".

https://www.anandtech.com/show/15648/nvidia-intros-dlss-20-adds-motion-vectors

Thing is since it's opengear it's not like it's hard to make profiles once implemented. Because thats what these are, local profiles. There's no waiting on samples from NVIDIA in their database for the reconstruction.

The fact his works across a large spectrum of cards is showing about it's more practical use in all games going forward and not a select few that nvidia makes profiles more, and has to update their database with given data for their AI algorithm to reconstruct the image.

And again FSR only being available for a couple titles is showing how many people actually have it. The tools are also for console, so I would assume developers will share their findings as time goes on and update profiles. We now are getting updates to games that launched in 2020. So I dont understand the issue.

Also only a matter of time before people hack drivers and create fixes which then get shared or make their way to actual developers of the game.

DLSS may be an awesome tech, but it's use and practicality because of it's reliance on NVIDIA proprietary is why we still only have a handful of titles with great uses of it. AMD's version though not great right now, shows promise as developer have more time with it, and can update as needed.

It's faster more open system as opposed to NVIDIA's.
 

octiny

Banned
Thing is since it's opengear it's not like it's hard to make profiles once implemented. Because thats what these are, local profiles. There's no waiting on samples from NVIDIA in their database for the reconstruction.

The fact his works across a large spectrum of cards is showing about it's more practical use in all games going forward and not a select few that nvidia makes profiles more, and has to update their database with given data for their AI algorithm to reconstruct the image.

And again FSR only being available for a couple titles is showing how many people actually have it. The tools are also for console, so I would assume developers will share their findings as time goes on and update profiles. We now are getting updates to games that launched in 2020. So I dont understand the issue.

Also only a matter of time before people hack drivers and create fixes which then get shared or make their way to actual developers of the game.

DLSS may be an awesome tech, but it's use and practicality because of it's reliance on NVIDIA proprietary is why we still only have a handful of titles with great uses of it. AMD's version though not great right now, shows promise as developer have more time with it, and can update as needed.

It's faster more open system as opposed to NVIDIA's.

That's a lot of assumptions. I won't hold my breath.

Quality over quantity is what I see the end game being.

Time will tell.
 

SlimySnake

Flashless at the Golden Globes
nowhere near x1x

x1x can brute force run the rdr 2 at native 4k and 30 fps

we tried same x1x settings and 4k and it only rendered 18-20 fps. that's nearly %50 slower than one x

desktop gpus cant match console gpus properly, console gpu will always work more efficient
Thats not entirely true. the 580 and the 1060 were competitors for the same market for years. Both nvidia and AMD saw them like that. And the 580 is a 6 tflops gpu that offers identical performance compared to the x1x.


Pretty much identical performance.

Here is a comparison of both cards in several games. Nearly identical performance in all but one game. Watch Dogs. Even RDR2.




The reason why games like RDR2 perform well on the x1x GPU and by extension the 580 is because of its use of low level APIs. Pretty much all Vulkan games run better on the 580. But newer games without vulkan support run better on the 1060.

Also, the 580 had 8GB of VRAM which likely helped it on higher resolutions like native 4k. But its pretty much useless because no one is playing games at native 4k and sub 30 fps on PC. They are both 1080p 60 fps cards.

Here is an article that goes into more detail.

 

Kuranghi

Member
fsrcxkuo.png


This is “quality” mode, the native is 1440p.

Wonder what the base resolution is for that?

edit: keep in mind this isn’t the “ultra” quality setting, I’d be most curious about that one.

Sorry to be blunt, but this comparison it terrible, you can't compare it like that because different parts of the frame are moving at different speeds and the viewport isn't uniformly sharp and many other reasons related to the geometry of the image at different sample points.

We need to wait for proper screendoor or image comparisons. I'm not writing this to attack you I'm writing it to make others who just see this and think thats the difference aware why its an unfair comparison.
 
Top Bottom