• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD FidelityFX Super Resolution (FSR) launches June 22nd

Papacheeks

Banned
That's a lot of assumptions. I won't hold my breath.

Quality over quantity is what I see the end game being.

Time will tell.

There is no quality over quality? It's one is basically open, the other is closed. Once is being used on consoles which profile wise can run on PC, so the work they do while developing console versions on games config wise can be used on PC.

So in my opinion more people in practical use will use this outside of games that right now are contracted with NVIDIA.
 

CamHostage

Member
Video and screens so far did not make a marry-me first impression, it is significantly blurring the image while in motion from what I can tell. I'd like to see more though.

AMD oddly made a lot of mistakes in this showcase (they showed split-screen instead of mirroring or dual-screening the footage... and WTF that every shot seemed to have all the good parts of the scene like leafy trees and ornate pillars and inviting entryways on the native left while the FSR right side half the time was a wall or an empty ravine?!) Looking forward to seeing FSR in a better light next time.

Let's also understand this is the first release and use of it and DLSS has been out for years. So similar to DLSS 1.0 which was not great, it's going to take time for developers to implement this.

Granted, first release, but I would say everybody was expecting AMD to leapfrog into competition with NVIDIA with FSR (since AMD is so late to the game and they took their time, but also since the bar of quality has been set high) that competed with DLSS 2.0, not 1.0. I think it makes sense why people might have expected more in this announcement.

I think you guys the biggest beneficiaries here. Its Consoles.

Maybe, although there are a lot of good choices for display on consoles. This was kind of sold as the DLSS-Challenger for Consoles that soon to come.

But speaking of consoles... there was no mention of consoles, right? Nothing in that section of the presentation, nothing I saw in the larger AMD keynote, and nothing about consoles on the FSR website. They talk about it being open to "almost any graphics card", they talk about it being made for a wide range of products ("desktop and laptop devices", in this case,) they say it works on over 100 commercially available products... then they say nothing about PS5 or Xbox Series X/S, which AFAIK was the first community that was anticipating the FSR announcement.

So what is the story with FidelityFX Super Res on consoles? Is it not in the first-phase rollout, is it not happening, or is AMD just not able say Xbox or PlayStation in its show for whatever contractual reasons (but they could have said "console platforms" and been fine if that were the case?) and it's still coming as previously rumored?
 
Last edited:

DeaDPo0L84

Member
AMD stays firmly behind Nvidia when it comes to tech. It's good they're making progress for those who are die hard AMD fans and maybe will help things on the console side but overall Nvidia stays winning.
 

octiny

Banned
There is no quality over quality? It's one is basically open, the other is closed. Once is being used on consoles which profile wise can run on PC, so the work they do while developing console versions on games config wise can be used on PC.

So in my opinion more people in practical use will use this outside of games that right now are contracted with NVIDIA.

I'm talking about the underlying tech which equates to the quality, do you believe FSR will implement real-time vector motion calculations within the profiles? If not, then like I said. Quality over quantity. AMD may eventually have more games supporting FSR, but if the quality is trash (for high-end users), I could care less.
 

Papacheeks

Banned
Video and screens so far did not make a marry-me first impression, it is significantly blurring the image while in motion from what I can tell. I'd like to see more though.

AMD oddly made a lot of mistakes in this showcase (they showed split-screen instead of mirroring or dual-screening the footage... and WTF that every shot seemed to have all the good parts of the scene like leafy trees and pillars and entryways on the native left while the FSR right side half the time was a wall or an empty ravine?!) Looking forward to seeing FSR in a better light next time.



Okay, first release, but I would say everybody was expecting AMD to leapfrog into competition with NVIDIA (since it is so late to the game, but also since the bar of quality has been set) with FSR that competed with DLSS 2.0, not 1.0. There are certainly things to look forward to with FSR, this isn't the end of the conversation or the battle for best visual output (this being open beyond the typical specialized devices is an interesting new wrinkle,) but I think it makes sense why people might have expected more in this announcement.



Maybe, although there are a lot of good choices for display on consoles. This was kind of sold as the DLSS-Challenger for Consoles that soon to come.

But speaking of consoles... there was no mention of consoles, right? Nothing in that section of the presentation, nothing I saw in the larger AMD keynote, and nothing about consoles on the FSR website. They talk about it being open to "almost any graphics card", they talk about it being made for a wide range of products ("desktop and laptop devices", in this case,) they say it works on over 100 commercially available products... then they say nothing about PS5 or Xbox Series X/S, which AFAIK was the first community that was anticipating the FSR announcement.

So what is the story with FidelityFX Super Res on consoles? Is it not in the first-phase rollout, is it not happening, or is AMD just not able say Xbox or PlayStation in its show for whatever contractual reasons (but they could have said "console platforms" and been fine if that were the case?) and it's still coming as previously rumored?

When it comes to drivers and new arc that no one in games has been developing on up until the last year or so third party wise. I say why is everyone expecting the moon? Nvidia had tons of time to make DLSS, and also improve upon it with an unlimited R&D budget. Radeon was in a bad spot just a year or so ago.

So all of this has had a bunch of thought in how they would roll this out. Developers are now targeting RDNA 2 in engines. So going forward it's going to take time to get a lot of developers using this, but since it's opensource that means there's going to be a lot of engines that straight up design around it for consoles and then take those profile/config setup to PC.

Imagine FSR done similar to checkerboarding or in combination of that. The difference here is we have consoles now running RDNA 2. To say they wont be able to implement those findings over to PC and vice versa is not thinking long term.

DLSS is a great tech and gives truly incredible results. It also hinges on the investment in NVIDIA with their AI, and having developers basically sign up to implement it so the reconstruction can happen. Would rather have a solution that is universal in drivers that even if there are not profiles given by developer, through the drivers you could force it and come up with some interesting results.
 
Last edited:

RaZoR No1

Member
I still hope that someday we will be able to activate it on driver level. Even if that would mean using something like Nvidia Inspector for AMD.
Having such a wide GPU support is really great, even if the result is not as good as DLSS.
AMD with FSR Supports more Nvidia GPU than Nvidias own DLSS 😅

I am curious to see, how this will be on consoles and if we get the choice to choose between the mode or activate/deactivate it.
 
Video and screens so far did not make a marry-me first impression, it is significantly blurring the image while in motion from what I can tell. I'd like to see more though.

AMD oddly made a lot of mistakes in this showcase (they showed split-screen instead of mirroring or dual-screening the footage... and WTF that every shot seemed to have all the good parts of the scene like leafy trees and pillars and entryways on the native left while the FSR right side half the time was a wall or an empty ravine?!) Looking forward to seeing FSR in a better light next time.

I agree with all of this. I'm hopeful but want independent analysis.
 

FingerBang

Member
They don't criticise everyone with the same level of energy, this is well known.
I disagree. It's one of my favorite channels, I watch all of their videos. They criticize everyone, the fact that some companies are more worthy of criticism than others doesn't make them biased. They have complained about the price of AIB graphics cards, they have complained about AMD not releasing non X CPUs, they have complained about Intel 11th gen i9 being ridiculous, but praised them for their i5/i7 CPUs. They don't think that RT is a big deal that should convince you to go towards one card or the other, but they have said all along that Nvidia offers the better value thanks to DLSS.

They're spot on this time. These cards are bad. And I'm not salty because I can't buy one, I was able to buy a 3080 FE at retail price a few months ago.
 

CamHostage

Member
When it comes to drivers and new arc that no one in games has been developing on up until the last year or so third party wise. I say why is everyone expecting the moon? Nvidia had tons of time to make DLSS, and also improve upon it with an unlimited R&D budget.

Not the moon, just a competitive product.

AMD is doing things differently from NVIDIA (being open and multi-hardware), so it makes sense that they're on a different timeline, but DLAA debuted in Sept 2018 and 2.0 has been out for over a year now. AMD took their time to come to market for their own reasons which may be good overall, but they missed the console launch (and as I mentioned, they frustratingly omitted saying anything about consoles, which would at least feel like a slingshot of momentum to me since DLSS is never going to be on Xbox or PlayStation,) and they're coming way late to the competition. AMD now have got their work cut out for them.

But again, I'm eager to see more and get a clearer understanding. (...And I'd like to see a demo that wasn't full of wonky problems and YT compression crinkle that mostly excited when you looked at the stats, not at the video.)
 

Papacheeks

Banned
Not the moon, just a competitive product.

AMD is doing things differently from NVIDIA (being open and multi-hardware), so it makes sense that they're on a different timeline, but DLAA debuted in Sept 2018 and 2.0 has been out for over a year now. AMD took their time to come to market for their own reasons which may be good overall, but they missed the console launch (and as I mentioned, they frustratingly omitted saying anything about consoles, which would at least feel like a slingshot of momentum to me since DLSS is never going to be on Xbox or PlayStation,) and they're coming way late to the competition. AMD now have got their work cut out for them.

But again, I'm eager to see more and get a clearer understanding. (...And I'd like to see a demo that wasn't full of wonky problems and YT compression crinkle that mostly excited when you looked at the stats, not at the video.)

In terms of where their products are. You have to remember AMD's R&D was split when they started Ryzen. A lot of their big engineering teams were all in on CPU line. Now that their CPU's have been taking names for the past couple years, their engineering teams are now working on Radeon.

So in terms of how long it took to get to V-CACHE, chiplet was 3+ years. We just got on RDNA 2 late last year. So it's going to take time just like it took time for developers to start targeting Zen arch. Now we are seeing cpu's like my first gen 1600x doing really well with newer titles.

So give it some time. By this time next year I think the landscape in how games perform and look will be pretty different. I imagine we will start seeing more profiles as more third party games release on console.
 

Kuranghi

Member
Video and screens so far did not make a marry-me first impression, it is significantly blurring the image while in motion from what I can tell. I'd like to see more though.

AMD oddly made a lot of mistakes in this showcase (they showed split-screen instead of mirroring or dual-screening the footage... and WTF that every shot seemed to have all the good parts of the scene like leafy trees and pillars and entryways on the native left while the FSR right side half the time was a wall or an empty ravine?!) Looking forward to seeing FSR in a better light next time.

Yes, this is a terrible way to compare it imo, many many factors contributing to making the comparison hard to evaluate. I will wait for something where, as best you can get, all other factors are equal. I agree the presentation setup itself was bad too on top of this.

The only way you can attempt to compare the same parts of the image here won't work properly because the geometry (I don't mean the scene, I mean the geometry of the frame) isn't comparable due to post-processing, FoV artifacts and other stuff.

I'm excited to see a proper comparison though, but people shouldn't be judging it finally on anything in this thread imo.
 

Rikkori

Member
I see the same kind of denial from the AMD/Console crowd today as I saw from all the NV fanboys when DLSS was first shown. Even when we told them it's worse than lowering the render scale and adding sharpening they didn't want to accept it even though it was clearly affected by the oil painting effect of typical AI reconstruction, but eventually NV gave them DLSS 2.0 so now they could attack DLSS 1.0 without feeling like they were betraying their GPU waifu. Will history repeat? First you'd have to be optimistic for a FSR 2.0 I guess, let's see. :messenger_tears_of_joy:
 

IntentionalPun

Ask me about my wife's perfect butthole
Sorry to be blunt, but this comparison it terrible, you can't compare it like that because different parts of the frame are moving at different speeds and the viewport isn't uniformly sharp and many other reasons related to the geometry of the image at different sample points.

We need to wait for proper screendoor or image comparisons. I'm not writing this to attack you I'm writing it to make others who just see this and think thats the difference aware why its an unfair comparison.
It's from the marketing image they chose.. and it's incredibly blurry right at the "line" they chose to split this same image with. In fact if anything the 2 pillars look closer than right at the line:

Original Image

IjnJtik.png
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
I know you've been critical about DLSS 2.0 artifacts, did you notice anything with AMD's solution? And which one you prefer?
Well they barely showed anything here; nothing moving much for instance.. hard to judge anything but the obvious blurriness in the 1440p stuff.

The 4k stuff looked great mind you.. but it just wasn't much to show... and the "FSR" side of the screen in that comparison had different lighting.. kind of.. annoying lol.. the scene with the 5 different "slices" was even harder to compare because the native slice on the far left was at the edge of the viewport and some sort of camera lense filter is applied to darken it.
 
Last edited:

llien

Member
There is no magic.
"Image reconstruction" my buttons.

The good outcome of this tech would be that people would stop bitching about "but no DLSS".

Another good point would be game devs saying "fuck proprietary shit" and going FSR so that people who really want to play with upscaling, can have it on much wider set of hardware.

Interesting that AMD have managed to provide something comparable to DLSS 1.0
Fidelity FX CAS (not to be confused with SR) was widely accepted as giving way better render, so "comparable to 1.0" what the heck do you mean?

DLSS 2.0 and 1.0 are two very different beast, the first one is true (but failed) AI solution. The 2 is a glorified TAAu.
 
Last edited:

sendit

Member
AMD stays firmly behind Nvidia when it comes to tech. It's good they're making progress for those who are die hard AMD fans and maybe will help things on the console side but overall Nvidia stays winning.

This is true. I member when AMD was consistently firmly behind Intel.
 
Last edited:

Kuranghi

Member
It's from the marketing image they chose.. and it's incredibly blurry right at the "line" they chose to split this same image with. In fact if anything the 2 pillars look closer than right at the line:

Original Image

IjnJtik.png

Thanks much for reply, that is indeed blurry and disappointing, I thought the pillars were pulled from the video. Hopefully 4K and/or the "ultra quality" preset work better but I wouldn't hold my breath.

What a bizarre marketing image to release if you want to drum up excitement, its obviously disappointing even without maximising it on my display, let alone zooming.

I'll wait for more information/comparisons before I comdemn it fully just in case something weird is going on but its pretty baws looking at that. Why is the geometry of the native image so distorted?
 

DeaDPo0L84

Member
This is so delusional it hurts.

6900XT beats 3090 at newest games, despite:
1) having less transistors
2) consuming less power
3) using MUCH slower VRAM

All that merely 1 year after 5000 series.

Does AMD compete when it comes to offering a solution as good as DLSS 2.0 and Ray tracing performance? Cause stuff you mentioned means nothing to me in real world scenarios. I play games hoping for the best performance while utilizing Ray tracing and dlss 2.0 when possible, I don't care what's under the hood to achieve those results.
 

Soltype

Member
This is just the first gen of the technology, I'm sure it'll better
DLSS 2.0 and 1.0 are two very different beast, the first one is true (but failed) AI solution. The 2 is a glorified TAAu.
I thought 2.0 still uses deep learning just in different areas?
 

IntentionalPun

Ask me about my wife's perfect butthole
Thanks much for reply, that is indeed blurry and disappointing, I thought the pillars were pulled from the video. Hopefully 4K and/or the "ultra quality" preset work better but I wouldn't hold my breath.

What a bizarre marketing image to release if you want to drum up excitement, its obviously disappointing even without maximising it on my display, let alone zooming.

I'll wait for more information/comparisons before I comdemn it fully just in case something weird is going on but its pretty baws looking at that. Why is the geometry of the native image so distorted?

Sorry I actually misunderstood a post from another forum (they were actually complaining their were NO marketing images released lol); it IS from the video, you were right.

It still just looks bad.. and I couldn't find any paused framed that look much better when I found the scene in the video.

It is the 1440p render though, and we don't know the source resolution.
 
Last edited:

llien

Member
Does AMD compete when it comes to offering a solution as good as DLSS 2.0 and Ray tracing performance?
AMD beats NV at RT (whatever the reason to care about that tech that is still in "maybe in the future" promising state years after release) in: WoW RT, Dirt 5, Fortnight.

DLSS 2 assessments are pathetic and I don't want to go that way, if you like fake resolutions, go for it.

But we were talking about tech.

Stuff introduced by AMD just recently:
1) Anti-lag (copypastad by NV)
2) SAM (again, you've guessed)
3) Infinity Cache (allowed them to opt for cheaper, slower VRAM while still rocking performance)

FidelityFX toolkit is ubiquitous, even green sponsored games like CP2077 use it.
 

llien

Member
I thought 2.0 still uses deep learning just in different areas?
1.0 was using datacenter training using per game's higher resolution assets (and what not, and, mind you IT DID MAKE PERFECT SENSE to try that, nothing silly about it, but it didn't work).

2.0 is TAA (90%) + some neural network based static (same across all games) processing.
 
Last edited:

Kuranghi

Member
Sorry I actually misunderstood a post from another forum (they were actually complaining their were NO marketing images released lol); it IS from the video, you were right.

It still just looks bad.. and I couldn't find any paused framed that look much better when I found the scene in the video.

It is the 1440p render though, and we don't know the source resolution.

Cheers for update, but yeah its still not great regardless of that. It will probably look even worse with less compressed shots I think.

If the source resolution isn't 720p or lower I'd be disappointed for that level of quality loss, but given the performance difference I think its 900-1080p.
 

IntentionalPun

Ask me about my wife's perfect butthole
the first one is true (but failed) AI solution.

You really need to stop talking about ML bro lol

Training on the actual source material made it a bad solution, not a good one (and a really odd one at that).. and.. all ML involves neural networks...

It's also incredibly unlikely DLSS 2.0 uses a "static" model; they are likely continuing to train on more imagery to build a better one over time.
 
Last edited:

GreenAlien

Member
Looks pretty good to me.. But I am still at 1080p so I might just not see what you guys are seeing :messenger_fearful:



Screenshot-12.png
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole


I think it starts at 3:40, I think Fidelity FX is a stepping stone needed for RDNA 3 to be ready


This video is one of the bigger jokes I've ever seen... why in the world did anyone use this guy as a source pre-console launch?

Everyone already knew AMD"s solution will be supported all over the place.. why is he acting like that's a surprise, other than click-baity nonsense?
 
Last edited:
This video is one of the bigger jokes I've ever seen... why in the world did anyone use this guy as a source pre-console launch?

Everyone already knew AMD"s solution will be supported all over the place.. why is he acting like that's a surprise, other than click-baity nonsense?

:/

I dunno, thought it was interesting when he mentioned RDNA3 and Fidelity FX..
 

Md Ray

Member
How about my factory OC'd 6800 (Powercolor Red Dragon)?
As long as games are being made for Xbox Series S (which I assume will be for the next 7 years minimum), people with current RDNA 2 GPUs on PC needn't worry about anything.

Even a hypothetical, less powerful RX 6500 non-XT with 20-22 CUs will last as long as Series S... Provided it has 6+ GB of VRAM.
 

Soltype

Member
You really need to stop talking about ML bro lol

Training on the actual source material made it a bad solution, not a good one (and a really odd one at that).. and.. all ML involves neural networks...

It's also incredibly unlikely DLSS 2.0 uses a "static" model; they are likely continuing to train on more imagery to build a better one over time.
So both DLSS 1.0 and 2.0 use tensor cores?
 

IntentionalPun

Ask me about my wife's perfect butthole
:/

I dunno, thought it was interesting when he mentioned RDNA3 and Fidelity FX..
Sorry.. not trying to sound aggressive towards you.. this video is just so bad; like the TMZ of tech videos. He's making up things nobody every said (while saying things like "I think nVidia said this"), and talking about DLSS 1.0 still which isn't being used anymore lol

I mean whatever, his theory you are mentioning is fine and all.. but it's all going to boil down to quality, and what they showed wasn't really promising so far so we have no clue how much it'll get used. Lots of devs have their own solution, Unreal Engine has a solution, etc. All things that work on various graphics cards.

DLSS certainly has competition but it's also at this point way ahead in quality.
 
Last edited:
Sorry.. not trying to sound aggressive towards you.. this video is just so bad; like the TMZ of tech videos. He's making up things nobody every said (while saying things like "I think nVidia said this"), and talking about DLSS 1.0 still which isn't being used anymore lol

I mean whatever, his theory you are mentioning is fine and all.. but it's all going to boil down to quality, and what they showed wasn't really promising so far so we have no clue how much it'll get used. Lots of devs have their own solution, Unreal Engine has a solution, etc. All things that work on various graphics cards.

DLSS certainly has competition but it's also at this point way ahead in quality.
I agree.
 
Turns out The Power of AI actually does do something in DLSS. Who would have seen this one coming?

Oh, anyone who has been paying attention to what Deep Learning actually is the past 5 years.
 

IntentionalPun

Ask me about my wife's perfect butthole
Looks pretty good to me.. But I am still at 1080p so I might just not see what you guys are seeing :messenger_fearful:



Screenshot-12.png
It doesn't look bad.

Thing is... as I always tell people.. either does just rendering at 1440p on a 4k monitor (or 1440p monitor.) Which would give you.. an even better perf boost.

I swear these upscaling techniques on most games just don't do shit once you hit a high enough resolution lol.. this game barely has any fine details at all for instance.

/gets off "4k is pointless for most games" soap box
 
Last edited:

Dream-Knife

Banned
As long as games are being made for Xbox Series S (which I assume will be for the next 7 years minimum), people with current RDNA 2 GPUs on PC needn't worry about anything.

Even a hypothetical, less powerful RX 6500 non-XT with 20-22 CUs will last as long as Series S... Provided it has 6+ GB of VRAM.
What does that have anything to do with the series S though? All of these cards are more powerful than any of the current gen consoles.

Do you say AMD cards are more future proof due to VRAM?
 
Last edited:
but it is also the most brutal example i can get

its the biggest GOTY contender game the PC received in the entirety of generation

i can get you more examples though, if you like... but then it would be horizon zero dawn, ac valhalla and such and you would say its old and its normal and this discussion would go on forever...

there's no proper disperancy between consoles and desktop gpus. it is clear, i'm not saying x1x is magically faster. x1x has the advantage of years of years development and developers have mastered it and manage to extract more efficient performance out of it.

that's series x being equal to 6700xt these days are irrevelant, because sx will be rivaling 6800xt by the time nextgen gpus arrive (same applies to ps5 as well. they're already doing mighty fine job in re:Village against their desktop counterparts)
I think it's a question of expectations, if you lower the details to console level and target 30fps (sometimes less) just like the consoles you can still play on a 750ti like you would on a base PS4 (probably even better than on a PS4 in some games)... now if you expect to always play at 60fps on the PC then yeah.

But a lot of the arguments about console getting higher ranks compared to PC hardware over time is just strange.
 
Top Bottom