• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD FidelityFX Super Resolution (FSR) launches June 22nd

Dampf

Member
fsrcxkuo.png


This is “quality” mode, the native is 1440p.

Wonder what the base resolution is for that?

edit: keep in mind this isn’t the “ultra” quality setting, I’d be most curious about that one.
Kudos to AMD to demonstrate 1440p too. 4K is pretty easy to reconstruct given the massive amount of present detail, but 1440p and especially 1080p is much harder.

As expected, quality is not nearly as good as DLSS, but wide range hardware support is a good thing. Here's the full picture in a bit better quality:

AMD-Fidelity-FX-Super-Resolution3-pcgh.jpg
 
Last edited:
nowhere near x1x

x1x can brute force run the rdr 2 at native 4k and 30 fps

we tried same x1x settings and 4k and it only rendered 18-20 fps. that's nearly %50 slower than one x

desktop gpus cant match console gpus properly, console gpu will always work more efficient
The 1060 runs RDR2 at 12 fps at max details. Xbox tier details should easily push it to 30 fps. You can't even adjust all settings to Xbox tier because some Xbox settings are lower than the lowest possible setting on PC. So in a gif:

Will Ferrell Reaction GIF
 

yamaci17

Member
The 1060 runs RDR2 at 12 fps at max details. Xbox tier details should easily push it to 30 fps. You can't even adjust all settings to Xbox tier because some Xbox settings are lower than the lowest possible setting on PC. So in a gif:

Will Ferrell Reaction GIF



i don't really think those settings would push the 1060 from 15-20 fps to 30 fps...

They're just a few, and we don't know which ones, but it doesn't make much of a difference to image quality either.

7AdtYmq.png
 

yamaci17

Member
Damn, the quality loss is big, it's like the whole screen is covered by motion blur :/ Better than nothing I guess, but I'm not sure if I'd prefer FSR over ordinary CBR or DRS.
checkerboarding is possibly superior to FSR in this state tbh

just render stuff at 1920x2160 and checkerboard them away... re:village does this fantastically and everyone are content with IQ. no point of pushing blurry upscaling...
 
nowhere near x1x

x1x can brute force run the rdr 2 at native 4k and 30 fps

we tried same x1x settings and 4k and it only rendered 18-20 fps. that's nearly %50 slower than one x

desktop gpus cant match console gpus properly, console gpu will always work more efficient


Red Dead 2 runs disproportionately worse on Pascal. Its not a good game to reference console performance. Use a radeon card instead. A 580 outperforms xbox x. Its the same with Valhalla and nvidia. You cant use games that are defective on certain cards/vendors to gauge console performance
 



i don't really think those settings would push the 1060 from 15-20 fps to 30 fps...

They're just a few, and we don't know which ones, but it doesn't make much of a difference to image quality either.

7AdtYmq.png

Ah, I see where the disparity comes from, RDR2 just runs badly on Nvidia cards. Its Radeon equivalent, an RX 580, should be able to run RDR2 at 4K30 at Xbox details. No special sauce required.
 

SoraNoKuni

Member
It's going to get refined obviously, great news, if quality setting is used alongside specific dev implementations of TAA this is going to be great for most people, most people actually game on consoles or cheap gpus.

Nice job AMD.
 

yamaci17

Member
Ah, I see where the disparity comes from, RDR2 just runs badly on Nvidia cards. Its Radeon equivalent, an RX 580, should be able to run RDR2 at 4K30 at Xbox details. No special sauce required.

this is the case with a lot of pc ports in recent years, this is not a specific thing to rdr 2. i can find more examples

that's why i argued in terms of 1060 specially, i have no grunt against amd cards

this is why I also suggest people rdna2 gpus instead of ampere gpus if they want longevity, because in the long run, if you have archaic architecture that differs a lot from consoles, you get absurdities like this where you perform %30 slower all of a sudden with "supposedly" equiavelent GPUs

i think this is not the main interest of this topic so better not stray, thank you for the discussion
 
Last edited:
this is the case with a lot of pc ports in recent years, this is not a specific thing to rdr 2. i can find more examples

that's why i argued in terms of 1060 specially, i have no grunt against amd cards

this is why I also suggest people rdna2 gpus instead of ampere gpus if they want longevity, because in the long run, if you have archaic architecture that differs a lot from consoles, you get absurdities like this where you perform %30 slower all of a sudden with "supposedly" equiavelent GPUs
It really depends on the game though. Games like GTA5, Fortnite, Ass Creed Origins and Odyssey (weirdly, not Valhalla), Jedi Fallen Order etc. run better on Nvidia hardware.

The "fine wine" argument with regards to AMD cards definitely has some truth to it though.
 

Alexios

Cores, shaders and BIOS oh my!
Oh sweet, I can't get DLSS on my 1080 so this will do. How's it gonna work on June 22nd, will a game release/update with it then in its own settings or is it some universal thingie you can somehow enable?

Also, that one image is blurry because it's not ultra quality mode and it also has motion blur on both sides, it's pretty clear, it probably works out better with it off and that's the first setting to kill anyway :D
 
Last edited:

mrmeh

Member
The second in game video has the game at 1440p native ...which means the super res split is probably running at 1080 native or less so maybe it will be better at ultra quality pushing 1440p to 4k.

..Because to be fair (and I have an Series X so I want this to work well) that second video made the tech look like dogshit.
 
Last edited:
The second in game video has the game at 1440p native ...which means the super res split is probably running at 1080 native or less so maybe it will be better at ultra quality pushing 1440p to 4k.

..Because to be fair (and I have an Series X so I want this to work well) that second video made the tech look like dogshit.
Pushing native 1440p to 4K with DLSS looks basically perfect and often better than native 4K. It's a very high bar to beat.
 

Rikkori

Member
This doesn't even look worth using, sadly. I didn't expect much from them but I honestly fail to see the point of it at all, compared to even just lowering the render res. Basically game over for AMD on PC.

Chloe Kim Reaction GIF by Togethxr
 

SantaC

Member
This doesn't even look worth using, sadly. I didn't expect much from them but I honestly fail to see the point of it at all, compared to even just lowering the render res. Basically game over for AMD on PC.

Chloe Kim Reaction GIF by Togethxr
lol this is version 1.00

it will get improved upon obviously.
 

Supmate

Neo Member
I just saw the presentation and I have a question.

They demoed Resident Evil Village on a laptop gpu using DX12. Does this mean that potentially Resident Evil 12 will run 1440p 60 on Series S and 4k 60 on Series X FSR enabled from 22nd June instead of 45 fps that Capcom published.

For series S particularly that would explain a lot as to why it has this 1440p mode that ran poorly
 

Md Ray

Member
I just saw the presentation and I have a question.

They demoed Resident Evil Village on a laptop gpu using DX12. Does this mean that potentially Resident Evil 12 will run 1440p 60 on Series S and 4k 60 on Series X FSR enabled from 22nd June instead of 45 fps that Capcom published.

For series S particularly that would explain a lot as to why it has this 1440p mode that ran poorly
Couple of things...

They demoed RE8 on laptop GPU that's more powerful than Series S GPU.

Series S already uses a form of upscaling method, checkerboard rendering, to reach 1440p resolution in both RT and non-RT mode. So it wouldn't make much of a difference with FSR, I'd imagine.

Series S had perf issues only in the RT mode, wildly fluctuating between 30 and 60fps. W/o RT it is locked 60fps, mostly.

XSX/PS5 also use CB rendering but to output 4K image.
 

yamaci17

Member
Couple of things...

They demoed RE8 on laptop GPU that's more powerful than Series S GPU.

Series S already uses a form of upscaling method, checkerboard rendering, to reach 1440p resolution in both RT and non-RT mode. So it wouldn't make much of a difference with FSR, I'd imagine.

Series S had perf issues only in the RT mode, wildly fluctuating between 30 and 60fps. W/o RT it is locked 60fps, mostly.

XSX/PS5 also use CB rendering but to output 4K image.
yes, series s already uses an upscaling tech, rendering at 1280x1440 and upscaling. and it seems to do a very good job (unlike on the PC version where we got the worse version of checkerboarding)

i bet re8 village checkerboarding on consoles would look better than FSR as it stands

FSR could help Valhalla's performance mode for both Xbox consoles though. But DLSS Quality mode at 1080p looks very bad compared to native 1080p(open to discussion but I don't think upscaling tech are meant to run effectively on such low resolutions), so i don't know how would FSR 720p to 1080p would look if its worse than DLSS
 
Last edited:

assurdum

Banned
There's no "improving" this without a complete re-design. Which won't happen for all the obvious reasons, and why they didn't do it differently in the first place. The fact that this doesn't use motion vectors at all makes it DOA.
Uh. Algorithms can always improve with the time as efficiency and precision.
 
Last edited:

Moriah20

Member
nowhere near x1x

x1x can brute force run the rdr 2 at native 4k and 30 fps

we tried same x1x settings and 4k and it only rendered 18-20 fps. that's nearly %50 slower than one x

desktop gpus cant match console gpus properly, console gpu will always work more efficient

You are comparing different architectures, and also some settings are actually lower than low on the X1X. So you can't get an actual direct match.

The RX 580 can get 25-28 fps at similiar settings to the X1X. Which is very close, because RDR 2 does NOT run at locked 30 fps on the X1X, it drops in a lot of places/sequences, especially in Saint Denis.

Pascal in particular underperforms in RDR 2.

Console gpu's tend to be more efficient but the difference is actually much much smaller now than it ever was.
 

Supmate

Neo Member
yes, series s already uses an upscaling tech, rendering at 1280x1440 and upscaling. and it seems to do a very good job (unlike on the PC version where we got the worse version of checkerboarding)

i bet re8 village checkerboarding on consoles would look better than FSR as it stands

FSR could help Valhalla's performance mode for both Xbox consoles though. But DLSS Quality mode at 1080p looks very bad compared to native 1080p(open to discussion but I don't think upscaling tech are meant to run effectively on such low resolutions), so i don't know how would FSR 720p to 1080p would look if its worse than DLSS
Thanks guys, just asking the question as the presentation stated it has to be incorporated into the engine and it obviously is on PC for RE Village. The engine is DX12 and we know techniques are largely transferable between xbox and PC. So I supposed Series S and X will potentially have FSR ready to go when it is introduced?

I don't think Valhalla has a game core engine?
 

Rudius

Member
I hope this allows some more "1080p" 60 games on my 1650 laptop. I'm interested in the Microsoft games and a few PC exclusives, since the next gen stuff I'll play on PS5.
 

Rudius

Member
Oh wow. thats very impressive.

I would like to see how well it does on cyberpunk. dlss on quality takes my 1440p high settings with rt on from 5 fps to 30+ fps. Balanced and performance go up to 60 fps but they look way too blurry.
Some speculate NVidia artificially limits the DLSS availability, using the tensor cores as an excuse. If this Fidelity FX is comparable it will be shameful for them.
 

FingerBang

Member
These guys...



Nvidia should blacklist them again. Morons.


Are you serious? They're right. There's a chip shortage, those cards won't improve the current situation, they'll make it worse. They're unnecessary at best.
They criticize EVERYONE, not only Nvidia. You obviously don't follow the channel and don't give a shit about the current situation is the only thing you care about is the good name of a multi-billion company.
 

Papacheeks

Banned
No, it is on a per game basis. Devs have to go back and implement this in their games just like they do with DLSS.

The way it's implemented seems the better way of doing this. No database needed. It's open to anyone unlike DLSS which is only specific cards from rtx 2000 series and on. The fact this works with a 1060 gtx is really telling on who's solution is more practical.

This also is a first iteration of it, it took a whole revamp by Nvidia to get DLSS where it is now. Dlss has been out for years now.

I think we get more games a lot faster using this than DLSS currently. Since it's also usable in consoles. Those profiles they create will carry over to PC and vice versa.
 
Top Bottom