• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD’s Equivalent to DLSS, FidelityFX Super Resolution Is Coming to PS5 & Xbox Series Consoles

LordOfChaos

Member
This is going to be really nice for the 9th gen consoles. I thought it was a waste to target 4K native when 1440P+ a next gen reconstructive upscaler would look pretty dang identical unless you were going cross eyed pixel peeping, you can always spend those GPU resources somewhere else that makes more visual impact.
 

Krisprolls

Banned
It's more advertised as black magic than that it really is. And make sure to read the comments;



DLSS is useful if you want to play at 4K with a weak card. It is not better than native. It does not make RT more useful, and just like pretty much all other proprietary nVidia tech, it will end up either dead, or being replaced by an open source alternative.


Yes, I don't know why it's praised like the second coming of Jesus. It's a nice reconstruction technique, nothing more. It makes the image quality worse, it's a compromise... The kind of compromises PC gamers told us were bad for years. But now it's ok paying $2000 and getting a suboptimal image quality because it's the latest Nvidia buzzword. Go figure.

Fast forward to 2.0 and oh oh, so cool.
Except, it's basically TAA with some NN post processing on top. With all the strength and weaknesess of.... (taadaaa)... surely you've guessed it: of TAA.

Yes, that's hilarious. Of course reconstruction techniques to 4K look pretty good, but that's true for most reconstruction techniques used today, nothing specifically linked with DLSS. Heck, that's why nobody plays in native 4K, it's a waste.
 

//DEVIL//

Member
It's more advertised as black magic than that it really is. And make sure to read the comments;



DLSS is useful if you want to play at 4K with a weak card. It is not better than native. It does not make RT more useful, and just like pretty much all other proprietary nVidia tech, it will end up either dead, or being replaced by an open source alternative.

so much salt in here. DLSS is no joke ? increasing the frames with most of the time better than native image quality is bad ? okay lol.

go play cyberpunk with 6800 xt ray tracing on. you enjoy that 30 frames MAYBE while i am enjoying at 60 frames. The same will go for Control and many others.

pathatic really it's not even funny.
 

Ascend

Member
so much salt in here. DLSS is no joke ? increasing the frames with most of the time better than native image quality is bad ? okay lol.

go play cyberpunk with 6800 xt ray tracing on. you enjoy that 30 frames MAYBE while i am enjoying at 60 frames. The same will go for Control and many others.

pathatic really it's not even funny.
haters gonna hate GIF
 

MrSec84

Member
It might not be possible, depending on how it is implemented. For example, if it uses VRS in any way shape or form, that automatically rules out the 5700XT.
Software based VRS has been implemented in the latest Call of Duty by Treyarch and Raven Software, but Machine Learning is present in RDNA1 onwards, from a software perspective programmers could write tools to allow smaller code integers to be fit into larger ones, like 4X4FP=FP16, which would work on older GPU hardware like PS4 Pro and XBox One X, along with any AMD GPU.

I don't see any reason why NVidia, AMD or of course Intel in the future couldn't make this work on a whole host of older and future graphics cards and APUs for that matter.
Newer tech would just be much more efficient at it because the hardware is built with it in mind.
 

marquimvfs

Member
Meh at those "super resolution techniques". I wanna run the new games at full resolution and quality. And shame on everyone that thinks that dlss is better than native, maybe it's time to see an ophthalmologist.
 

LordOfChaos

Member
Meh at those "super resolution techniques". I wanna run the new games at full resolution and quality. And shame on everyone that thinks that dlss is better than native, maybe it's time to see an ophthalmologist.

Not better...But 3840 x 2160 is 2.25x more pixels to fill than 2560 x 1440, which with good upscaling looks very damn close unless you're close up pixel peeping. Does it look 2.25x better, or would I rather spend that GPU power elsewhere? For me it's the latter.
 

hlm666

Member
so much salt in here. DLSS is no joke ? increasing the frames with most of the time better than native image quality is bad ? okay lol.

go play cyberpunk with 6800 xt ray tracing on. you enjoy that 30 frames MAYBE while i am enjoying at 60 frames. The same will go for Control and many others.

pathatic really it's not even funny.

Gonna need a time machine to get that 6800 running with DXR on, the good news is while they are in the future can also bring back super res.
 
Last edited:

marquimvfs

Member
Not better...But 3840 x 2160 is 2.25x more pixels to fill than 2560 x 1440, which with good upscaling looks very damn close unless you're close up pixel peeping. Does it look 2.25x better, or would I rather spend that GPU power elsewhere? For me it's the latter.
If you need those cheats to play at the full resolution of your tv/monitor, maybe you should choose one with a smaller resolution to begin with. Native it's always best...
 
Last edited:

LordOfChaos

Member
If you need those cheats to play at the full resolution of your tv/monitor, maybe you should choose one with a smaller resolution to begin with. Native it's always best...

4K is a standard TV resolution, 1440p isn't, not much choice there, but that doesn't mean we have to squander twice as much console GPU power on 4K when 1440p with upscaling looks nearly as good from a distance. Even on PC if it's a big performance saver with marginal difference vs native quality, I don't see why you're so fussed against it, especially for ray tracing which can drag every card down. 1440p with DLSS quality upscaling is still going to look better on a 4K monitor than 1440p on a native 1440p monitor.
 

//DEVIL//

Member
Gonna need a time machine to get that 6800 running with DXR on, the good news is while they are in the future can also bring back super res.
What ? 6800 can run ray tracing on cyberpunk. it's the no DLSS that kills it.

Do you know what is funny? I own a 6800XT and love it. wouldn't trade it for anything else. i don't want a 10 gig card that will be EOL in couple of months when the 3080ti is out.

But I do give credits where it's due, AMD needs to step up their game soon.

Do you know how I got my 6800xt MSI trio? some idiot wanted to trade me his card for my 3070 FTW 3. ( without me paying extra cent )

how can I say no to such a deal? lol
 

hlm666

Member
What ? 6800 can run ray tracing on cyberpunk. it's the no DLSS that kills it.

Do you know what is funny? I own a 6800XT and love it. wouldn't trade it for anything else. i don't want a 10 gig card that will be EOL in couple of months when the 3080ti is out.

But I do give credits where it's due, AMD needs to step up their game soon.

Do you know how I got my 6800xt MSI trio? some idiot wanted to trade me his card for my 3070 FTW 3. ( without me paying extra cent )

how can I say no to such a deal? lol

When did the update from amd/cdpr come that enabled DXR? Was disabled by amd at launch and hadn't seen anything about it since. Got some benchmark links? curious to see if it's any better than other heavy RT games.
 

//DEVIL//

Member
When did the update from amd/cdpr come that enabled DXR? Was disabled by amd at launch and hadn't seen anything about it since. Got some benchmark links? curious to see if it's any better than other heavy RT games.
I don't even own the game. I was talking in general based on DLSS performs. I didn't know the DXR isn't enabled on cyperpunk. wtf ?

but anyway Control has ray tracing enabled on 6800/xt .. and yeah the performance is horrible.
 

marquimvfs

Member
4K is a standard TV resolution, 1440p isn't, not much choice there,
Yes, you're right. But see, that's part of the problem. That's why I still play in a 1080p panel and people in this very forum call me stupid for ir.

but that doesn't mean we have to squander twice as much console GPU power on 4K
Yes, it does...

1440p with upscaling looks nearly as good from a distance.
Depends on the distance you're talking about. It's nearly not as good as you're painting
I don't see why you're so fussed against it, especially for ray tracing which can drag every card down.
That's another part of the problem. Instead of improving the raw power of their cards to make them truly 4k ready, nvidia deliberately keeps investing on gimmicks that people insist to think it's the new holy grail, just like every other proprietary crap they invented before. The problem this time is that, while none of the players of the market had raw power to do proper ray tracing, they truly invested in it thinking it's the future. Maybe it is, when it could be done properly, not now. Not when none of high end cards and consoles on the market can play games at an acceptable performance in the new "standard resolution".
That is one cause of the fuss, the other is people that insists on defend those practices.
1440p with DLSS quality upscaling is still going to look better on a 4K monitor than 1440p on a native 1440p monitor.
Lol, no. Not even on your sweetest wet dream, specially in motion, considering a 1440p and a 4k panel of comparable iq.
 
Last edited:

Ivan

Member
This makes even more sense on consoles IMO... Can't wait to see the results of good teams focusing on this.
 

skit_data

Member
Temper your expectations people, if it comes to consoles it will not be equivalent to nvidias DLSS. It may be a nice solution that produces ”good enough” results to the majority of people, but anything closely resembling DLSS requires a lot of silicon that we know for a fact does not exist in any of the consoles APUs.
 

Elios83

Member
AMD's solution is basically a smart sharpening filter. It is platform agnostic and not machine learning based.
There are not dedicated tensor cores in current AMD GPUs so in any case, even if with different degrees of efficiency, there would be a cost associated to the use of the feature that probably is being calculated to be far less than rendering at the higher resolution that FidelityFX is targeting to fake.
I hope that this is a success but right now there are too many unknowns, including the amount of artifacts this solution creates in games.
 
Last edited:

MonarchJT

Banned
AMD's solution is basically a smart sharpening filter. It is platform agnostic and not machine learning based.
There are not dedicated tensor cores in current AMD GPUs so in any case, even if with different degrees of efficiency, there would be a cost associated to the use of the feature that probably is being calculated to be far less than rendering at the higher resolution that FidelityFX is targeting to fake.
I hope that this is a success but right now there are too many unknowns, including the amount of artifacts this solution creates in games.
of course Elios of course ) i would like to find some of your post about ps4pro fp16
 

llien

Member
go play cyberpunk with 6800 xt ray tracing on.
I remember comparing on vs off on CP2077.
And... no, it wasn't like one would easily pick one over the other.
Sometimes on looked outright bad (and was countered by imaginary "daylight" coming through the doors.

On this pic, I could not remember which one is with RT On (I remembered that it was always on the same side):

BQ5l4be.png


rQSNcLt.png


And then it struck me, on the second screenshot:

It's the blurry one


So why are you having it on at all?
 
Last edited:

FireFly

Member
That's another part of the problem. Instead of improving the raw power of their cards to make them truly 4k ready, nvidia deliberately keeps investing on gimmicks that people insist to think it's the new holy grail, just like every other proprietary crap they invented before. The problem this time is that, while none of the players of the market had raw power to do proper ray tracing, they truly invested in it thinking it's the future. Maybe it is, when it could be done properly, not now. Not when none of high end cards and consoles on the market can play games at an acceptable performance in the new "standard resolution".
That is one cause of the fuss, the other is people that insists on defend those practices.
The last generation of consoles extensively used temporal upscaling, and this generation it's being used in both ray tracing and non-ray tracing modes. UE5 looks to be natively targeting 1440p even with a non-traced solution for GI, since geometry loads scale with the number of pixels. So even if ray tracing hardware wasn't included with consoles, we would still be seeing developers target non-native resolutions.

My suggestion is that developers do this, not because someone is pointing a gun to their collective heads, but because they feel it represents the best visual tradeoff. In that context, having a cleaner looking upscaling solution available hardly seems like a terrible thing.
 
Last edited:

Hunnybun

Member
Yes, I’m a strong proponent of reconstruction. But the brand based fervour about certain solutions is reaching mass hysteria almost. Reconstruction has been a reality since last gen, there are options already, some better than others. It’s splitting hairs.

I suppose it depends on how good reconstruction is. I freely admit that checkerboarding is fine for me. I've got a 65 inch OLED and I can't honestly tell the difference between that and native 4k.

But DLSS gets a better result than that even from 25% of the pixels, so if this new AMD solution got something close to that then we could be looking at a significant performance boost for the same IQ.

If that allows us to have, say, RT at 60fps and "4k" then that would be pretty awesome.
 

Shmunter

Member
I suppose it depends on how good reconstruction is. I freely admit that checkerboarding is fine for me. I've got a 65 inch OLED and I can't honestly tell the difference between that and native 4k.

But DLSS gets a better result than that even from 25% of the pixels, so if this new AMD solution got something close to that then we could be looking at a significant performance boost for the same IQ.

If that allows us to have, say, RT at 60fps and "4k" then that would be pretty awesome.
Can’t disagree in more efficiency. If we can get better results from lower rez, all the better. See how it plays out.
 
D

Deleted member 17706

Unconfirmed Member
I remember comparing on vs off on CP2077.
And... no, it wasn't like one would easily pick one over the other.
Sometimes on looked outright bad (and was countered by imaginary "daylight" coming through the doors.

On this pic, I could not remember which one is with RT On (I remembered that it was always on the same side):

BQ5l4be.png


rQSNcLt.png


And then it struck me, on the second screenshot:

It's the blurry one


So why are you having it on at all?

Wow... you're still at this?
 

Armorian

Banned
I remember comparing on vs off on CP2077.
And... no, it wasn't like one would easily pick one over the other.
Sometimes on looked outright bad (and was countered by imaginary "daylight" coming through the doors.

On this pic, I could not remember which one is with RT On (I remembered that it was always on the same side):





And then it struck me, on the second screenshot:

It's the blurry one


So why are you having it on at all?

Something to learn here:

 

Zathalus

Member
Temper your expectations people, if it comes to consoles it will not be equivalent to nvidias DLSS. It may be a nice solution that produces ”good enough” results to the majority of people, but anything closely resembling DLSS requires a lot of silicon that we know for a fact does not exist in any of the consoles APUs.
This is simply not true. DLSS 1.9 ran on standard shaders and was not that far off quality wise from DLSS 2.0. AMDs equivilant can at least be better then DLSS 1.9 as it can take advantage of the lower precision INT capabilities of RDNA.

It may not beat DLSS 2.0, but it can come damn close.
 

99Luffy

Banned
go play cyberpunk with 6800 xt ray tracing on. you enjoy that 30 frames MAYBE while i am enjoying at 60 frames. The same will go for Control and many others.

pathatic really it's not even funny.
Do you know what is funny? I own a 6800XT and love it. wouldn't trade it for anything else. i don't want a 10 gig card that will be EOL in couple of months when the 3080ti is out.

Do you know how I got my 6800xt MSI trio? some idiot wanted to trade me his card for my 3070 FTW 3. ( without me paying extra cent )

how can I say no to such a deal? lol
Your multiple personalities are fighting each other.
 

RockOn

Member
*RDNA2

RDNA1 does not support INT instructions.
Yes, FidelityFX Super Resolution is coming to PC/Series S/X & PS5. AMD openML is open source. PS5 is RDNA 2 with enchanced CUs(support for mixed integer is standard)Hardware accelerated Machine Learning will be supported with updated PS5 APIs(GNM & GNMX)
 

marquimvfs

Member
My suggestion is that developers do this, not because someone is pointing a gun to their collective heads, but because they feel it represents the best visual tradeoff. In that context, having a cleaner looking upscaling solution available hardly seems like a terrible thing.
That's a statement I can agree with, specially in case of consoles and lower end gpus. But that shouldn't be the case of high end GPUs. Imagine if from now on, every game needs those gimmicks to run at full quality and FX even on high end GPUs... and the fanboys couldn't stop repeating "hur dur DLSS is better than native, must have DLSS".
 

//DEVIL//

Member
Your multiple personalities are fighting each other.
Nope. not really. I love my card. I have nothing against it. but AMD software is lacking. no shame in admitting the truth. and if I have an X amount of money to spend on a video card. without a question, I would go the Nvidia route for whatever that X money can get me.

But when it comes to the X money giving me a 3070 ( because I can't get 3080 thanks to the stupid prices ) and someone offers me a 6800XT that is on the same level as 3080.. I will take it.

Does AMD software suck? yeah, it does. but that doesn't mean their card is shit.
 

Omni_Manhatten

Neo Member
Yes, FidelityFX Super Resolution is coming to PC/Series S/X & PS5. AMD openML is open source. PS5 is RDNA 2 with enchanced CUs(support for mixed integer is standard)Hardware accelerated Machine Learning will be supported with updated PS5 APIs(GNM & GNMX)
Well Cerny only said they have the same support for accelerated RT HW. He never once said they accelerated the HW for ML. That’s was Microsoft who made that claim. The algorithms Sony can use for Super Resolution can be trained to run at FP16 effectively with API from AMD. Devs will be able to use the same ones across the board to avoid having to use different methods and that saves money and time. If they want to use DirectML from Xbox they will need to use that API. That will have the algorithms trained by the work with Azures and Nvidias collaboration. You can look into the AI work they did and it makes sense why Xbox would want to combine the tech into its next gen systems. AMD SR aims to be a one stop for all solution for devs. Imagine they use the AMD version across all consoles for a predictable effect within a budget.
 

MrSec84

Member
*RDNA2

RDNA1 does not support INT instructions.
RDNA1 has it in the Whitepaper for the architecture, some of the CUs have the ability to scale below FP16 in Integer Size, down to FP8 and FP4.


"Some variants of the dual compute unit expose additional mixed-precision dot-product modes in the ALUs, primarily for accelerating machine learning inference. A mixed-precision FMA dot2 will compute two half-precision multiplications and then add the results to a single-precision accumulator. For even greater throughput, some ALUs will support 8-bit integer dot4 operations and 4-bit dot8 operations, all of which use 32-bit accumulators to avoid any overflows."
 

Rikkori

Member
My bet is it will be TAA reconstruction ala The Division 2's + sharpening and maybe tier 2 VRS as an added bonus rather than mandatory. People don't understand what tensor cores are actually used for with DLSS 2, and the answer is: something that can be replaced by a manually-tuned model necessitating no tensor cores at all (it's just the clamping part in the end). People talking about A.I. up & down don't understand how little it is actually being used.
 

vkbest

Member
God of War used an impressive checkerboard upscale, but the technology feels old now. The Red Dead 2 checkerboard was atrocious. Refinement is necessary. Expected now, even.

"Sharpening" reminds me of the old cathode days when sharpening made everything worse than when you started, so I just get nervous whenever I see references.

That being said, DLSS 2.0 has minor issues with ghosting, but it's otherwise pretty astonishing. But it isn't coming to PS5 or XSX. Ever. I get that. Dedicating the entirety of the CPU to an alternative is counter intuitive. I get that too. There's got to be a half decent parity option in the pipeline though.
RDR 2 is not using checkerboard
 

Armorian

Banned
At what, asking a person who hyped RT on in CP77 to show what he was hyped about, was that what you've asked?

I showed you DF video where they analyze RT effects in this game. I hope you watched it because differences are quite big.
 
D

Deleted member 17706

Unconfirmed Member
I showed you DF video where they analyze RT effects in this game. I hope you watched it because differences are quite big.

I think he's going to stick to his low-res screenshots and continue to bask in ignorance.
 

ErRor88

Member
Jason Ronald on ML in Series X. Also, I believe when he says "it's beyond". He's referring to Direct ML being used for AI, Super-resolution, and other things.



Full interview:
 
Top Bottom