• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD FidelityFX Super Resolution may launch in spring

assurdum

Banned
rx 6800 have 60 cu...512 gb/s
rx 6800xt 72 cu ..same 512 gb/s
the ps5 isn't powerful as the 6800xt....and the xsx have pretty more bandwidth than the 6800xt with lots less cu's (52 vs 72)
your friend told you bushits
AMD GPU has infinity cache. Don't forget it.
 
Last edited:

MonarchJT

Banned
AMD GPU has infinity cache. Don't forget it.
I don't think that infinitecache (however much it can alleviate the bandwidth can make it for a 20 cu difference (and still lower bandwidth)
ps5 36 cu.. no infinite cache 448 gb/s
xsx 52 cu.. no infinite cache 560 gb/s
6800xt 72 cu.. infinite cache 512 gb/s

your friend told xsx have no enough bw?
 
Last edited:

assurdum

Banned
I don't think that infinitecache (however much it can alleviate the bandwidth can make it for a 20 cu difference (and still lower bandwidth)
ps5 36 cu.. no infinite cache 448 gb/s
xsx 52 cu.. no infinite cache 560 gb/s
6800xt 72 cu.. infinite cache 512 gb/s

your friend told xsx have no enough bw?
Because you don't understand how CUs works. High CUs counts is pointless if not feeded with enough data. Now ps5 already has more data per CUs compared series X furthermore is faster in processing and custom cache inside free more bandwidth. Yeah maybe more CUs will give a nod to series X in perfomance but I doubt will be enormous considered they run slower and with less data.
 
Last edited:

MonarchJT

Banned
Because you don't understand how CUs works. High CUs counts is pointless if not feed with enough data. Now ps5 already has more data per CUs compared series X furthermore is faster and custom cache are there to free more bandwidth. Now yeah maybe more CUs will give a nod to series X in perfomance but I doubt will be enormous considered they ran slower and with less data.
bored.....really must be for this reason that the gpu showed on latest DF test till 36% gpu advantage due to lack of bandwidth . tell to your friend to restudy how cu and gpu works and you should follow him . I tried to be nice but you continue with the "coz you don't know" "coz you don't understand" when the story clashes with everything you write. bye last reply
 
Last edited:

assurdum

Banned
bored.....really must be for this reason that the gpu showed on latest DF test till 36% gpu advantage due to lack of bandwidth . tell to your friend to restudy how cu and gpu works and you should follow him . I tried to be nice but you continue with the "for you don't know" "why you don't understand" when the story clashes with everything you write. bye last reply
Ignore list then. Goodbye
 
Last edited:
Because you don't understand how CUs works. High CUs counts is pointless if not feeded with enough data. Now ps5 already has more data per CUs compared series X furthermore is faster in processing and custom cache inside free more bandwidth. Yeah maybe more CUs will give a nod to series X in perfomance but I doubt will be enormous considered they run slower and with less data.
Stop you're killing me, I can't breathe 😂
 

ToTTenTranz

Banned
the difference is, and I'm sure someone will gladly correct me if I'm wrong, Series S|X have a built in hardware solution to ML whereas Sony doesn't. This could be a factor in the next few years for digital foundry console warring.

For normal people comparison threads have always been pointless. If you have to compare screens side by side and view videos with meticulous detail about how they are different then it really doesn't matter which you play.
AFAIK the PS5 uses RDNA2 ALUs that should support the same 4x INT8 and 8x INT4 throughput as both the Xbox series and Navi 2x PC GPUs.

The only features Microsoft claimed exclusivity on the console space was Sampler feedback streaming which seems like the same IO granularity that Cerny mentioned in road to PS5, and VRS.
The PS5 may not use Microsoft's specific VRS method but they definitely have hardware support for other forms of foveated rendering, considering the sheer number of patents that Cerny et al registered on the subject throughout the past 4 years or so.
 

99Luffy

Banned
I dont think AMD needs something thats DLSS 2.0 quality. They just need something on the same level or a bit better than CBR. We've all seen the DF videos and yes DLSS does look great but at the end of the day I dont think I've seen anyone complain about Miles Morales image quality. If CBR was as universal on PCs as it was on console I doubt anyone would care about DLSS.
Should also be mentioned that DLSS quality despite leveraging tensor cores(which take up like 20% of the gpu die space) actually performs 25% worse than native 1440p. All those tensor cores and the shaders are still taking a 25% hit?? hmm.
 

JimboJones

Member
I dont think AMD needs something thats DLSS 2.0 quality. They just need something on the same level or a bit better than CBR. We've all seen the DF videos and yes DLSS does look great but at the end of the day I dont think I've seen anyone complain about Miles Morales image quality. If CBR was as universal on PCs as it was on console I doubt anyone would care about DLSS.
Should also be mentioned that DLSS quality despite leveraging tensor cores(which take up like 20% of the gpu die space) actually performs 25% worse than native 1440p. All those tensor cores and the shaders are still taking a 25% hit?? hmm.
DLSS 2.0 has shown better image quality than CBR but if amd was able to implement CBR on a driver level I'd be pretty happy.
 

YCoCg

Member
CBR wouldn't be as much as a performance gain though, it only halves one axis and reconstructs where as DLSS operates on both, if they can do an advanced version of CBR which takes both axis into account and then clean up the image more then I'm sure most would be happy.
 

MonarchJT

Banned
AFAIK the PS5 uses RDNA2 ALUs that should support the same 4x INT8 and 8x INT4 throughput as both the Xbox series and Navi 2x PC GPUs.

The only features Microsoft claimed exclusivity on the console space was Sampler feedback streaming which seems like the same IO granularity that Cerny mentioned in road to PS5, and VRS.
The PS5 may not use Microsoft's specific VRS method but they definitely have hardware support for other forms of foveated rendering, considering the sheer number of patents that Cerny et al registered on the subject throughout the past 4 years or so.

From the xbox architect himself

"We knew that many inference algorithms need only 8-bit and 4-bit integer positions for weights and the math operations involving those weights comprise the bulk of the performance overhead for those algorithms," says Andrew Goossen. "So we added special hardware support for this specific scenario. The result is that Series X offers 49 TOPS for 8-bit integer operations and 97 TOPS for 4-bit integer operations. Note that the weights are integers, so those are TOPS and not TFLOPs. The net result is that Series X offers unparalleled intelligence for machine learning."

So apart from the sensational sound of it (it's clear) he's talking nonsense and they haven't added anything, because talking to some of you it seems like whatever we customized on the X is "irrelevant" ... "software" "pr" ...



Instead this was from the words of Leonardi.. the lead graphic engineer regarding the ALUs

I'm sorry to bring a tweet from Blue nugroho (i don't like him) ... but I remember that when leonardi wrote this been talked about everywhere from Eurogamer to Beyond3d
Obviously, having created a media case, Leonardi tried to fix it by immediately releasing another statement,because obviously it was putting the ps5 in a bad light, where it said that the gpu in the ps5 was unique and had additions and deductions ... such as raytracing (added) but who was ultimately part of the rdna2 generation but that doesn't seem and there is no sign as of today t hat it support at the hardware level beyond the f16 already present in the ps4pro

"RDNA 2 is a commercial theme to simplify the market, otherwise GPUs with completely random features would come out and it would be difficult for the average user to choose,"
"For example, support for ray tracing is not present in any AMD GPU currently on the market. (...) The PlayStation 5 GPU is unique, it cannot be classified as RDNA 1, 2, 3 or 4."

 
Last edited:
Unfortunately I don't find anything about Sony the ps5 or if they have an api about the ML (if anyone finds something please post it) but here an interesting video (for those who have never seen it) I think that Ms has co-developed the super resolution tech to then release it inside their own dx12u and collaborated with AMD to make it compatible with all RDNA2 gpu in any case. From an article

"While AMD clearly announced the feature as a part of FidelityFX technology, Tom Warren claims that this technology will be open and cross-platform. It is hard to image NVIDIA supporting FidelityFX supersampling, so we assume that this technology will be part of Microsoft DirectML technology.

XBOX Series X/S also features ML interference acceleration. The AI ‘tensor’ cores require a very small area of the die, while can provide 3-x10x performance improvement, a slide from Microsoft claimed.

DirectML super-resolution a Microsoft technology that was demonstrated back in 2019 during Game Developer Conference. It can provide a higher framerate and lower latency compared to TensorFlow, which was not designed for real-time super-resolution."



XSX/XSS doesn't have Tensor cores.

They have mixed precision packed math for INT ops.

It's a standard feature for RDNA2.

It's also considerably less performant than NVidia's Tensor cores at AI inferencing. It's not even close.
 

MonarchJT

Banned
XSX/XSS doesn't have Tensor cores.

They have mixed precision packed math for INT ops.

It's a standard feature for RDNA2.

It's also considerably less performant than NVidia's Tensor cores at AI inferencing. It's not even close.
doesn't have tensor core ....seem it's not standard at least not on console
 

samjaza

Member
Every TV already has DLSS. It is called motion smoothing and it is far better than DLSS
have you used DLSS it is diffrent then the basic hardware interpolation done on tv. a more direct comparison would be Nvidia NGX, but i dont know any video players that use it.
 

MonarchJT

Banned
AMD doesn't have a single GPU with anything resembling Tensor cores, so I don't know why anyone would think MS magically has Tensor cores in the XSX/XSS.
it doesn't have tensor core I wrote it already ...it seem that meanwhile full RDNA2 support it in the ALU the ps5 could not have it, at least if we trust ps5 principal graphic engineer ...for this reason MS is screaming out of the roof their support for it. For this reason on console is not standard
 
Last edited:

M1chl

Currently Gif and Meme Champion
I don't think that infinitecache (however much it can alleviate the bandwidth can make it for a 20 cu difference (and still lower bandwidth)
ps5 36 cu.. no infinite cache 448 gb/s
xsx 52 cu.. no infinite cache 560 gb/s
6800xt 72 cu.. infinite cache 512 gb/s

your friend told xsx have no enough bw?
It's for whole system, not just the GPU, which needs to be taken into consideration.
 

M1chl

Currently Gif and Meme Champion
a DDR4 3200 cap at what? 25.6 gb/s ...DDR4 4400...35 gb/s it's easy to understand that there is enough bw for cpu+gpu ...
Sure, but that's not only thing which you have to account for, another thing is how many things can you start at one given time, because you have only so much lanes available at one given time and that sort of thing. This way consoles are for example bound in soumething as essential, like AF 16x...which is standard on PC for 20 years and on consoles is basically still seldomely used, while the effect is pretty huge.
 

quest

Not Banned from OT
it doesn't have tensor core I wrote it already ...it seem that meanwhile full RDNA2 support it in the ALU the ps5 could not have it, at least if we trust ps5 principal graphic engineer ...for this reason MS is screaming out of the roof their support for it. For this reason on console is not standard
Thanks for your couple of posts I'm really shocked Sony skipped int 8 and 4 support to save some die space. But I guess when these were drawn up ML and dlss was not a thing but engineers at Microsoft had ideas for ml back then.
 

perkelson

Member
have you used DLSS it is diffrent then the basic hardware interpolation done on tv. a more direct comparison would be Nvidia NGX, but i dont know any video players that use it.

It is the same thing. But instead of doing 1:1 frames and adding frames in between based on last and future image. DLSS takes small res render blows it up to high res and displays image. Giving your extra FPS. Bot are image processing techniques and just because it uses AI doesn't make it much different from just normal image processing.

The main benefit of DLSS is supposed to be higher framerate not higher resolution. Which means at the end both achieve exact same thing but DLSS can't double, triple or quadruple framerate as you wish like TV motion smoothing can.

So it is better to turn on actual 4k at 30fps and interpolate it to 120fps rather than run game at 1440p and blow it up to 4k for that "smooth" 50fps.

The only problem here is that TVs content is not latency sensitive and video decoders/encoders are super slow in them because they don't need to be fast and snappy and they do give extra 30-100ms latency. With proper GPU handling it should be capable of being only minimally slower than Vsync option as i am doubtful producing 2-4 even 16 interframes between real frames should be done longer than 1-2ms.
 
Last edited:
Finally we could see if AMD's solution will live up to it and how it can somehow compete with DLSS
It would also be interesting to know if the consoles will use it and if the Xbox Series X|S will be able to take advantage of the modification made to the CU's of the gpu to support machine learning.


Skeptical about this.

Didn't we already get confirmation from AMD that their Super Resolution solution won't be based on machine learning?


Recently, Rick Bergman hinted at a new RDNA 2 GPU feature dubbed as FSR or FidelityFX Super Resolution which happens to be AMD’s response to Nvidia’s DLSS technique, though it will work on an entirely different way.

Many have been wondering whether AMD will also use a similar AI-approach to this super-sampling technique, but the answer to this is a “no”.
 

IntentionalPun

Ask me about my wife's perfect butthole
Skeptical about this.

Didn't we already get confirmation from AMD that their Super Resolution solution won't be based on machine learning?

That guy is assuming it won't use ML because it's cross-platform. I think at least; since he never explains why he so confidently says "no."

Read the next sentence lol.. "Even though AMD has not shared any technical details on its upscaling tech yet".
 

IntentionalPun

Ask me about my wife's perfect butthole
Should also be mentioned that DLSS quality despite leveraging tensor cores(which take up like 20% of the gpu die space) actually performs 25% worse than native 1440p. All those tensor cores and the shaders are still taking a 25% hit?? hmm.

I don't think there is any mystery to this; DLSS 2.0 is largely based on traditional upscaling techniques. It uses ML to improve the end result.

It's why they no longer train on individual games, because all they are doing is final enhancements to images.
 

RockOn

Member
From the xbox architect himself

"We knew that many inference algorithms need only 8-bit and 4-bit integer positions for weights and the math operations involving those weights comprise the bulk of the performance overhead for those algorithms," says Andrew Goossen. "So we added special hardware support for this specific scenario. The result is that Series X offers 49 TOPS for 8-bit integer operations and 97 TOPS for 4-bit integer operations. Note that the weights are integers, so those are TOPS and not TFLOPs. The net result is that Series X offers unparalleled intelligence for machine learning."

So apart from the sensational sound of it (it's clear) he's talking nonsense and they haven't added anything, because talking to some of you it seems like whatever we customized on the X is "irrelevant" ... "software" "pr" ...



Instead this was from the words of Leonardi.. the lead graphic engineer regarding the ALUs

I'm sorry to bring a tweet from Blue nugroho (i don't like him) ... but I remember that when leonardi wrote this been talked about everywhere from Eurogamer to Beyond3d
Obviously, having created a media case, Leonardi tried to fix it by immediately releasing another statement,because obviously it was putting the ps5 in a bad light, where it said that the gpu in the ps5 was unique and had additions and deductions ... such as raytracing (added) but who was ultimately part of the rdna2 generation but that doesn't seem and there is no sign as of today t hat it support at the hardware level beyond the f16 already present in the ps4pro

"RDNA 2 is a commercial theme to simplify the market, otherwise GPUs with completely random features would come out and it would be difficult for the average user to choose,"
"For example, support for ray tracing is not present in any AMD GPU currently on the market. (...) The PlayStation 5 GPU is unique, it cannot be classified as RDNA 1, 2, 3 or 4."


Thats nonsense, int4/int8 is in RDNA from v1.1. MS added fuckall. PS5which is RDNA2 and other RDNA 2 GPUS have all the features that AMD added which includes
2srHZfd.png
accellerators( eg for Machine Learning) all been part of the new enhanced CUs & ALUs
 

MonarchJT

Banned
,
Thats nonsense, int4/int8 is in RDNA from v1.1. MS added fuckall. PS5which is RDNA2 and other RDNA 2 GPUS have all the features that AMD added which includes
2srHZfd.png
accellerators( eg for Machine Learning) all been part of the new enhanced CUs & ALUs
I'm sure that you know more about the ps5 cu's and alu than A ps5 principal graphics engineer
 

RockOn

Member
,

I'm sure that you know more about the ps5 cu's and alu than A ps5 principal graphics engineer
Try reading RDNA1/RDNA 2 white papers its in ther in black & white for all to read. Yet some seem to be still stick PS5 is RDNA1(which is complete bullshit). If PS5 is RDNA1 than why does AMD mention all consoles(including PS5) in ther RDNA2 stuff on ther website(exactly cos PS5 is RDNA 2)

Seems that your all lapping up the MS buzzwords, MS PR bullshit
 

MonarchJT

Banned
Try reading RDNA1/RDNA 2 white papers its in ther in black & white for all to read. Yet some seem to be still stick PS5 is RDNA1(which is complete bullshit). If PS5 is RDNA1 than why does AMD mention all consoles(including PS5) in ther RDNA2 stuff on ther website(exactly cos PS5 is RDNA 2)

Seems that your all lapping up the MS buzzwords, MS PR bullshit
No one is saying that ps5 is rdna1 lol ps5 is a mix of things..I'm up to a lots of things the devkits timing, the amd leak, the dieshot itself, the analysis on the dieshot made by experts who have been doing them for years like Locuza,Leviathan and all the others, DigitalFoundry,David Cage words from quantic dreams, the words of one or of the ps5 principal graphics engineer Leonardi and above all that no one at the launch of a console would think of glossing over something that could bring hype like machine learning after the people knows what is it thanks to DLSS. Especially if those who gliss are the same people who until recently were screaming for fp16 and rapid packed math. You are free to believe what you want of course
 
Last edited:

Rikkori

Member
So we either get Testosterone + FSR or we get nothing.
No... it's either the name Test or it has FSR. Ofc, he could be lying or pulling some other semantics game, after all it doesn't say 'only one of the following two sentences is true'. 🤷‍♂️
 

Ascend

Member
No... it's either the name Test or it has FSR. Ofc, he could be lying or pulling some other semantics game, after all it doesn't say 'only one of the following two sentences is true'. 🤷‍♂️
Actually, that's exactly what it says. What else can the statement "only one sentence is true" mean, when there are two sentences below that statement...?

And note that one of the sentences is that it does NOT feature FSR... So if we assume that only one sentence can be true, this means;

Testosterone: true -> No FSR: false = Testosterone + FSR
No FSR: true -> Testosterone: false = No FSR + No testosterone
 

ToTTenTranz

Banned

How could FSR "come in a driver" if FSR isn't driver-enforceable?

It's a tech that will be enabled in a number of games coming out in the future, as it becomes part of FidelityFX. "Launching FSR" in a driver would be just a marketing move.
 
Last edited:

Ascend

Member
Any chance of this coming to Ryzen APU?
It depends on which architectures it work. If it works on GCN as well, it should work on all APUs. If it's only for RDNA, it's another story. I think they would love to market it on laptops too though, so, I expect it to work at least on Vega.
 

Kataploom

Gold Member
It depends on which architectures it work. If it works on GCN as well, it should work on all APUs. If it's only for RDNA, it's another story. I think they would love to market it on laptops too though, so, I expect it to work at least on Vega.
Hope so, I'm on a Vega 8 till I get a better GPU some day... unless that day comes soon, I'm feeling tempted by 2nd hand market
 
Top Bottom