• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD FidelityFX Super Resolution may launch in spring

Reallink

Member
Exactly. What the AI part of it is now is for the 'fit' of data - in theory you don't even need to have tensor cores to see that benefit, you just need the right model for clamping, which is what's usually manually done for TAA reconstruction already. In practice it doesn't work so well because not all studios are created equal, but we can see that Massive Ubisoft can do this as good or better than DLSS 2.0 with The Division 2's TAA. Not to mention - DLSS doesn't always work out so well either, with it being unable to reconstruct raytraced reflections in many cases (CP2077, WD:L).

Where Nvidia wins, as always, is at marketing. I said even from day 1 when they had shitty DLSS 1.0 - they win, because people are stupid and they want to self-deceive anyway, so if you just tell them the AI will even blow them it's that good they'll believe it.

DLSS 1.0 was universally panned though, by anyone Nvidia marketing hoped to reach. The overwhelming consensus was definitely hugely negative, to the point FidelityFX CAS was generally believed to be better, particularly on 1440p > 4K. This is likely the primary reason AMD were caught with their pants down, forced to unveil and sell cards with promises of nondescript features that were nearly a year out.
 
Last edited:

quest

Not Banned from OT
the difference is, and I'm sure someone will gladly correct me if I'm wrong, Series S|X have a built in hardware solution to ML whereas Sony doesn't. This could be a factor in the next few years for digital foundry console warring.

For normal people comparison threads have always been pointless. If you have to compare screens side by side and view videos with meticulous detail about how they are different then it really doesn't matter which you play.
We don't know what Sony included for ml since they are treating the ps5 like a Kremlin secret. I think Sony had int 8 and 4 like Microsoft. Microsoft might of added something to help process the ml like extra registers or something we don't know. The biggest advantage Microsoft will have is pure grunt of 18% more raw processing power they can toss at it versus sony to complete any ml task faster by almost 20%.
 

llien

Member
if with DLSS on a 3060 will happily play a game at 4k 60fps?
Dude.
Does it not feel a bit silly when you call upscales from 1440p and even 1080p "4k"?

So ML is a thing and will definitely happen on SX/SS and PS5?
ML is a nice buzzword, but what they really really meant (and what really really happens on Gamer GPUs) is Neural Network Inference.
"Learning" is a much MUCH MUCH more computationally expensive shit that normally happens at datacentters, sometimes on single workstations, but then, it is not something that could be done at runtime.
 

llien

Member
promises of nondescript features that were nearly a year out.
Enabled on a whopping eh... how many games are out there with that glorified upscaling, by the way?

I've forgotten, how does checkboard rendering qualify, is it not an excellent upscaling technique?
Although, no "AI" buzzwords used, I see.

Do you know what it takes to train AI to give answers to this simple equation:

y = sin(x)

or perhaps for this one, let's be generous:

y = ax^3+bx^2+c

No? Oh well.
 

assurdum

Banned
hotchip 2020 during the presentation of the SOC of the x series ... in the slides ML (machine learning) inference accelleration, where they clearly point to image upscaling. In the second slide it is clearly written about Hardware Soc innovation .... "machine learning accelleration"
No offence but you should read point by point what hotchip 2020 said. I assure to you there isn't any special hardware inside the series X about ML or whatever they named it. It's all about CUs work as ps5.
 
Last edited:

Greeno

Member
No offence but you should read point by point what hotchip 2020 said. I assure to you there isn't any special hardware inside the series X about ML or whatever they named it. It's all about CUs work as ps5.

But there actually is. Maybe the HotChip slides were not clear on the matter, but Microsoft did end up clearly mentioning it in this article:


by saying "we have gone even further introducing additional next-generation innovation such as hardware accelerated Machine Learning capabilities for better NPC intelligence, more lifelike animation, and improved visual quality via techniques such as ML powered super resolution".
 

THE DUCK

voted poster of the decade by bots
If this ends up supported by either console, it really could be a game changer in terms of overall performance. Guess we will see!
 

Schmick

Member
Dude.
Does it not feel a bit silly when you call upscales from 1440p and even 1080p "4k"?
It does not matter whether it is full 4K or not its the end result that counts and across the boards DLSS is generally praised. So no i dont think its silly. Whats silly is this goal to chase for full 4K at the cost of detail and FPS when theres no need for it. DLSS is now at version 2.0, a marked improvement over 1.0, its only going to get better,
 

Mister Wolf

Member
download.png


The bar has been set. When 25% of 4K looks better and runs better than 65% of 4K that's also using a fancy sharpening filter .
 
Last edited:

assurdum

Banned
But there actually is. Maybe the HotChip slides were not clear on the matter, but Microsoft did end up clearly mentioning it in this article:


by saying "we have gone even further introducing additional next-generation innovation such as hardware accelerated Machine Learning capabilities for better NPC intelligence, more lifelike animation, and improved visual quality via techniques such as ML powered super resolution".
Please at least care to read what you quoted because there isn't mentioned anything of specific inside the series X hardware built just to ML or super resolution, as for example Nvidia GPU. It's all part of the CUs RDNA2 tech, nothing more, nothing less. I invite you as many other people who persists with this matter to try to understand at least what they read; to say MS has specifically worked to favourite the CUs job to the ML acceleration stuff, doesn't means they have rebuilt the hardware inside the CUs but simply, they update their software to improve ML acceleration via CUs. I thought it was obvious from their panels.
 
Last edited:

MonarchJT

Banned
Please at least care to read what you quoted because there isn't anything inside the series X hardware specifically built just to ML or super resolution stuff as for example Nvidia GPU. It's all part of the CUs RDNA2 tech, nothing more, nothing less. I invite you as many other people to try to understand what they read because say MS has specifically worked to adopt the CUs to the ML acceleration doesn't means they have rebuilt the CUs via hardware but simply they update their libraries to work better with CUs.
my god did you read what I wrote ? they mate compatibile the CU's with fp8 int 4 and int8 It is clear to everyone that it bothers you any advantage the Xbox may have over the PlayStation but at this point you are totally ridiculous. stop that.
 

assurdum

Banned
my god did you read what I wrote ? they mate compatibile the CU's with fp8 int 4 and int8 It is clear to everyone that it bothers you any advantage the Xbox may have over the PlayStation but at this point you are totally ridiculous. stop that.
And this is a completely different argument. Compatible CUs with fp8 are not exactly as exclusive dedicate hardware parts for ML and super resolution. Though I doubt ps5 doesn't have it too, considered is backward via hardware with the Pro. But that's another story.
 
Last edited:

MonarchJT

Banned
And this is a completely different argument. Though I doubt ps5 doesn't have it too, considered is backward via hardware with the Pro.
ps4pro it only had fp16 which is not the same and still there is no proof that its still so since for compatibility it is not necessary at all (since at that time no one used low precision) in any case there is no proof of compatibility with fp16 ...... nor fp8 or int4 or 8. stop that
 
Last edited:

assurdum

Banned
ps4pro it only had fp16 which is not the same and still there is no proof that its still so since for compatibility it is not necessary at all (since at that time no one used low precision) in any case there is no prkva nor fp16 ...... nor fp8 or int4 or 8. stop that
You really think to have fp8 is like to have dedicate hardware part for ML acceleration? No offence but have you a single clue of what are you talking about? And what ps5 had to do to what I'm talking about previously? Honestly I doubt it missed fp8 compatibility but even in the case, isn't it that important.
 
Last edited:

MonarchJT

Banned
You really think to have fp8 is like to have dedicate hardware part for ML acceleration? No offence but have you a single clue of what are you talking about?
this si the answer you deserve
i don't know either of there aren't tensor core for it but if i trust about the fantasitigaz superduper fast i/o of ps5 from cerny and still games load faster on xsx why I can't trust microsoft saying they have hw for it?
 
Last edited:

assurdum

Banned
this si the answer you deserve
i don't know either of there aren't tensor core for it but if i trust about the fantasitigaz superduper fast i/o of ps5 from cerny and still games load faster on xsx why I can't trust microsoft saying they have hw for it?
Trust whatever you want but don't talk of things you barely understand with all respect. MS has never said series X has dedicated hardware part exclusively for ML. And faster I/O not necessarily means faster games load. But I doubt you really care to understand of what we are talking about.
 
Last edited:
Is the ML on the chips for Series X/S like the hidden dual GPU on the One? I believe MisterXMedia was the #1 proponent of that rumor... jk boys, just messing with yall!

In all seriousness... is there anything stopping Xbox from coming up with their own res/ai/scaling solution that would take advantage of that hardware? Is there a good chance Xbox/MS is already working up their own solution? Who knows when AMD will have their version ready or if it will be good aka on par with DLSS. Last I heard, MS has some pretty good software guys. I mean, don't both consoles already use some kind of scaling? They seem to be pretty good so far.
 
Last edited:

MonarchJT

Banned
Trust to whatever you want but doesn't talk if things you barely understand with all respect. MS has never said series X has dedicated hardware part exclusively for ML.
lol After "things you barely understand" you don't deserve answer .
ms said it and is in the soc....warrior go to sleep.

I'll not continue this convo search about yourself .
interview directly from xbox hw architect
 
Last edited:

assurdum

Banned
lol After "things you barely understand" you don't deserve answer .
ms said it and is in the soc....warrior go to sleep.

I'll not continue this convo search about yourself .
interview directly from xbox hw architect
Man you sell fp8 as it was sort of tensor core. Clearly you read the MS panels (quite vague and imprecise to be honest) as they were the holy grail of the graphic tech but you don't reflect a bit of the real meaning around the things they are talking about. It's clear as the daylight. And you feel offended if someone try to explain to you isn't it exactly as you claimed. :/
 
Last edited:

Genx3

Member
Xbox Series X/S has hardware customizations to accelerate ML. That increased the size of the GPU as stated by MS engineers.

Those are facts.
 

assurdum

Banned
and fact can be backed:

I give up. Like seriously. Thanks to God Sony fanboy should be the worse there. The level of absurdity I read every time we are talking of tech stuff. I missed the count. And every time the same videos, the same panels, the same arguments completely misinterpreted; the series X hardware hasn't exactly dedicated chip for ML. .
 
Last edited:
ML Render is specific to Xbox, the competition is however utilizing the same GPU essentially, and could/will
come up with a comparable ML solution. In fact the onus of such a solution may be on AMD to provide. However the competition
should find it perplexing that Microsoft has from the onset built MLRender into the DX framework for Series X from the ground up - whereas
the competition is left depending on their own coding knowledge and AMD's GPU implementation.

However this shouldn't be a problem in the long run.
 
Last edited:

MonarchJT

Banned
I give up. Like seriously. Thanks to God Sony fanboy should be the worse there. The level of absurdity I read every time we are talking of such stuff. I missed the count. And every time the same video, the same panels which absolutely never said a single time the series X hardware has dedicated chip for whatever you try to spread. Unbelievable.
if there isn't on the presentation of the ps5 any reference or words from sony or cerny about vrr..vrs, mesh shader.. and trillions of other things they exist coz "they can have their version of everything !!!!!but if in the official hotchip 2020 (made for those who understand) presentation of the hardware soc innovaton it says clearly on the official slide "machine learning accelleration" clearly don't exist and we are inventing everything ..because we must believe you.
 
Last edited:

assurdum

Banned
if there isn't on the presentation of the ps5 any reference or words from sony or cerny about vrr..vrs, mesh shader.. and trillions of other things they exist coz "they can have their version of everything !!!!!but if in the presentation of the hardware soc innovaton it says lesley on the official slide "machine learning accelleration" clearly don't exist and we are inventing everything ..because we must believe you.
And why you continue to talk about the ps5 just why....you can't really live without console war isn't it?
 
Last edited:
You really think to have fp8 is like to have dedicate hardware part for ML acceleration? No offence but have you a single clue of what are you talking about? And what ps5 had to do to what I'm talking about previously? Honestly I doubt it missed fp8 compatibility but even in the case, isn't it that important.

Honestly he’s a troll who will
Shill for free. 95% of his posts are promoting imaginary Xbox features and spreading ps5 fud.
 

MonarchJT

Banned
I can talk about possible deficiency which really hurt me but are not related to the ps5 believe me.
I sincerely hope so and I sincerely hope that no deficiency will afflict you but stop if you want us to take you seriously to always think of the worst for one and the best for another. it's clear that we have different preferences but we can discuss them based on what both factions offer us in a civil manner.
 

assurdum

Banned
Can you go PM ?
data is presented for everybody to decide for himself and your overly extreme biased opinions are also pretty clear.
I don't know maybe I can't explain me better, maybe it's my limits, but honestly I don't know how to explain better the hardware part inside the series X which most of you are talking about, are not exactly as to have, let's say, whole dedicated hardware resources in the ML processing. They are very minimal customisation. My last post about the argument.
 
Last edited:

martino

Member
I don't know maybe I can't explain me better, maybe it's my limits, but honestly I don't know how to explain better the hardware part inside the series X which most of you are talking about, are not exactly as to have, let's say, whole dedicated hardware resources in the ML processing. They are very minimal customisation. My last post about the argument.
it's small area cost but i wouldn't call 3 to 10x performance gains for those kind of operation small....
 

assurdum

Banned
it's small area cost but i wouldn't call 3 to 10x performance gains for those kind of operation small....
I wouldn't take too seriously that 3 to 10X estimation. It's very easy to use a manipulative narrative when we don't know the real means of such statement. I would laugh to the face of Sony or whatever does it without a concrete data which proves it.
 
Last edited:

martino

Member
I wouldn't take too seriously that 3 to 10X estimation. It's very easy to use a manipulative narrative when we don't know the real means of such statement. I would laugh to the face of Sony or whatever did it a without a concrete data which proves it.
you're right they added it for nothing but PR.
 
And this is a completely different argument. Compatible CUs with fp8 are not exactly as exclusive dedicate hardware parts for ML and super resolution. Though I doubt ps5 doesn't have it too, considered is backward via hardware with the Pro. But that's another story.
But my friend even if PS5 does it only has 36 cu's Series X has 52 cu's, what you think that means?
 

assurdum

Banned
But my friend even if PS5 does it only has 36 cu's Series X has 52 cu's, what you think that means?
What exactly are you trying to argue? Why you think 36 CUs are not enough? I don't know maybe with more CUs could have a sort of advantage in perfomance, it's hard to quantify but I don't think will cause problems on ps5.
 
Last edited:

Reallink

Member
Enabled on a whopping eh... how many games are out there with that glorified upscaling, by the way?

I've forgotten, how does checkboard rendering qualify, is it not an excellent upscaling technique?
Although, no "AI" buzzwords used, I see.

Do you know what it takes to train AI to give answers to this simple equation:

y = sin(x)

or perhaps for this one, let's be generous:

y = ax^3+bx^2+c

No? Oh well.
I'm not even sure what you're trying to say here. Pro's checkerboard upscaling was great, but 0 PC games support it, and it's objectively inferior to DLSS 2.0 (see Deathstranding). Idgaf how end results are achieved, but if I'm paying $800-$1000 for a GPU I want at least some illusion of future proofing features. Nvidia's 2 or 3 dozen games are certainly a lot more concrete than AMDs magician gesture to some as of yet ethereal super resolution.
 

assurdum

Banned
I don't argue my friend, let's just say the odds are stacked against PS5.
Ok let me reverse the argument. A friend tell me 560 GBs for 52 CUs is not enough to feed all properly, let's not talk to the particular RAm configuration, infinity cache is not there, custom cache neither. You think there will be zero consequences in ML perfomance?
 
Last edited:

MonarchJT

Banned
Ok let me reverse the argument. A friend tell me 560 GBs for 52 CUs is not enough to feed all properly, let's not talk to the particular RAm configuration. You think will be zero consequences in ML perfomance?
your friend is objectively wrong
 
  • LOL
Reactions: B23

assurdum

Banned
your friend is objectively wrong
Oh no it's not. Serie X should have as minimum the same bandwidth per CUs of ps5 to offer a proportionate leap in perfomance. It's a simple math operations. And CUs on ps5 have higher frequencies. Keep in mind I'm not even interested in the argument but there is a reason because ps5 is like that. Now it means series X can't offer better performance? No but the gap couldn't be that big as the CUs number leave to intend.
 
Last edited:

MonarchJT

Banned
Oh no it's not. Serie X should have as minimum the same bandwidth per CUs of ps5 to offer a proper leap. It's a simple math operations.
rx 6800 have 60 cu...512 gb/s
rx 6800xt 72 cu ..same 512 gb/s
the ps5 isn't powerful as the 6800xt....and the xsx have pretty more bandwidth than the 6800xt with lots less cu's (52 vs 72)
your friend told you total bullshits
 
Last edited:
Top Bottom