• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

AMD Working on an AI-powered FSR

winjer

Gold Member

AMD CTO Mark Papermaster confirmed that AMD is working on a new upscaling technology that leverages AI. A key technological difference between AMD FSR and competing solutions NVIDIA DLSS and Intel XeSS, has been AMD's remarkable restraint in implementing AI in any part of the upscaler's pipeline. Unlike FSR, both DLSS and XeSS utilize AI DNNs to overcome temporal artifacts in their upscalers. AMD Radeon RX 7000 series GPUs and Ryzen 7000 CPUs are the first with accelerators or ISA that speed up AI workloads; and with the RX 7000 series capturing a sizable install-base, AMD is finally turning to AI for the next generation of its FSR upscaling tech. Papermaster highlighted his company's plans for AI in upscaling technologies in an interview with No Priors.

To a question by No Priors on exploring AI for upscaling, Papermaster responded: "2024 is a giant year for us because we spent so many years in our hardware and software capabilities for AI. We have just completed AI-enabling our entire portfolio, so you know cloud, edge, PCs, and our embedded devices, and gaming devices. We are enabling gaming devices to upscale using AI and 2024 is a really huge deployment year." In short, Papermaster walked the interviewer through the 2-step process in which AMD is getting into AI, with a hardware-first approach.

it's about time starcraft GIF by Blizzard Entertainment


The question now remains, which GPUs will be supported. If AMD enables a DP4A path like XeSS does, this would mean that RDNA2 GPUs could also use this tech, including the PS5 and Xbox Series.
If not, it could mean support only for RDNA3 or later GPUs. Including the PS5 Pro.
 

Bojji

Member



it's about time starcraft GIF by Blizzard Entertainment's about time starcraft GIF by Blizzard Entertainment


The question now remains, which GPUs will be supported. If AMD enables a DP4A path like XeSS does, this would mean that RDNA2 GPUs could also use this tech, including the PS5 and Xbox Series.
If not, it could mean support only for RDNA3 or later GPUs. Including the PS5 Pro.

PS5 don't support DP4A (like 5700XT) as far as I know.

Edit:

UHI0tpd.jpg
 
Last edited:

winjer

Gold Member
PS5 don't support DP4A (like 5700XT) as far as I know.

Edit:

UHI0tpd.jpg

The thing is, it has never been confirmed by official sources.
Since the PS5 is a mix of RDNA1 and RDNA2, it might be the case for it not having DP4A support.
But that still leaves the Series consoles and the whole RDNA2 range on PC.
 

Bojji

Member
The thing is, it has never been confirmed by official sources.
Since the PS5 is a mix of RDNA1 and RDNA2, it might be the case for it not having DP4A support.
But that still leaves the Series consoles and the whole RDNA2 range on PC.

Of course but same was said about Mesh Shaders and VRS and now we know for sure that it doesn't.

It's RDNA1 with ray tracing.
 

Loxus

Member
I'm confused. Isn't AI Cores needed for this?

From the article:
AMD spent 2022-23 introducing ISA-level AI enablement for Ryzen 7000 desktop processors and EPYC "Genoa" server processors. For notebooks, it introduced Ryzen 7040 series and 8040 series mobile processors with NPUs (accelerated AI enablement); as well as gave its Radeon RX 7000 series RDNA 3 GPUs AI accelerators.
 

Xyphie

Member
PS5 GPU not supporting DP4a has been confirmed by AMD in their LLVM Github for years. Some PS fans are in super cope mode about it though because they can't accept that PS5 GPU = Navi 10 with RT cores.

We know with 100% certainty that gfx1013 is the PS5 GPU and anyone can see in line 218 that gfx1013 has the same feature set as gfx1010 (Navi 10, 5700XT et al), while gfx1011 and gfx1012 do (it's the "dot1-7-insts" features).
 

winjer

Gold Member
I played a bit of robocop on 6800 and TSR performed just slightly worse than FSR2 but looked much better. They clearly fucked up, TSR is not using any ai and it's definitely better. I think that insomniac reconstruction is also better than fsr.

Epic has many years of advantage creating temporal upscalers.
Remember that UE4 already had TAAU in UE4.19, in early 2018. Before DLSS2 and way before FSR2.

Also, TSR has a nice trick up it's sleeve. With more recent versions of UE5 r.TSR.History.ScreenPercentage is set to 200% by default.
This means the image is upscaled much higher and then downscaled after. It's probably the main reason why it looks better than FSR2, but also the reason why it's heavier.
 

AJUMP23

Gold Member
I was listening to a podcast where MS is working on providing Devs a single line of code that allows them to use FSR or DLSS without all the effort they have to put in now for support. It would be nice to where it just happens, and developers do not have to invest many resources into supporting the features.
 

winjer

Gold Member
If it cannot be open source then so be it.

That it cannot be open source, might have a secondary advantage.
Today, most devs just inject the FSR2 code into their games.
But if they are forced to use a .dll, like XeSS and DLSS, then it means users can also update these files for better image quality.

Open source is the "I'm vegan!" of the software world. Nobody cares.

Admittedly, AMD's the obsession with open source is a bit strange.
But open source software has given us a ton of great stuff.
Linux is the basis for most of the internet infrastructure. Could you imagine how bad things would be if we had to rely on Windows.
Chrome is also open source and it's the basis for most browsers today. Chrome, Brave, Thorium, Opera, etc.
 

Buggy Loop

Member
I was listening to a podcast where MS is working on providing Devs a single line of code that allows them to use FSR or DLSS without all the effort they have to put in now for support. It would be nice to where it just happens, and developers do not have to invest many resources into supporting the features.

So Nvidia Streamline (open-source!)


But not by made by Nvidia, because while Intel signed for it and Hardware vendor #3 is clearly Nvidia inviting AMD to join, AMD played tough to get.
Maybe Microsoft can do it.

This could have happened 2 years ago if AMD wasn't so fucking stubborn.

"Bu bu but while Nvidia's container is open-source, DLSS it not open-source!"

Iron Man Eye Roll GIF


That it cannot be open source, might have a secondary advantage.
Today, most devs just inject the FSR2 code into their games.
But if they are forced to use a .dll, like XeSS and DLSS, then it means users can also update these files for better image quality.

Indeed

Admittedly, AMD's the obsession with open source is a bit strange.
But open source software has given us a ton of great stuff.
Linux is the basis for most of the internet infrastructure. Could you imagine how bad things would be if we had to rely on Windows.
Chrome is also open source and it's the basis for most browsers today. Chrome, Brave, Thorium, Opera, etc.

Yes, of course I'm exaggerating a bit to put all open source into a basket and call it useless. What I mean is that if you sell hardware that are always benchmarked in visual quality and performance, if you're hindering your product by ignoring potential solutions because you insist on being open-source then it's a dumb move.

AMD has to put open-source on the side. Start by having a good solution first and then think about open-source later. Intel did (will?) do the same with XeSS as it is promised to be open-source some day.
 
Last edited:

AJUMP23

Gold Member
So Nvidia Streamline (open-source!)


But not by made by Nvidia, because while Intel signed for it and Hardware vendor #3 is clearly Nvidia inviting AMD to join, AMD played tough to get.
Maybe Microsoft can do it.

This could have happened 2 years ago if AMD wasn't so fucking stubborn.

"Bu bu but while Nvidia's container is open-source, DLSS it not open-source!"

Iron Man Eye Roll GIF


Nvidia probably sees the open source as making it possible capture market share.
 

Fbh

Member
Oh no. Now devs will be like
"With this with can put 4K raytraced reflections on the eyes of the background animals and insects in the environment, then we just render the game at 240p and upscale!!"
 
Oh no. Now devs will be like
"With this with can put 4K raytraced reflections on the eyes of the background animals and insects in the environment, then we just render the game at 240p and upscale!!"
Maybe these new games will finally be able to hold a steady 30FPS in those town areas.
 

amigastar

Member
That's what I had planned 👍
Well, choosing Graphics card is rather easy, there is the RTX 4070 super and RTX 4070 TI Super (depends on how many you wanna spend)
Or you wanna go the highend route with 4080 and up etc.
CPU wise i don't know since i don't follow the current CPU Development.
 
Last edited:
Well, choosing Graphics card is rather easy, there is the RTX 4070 super and RTX 4070 TI Super (depends on how many you wanna spend)
Or you wanna go the highend route with 4080 and up etc.
CPU wise i don't know since i don't follow the current CPU Development.
I was planning on 4080 super and intel i7 14700k.

.. as I'm reading I'm wondering if I need to upgrade this cpu if I wanna game at 4k. Thoughts?
 
Last edited:

Rentahamster

Rodent Whores
I was planning on 4080 super and intel i7 14700k.

.. as I'm reading I'm wondering if I need to upgrade this cpu if I wanna game at 4k. Thoughts?

It depends on what game you're playing, but generally speaking, 4K gaming is more dependent on your GPU, not your CPU, since the GPU is the limiting factor in 4K gaming most of the time. When you look at 4K gaming benchmarks, the FPS hardly changes much between mid to high-tier CPUs.

 

JohnnyFootball

GerAlt-Right. Ciriously.
Better quality fsr would be really good, It's by far the worst upscale out of the 3 currently.
Indeed. XeSS is surprisingly good and it should get even better as it becomes more tailored to . AMD should hire Intel's engineers.

FSR isn't great but FSR3 is noticeably better.
 

JohnnyFootball

GerAlt-Right. Ciriously.
I was planning on 4080 super and intel i7 14700k.

.. as I'm reading I'm wondering if I need to upgrade this cpu if I wanna game at 4k. Thoughts?
Good luck finding a 4080 Super at MSRP. They're all back to 4080 prices of $1200+

You're better off just getting a 4070 Ti at that point. I've seen quite a few still at the $799 price point.
 

hlm666

Member
I was planning on 4080 super and intel i7 14700k.

.. as I'm reading I'm wondering if I need to upgrade this cpu if I wanna game at 4k. Thoughts?
unless your hunting extreme framerates your cpu choice should be fine, if you get cpu bound you don't have enough settings turned up ;) Jokes aside the higher the resolution the less chance your going to be cpu bound.
 

SlimySnake

Flashless at the Golden Globes
As long as the PS5 Pro supports it, im good. Sucks that the PS5 might not have it, but thats ok. its a 3 year old console at this point. If you want the latest and greatest, pay up.

I was able to trade in my PS4 for $250 and got a PS4 pro for just $150. Upgrading consoles isnt really that expensive.
 

Fafalada

Fafracer forever
Intel did (will?) do the same with XeSS as it is promised to be open-source some day.
I call BS on that one. Intel has open-sourced plenty of things in the past - XeSS is clearly not one of them - if they had any plans for it, it would have happened already.

Of course but same was said about Mesh Shaders and VRS and now we know for sure that it doesn't.
Well - no, Mesh Shaders we now know for sure are just paper-ware, where the actual hw-acceleration is shared between PS5/XSX. Meanwhile, no VRS was confirmed years ago (practically since launch).
Discussion of DP4a has been avoided at large, not that it's strictly needed here - worst case you get an algorithm at half the speed with 16bit math, so it would still work for certain workloads even without it.
 

Bojji

Member
Do the Series consoles support it?

Most likely yes. Even Radeon 5500 had it, just first line of rdna1 GPUs was very weird. Surprisingly PS5 is using first version but it has RT hardware...

Xbox GPU is feature complete with rdna2 aside infinity cache.

Edit: hardware unboxed tests of xess showed that it can be much slower than FSR2 on rdna GPUs so developers most likely won't use it on Xbox.

This updated fsr version will most likely work on rdna3 ai hardware.
 
Last edited:

YCoCg

Member
The situation with FSR has been a let down really, the idea of it being open meant that they were hoping people would contribute to it and make it better over time but that didn't really happen and the last build update remains at 2.3 and has done for months now, maybe they reached the limit of what could be done without AI? Perhaps and that's why they've changed direction.
 

Bojji

Member
The situation with FSR has been a let down really, the idea of it being open meant that they were hoping people would contribute to it and make it better over time but that didn't really happen and the last build update remains at 2.3 and has done for months now, maybe they reached the limit of what could be done without AI? Perhaps and that's why they've changed direction.

TSR shows that software can definitely have better results than FSR2.

Bout time they're getting rid of a software solution to what is seemingly a hardware one.

I bet that games will have "FSR (some number here)" that will have different quality level if you are using old gpus or RDNA3+.
 
TSR shows that software can definitely have better results than FSR2.



I bet that games will have "FSR (some number here)" that will have different quality level if you are using old gpus or RDNA3+.

It should. FSR didn't help me go AMD. Hell, it made me get rid of my old 1080ti to a Nvidia card. FSR was basically unusable at 1440P.
 
Top Bottom