• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

What happened to AMD FSR 3.0?

Spyxos

Member
About 8 months ago, Amd showed FSR 3. 8 months ago there was this short video for a few seconds since then no video presentation. Then it was briefly talked about in March 2023, again no one saw it. For the fact that it is supposed to come this year.

There is very little information about it. We don't even know if Nvidia cards will be supported this time. Without DLLS 3 alternative from Amd, I can't even consider these cards. It seems to me like they just started development and promised a release for 2023, even though they don't have anything yet.

AMD-FSR-3.jpg


amd-fidelity-fx-super-resolution-3-diagram-20230327.jpg


 

Buggy Loop

Member
They were likely reactionary to Nvidia’s announcement and not anywhere near ready.

Makes me chuckle the peoples who said that AMD will have a rapid solution because they’re good at motion interpolation for videos 😂

Also a big part of what makes frame gen usable is Nvidia reflex. AMD’s solution is not even close to reflex. Under scrutiny it’s also probably very hard to match frame gen’s quality, it’s harder to achieve than many believes and if AMD is stubborn to not go with ML, even tougher.
 

SolidQ

Member
My thoughts. They trying to work it on much possible hardwares, that why delay, but they should start only for 7xxx series, then start make it for everyone.
When market going to 80%+ hardware with AI cards RX 7xxx/RTX 2xxx+, i think they going make Open AI Upscaling
 

poppabk

Cheeks Spread for Digital Only Future
Obviously it was in reaction to DLSS 3, but it appears to be harder to implement well without dedicated hardware. Considering how poor FSR 2 is compared to DLSS 2 I'm not holding my breath on this being any good.
FSR2 is good enough in quality mode. Frame interpolation can't be that hard to do, Quest 2 has its own version that seems to work decently overall. My guess is that the problem is similar to FSR2 where if you have 1440p at 60 FPS plus they have it working, but it becomes dogshit when you drop below that.
 

rofif

Can’t Git Gud
They hopefully skipped it and working on FSR 4 (just like 1 was some random stupid fake upscaling and 2 was dlss)
 

Holammer

Member
When will AMD release their next round of GPUs? Maybe they're holding on to the tech, so it can be part the marketing push.
I wanna see FSR3 in Yuzu!
 

shamoomoo

Member
They were likely reactionary to Nvidia’s announcement and not anywhere near ready.

Makes me chuckle the peoples who said that AMD will have a rapid solution because they’re good at motion interpolation for videos 😂

Also a big part of what makes frame gen usable is Nvidia reflex. AMD’s solution is not even close to reflex. Under scrutiny it’s also probably very hard to match frame gen’s quality, it’s harder to achieve than many believes and if AMD is stubborn to not go with ML, even tougher.
I get that developing software can become complicated how is FSR3 a reaction? Frame generation and upscaling have been a thing at least in theory before Turning ever existed. Also, frame gen and upscaling only comes about to sell Nvidias 2000 series because no one was going crazy for frame gen upscaling before then.
 

JCK75

Member
I mean I do wish it would release because I'm only interested in upskilling technology for handheld gaming devices... And let's be honest, the once worth using are not using Nvidia.
 

RoboFu

One of the green rats
Its going to be that big of difference! :messenger_beaming:





its probably all marketing ... my guess unless it takes new "cores" it will be launched with somehting like the starfield launch.
 

Buggy Loop

Member
Why are some people having a huge hate boner for AMD? I've seen people even say they want AMD to fail? Like how does that benefit you? You think a market where Nvidia has even less competition is going to be good?

Cause they’re not competing

The reason why 4080 is so heavily cut down silicon and costs so much is that even AMD’s flagship is not a threat to that fleecing.

AMD embraced the fleecing. So why do they need a free pass? They’re no longer the unbeatable value, not when you compare the software stack of Nvidia’s.

Intel is the only contender.
 

winjer

Gold Member
AMD shoot themselves in the foot by announcing FSR3 before it was ready.
I bet Intel is also working on a frame generation tech, but it's keeping quiet as not to generate false expectations and bad press.
 

Sleepwalker

Member
Why are some people having a huge hate boner for AMD? I've seen people even say they want AMD to fail? Like how does that benefit you? You think a market where Nvidia has even less competition is going to be good?

People are just tired of incompetence. Having AMD being so behind the competition doesn't help anyone and doesn't help improve competition. I don't really see the same "hate boner" for intel, in fact I see a lot of people hoping they can surpass AMD and actually compete.
 

SmokedMeat

Gamer™
Tired of these fake frames hype, now new cards get compared using fake frames compared to older cards with normal frame
Improve normal performance first then think about fake frames

Agreed. It’s a big part of Nvidia’s marketing when trying to show a jump in performance over previous gen.

I’m interested in seeing how AMD’s fake frame solution works out, but ultimately it’s not that important to me. We shouldn’t need fake frames to get a game to run well.
 

StereoVsn

Member
Cause they’re not competing

The reason why 4080 is so heavily cut down silicon and costs so much is that even AMD’s flagship is not a threat to that fleecing.

AMD embraced the fleecing. So why do they need a free pass? They’re no longer the unbeatable value, not when you compare the software stack of Nvidia’s.

Intel is the only contender.
That's not entirely true. 7900xt was running for $700-750 during latest sales and it's generally a faster card vs 4070ti, sometimes significantly so.

7900xtx was going for $800-900 and it's on par with 4080.

DLSS2 is better vs FSR 2 in general, but the difference is a lot less pronounced on "quality" mode. DLSS 3 is situational due to latency, motion artifacts, and support in games.

Personally if I didn't have 3080ti which I got at the height of the stupid crypto boom (couldn't find 3080 to buy), I would have gotten 7900xtx considering the price/performance parameters.
 

BennyBlanco

aka IMurRIVAL69
Why are some people having a huge hate boner for AMD? I've seen people even say they want AMD to fail? Like how does that benefit you? You think a market where Nvidia has even less competition is going to be good?

1. They had a chance to stick it to Nvidia last year and pull the carpet from underneath them by pricing their GPUs with a reasonable profit margin and didnt.

2. They keep doing these partnerships where they pay to cockblock everyone from using DLSS, and are now doing it on the biggest games of the year, Starfield.

3. FSR is ass. Console gamers are even getting shafted here like with Jedi Survivor using FSR to upscale from sub 720p and looking like dogshit.

4. Just anecdotally their CPUs used to run like nuclear reactors. The new ones are good so I hear but I’ll never buy one because the only AMD product I ever owned had absurdly bad thermals no matter what I did.

I want them to be good but they are just always doing things that piss me off. I’m rooting for Intel to come on strong in the GPU market and challenge Nvidia because I don’t think AMD has the chops. They are always like 5 steps behind and their market share is in the toilet because of it.
 

LordOfChaos

Member
Even MetalFX and XeSS as the new kids on the block have beaten AMD on image quality

FSR 3 and frame gen are somewhat different to that, but still, AMD as the only solution provider without dedicated hardware accelerating this (Apple has Neural Engine, Intel has XMX cores, Nvidia of course has Tensor cores) seems to be lagging behind even new efforts here.

It doesn't sound like they caved on dedicated hardware with RDNA 4, and will still be trying to pump upscaling, ray tracing, and other new techniques still through CU compute. I wonder if that'll keep them behind.
 

Rentahamster

Rodent Whores
They're probably having a hard time making it work on all GPUs. IMO they should just limit it to their own hardware first if its that difficult.

I wish that they'd at least implement something like universal asynchronous time warp first.
 
Last edited:

SABRE220

Member
Even MetalFX and XeSS as the new kids on the block have beaten AMD on image quality

FSR 3 and frame gen are somewhat different to that, but still, AMD as the only solution provider without dedicated hardware accelerating this (Apple has Neural Engine, Intel has XMX cores, Nvidia of course has Tensor cores) seems to be lagging behind even new efforts here.

It doesn't sound like they caved on dedicated hardware with RDNA 4, and will still be trying to pump upscaling, ray tracing, and other new techniques still through CU compute. I wonder if that'll keep them behind.
They are so behind its quite frankly embarrassing...Nvidia gave them openings to actually capitalize on opening but they have been absolutely lazy in terms of innovation and R&D doubling down on their aging pipeline while their competitors leave them in the dust, worse still they have the balls to still price gouge with their underwhelming offerings taking any goodwill they had left since their glory days. Imagine Intel developing their first dedicated gpu and embarrassing amds newest arc in both Rt pipeline, machine learning capabilities and image reconstruction..this shit is unacceptable considering their past pedigree and Im still trying to figure out why Lisa lu gets plaudits when she literally turned their gpu tech into the shitter just to get their cpu side running.
 

LordOfChaos

Member
They are so behind its quite frankly embarrassing...Nvidia gave them openings to actually capitalize on opening but they have been absolutely lazy in terms of innovation and R&D doubling down on their aging pipeline while their competitors leave them in the dust, worse still they have the balls to still price gouge with their underwhelming offerings taking any goodwill they had left since their glory days. Imagine Intel developing their first dedicated gpu and embarrassing amds newest arc in both Rt pipeline, machine learning capabilities and image reconstruction..this shit is unacceptable considering their past pedigree and Im still trying to figure out why Lisa lu gets plaudits when she literally turned their gpu tech into the shitter just to get their cpu side running.

I could see Intel being a huge threat to their GPUs over the next few years, this is just the first spin, give it 2-3 more generations and they could be pretty great while out-AMDing AMD on pricing and value, and actually having good RT and upscaling solutions in hardware to offer for that value

AMD's turnaround in CPUs since Zen has been pretty remarkable though, but it's long been the case that when either of their CPUs or GPUs are excelling, the other is floundering, AMD has rarely been able to hit on both at the same time
 

FireFly

Member
Im still trying to figure out why Lisa lu gets plaudits when she literally turned their gpu tech into the shitter just to get their cpu side running.
RDNA 2 was a big improvement over previous generations and even RDNA 1 was decent outside of lacking features. It's with RDNA 3 that they really dropped the ball.
 

night13x

Member
AMD being AMD. They budget price their cards for a reason because they know they cannot actually compete with nvidia tech. Not trying to piss on AMD (console and mini consoles has been a godsend under amd, along with their excellent CPU), but they definately need a good bit more time to cook if they were to actually compete with nvidia in a timely fashion.
 

Buggy Loop

Member
That's not entirely true. 7900xt was running for $700-750 during latest sales and it's generally a faster card vs 4070ti, sometimes significantly so.

7900xtx was going for $800-900 and it's on par with 4080.

DLSS2 is better vs FSR 2 in general, but the difference is a lot less pronounced on "quality" mode. DLSS 3 is situational due to latency, motion artifacts, and support in games.

Personally if I didn't have 3080ti which I got at the height of the stupid crypto boom (couldn't find 3080 to buy), I would have gotten 7900xtx considering the price/performance parameters.

4080 has hit the $1k mark too

DLSS is better than FSR in general? How about always. Only one of those two can claim to have better than native.

Not that I would buy either but if I’m spending close to a grand, first I’m not touching the AMD reference cards, 4080s coolers are actually very good for the whole stack. Have less power consumption, have less noise overall, frame gen, better RT, for virtually the same rasterization when you take a big enough pool of games. They will also perform better with DirectStorage GPU decompression too. And VR, and resell value for professionals or just gaming in general. Early adopters of RDNA 3 had shit VR performances, only been fixed like a month ago. How is that acceptable? Fine wine TM indeed, start broken and eventually fix it.

Intel was expected to have to catch up in drivers since it was their first entry, and they have done quite a job to catch up. I’m expecting better from a company where their origins, ATI was in the business before Nvidia.



RDNA 3 just ain’t it. I feel like something fucked up along the line and they had to deliver for end of fiscal quarter. Their previous gen was actually way more competitive.
 
Last edited:

StereoVsn

Member
4080 has hit the $1k mark too

DLSS is better than FSR in general? How about always. Only one of those two can claim to have better than native.

Not that I would buy either but if I’m spending close to a grand, first I’m not touching the AMD reference cards, 4080s coolers are actually very good for the whole stack. Have less power consumption, have less noise overall, frame gen, better RT, for virtually the same rasterization when you take a big enough pool of games. They will also perform better with DirectStorage GPU decompression too. And VR, and resell value for professionals or just gaming in general. Early adopters of RDNA 3 had shit VR performances, only been fixed like a month ago. How is that acceptable? Fine wine TM indeed, start broken and eventually fix it.
Oh, didn't see the $1K deal for 4080, that's pretty good.

For DLSS vs FSR, at quality 4k, ie rendring at 1440p, it's actually pretty decent. You can see comparisons in latest Hardware Unboxed videos. FSR really starts falling apart at lowere rendering res.

For coolers, the $900 cards I saw were custom and had good reviews so don't think that would be an issue.

For normal gaming the 7900xtx is really not a bad card or deal. I haven't messed with VR for over a year so haven't followed driver issues with it, but for most people it wouldn't be a huge deal. Still not something that should have occurred.
 
FSR 3 will turn up in Starfield in my opinion. AMD will use that game to showcase their new tech in a game that will only support FSR2 and FSR3 and will not support DLSS or XeSS since they won't want their tech shown as being in anyway inferior to the competition.
 

lmimmfn

Member
DLSS3, the one use when you get 20FPS with raytracing to make it seem reasonable even though its all fake with lag.

While it's nice to mess about with(cyberpunk with overdrive raytracing) it would have 0 influence in my graphic card choice.
 

BennyBlanco

aka IMurRIVAL69
DLSS3, the one use when you get 20FPS with raytracing to make it seem reasonable even though its all fake with lag.

While it's nice to mess about with(cyberpunk with overdrive raytracing) it would have 0 influence in my graphic card choice.

Input lag? DLSS3 only works with Nvidia Reflex
 

Mister Wolf

Gold Member
DLSS3, the one use when you get 20FPS with raytracing to make it seem reasonable even though its all fake with lag.

While it's nice to mess about with(cyberpunk with overdrive raytracing) it would have 0 influence in my graphic card choice.

I used DLSS3 in Jedi Survivor with raytracing to take a 55-70fps experience all the way up to a mostly locked 120fps. It was exquisite.
 
Top Bottom