• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

What happened to AMD FSR 3.0?

Buggy Loop

Member
FSR 3 will turn up in Starfield in my opinion. AMD will use that game to showcase their new tech in a game that will only support FSR2 and FSR3 and will not support DLSS or XeSS since they won't want their tech shown as being in anyway inferior to the competition.

Probably. Why release FSR now (even though it’s late) when you have a 2 months possible release with the biggest blockbuster of the year.

I don’t know if keeping the other tech out would help them though? Their tech would have to massively step up to impress and I have my doubts. The backlash potential is off the charts.
 
AMD is good at copying Nvidia's homework sometimes, but with FSR they simply don't have any institutional knowledge in AI and are hopelessly outclassed by DLSS.

Even Intel got XeSS into a good shape compared to AMD FSR.
 

Eotheod

Member
Both companies are shit, and this is coming from someone with a RX6800XT. NVIDIA fleece consumers without giving much for the fleece, and AMD suck at innovating pipeline features like RT and FSR.

The real loser is the consumer. The GPU market is fucked and it sucks. At least Intel have bothered to show up, and with some hope that they innovate enough.
 
Last edited:
Both companies are shit, and this is coming from someone with a RX6800XT. NVIDIA fleece consumers without giving much for the fleece, and AMD suck at innovating pipeline features like RT and FSR.
AMD's issue isn't even their lack of features, IMO. I mean, that's an issue, but the bigger problem is that they refuse to compete on price. They're happy being party of this duopoly system where the margins are jacked up and there's no real price competition. There's a small discount, because of both the lack of features and the fact that their architecture is plainly inferior to Nvidia's, but they'll never really try to compete.

Frame Generation is great though, and I hope AMD manages something comparable, but I have doubts because their knockoff versions of Nvidia technologies are always inferior. I'd love to see a good open source version of Nvidia's proprietary shit but I doubt it will happen
 
Last edited:

Barakov

Member
About 8 months ago, Amd showed FSR 3. 8 months ago there was this short video for a few seconds since then no video presentation. Then it was briefly talked about in March 2023, again no one saw it. For the fact that it is supposed to come this year.

There is very little information about it. We don't even know if Nvidia cards will be supported this time. Without DLLS 3 alternative from Amd, I can't even consider these cards. It seems to me like they just started development and promised a release for 2023, even though they don't have anything yet.

AMD-FSR-3.jpg


amd-fidelity-fx-super-resolution-3-diagram-20230327.jpg



What happened? AMD.
 

yamaci17

Member
Tired of these fake frames hype, now new cards get compared using fake frames compared to older cards with normal frame
Improve normal performance first then think about fake frames
dlss3 has ramifications beyond improving regular GPU bound performance

it became a tool to overcome / brute force horrendously CPU bottlenecked games to hit high framerates. that has nothing to do with GPU power sadly. a plague tale requirem's barebones market scene taps a 5800x 3d around a comical 70-85 FPS. if one wants a 100+ fps experience, only way to achieve it is through DLSS3. trust me, that's the primary reason NVIDIA developed DLSS3. they're aware that CPUs or CPU bound optimizations cannot keep up with their behemoth GPUs anymore. DLSS3 both utillizes empty spaces on GPU to fill up the GPU (creating the illusion of higher GPU utilization) and increase framerate on paper. its a win win tech for NVIDIA.

even a 13400f wont be able to keep up at 60 FPS with a midrange GPU such as 4060ti due to horrendous CPU bottlenecks



for me, DLSS3 is just this. I don't really care about GPU bound improvements, as DLSS2 can provide ample of it, and then you can always claw back immense rasterization performance by using tweaked settings and still be there visually %97 maybe. but if you're CPU bottlenecked, turning the game into potato will at best yield you a funny %10-15 CPU performance. due to that, DLSS3 is the best tool if you're CPU bottlenecked around 45-70 FPS, which most games started to become.

it took like 5-6 years to knock a 2500k from the defining 60 FPS experience on PC. yet somehow a brand new i5 13400f cannot even present a locked 60 fps experience in brand new titles. it sounds really fishy, especially in relation to DLSS3's timely arrival. but what I can say, good luck to everyone
 
Last edited:

smbu2000

Member
Maybe they figured that nobody really cares about fake frames and support for it even with dlss3 is still low. Better to work on improving fsr2 and not worry about fake frames with higher latency.
 

nemiroff

Gold Member
Maybe they figured that nobody really cares about fake frames and support for it even with dlss3 is still low. Better to work on improving fsr2 and not worry about fake frames with higher latency.
IIRC Tom's Hardware tested Cyberpunk with DLSS3 and the graphs showed that latency was lower than native, even with Frame Gen enabled.
 

Danknugz

Member
People are just tired of incompetence. Having AMD being so behind the competition doesn't help anyone and doesn't help improve competition. I don't really see the same "hate boner" for intel, in fact I see a lot of people hoping they can surpass AMD and actually compete.
careful, there's definitely a strange cult like AMD thing out there which is normal considering they always undercut the competition in price, but shamefully it's just basic logic that they rally against. i've always claimed that amds ran hot and cut corners, apparently pay their devs less cause their drivers are horrible, and this is one of the ways they manage to undercut the competition.

i always seemed to get flamed / downvoted by what seems like an automated botnet every time i mention this or similar. but keep getting proven right, i guess amd boards were burning up just this year and things like this thread topic happening all the time.
 

Spyxos

Member
IIRC Tom's Hardware tested Cyberpunk with DLSS3 and the graphs showed that latency was lower than native, even with Frame Gen enabled.
That can't be right. I also tried it in Cyberpunk with the Nvidia overlay and there was a clear difference with dlss3.
 

Reallink

Member
IIRC Tom's Hardware tested Cyberpunk with DLSS3 and the graphs showed that latency was lower than native, even with Frame Gen enabled.
DLSS 3 has lower latency than no DLSS (i.e. running the game at native res), but 3.0 is definitely higher than DLSS 2.
 
Last edited:

lmimmfn

Member
Input lag? DLSS3 only works with Nvidia Reflex
Emm, you do realise that there is no input possible for generated frames so if real frame generation is 30FPS but with DLSS3 it's 60FPS, input is still only 30FPS or 33.3ms.

Nvidia Reflex or not, input is capped at actual real framerate, so yeah lag!
 
Last edited:

lmimmfn

Member
I used DLSS3 in Jedi Survivor with raytracing to take a 55-70fps experience all the way up to a mostly locked 120fps. It was exquisite.
Some like it, some don't, I'm not here to tell anyone anything regarding the experience.
Input lag at 55-70FPS would be minimal/not noticeable unless you're in your teens lol as for me anyway ~16.6ms lag vs ~8.3ms(if 120FPS was real) lag is difficult to differentiate.
 

GreatnessRD

Member
Amd's CEO got told to hold it off for another year at their family reunion. Didn't want to upset the price fixing balance, you see.
That's the kind of fam reunion I'm trying to sit in. We make all da monies.

And FSR3 was a rush announcement. Hopefully they'll have it ready by Q1 '24. I'd rather they keep fixing the drivers, so I'm not so fussed about FSR. I'm not huge into upscaling. FSR2 is more than adequate. I hate that folks are making out to be as if its DLSS1 or FSR1. While it isn't DLSS3, it is by no means toilet water.
 

yamaci17

Member
Emm, you do realise that there is no input possible for generated frames so if real frame generation is 30FPS but with DLSS3 it's 60FPS, input is still only 30FPS or 33.3ms.

Nvidia Reflex or not, input is capped at actual real framerate, so yeah lag!
GPU bound 30-35 FPS will have an enormous latency around 120-130ms by default

reflex by itself cuts its to 40-60 ms range. and DLSS3 add some on top of it, and you still end up somewhere between the middle.

GPU bound 60 FPS will also have more latency than reflex-ed 30-35 FPS, in most cases

you're simply disregarding GPU bound input latency which adds immense amount of additional input lag, which Reflex solves. Reflex is not magically reducing input lag out of thin air. its main target is the GPU bound pressure latency that existed for a long time. most people ,AMD and NVIDIA users included, played with this heavy GPU bound pressure latency for years without any problems. the people who did have some problem with latency often used frame caps to reduce it.. you can cut down this GPU bound input lag by introducing a framecap that pushes GPU utilization below %90 but there will be scenes that will demand more GPU power and end up causing input lag again. naturally, it wasn't an elegant solution and it only worked if your GPU was super powerful for a game so you could always know that your GPU will always have plenty headroom for your chosen framecap

here's an actual case analysis

witcher 3 at 66 FPS with GPU bound settings have 70-75 ms of input latency
kF329o3.jpg



and GPU bound 34 FPS has an enormous input latency around 130 ms (as I predicted)


CN8vnBf.jpg



Reflex massively cuts its down to 50 ms ranges

v3MLaIK.jpg



So in the end, Reflex'ed 34 FPS has less input latency than non-Reflex'ed GPU bound 66 FPS.

You can always have even lower latency by using Reflex on top of GPU bound 66 FPS, that's true. But that wouldn't be an option if NVIDIA never introduced Reflex to begin with. If you were fine with GPU bound 70+ ms of latency at 60-70 FS, you should technically still be fine at 40 FPS with Reflex that is being interpolated to the same 60-70 FPS. because they both still have literally the nearly same input latency. if you however feel entitled to the new low input latency you can achieve at actual 60-70 FPS, then it is just that, entitlement. but here's what you get if you Reflex the GPU bound 65-70 FPS. you get the peak lowest latency experience possible.

MB698cN.jpg


NVIDIA, if they wanted, could've opted out to give Reflex to everyone and only makes it so that it can be enabled alongside with DLSS3. if they did that, DLSS3 would always come out with positive input latency gain and no one would understand how it happens and none of this discourse would take place.

So by default, people played at 50-80 FPS for years that had actually higher input latency than a 30 FPS cap would. as I said, you don't need Reflex. here's the same with 30 FPS cap without reflex;

30 FPS cap, no reflex (even lower latency than reflex-ed 34 FPS)

M608ssT.jpg


and Reflex does nothing to the input latency, as much of the GPU bound pressure has been already beaten with the frame cap;

M608ssT.jpeg



So in short;

Input latency chart (highest to lowest)


Low framerate no reflex > High framerate no reflex (default experience) > Low framerate + DLSS3 + reflex > Low framerate reflex > > High framerate reflex (20 ms)

I'm not bringing DLSS2 to any of this equation latency reduction coming from DLSS2 is simply the magic of having higher framerate.
 
Last edited:

ToTTenTranz

Banned
AMD is good at copying Nvidia's homework sometimes, but with FSR they simply don't have any institutional knowledge in AI

I guess it was just out of pure luck and coincidence that AMD's Mi300 compute+AI accelerators are powering the next world's fastest supercomputer for deep learning applications.

You should send a formal letter of complaint to the Lawrence Livermore National Laboratory for choosing a compute hardware supplier that doesn't have "any institutional knowledge in AI".
If you properly expose your coherent reasoning, I'm sure they'll swiftly change their supplier to nvidia instead.
 

Allandor

Member
AMD is good at copying Nvidia's homework sometimes, but with FSR they simply don't have any institutional knowledge in AI and are hopelessly outclassed by DLSS.

Even Intel got XeSS into a good shape compared to AMD FSR.
Not really. AMD build something that even old Nvidia cards can benefit. Nvidia only builds something only their latest cards can do (well... at least they say that old cards can't do it).
I really hope that those techs get better at not creating artifacts. I really still don't use DLSS because the picture gets much blurrier in movement and artifacts appear all over the place.
 
AMD is good at copying Nvidia's homework sometimes, but with FSR they simply don't have any institutional knowledge in AI and are hopelessly outclassed by DLSS.

Even Intel got XeSS into a good shape compared to AMD FSR.

I dunno, FSR 2.2 lags behind DLSS but it's still a fine solution for the most part. Oftentimes it's near impossible to tell them apart. And it being hardware agnostic is a big win as well.

Hopefully FSR 3.0 closes that gap further. As others have suggested, it wouldn't surprise me to see it debut with Starfield.
 
Last edited:

Clear

CliffyB's Cock Holster
I remember when people just used to judge games on how they felt to play, not what proprietary tech it uses...
 

Clear

CliffyB's Cock Holster
Well, frame generation does feel much better to play.

Should it or upscaling ever really be necessary? Ultimately this tech is there to fix performance issues when the hardware is unable to keep up natively.
 

Skifi28

Member
Should it or upscaling ever really be necessary? Ultimately this tech is there to fix performance issues when the hardware is unable to keep up natively.
Well, in an ideal world there wouldn't be a need for any techniques faking something. Everything would be full path traced, but we're not quite there yet so every little bit helps.
 

Clear

CliffyB's Cock Holster
Unfortunately game devs are now designing games with upscaling in mind. It didn't take long for upscaling to become a crutch for everybody in the industry.

I somewhat agree, but these sort of upscaling techniques make sense when deployed on consoles where there's a fixed global limit for how much performance is available.

On PC this isn't the case and it seems to me that users are creating part of the problem by expecting every game to run well at maxed out settings if these features are available; basically setting up "problem" scenarios where native performance starts to suffer then demanding the upscaling tech fixes it!

Sorry, but it seems to me that you can't treat this tech like a "magic bullet". Its useful, but realistically its always going to have limits and downsides.
 
Top Bottom