SilentUser
Member
Just watched the video in 4K. Honestly, FSR made Godfall really blurry on the GTX 1060. Will need to see more or perhaps the results will be far better on newer cards or AMD cards.
These guys...
Nvidia should blacklist them again. Morons.
Kudos to AMD to demonstrate 1440p too. 4K is pretty easy to reconstruct given the massive amount of present detail, but 1440p and especially 1080p is much harder.
As expected, quality is not nearly as good as DLSS, but wide range hardware support is a good thing. Here's the full picture in a bit better quality:
Some speculate NVidia artificially limits the DLSS availability, using the tensor cores as an excuse. If this Fidelity FX is comparable it will be shameful for them.
Didn't expect much to begin with, but this looks like the lowest setting on DLSS and its apparently Quality mode according to text in the bottom right corner.
Lets wait till the reviews are out for real comparisons.
WTF that image is atrocious. I’m surprised they’d even show it off in this state. Guess I’ll wait for some more detailed analysis when the review sites get their hands on it, but I have zero expectations now.Didn't expect much to begin with, but this looks like the lowest setting on DLSS and its apparently Quality mode according to text in the bottom right corner.
Lets wait till the reviews are out for real comparisons.
Let's also understand this is the first release and use of it and DLSS has been out for years. So similar to DLSS 1.0 which was not great, it's going to take time for developers to implement this.
The damage control on green side is at full throttle
I hope it could bring to almost all users a great alternative to use and enjoy more years of gaming
What are you talking about? Why not just Google search things to get the answers of things you don't have a clue on? Hellblade 2 trailer is captured in engine, NOT CGI. This has already been confirmed almost a year ago. It was captured at 24fps to give the cinematic look (which it nails).I think you guys the biggest beneficiaries here. Its Consoles, isnt Godfall a game that Favors AMD since its on Consoles? (only a timed exclusive for PS5)
A 59% speedup using FXR Ultra Quality is a huge plus for Consoles.
Expect Both Microsoft and Sony 1st and 2nd party studios to implement this.
Heck with Unreal Engine V combine with FXR (Ultra Quality Mode), I now think Ninja Theory can actually get close to that Hell Blade 2 trailer!
(Which was totally Fake CGI Video)
Exciting Times!
Fine, it was still in no way representative of how the game would have looked. It was BS.I'm still waiting on homie to explain why this is a godsend, and how DLSS should be illegal and banned in all countries. It's not like he hasn't read this thread.
What are you talking about? Why not just Google search things to get the answers of things you don't have a clue on? Hellblade 2 trailer is captured in engine, NOT CGI. This has already been confirmed almost a year ago. It was captured at 24fps to give the cinematic look (which it nails).
Just about every game shown in engine is pretty much bs, as gameplay will never represent that fidelity. So definitely agreed there. Hopefully we get more samples, as this can benefit several people. Now if people don't like it, there's an on/off switch.Fine, it was still in no way representative of how the game would have looked. It was BS.
Reminds me of DLSS in that anything less than 1440p/quality mode looks a bit wank
Interesting that AMD have managed to provide something comparable to DLSS 1.0 without the need for proprietary tech, so it's hardware agnostic, and it's open source so devs can implement & iterate on it as much they want. I certainly wasn't expecting them to show GTX 1060 performance in their announcement, wasn't even expecting official support stretching back to Vega either. Sure it's no DLSS 2.0, but I'm almost convinced those outside of RTX owners will overlook that for now given it's something they can use
Same. I own a 3080 but I was still rooting for AMD here. A platform-agnostic alternative to DLSS, with almost as good image quality, and that didn’t require developer effort to add support, would’ve benefited everybody.Well I’m sad now, even as a new 3060 owner, I kind of wished they would have a home run and start an aliasing war with Nvidia. Would have benefited everyone, well especially since Nvidia owners anyway would get access to both solutions.
But yea, it’s about as expected.
They have a point though.
Are you serious? They're right. There's a chip shortage, those cards won't improve the current situation, they'll make it worse. They're unnecessary at best.
They criticize EVERYONE, not only Nvidia. You obviously don't follow the channel and don't give a shit about the current situation is the only thing you care about is the good name of a multi-billion company.
Your agenda is the problem. You did not even see the video. The contents reflect the exact sentiment seen on Nvidia subreddit and all forums I visit. Its called common sense and you lack any of it.
No. The issue is that it's not a good enough generalised solution to replace an individual game's own solution. By that I mean a studio can tweak their own TAA to look great for the particular game they're developing without having to worry about how that might look in others (something FSR/DLSS have to, because they're not bespoke to each game), which will yield much better image quality and similar performance, for example what Insomniac does with their Temporal Injection, or indeed the industry's best - The Division 2's solution which is on par with DLSS or better, without requiring tensor cores. So when AMD puts forth a solution that's as weak as simply lowering the render scale then it makes no real difference for consoles, and not for indies either (because TSR is better, with UE5, and probably Unity will have something similar too), and then on PC you simply have one more reason to buy an RTX gpu instead of Radeon.So this will make PS5/XSX even more super?
Jeez it's quite early, don't understand all that skepticism about it. It's really unnecessary. If you look to what it was TAA at the beginning you wouldn't have bet a penny to how good is became now.No. The issue is that it's not a good enough generalised solution to replace an individual game's own solution. By that I mean a studio can tweak their own TAA to look great for the particular game they're developing without having to worry about how that might look in others (something FSR/DLSS have to, because they're not bespoke to each game), which will yield much better image quality and similar performance, for example what Insomniac does with their Temporal Injection, or indeed the industry's best - The Division 2's solution which is on par with DLSS or better, without requiring tensor cores. So when AMD puts forth a solution that's as weak as simply lowering the render scale then it makes no real difference for consoles, and not for indies either (because TSR is better, with UE5, and probably Unity will have something similar too), and then on PC you simply have one more reason to buy an RTX gpu instead of Radeon.
So, in the end, this is honestly just a waste of software dev time in service of marketing. AAA studios with their own engines will have their own solutions, those on UE5 will have TS3, and so there's very few left that would even benefit in the least from this. It's a big whoop.
Thanks AMD for giving your old Nvidia GPU a few more years of life:
I feel like if i buy RX 6800xt its gonna last me for next 7 years on 1440p. Am i too optimistic or?
This is “quality” mode, the native is 1440p.
Wonder what the base resolution is for that?
edit: keep in mind this isn’t the “ultra” quality setting, I’d be most curious about that one.
DNo. The issue is that it's not a good enough generalised solution to replace an individual game's own solution. By that I mean a studio can tweak their own TAA to look great for the particular game they're developing without having to worry about how that might look in others (something FSR/DLSS have to, because they're not bespoke to each game), which will yield much better image quality and similar performance, for example what Insomniac does with their Temporal Injection, or indeed the industry's best - The Division 2's solution which is on par with DLSS or better, without requiring tensor cores. So when AMD puts forth a solution that's as weak as simply lowering the render scale then it makes no real difference for consoles, and not for indies either (because TSR is better, with UE5, and probably Unity will have something similar too), and then on PC you simply have one more reason to buy an RTX gpu instead of Radeon.
So, in the end, this is honestly just a waste of software dev time in service of marketing. AAA studios with their own engines will have their own solutions, those on UE5 will have TS3, and so there's very few left that would even benefit in the least from this. It's a big whoop.
By that metric, you would think AMD would of foreseen the obvious issues. If it is to be improved, AMD will need to implement real-time motion vectors (like DLSS 2.0) which will nullify any developer profiles from the first iteration of FSR. So it's not so much developers, but the underlying technology which would need to be improved. Developers would need to go back & implement newer profiles to allow FSR 2.0 to see real-time motion vectors within the game engine (still waiting on a SOTTR DLSS 2.0/2.1 update that will never come ). How much of a performance hit it would be compared to DLSS 2.0 remains to be seen. Unfortunately, it just looks like AMD rushed this out so they could say they have "something".
https://www.anandtech.com/show/15648/nvidia-intros-dlss-20-adds-motion-vectors
Thing is since it's opengear it's not like it's hard to make profiles once implemented. Because thats what these are, local profiles. There's no waiting on samples from NVIDIA in their database for the reconstruction.
The fact his works across a large spectrum of cards is showing about it's more practical use in all games going forward and not a select few that nvidia makes profiles more, and has to update their database with given data for their AI algorithm to reconstruct the image.
And again FSR only being available for a couple titles is showing how many people actually have it. The tools are also for console, so I would assume developers will share their findings as time goes on and update profiles. We now are getting updates to games that launched in 2020. So I dont understand the issue.
Also only a matter of time before people hack drivers and create fixes which then get shared or make their way to actual developers of the game.
DLSS may be an awesome tech, but it's use and practicality because of it's reliance on NVIDIA proprietary is why we still only have a handful of titles with great uses of it. AMD's version though not great right now, shows promise as developer have more time with it, and can update as needed.
It's faster more open system as opposed to NVIDIA's.
Thats not entirely true. the 580 and the 1060 were competitors for the same market for years. Both nvidia and AMD saw them like that. And the 580 is a 6 tflops gpu that offers identical performance compared to the x1x.nowhere near x1x
x1x can brute force run the rdr 2 at native 4k and 30 fps
we tried same x1x settings and 4k and it only rendered 18-20 fps. that's nearly %50 slower than one x
desktop gpus cant match console gpus properly, console gpu will always work more efficient
This is “quality” mode, the native is 1440p.
Wonder what the base resolution is for that?
edit: keep in mind this isn’t the “ultra” quality setting, I’d be most curious about that one.
Hopefully DF gets to pit this against DLSS 1.0