If you only watch totally not paid shills, also known as DF, I guess so.so basically, in its current form, it's basically useless
If you only watch totally not paid shills, also known as DF, I guess so.
On some of their hardware.Making something popular that only works on their hardware...
Kills their tech. (e.g. G-Sync)sells their hardware.
Exactly what about "quality ultra" settings (from any review, bar the DF bovine feces) renders it "not a good option"?I bet this version of FSA was a knee jerk reaction to DLSS 2.0 eating their lunch, and FSA 2.0 will get a longer baking time to actually give us a good option
They don't use fake images, but they only show you when it's worse, not when it's better, that's why it's misleading. It's like if they'd only show you the ghosting artifacts of DLSS but never the benefits for texture detail et al.oh I see we are already in idiot town... get your tinfoil hats everyone!
are you saying the video uses fake images? because if not then it clearly looks worse than every other modern upscaling tech of any modern engine
If you only watch totally not paid shills, also known as DF, I guess so.
Do you realize that DLSS operates at lower than native resolution? Of course those effects will be lower resolution...Friendly reminder that no publication talking about these technologies and comparing them is honest if they didn't mention how DLSS fails at reconstructing some RT details like reflections, before they start waxing poetically about the magic of thecloudA.I. and how FSR is "nothing but a dumb upscaler". Yes, I am in fact talking about the charlatans at Digital Foundry but they're far from alone.
Yes I do, do you? Do you know what the word 'reconstruction' means? The point is those effects don't get treated at all, not that they're lower quality than native, of course they would be.Do you realize that DLSS operates at lower than native resolution? Of course those effects will be lower resolution...
However maybe in future DLSS will process half-res/quarter-res textures separately and those "artifacts" will gone.
Kills their tech. (e.g. G-Sync)
Why it is killed, mm, let me think.They keep selling licenses for G-Sync compatible displays.. so not sure why you think the tech is killed.
My point is it undermines their own tech.My point was it's their strategy.
So what the problem then? You say no-one mentioned it? Alex from DF has mentioned it several times. As well as AF (texture quality in the distance) will be lower if you using DLSS.Yes I do, do you? Do you know what the word 'reconstruction' means? The point is those effects don't get treated at all, not that they're lower quality than native, of course they would be.
They don't use fake images, but they only show you when it's worse, not when it's better, that's why it's misleading. It's like if they'd only show you the ghosting artifacts of DLSS but never the benefits for texture detail et al.
What do I mean? He tries to shill TAAU as a better alternative to FSR (and only shows FSR performance, which we know is the worst version, but not also ultra quality), but a less sponsored by Nvidia outlet tested that out in an actually released game that's not in alpha and found that FSR is still better than even TAAU (also on UE4). So - the conclusion of an honest man should be that the efficacy of each solution depends. Instead he only shows you the negative - why? That's where you'll have to figure out on your own why an Nvidia-sponsored outlet wants to show AMD in a worse light.
The TAAU vs FSR test in Godfall (through config tweaks):
GN mentioned that it runs on pretty much anything (although not officially supported).OK, all they need to do is add 600 series support and i'll be set for another generation
Why it is killed, mm, let me think.
Perhaps because VRR (open and free for all) is used on that 'FreeSync" by AMD, and that useless G-Sync chip is nowhere to be found.
My point is it undermines their own tech.
It was this kind of behavior by NV that killed OpenGL.
Could remember wrong but he is always a lot more critic with AMD as isn't it with Nvidia.It doesn't matter if it is done for money or "favors" FingerBang
DF was blatantly skewing reality in their "super early super preview" of 3080 trying to make it look faster than it was, and is now the only review bashing a tech by NV's competitor (and stakes are high here).
Do you think it is just a coincidence?
If it is supported on my GTX670, I'd love to experiment, I suspect it will be a waste of time though. I know, time for an upgrade...man this card has served me well though, it's old reliable!GN mentioned that it runs on pretty much anything (although not officially supported).
Microsoft announced XSeX will have it.
I don't see why Sony wouldn't.
I thought they stopped making monitors with the module?It's.. not.. nowhere to be found.. they are still selling g-sync licenses for compatible monitors (basically a logo lol), and g-sync chips for fully supported monitors. TV's too.. not entirely sure why you think their tech is nowhere to be found. Go buy an LG CX TV in 2021 and you are giving nVidia some money.
Which is the point, to make money... their strategy.. AMD makes no money off of FreeSync.
You have not made your point well at all, as you are just making shit up as usual.
They didn't... they have 3 separately tiers of GSync where 2 uses the GSync module.I though they stopped making monitors with the module?
Honestly not sure; but when I researched G-Sync monitors like 3 months ago a bunch of the recommended ones have the module. But maybe they weren't made particularly recently.I though they stopped making monitors with the module?
It just that Preminum monitors are in way lower releases than budget ones... so people misses these... you see like one or two releases each few months... sometimes you have even 6 months without a premium monitor being released.Honestly not sure; but when I researched G-Sync monitors like 3 months ago a bunch of the recommended ones have the module. But maybe they weren't made particularly recently.
Quick google suggests some of the top monitors released in 2021 don't have it though. edit: Nevermind, like ethomaz says, I just found the highest end monitor ASUS announced recently and it has the latest processor.. nVidia has expanded that if anything, as now they have a special processor that does G-Sync w/ HDR at peak brightness.... it's also not considered "useless' by non-AMD fanboys, as pretty much anyone sane agrees it's superior to Free-Sync/VRR.
Either way, they are continuing to make money with it.. and expanding into TVs in 2021, 6 years after they introduced G-Sync.
Honestly not sure; but when I researched G-Sync monitors like 3 months ago a bunch of the recommended ones have the module. But maybe they weren't made particularly recently.
Quick google suggests some of the top monitors released in 2021 don't have it though. edit: Nevermind, like ethomaz says, I just found the highest end monitor ASUS announced recently and it has the latest processor.. nVidia has expanded that if anything, as now they have a special processor that does G-Sync w/ HDR at peak brightness.... it's also not considered "useless' by non-AMD fanboys, as pretty much anyone sane agrees it's superior to Free-Sync/VRR.
Either way, they are continuing to make money with it.. and expanding into TVs in 2021, 6 years after they introduced G-Sync.
Glad they stuck with the tech, shame more monitors don't use it.It just that Preminum monitors are in way lower releases than budget ones... so people misses these... you see like one or two releases each few months... sometimes you have even 6 months without a premium monitor being released.
And because that could take months if not a couple of years, people are happy. You answered your own question there.Why are people of old GPUs so happy? Those GPUs are only going to be good for 1080p and we can all see how terrible FSR is at 1080p and below. It's better than nothing, but at this point it's best to just upgrade GPU when/if prices come back to normal.
Name a single TV with GS chip.TV's too..
Oh boy.pretty much anyone sane agrees
Name a single TV with GS chip.
I'm not just talking about the chips..
Damn it, boy...they make money
I'd ask for receipts, but given your post history in this very thread, I'd better not.And G-Sync absolutely does outdo FreeSync, particularly with the proprietary chip.
You've given the exact reason why people with old GPU's are happy. They can get more out of their aging Graphics Card, until new GPU prices return to some form of normality again.Why are people of old GPUs so happy? Those GPUs are only going to be good for 1080p and we can all see how terrible FSR is at 1080p and below. It's better than nothing, but at this point it's best to just upgrade GPU when/if prices come back to normal.
I'd ask for receipts, but given your post history in this very thread, I'd better not.
but G-SYNC tends to offer a more polished and consistent platform for a premium (which means that the best monitors around are mostly G-SYNC), while capabilities can vary wildly between the wide range of FreeSync models found in every price range.
I"m pretty sure only one reviewer in that list disliked it.So it seems like either you love fsr or you a shill.
It’s amazing how people become paid shills as soon as they criticize something you like.
So it seems like either you love fsr or you a shill. Looked pretty shit at everything but ultra. Where my paycheck?
All of these people are putting out videos.. and you can look for yourself and decide if it's something you like.
I dislike the heavily softened image I'm seeing; I don't quite understand why someone would look at that and proclaim it's as good or better than native.. but I am not going to call them a shill.. I just don't value their opinion particularly highly since at the very least, our tastes in what makes a better image differ.
YouTubers don't have to be shills to make something overly positive and not analyzed that well either.
Instead of judging things by how many people have X opinion, judge for yourself.
*quite good at ultra quality, close to DLSS 2 give or take
This finding from the op
Scratching head after watching digital foundry.
Is this even true?
Did Alex say the same thing? He even question it exist, well from anything below ultra anyway.
My take away from this is ultra is just OK. Not anywhere near DLSS.
That's what I'm not getting too. They like to pretend that NVIDIA is some evil power, lol. Nvidia is years ahead of anyone because of many reasons. Why such hate? Price? Sure. But I've got 2070S instead of 1080ti and very happy about it. If I got any AMD GPU I will be not able to enjoy games that I now enjoy. (Minecraft RTX, Cyberpunk 2077 RTX, No Man Sky VR with DLSS. don't laugh at me, those are names of few)This guy is delusional, everything he praises about AMD gpu's Nvidia does it better. RT, DLSS, G Sync and overall performance, Nvidia is a gen ahead, but he likes to call people paid chills when people choose the superior tech. He talks trash about RTX, G sync and Dlss but when AMD comes with a inferior version of those he praises it like its the best thing ever. Even if it looks worse (like now with FSR) he still keeps downplaying Nvidia's tech and believes that Amd trumped Nvidia lol. Don't take him to surious, he is guy that told me 1 year ago that a 5700xt was a better gpu than a 2070 super and that dlss and ray tracing where not gonna make much difference in current gen games. Look how that turned out lol.
This is why DF quit GAF. But the shitty fanboys never learn, so enjoy having DF posting on fucking Era instead of here.It’s amazing how people become paid shills as soon as they criticize something you like.