• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Former Ubisoft dev says that Techland intentionally made AMD FSR look worse in Dying Light 2

KyoZz

Gold Member
A few days ago, Reddit’s TheHybred (who is a former Ubisoft developer) said that Techland has intentionally made AMD FSR look worse in Dying Light 2.
TheHybred shared the values for AMD FSR Ultra Quality, so we went ahead and tested them in order to find out whether these accusations were valid or not.

As the dev said:
"This game is an NVIDIA sponsored title so FSR is missing the ultra quality preset along with having the sharpness value lowest as it can be to make the technology look bad"

So, as you may have guessed, we benchmarked the new values that TheHybred provided.
For our tests, we used an Intel i9 9900K with 16GB of DDR4 at 3800Mhz and NVIDIA’s RTX 3080. We also used Windows 10 64-bit, and the GeForce 511.65 driver.
Below you can find some comparison screenshots. AMD FSR Quality is on the left, NVIDIA DLSS is in the middle, and AMD FSR Ultra Quality is on the right:



As we can easily see, AMD FSR Ultra Quality looks sharper and better than NVIDIA DLSS Quality. However, AMD FSR Ultra Quality also runs noticeably slower than DLSS Ultra Quality. Here are some benchmarks at 1440p/Ultra/Ray Tracing.



As we can see, AMD FSR Ultra Quality runs faster than native 4K. And while its image quality is better than DLSS Quality, its performance is not that great.
Still, the values that TheHybred shared can indeed enable AMD FSR Ultra Quality Mode.

In order to enable AMD FSR Ultra Quality, you’ll have to follow this guide:
Open the video.scr file that is located at Documents/dying light 2/out/settings. Then, use the following settings. Once you make these changes, save the file, make it “Read Only”, and then launch the game.
  • Scale3D (0.77)
  • FSR (1.000000)
  • Upscaler (3)
  • Upscaling (3)

Ubisoft’s former dev concluded:
"AMD needs to quit forbidding devs to not include DLSS in their sponsored titles and NVIDIA should not intentionally try to make their tech look worse than it is, this just hurts gamers and no one falls for the tricks to begin with so what’s the point?"

UPDATE:



Explosion Reaction GIF


 
Last edited:

SlimySnake

Member
"AMD needs to quit forbidding devs to not include DLSS in their sponsored titles and NVIDIA should not intentionally try to make their tech look worse than it is, this just hurts gamers and no one falls for the tricks to begin with so what’s the point?"

This is all so petty lol

No one comes out of this looking good.
 

ethomaz

Banned
He said he has no proof and it just a theory.

From the tests FSR UQ seems to have lower framerate and that is more probably the choose to not include it in the game options.
 
Last edited:

Stuart360

Gold Member
Nice theory but sometimes this kind of tech isnt always implemented great. IT looks pretty bad in HZD on PC, with the foliage especially being a shimmering mess, even on the top setting. On GOW though, it looks awesome.
 

Hugare

Member
Every major release is sponsored by Nvidia, that doesnt mean that they've botched FSR on purpose.

Considering how poorly optmized the game is, how even DLSS was bugged at launch, how the game has a flickering problem on the PS5 that anyone who plays it for 5 minutes would be able to notice and etc., its probable that FSR implementation was rushed like anything else tech wise with this game
 
Here's some further statements:

I am a game developer (Former Ubisoft) and I have also worked with FSR's open source code before. Everything wrong with FSR was deliberately and manually changed, because the code is modified.

Furthermore NVIDIA had a presentation where they compared FSR against DLSS and in that they used quality vs quality so they did it by internal resolution and not performance (which is disingenuous) and it also looked like the sharpness was neutered there too. So the same exact thing NVIDIA demonstrated is happening in a game they sponsor. Maybe it's a coincidence but this is still more than forgetfulness to alter the code of something in an unfavorable way.

Let's not forget their using an RT algorithm made for tensor cores for certain ray tracing effects that they didn't disable for AMD cards which tanks performance drastically than what it should be but again as a former dev at Ubisoft who partners with AMD I know very well the practices of intentionally limiting a competitors product (card or features) to make our sponsor look better so this is not some conspiracy nor is it specific to NVIDIA.

No one can be certain this is because of the sponsorship but that's not the point of the post, the point is you can mitigate a lot of these limitations by tweaking the config and I'm showing that here to help people, this was just a brief theory as to why (one backed by experience & common sense) it still is just my belief you don't have to agree, I just know what was in our contractual obligations when partnering with AMD (can't discuss) and I really don't think NVIDIA is any different.


This user is the creator and admin of r/optimizedgaming.
 

Concern

Member
This is so petty and pathetic at the same time. All meanwhile we got 1080p/60fps on our new consoles lol.
 

elliot5

Member
LMAO

64 fps average “performance not that great”

DLSS at 72 fps average “incredible”

Even though image quality is different. Man these tech sites sometimes are just sad times.
15% uplift for comparable image quality (which may look better in motion) seems worth
 

M1chl

Currently Gif and Meme Champion
Wonder what is shilling budget for AMD, but I get it, it's the only way that still keeping them relevant.
 

Gaelyon

Member
64 fps with improved graphic quality vs 72 fps can be a valuable option. At the very least it should be left for consideration and left to the player to decide.
 
Top Bottom