• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA RTX 4090 cannot run Remnant 2 with 60fps at Native 4K/Ultra

Draugoth

Gold Member
Remnant-2-feature-1038x576.jpg


Remnant 2 is powered by Unreal Engine 5, meaning that it’s one of the first games using Epic’s latest engine. Moreover, the game supports DLSS 2/3, as well as XeSS and FSR 2.0. NVIDIA previously claimed that the game would only support DLSS 2, so the inclusion of DLSS 3 Frame Generation is a pleasant surprise.

For initial 4K tests, the game was benchmarked the starting area and the Ward 13 safe zone. The second area features numerous NPCs, so, in theory, it should provide reliable results for the rest of the game.

As you can see, the NVIDIA GeForce RTX 4090 cannot run Remnant 2 with 60fps at native 4K and Ultra Settings:

Remnant-2-4K-benchmarks.png





At Native 4K/Ultra, the RTX 4090 pushes an average of 40fps. By enabling DLSS 2 Quality, you get constant 60fps at 4K/Ultra. And then, by enabling DLSS 3 Frame Generation, we get an additional 45-50% performance boost.

The game also suffers from some weird visual artifacts. You can clearly see the artifacts on the blades of grass in the video we’ve captured (while moving the camera). These artifacts are usually caused by an upscaling technique. However, even without any upscaler, the game still has these visual glitches/bugs.\


To its credit, the game can be a looker. Gunfire has used a lot of high-quality textures, and there are some amazing global illumination effects. However, its main characters are nowhere close to what modern-day triple-A games can achieve. And, while the game looks miles better than its predecessor, it does not justify its ridiculously high GPU requirements.

Via DSOGaming
 

Eotheod

Member
Why can't the pipeline be mad emore efficient when using UE5? It seems like it gets caught up processing everything and just decides to chug, that or badly optimised assets. The problem with modern dev thinking is more polygons = totally better model, when that clearly isn't the case.

I've seen far better model usage through indie games that utilise the shape and style of the model to cut back on unnecessary polygons.
 

Oublieux

Member
It is an upscaler which means it will never be better than native.
Not true (that it’s an upscaler)… an upscaler cannot introduce new pixels or information to an image. DLSS and FSR takes the image and then fills the image in with new pixels when increasing it to a higher resolution. It takes a “best guess” approach via machine learning. This is why you’ll sometimes see the image resolve details better in DLSS and FSR vs. native.
 
Last edited:

Leonidas

Member
Yeah, lets just brute force the shitty code with all the tricks that are available. Because DLSS was made for shitty developers to ensure their games run decently. No need to optimize games anymore, am i right?
Better than running the game at sub-60 FPS/Ultra.

We won't know how well the game is optimized till someone like DF analyzes it.

One easy way to optimize would be turning down to console settings, I'm sure the console's ain't running this one at Ultra...
 

Leonidas

Member
Basically developers refuse to optimize anymore. There is no thought or work being put in into optimization because they have DLSS/FSR crutch.
If a 2080 performs worse than consoles at console settings then I would agree with you, but we have yet to see what console settings look like.

The only way I'd agree the game is unoptimized if it the RTX 2080 or RX 7600 performs 15+% worse than consoles, at console settings.
 
Last edited:

StereoVsn

Member
If a 2080 performs worse than consoles at console settings then I would agree with you, but we have yet to see what console settings look like.

The only way I'd agree the game is unoptimized if it the RTX 2080 or RX 7600 performs 20% or more worse than consoles, at console settings.
I am not saying they are optimizing particularly well for consoles either. Witness FFXVI and it's shitty FSR based image quality.

Edit: Meant XVI lol 😂
 
Last edited:

Holammer

Member
How does it fare with with High/Very High settings or whatever the one notch lower than Ultra is called?
Some devs treat Ultra as something requiring crazy future tech, maybe it's one of those.
 

Reallink

Member
Either the game is the most comically unoptomized mess ever released or UE5 is all smoke and mirrors and will prove to be an infamous dog of an engine outside of perhaps Epic's own first party titles. Game that looks worse than Destiny 1 absolutely should not be struggling to hit 40fps on a 100 teraflop card.
 
Last edited:

ZehDon

Gold Member
Running on my machine (i9 10900k, RTX3090, 64GB RAM) at 1440p Ultrawide all settings at Ultra without DLSS, I hover around 30FPS. With DLSS at Quality it pushes way over sixty. The only complaint I have is that the game doesn't seem to play nice with my GSYNC monitor, for some reason. Can't quite figure out why, but rather than 60FPS with input lag, I have to uncap the framerate and put up with the tearing.
 

Gaiff

SBI’s Resident Gaslighter
We are now in upside down world where people with $1600 GPUs are fine playing games at sub-native while people with $500 consoles are demanding native. This is what Digital Foundry does to a motherfucker...
Console gamers rightfully demand native when their res are at sub-HD resolutions. 720p base resolution upscaled to 1080p ends up looking worse than PS4 games in terms of image quality. In many respects, we've gone backwards.

They expect at least native 1080p and I can't blame them.
 
We are now in upside down world where people with $1600 GPUs are fine playing games at sub-native while people with $500 consoles are demanding native. This is what Digital Foundry does to a motherfucker...
its a fn woke 🤡 site, it causes many issues and stupid system fighting
 
Last edited:
Oh look, another game that needs a new GPU to run effectively.

It's almost like there's some underlying collusion to force everyone to upgrade their GPU.
I mean, if nobody on anything below a 3090 can run current AAA games at playable frame rates they'll have to upgrade or be left behind.

No, that's not possible, there's no way that would happen.

It must be these developers that have just gotten lazy and are only making games that can be brute forced by 4090s to run at acceptable FPS...that's what is really going on.

Either way, if you don't have a 40 series prepare to lower your settings and start getting used to sub 60fps.

I, of course, am being somewhat sarcastic...but, there's certainly something not quite right about the sheer volume of games that are releasing as an unoptimised mess and/or with minimum requirements that shouldn't be as demanding as they are considering how average some of them look.
 
Last edited:
Top Bottom