• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA RTX 4090 cannot run Remnant 2 with 60fps at Native 4K/Ultra

Minsc

Gold Member
4k Ultra is yesterday's news. How does it run it at 8k Ultra on this rip-off card? Probably not even 30 fps. And shit, 16k Ultra probably can't even get 10fps, I imagine that's because of the paltry 32GB RAM, they should have gone for at least 512GB of RAM so then 16k Ultra would be possible. Don't even get me started on 32k or 64k Ultra settings. Can't even launch the game on those.

Not even sure why you'd not just wait for PS5 Pro, it'll be able to run it in performance mode at 8k.
 

Bernoulli

M2 slut
people that play in 4K is it on a PC or TV? i'm happy with my 1080p screen and don't see the need to change
i have a 4K hdr tv but i still prefer the 24inch screen
 
I don’t see what the big deal is. 4K at Ultra settings is the most demanding claim on GPU power and will naturally be difficult to push at 60+ FPS. I don’t know much about these devs but like most they probably targeted consoles and PC was an afterthought.

If you’re on PC and going for performance you likely aren’t playing at 4K unless it is some single player showpiece/tech demo.
 

KyoZz

Tag, you're it.
We are now in upside down world where people with $1600 GPUs are fine playing games at sub-native while people with $500 consoles are demanding native. This is what Digital Foundry does to a motherfucker...
Please don't take a few people on GAF as a generality.
Remnant 2 is just a shitty unoptimized mess and the devs should be ashamed. End of the story.
 

MikeM

Member
Sure. Keep using the worst PC port in years, equivalent to Cyberpunk console ports on release, to make a point.

Why not use Ghostwire Tokyo, Forspoken and Returnal PC versions instead?

BTW, I'll just wait for DF analysis comparing consoles to PC version, same story is gonna repeat: Stutters but yet way better IQ and more stable frame rates than on consoles with equivalent GPU since CPU and RAM bottlenecks aren't present on PC.
Stutters and more stable framerate seem like an odd pairing.
 

Gorgon

Member
people that play in 4K is it on a PC or TV? i'm happy with my 1080p screen and don't see the need to change
i have a 4K hdr tv but i still prefer the 24inch screen

Most gameplay on big TVs is probably done by console players, at least in relative numbers. And historically this makes total sense, of course. But it's becoming increasingly popular for PC players to go the coutch way. Personaly, what I can say is that I will probably return to PCs next gen (because I miss mods and nothing else) and leave consoles behind (or at least the Xbox) but I certainly won't go back to sitting in a chair in front of a monitor.
 
Not really.

Crap running PC versions have been a constant for the last 25yrs.

Apart from a few select developers, using their own engines, smaller teams have taken the attitude that PC gamers will simply throw money at the problem to brute force their way past it with more powerful hardware.
PC gamers could easily brute force PS4 games- 60fps has been a standard for PC gamers since forever.
 

phant0m

Member
Another poorly optimized pc game then ?

No? Is this whole thread just a troll? If a 4090 can’t run it @ 4K maxed out with no supersampling it’s a bad port?

I’m playing Remnant 2 early access with a 3080 and it runs fine? I mean I guess if blew like $1500 on a 4090 I’d probably be upset too but honestly the game is fine guys:

  • 1440p/Ultra/DLSS Balanced: holds 72fps locked (half rate of 144hz monitor)
  • 1440p/Ultra/DLSS Quality: holds 60fps locked
  • 1440p/High/DLSS balanced: easily reaches 100 fps, I leave it unlocked because of gsync
  • 1440p/High/native: holds 60fps locked
 
Last edited:

Kataploom

Gold Member
Stutters and more stable framerate seem like an odd pairing.
When a game constantly falls below target frame rate it's different than it meeting or exceeding it with 100 ms of stutters every few seconds or minutes for the first few hours of gameplay.
 

phant0m

Member
4K, Ultra, UE5, and a small developer infamous for delivering mediocre gfx/vfx..*

Yeah I wonder what could go "wrong", lol..

Just lower the settings and use DLSS.


*Gunfire Games are known for making pretty good games though
The visual difference between High and Ultra is pretty insignificant unless you’re comparing side-by-side screencaps. The performance difference is easily 10 if not 15 fps.

At 4K/High/DLSS balanced a 4090 should easily crush 100+ fps on this. But no, let’s make it run in the most extreme configuration possible (4K/Ultra/no DLSS) and whine instead.
 

yamaci17

Member
The whole point of buying the best of the best is to run stuff without compromise.

..and that applies to all consumer products
they can technically up the shadow resolution by 36x and buckle the 4090 down to 5 fps and call it a ultra setting.

who determines the compromise line?

ac valhalla has ugly draw distance despite everything at "max/ultra". but yeah, you can then simply say you get 4k ultra 60 fps on a 2080ti (nearly). so now you think you didn't compromise on quality?

there's no universal way of labeling something low, med, high or ultra.

hogwarts legacy had 4 texture options at launch,
low: 3000 mb of budget
medium: 3500 mb of budget
high: 4100 mb of budget
ultra: 5000 mb of budget

and then, they went and made the old low new high. now

low: 1200 mb
medium: 1800 mb
high: 3000 mb
ultra: 5000 mb

for some other user, such as 4080 with 16 gb, 5 gb ultra texture is still a compromise. you can have 10 gb texture cache and get extremely high detailed textures even at an extreme view distance. however the option is simply not there.

or imagine Game A calling a 1/8 dithered ray tracing reflections as "epic mega max ray tracing". will you be simply that you get high framerate with their "label"?

then imagine Game B calling full resolution ray tracing reflections "medium" and 2x resolution ray tracing reflections "high". now what happens in relation to the game above?

so you didn't compromise on image quality on Game A because you could run "epic mega max ray tracing "label with high framerates, but you did with Game B?

we don't even know what ultra settings entail. for all we know, there could be one crucial setting that hampers framerates by %40 with any barely image quality difference. there's a reason most gtx 1000 and midrange 2000 users do not bitch as much as 3000 and 4000 users do. they simply go on and set their games to sane medium high settings and get great performance across most of the titles.

this even includes last of us part 1, which runs majestic at med/high mix settings on a 1070/2060. but once you push ultra, even a 3070 buckles down at 1080p. that's how it is.

spiderman's remastered pc version literally calls obnously bad ray tracing resolution "high". from their perspective, it is high. so we have to take up their word for it? and they labeled a setting "very high" because it makes reflections somehow bareable. worse is, you can still get higher reflection quality and there are indeed cards that can handle it, but they won't. why would they? run off the mill users should be happy that they play at very high ray tracing without a compormise and get great performance while doing that.

so it is how it is, huh?

people have to stop think in "preset" terms. just focus on what is on the screen. if high and ultra looks %99 the same and one is %50 more performant, then I hope devs present epic settings that take %250 more performance for a %0.5 improvement in image quality just to mess with you guys who do not want to compromise on image quality that somehow you believe are determined by some random dev.
 
Last edited:

SHA

Member
h6q3l7p.png
 

phant0m

Member
The whole point of buying the best of the best is to run stuff without compromise.

..and that applies to all consumer products

Diminishing returns applies to all consumer goods (particularly electronics/hardware) as well. “I spent twice as much on my GPU” does not translate to twice the frame rate, twice the resolution or anything else and never has.

The extreme ends of gaming in both presets and actual hardware have never really been optimized for. In some cases, the devs will flat out say “our ultra/extreme/insane preset is meant to the capabilities of future hardware”.

While I like the capability to tweak game settings on my PC to my heart’s content, I’ve determined over the last several years that just letting GFE optimize my graphics settings for my hardware yields the best results to just play the game.
 

TrebleShot

Member
Wtf this is utter bullshit so NVIDIA release the most expensive GFX card in centuries.

Devs start relying on the super expensive card for their games to run at minimum 60fps so gaming at the threshold becomes stupidly expensive.

This is pathetic. PC gaming is in a right state.
 

GymWolf

Member
Thank god for buying a gpu with framegen, it bruteforce smoothness in almost everything:lollipop_grinning_sweat:

The game looks good but no so good to bash a 4090 without dlss.
 

GymWolf

Member
How does it fare with with High/Very High settings or whatever the one notch lower than Ultra is called?
Some devs treat Ultra as something requiring crazy future tech, maybe it's one of those.
Shadow on normal insted of ultra make you save a lot of frames without much loss.

At least this is what i readed on reeee, i still have to try because i didn't had much problem yet.
 

phant0m

Member
How does it fare with with High/Very High settings or whatever the one notch lower than Ultra is called?
Some devs treat Ultra as something requiring crazy future tech, maybe it's one of those.
i can't speak for a 4090 but my 3080 easily holds 72 fps @ 1440p/High/DLSS Quality

the bigger issue is not using DLSS.
as the devs said, they designed and optimized the game around upscaling tech. so i don't know we're whining about performance when we disable that.

but i suppose "RTX 4090 gets 80+ fps @ 4K/Ultra/DLSS Balanced" doesn't really make for much of a headline, does it?
 
Last edited:

night13x

Member
We might as well accept that going forward a lot of devs will rely on DLSS 2/3 instead of making sure their games are very well optimized. In fact it probably pleases their masters (aka Nvidia) to push current gen card sales.
 

Pey.

Member
I played the prologue of Remnant 2 at 1440p @ Max Settings with a RTX 4060 (a $300 card) and I get a solid 60 FPS experience, and would get more if not by frame generation forcing vsync on for some reason. I think some people confuse DLSS 2/3 with console upscalers and that's the main problem. They are not the same, at all. It has been proven (and you can see for yourself in the imagen comparison) that DLSS 2 Quality looks even better than native, so you get improved visuals and can literally double the performance in some games with DLSS 3. I know, FG is exclusive to RTX 40, but it's not like the gains aren't woth it. I did a full review of the 4060 with lots of videos and image comparison, and FG is truly a change changer. Being able to play games like Cyberpunk 2077, Sackboy, APT: Requiem Jedi: Survivor at 1440p @ Max Settings at 60 FPS say a lot about how much a proper upscaling technology can make a difference between being playable to fully enjoyable.

Here is an image comparison between 1440p Native (27 FPS), DLSS 2 Quality (46 FPS) and DLSS 3 Quality (60 FPS+): https://www.pcmrace.com/wp-content/uploads/2023/07/remnant24060.html

 
Last edited:

mrcroket

Member
People often underestimate the power of consoles and overestimate the power of their hardware, you find many people saying that their game runs better than on console at 1440p with dlss, when in reality they are playing at 960p.
 
Top Bottom