• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

NVIDIA RTX 4090 cannot run Remnant 2 with 60fps at Native 4K/Ultra

Kataploom

Gold Member
Trolling?
No, people shit on PC ports, but most of the time if the PC version is bad, all versions are bad day one and PC being like the better on IQ and framerate due to brute force. When we have PC version feeling unoptimized, consoles versions end up being like 720p or 800p upscaled using FSR like Jedi Survivor and Forspoken.

The one notable exception has been TLOU, which is literally the worst PC port in ages... That thing can only be brute forced due to how bad it is and that's now because glitches and bugs were commonplace for weeks at launch.

So if this game struggles at 4K on a card that is probably as powerful as PS6 will be, I wouldn't even expect 1080p native on performance mode on consoles.
 

EDMIX

Member
smh, dumb Nvida, its cause 4090 is to WEAK for how demanding and complex Remnant 2 is, so we need a 5090....also we need PS6 and Series ABC to really, like really, really support this game.

This game wasn't even made like, for sure for this generation, thats why it has problems, cause Nvida, MS and Sony, nothing to do with the developer.....

/s
 

Kakax11

Banned
FuqhyaeXsAExANM
 

Ywap

Member
The stuttering Unreal Engine is ruining gaming. Good thing Sony and Ubi aren't using it yet.
 
Last edited:

S0ULZB0URNE

Member
No, people shit on PC ports, but most of the time if the PC version is bad, all versions are bad day one and PC being like the better on IQ and framerate due to brute force. When we have PC version feeling unoptimized, consoles versions end up being like 720p or 800p upscaled using FSR like Jedi Survivor and Forspoken.

The one notable exception has been TLOU, which is literally the worst PC port in ages... That thing can only be brute forced due to how bad it is and that's now because glitches and bugs were commonplace for weeks at launch.

So if this game struggles at 4K on a card that is probably as powerful as PS6 will be, I wouldn't even expect 1080p native on performance mode on consoles.
If the game is using the super fast data streaming tech of the PS5 to optimize that version than we can't compare it to the PC that uses no such tech.

PS5 has been used to show off UE5 on a few occasions.
Maybe it's using some of those optimizations?
I am assuming XBSX as well.


PS6 will put out better visuals than the 4090.
 

S0ULZB0URNE

Member
smh, dumb Nvida, its cause 4090 is to WEAK for how demanding and complex Remnant 2 is, so we need a 5090....also we need PS6 and Series ABC to really, like really, really support this game.

This game wasn't even made like, for sure for this generation, thats why it has problems, cause Nvida, MS and Sony, nothing to do with the developer.....

/s
Runs great on PS5 🙈
 

Kakax11

Banned
Turing launched in 2018 and the PS6 in 2020, the PS5 is slated for 2028, it isn't far fetched to think it will have better performance than a 2022 card lol, the PS5 is faster than the 1080 that was 4 years old at the time, the 4090 will be 6 years old.

Nah it's far fetched because AMD is meh while Nvidia is bog boi stronk, PS5 was slower than 2080 when it came out so yeah
 

Kataploom

Gold Member
If the game is using the super fast data streaming tech of the PS5 to optimize that version than we can't compare it to the PC that uses no such tech.

PS5 has been used to show off UE5 on a few occasions.
Maybe it's using some of those optimizations?
I am assuming XBSX as well.


PS6 will put out better visuals than the 4090.
If you keep saying so I'll start believing you DO believe it, like, for real lol.

BTW, this game is probably gonna be fixed soon.
 

Kataploom

Gold Member
Turing launched in 2018 and the PS6 in 2020, the PS5 is slated for 2028, it isn't far fetched to think it will have better performance than a 2022 card lol, the PS5 is faster than the 1080 that was 4 years old at the time, the 4090 will be 6 years old.
PS5 GPU is around 2060s and 2080 (best case) though, if same increments apply next GPU gen, we'll probably get a GPU on PS6 that will be weaker than 4090 or same at most.
 

Hoddi

Member
Oh look, another game that needs a new GPU to run effectively.

It's almost like there's some underlying collusion to force everyone to upgrade their GPU.
I mean, if nobody on anything below a 3090 can run current AAA games at playable frame rates they'll have to upgrade or be left behind.

No, that's not possible, there's no way that would happen.

It must be these developers that have just gotten lazy and are only making games that can be brute forced by 4090s to run at acceptable FPS...that's what is really going on.

Either way, if you don't have a 40 series prepare to lower your settings and start getting used to sub 60fps.

I, of course, am being somewhat sarcastic...but, there's certainly something not quite right about the sheer volume of games that are releasing as an unoptimised mess and/or with minimum requirements that shouldn't be as demanding as they are considering how average some of them look.
I don't see how a game developer benefits from pushing nvidia GPUs. They're just gonna sell fewer copies if their game doesn't run well.

If anyone would be getting cutbacks then it would be Epic Games since they created UE5. And I seriously disbelieve that.
 
I don't see how a game developer benefits from pushing nvidia GPUs. They're just gonna sell fewer copies if their game doesn't run well.

If anyone would be getting cutbacks then it would be Epic Games since they created UE5. And I seriously disbelieve that.
Not if enough people upgrade, and not if they are being helped by going straight to game pass, being partnered with Nvidia or AMD, or going free to play.

I agree, it doesn't necessarily make sense even with those possibilities, so the question is why are we now seeing game after game after game being unable to run well on anything but the most expensive GPU - and even then it's not always enough?

Think, Forspoken, TLOU, Dead Space Remake, Gollum, Hogwarts Legacy, Gotham Knights, Jedi Survivor, now Remnant 2, to name just a few that have had performance issues, stuttering, poor optimisation, memory leak, excessive VRAM requirements etc...
 
Not if enough people upgrade, and not if they are being helped by going straight to game pass, being partnered with Nvidia or AMD, or going free to play.

I agree, it doesn't necessarily make sense even with those possibilities, so the question is why are we now seeing game after game after game being unable to run well on anything but the most expensive GPU - and even then it's not always enough?

Think, Forspoken, TLOU, Dead Space Remake, Gollum, Hogwarts Legacy, Gotham Knights, Jedi Survivor, now Remnant 2, to name just a few that have had performance issues, stuttering, poor optimisation, memory leak, excessive VRAM requirements etc...
I assume it's a combination of the diverse set of PC configurations, the less optimised software environment, and in many cases the fact that dynamic shader compilation needs to be accounted for on PC while it can be precompiled on console

Also doesn't help that Nvidia especially are gimping the memory on graphics cards to encourage upgrades, and to delineate gaming cards from professional cards.

Developers want to be able to sell to as many people as possible (potentially unless they have a deal with Nvidia or AMD), but there are a lot of obstacles to overcome when releasing a game on PC and if there are additional obstacles on top of that then... fuck it.
 

Hoddi

Member
Not if enough people upgrade, and not if they are being helped by going straight to game pass, being partnered with Nvidia or AMD, or going free to play.

I agree, it doesn't necessarily make sense even with those possibilities, so the question is why are we now seeing game after game after game being unable to run well on anything but the most expensive GPU - and even then it's not always enough?

Think, Forspoken, TLOU, Dead Space Remake, Gollum, Hogwarts Legacy, Gotham Knights, Jedi Survivor, now Remnant 2, to name just a few that have had performance issues, stuttering, poor optimisation, memory leak, excessive VRAM requirements etc...
It's a good question but I don't know the answer to that. These issues only really started when games transitioned to DX12 so maybe that's the reason. nvidia's DX12 driver also supposedly has some issues with overhead and maybe that's compounding those DX12 issues. Or maybe it's just nvidia's DX12 driver itself that's the problem?

Hell, maybe nvidia is sandbagging performance to sell more GPUs. I don't even slightly believe that but it's still more plausible than a game developer sabotaging their own game to please GPU vendors. Or maybe UE5 is simply very performance intensive?
 
Last edited:

Kataploom

Gold Member
pau1xFa.jpg


PS5 version runs at higher settings than the 2070 Super pictured here.
Sure. Keep using the worst PC port in years, equivalent to Cyberpunk console ports on release, to make a point.

Why not use Ghostwire Tokyo, Forspoken and Returnal PC versions instead?

BTW, I'll just wait for DF analysis comparing consoles to PC version, same story is gonna repeat: Stutters but yet way better IQ and more stable frame rates than on consoles with equivalent GPU since CPU and RAM bottlenecks aren't present on PC.
 

S0ULZB0URNE

Member
Sure. Keep using the worst PC port in years, equivalent to Cyberpunk console ports on release, to make a point.

Why not use Ghostwire Tokyo, Forspoken and Returnal PC versions instead?

BTW, I'll just wait for DF analysis comparing consoles to PC version, same story is gonna repeat: Stutters but yet way better IQ and more stable frame rates than on consoles with equivalent GPU since CPU and RAM bottlenecks aren't present on PC.
Those games aren't in the same league of optimization as TLOU on PS5.

So 18fps lower on a 2070 Super with lower settings.
If a 2070 Super ran PS5 equivalent settings it would likely be a 25-30fps PS5 advantage.

Oh and stutters are a deal breaker.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
PS6 will put out better visuals than the 4090.
Imagine bragging about a system coming out 5-6 years after the 4090 putting out better visuals. It better fucking do that since that's almost an entire console generation.

What's next? The next Xbox will put out better visuals than the PS5?

Those games aren't in the same league of optimization as TLOU on PS5.

So 18fps lower on a 2070 Super with lower settings.
If a 2070 Super ran PS5 equivalent settings it would likely be a 25-30fps PS5 advantage.

Oh and stutters are a deal breaker.
System warriors not operating on logic. What else is new?
 
Last edited:

S0ULZB0URNE

Member
Imagine bragging about a system coming out 5-6 years after the 4090 putting out better visuals. It better fucking do that since that's almost an entire console generation.

What's next? The next Xbox will put out better visuals than the PS5?


I swear, system warriors don't operate on logic.
Imagine jumping in a conversation without knowing all the facts.
So if this game struggles at 4K on a card that is probably as powerful as PS6 will be, I wouldn't even expect 1080p native on performance mode on consoles.
 

Kataploom

Gold Member
Those games aren't in the same league of optimization as TLOU on PS5.

So 18fps lower on a 2070 Super with lower settings.
If a 2070 Super ran PS5 equivalent settings it would likely be a 25-30fps PS5 advantage.
What do you know? One can judge the code quality of a bad port for its result as evidenced in TLOU PC port, but to compare TLOU vs Returnal on PS5 you literally have to know insider details from both development processes. Do you?

But following your point, similarly, TLOU isn't in the same league of optimization and quality of even an average game on PC. Stop using it as an example of anything but missmanagement and devs lack of experience on the platform.

Same for Remnant 2. A bad port can (and use to) get fixed tho.
 

S0ULZB0URNE

Member
What do you know? One can judge the code quality of a bad port for its result as evidenced in TLOU PC port, but to compare TLOU vs Returnal on PS5 you literally have to know insider details from both development processes. Do you?

But following your point, similarly, TLOU isn't in the same league of optimization and quality of even an average game on PC. Stop using it as an example of anything but missmanagement and devs lack of experience on the platform.

Same for Remnant 2. A bad port can (and use to) get fixed tho.
Your right TLOU isn't on the level, it's on another level than most games including most PC games.

Remnant 2 plays great for me 🫡
Keep blaming the ports when it's the norm on PC as it keeps happening and will continue to this generation.
 

draliko

Member
As simply as a bad pc port, shit happens, on pc and on consoles, especially with small Devs.. (tlou is a rare case luckily)... Anyway some of you guys are gullible for sure... can see why people still sends money to a Nigerian prince... Do you also believe reptilians rule the world?
 

Bojji

Member
2080Ti is a 16.5TF FP32 GPU at its 1900mhz gaming boost ( normal clock during gaming) and can be oced to 2100mhz aka real 18.3TF.
OCed 2080Ti is almost at the level of stock 3080.

It's not. It can run some games better thanks to 1gb of additional VRAM but you can't match 3080 performance without some extreme oc or some shit.
 

DenchDeckard

Moderated wildly
4k native with every setting on ultra..that should push a system. See how much they can reign it in with updates.
 

Bojji

Member
4k native with every setting on ultra..that should push a system. See how much they can reign it in with updates.

But this game isn't looking that good, it's not even using RT and probably no lumen and nanite. It runs bad for what it's rendering. Looks like UE5 won't be much better than UE4, CPU utilisation is still piss poor and shader cache stuttering is still a problem, fucking Epic...
 
Oh look, another game that needs a new GPU to run effectively.

It's almost like there's some underlying collusion to force everyone to upgrade their GPU.
I mean, if nobody on anything below a 3090 can run current AAA games at playable frame rates they'll have to upgrade or be left behind.

No, that's not possible, there's no way that would happen.

It must be these developers that have just gotten lazy and are only making games that can be brute forced by 4090s to run at acceptable FPS...that's what is really going on.

Either way, if you don't have a 40 series prepare to lower your settings and start getting used to sub 60fps.

I, of course, am being somewhat sarcastic...but, there's certainly something not quite right about the sheer volume of games that are releasing as an unoptimised mess and/or with minimum requirements that shouldn't be as demanding as they are considering how average some of them look.
We've always had incompetent developers. And we always will.
 
I assume it's a combination of the diverse set of PC configurations, the less optimised software environment, and in many cases the fact that dynamic shader compilation needs to be accounted for on PC while it can be precompiled on console

Also doesn't help that Nvidia especially are gimping the memory on graphics cards to encourage upgrades, and to delineate gaming cards from professional cards.

Developers want to be able to sell to as many people as possible (potentially unless they have a deal with Nvidia or AMD), but there are a lot of obstacles to overcome when releasing a game on PC and if there are additional obstacles on top of that then... fuck it.
Nvidia are playing stupid games, offering more and more variants - often with insufficient VRAM and often at ridiculous prices.

They have a near monopoly, being the more well known manufacturer (at least in some pc gamers eyes).
 
It's a good question but I don't know the answer to that. These issues only really started when games transitioned to DX12 so maybe that's the reason. nvidia's DX12 driver also supposedly has some issues with overhead and maybe that's compounding those DX12 issues. Or maybe it's just nvidia's DX12 driver itself that's the problem?

Hell, maybe nvidia is sandbagging performance to sell more GPUs. I don't even slightly believe that but it's still more plausible than a game developer sabotaging their own game to please GPU vendors. Or maybe UE5 is simply very performance intensive?
I agree, it doesn't seem plausible...but what can the reason be then?

All of a sudden, none of these developers can make a game that runs well, often on far superior hardware to that available even a few years ago?

They can offer their product to people with a 4090 and it still can't run at a stable 60fps at high settings, often struggling at 1080p, much less 1440 or 4k.

I don't understand it, and I'm naturally a cynic as it is.
 
Last edited:

Techies

Member
I agree, it doesn't seem plausible...but what can be reason then?

All of a sudden, none of these developers can make a game that runs well, often on far superior hardware to that available even a few years ago?

They can offer their product to people with a 4090 and it still can't run at a stable 60fps at high settings, often struggling at 1080p, much less 1440 or 4k.

I don't understand it, and I'm naturally a cynic as it is.
Problem is 4k, flat screen gaming honestly doesn't need it for the average user. Most people with 4k own televisions. And people sit a distance away from then and play on controller.

When you go 1440 the fps increase is drastic. Upscaling and 1440p is the best combination and doesn't murder the fps.

Normal people like me would most likely have 1080p 144hz

Game will run wel for majority of people.
 

Dr.D00p

Gold Member
Basically developers refuse to optimize anymore. There is no thought or work being put in into optimization because they have DLSS/FSR crutch.

Not really.

Crap running PC versions have been a constant for the last 25yrs.

Apart from a few select developers, using their own engines, smaller teams have taken the attitude that PC gamers will simply throw money at the problem to brute force their way past it with more powerful hardware.
 
Last edited:

hyperbertha

Member
It seems dlss, rather than providing a gateway to higher realms of being such as 4k 144hz, will now be used as a crutch by devs to get back to baseline, with anything without dlss being unplayable.
 

nemiroff

Gold Member
4K, Ultra, UE5, and a small developer infamous for delivering mediocre gfx/vfx..*

Yeah I wonder what could go "wrong", lol..

Just lower the settings and use DLSS.


*Gunfire Games are known for making pretty good games though
 
Last edited:

theclaw135

Banned
Not really.

Crap running PC versions have been a constant for the last 25yrs.

Apart from a few select developers, using their own engines, smaller teams have taken the attitude that PC gamers will simply throw money at the problem to brute force their way past it with more powerful hardware.

PC developers are deathly afraid of making games that ARE that demanding. They know how many people will whine how the game is "unoptimized".
 

zeomax

Member
What does this chart even mean?
What is looking better native resolution or DLSS.
For example Call of Duty in native 4K looks slightly better than in 4K DLSS quality mode.
Death Stranding looks in 4K DLSS quality mode way better than in 4K native.
(+) = almost the same or slightly better
(++) = better quality
(+++) = way better quality
(tie) = no visual difference between nativ and DLSS
 

CuNi

Member
In the beginning, games had to be super optimized for the hardware to push new gameplay...
Then, as hardware evolved faster than people could optimize for it, it was used to brute force advancements...
As hardware innovation slowed down, new solutions like AI and Upscaling had to be invented to continue the already established standards of brute forcing...

What I am saying is, if someone were to chart it, it would scientifically show that over a long enough period of time, all games culminate in Visual Novels.
Thank you for coming to my TED talk.
 

nemiroff

Gold Member
In the beginning, games had to be super optimized for the hardware to push new gameplay...
Then, as hardware evolved faster than people could optimize for it, it was used to brute force advancements...
As hardware innovation slowed down, new solutions like AI and Upscaling had to be invented to continue the already established standards of brute forcing...

What I am saying is, if someone were to chart it, it would scientifically show that over a long enough period of time, all games culminate in Visual Novels.
Thank you for coming to my TED talk.

I'm not really trying to dispute your post, just adding some big points, so-to-speak..

When it comes to Remnant 2, the developer is tiny in the big picture, there's not much to extrapolate in the overarching narrative.

What happened when it comes to processing power isn't a secret, it's not about optimization or "brute forcing", we've talked about the impact of 4K at length after 1080p had been the standard for many years.

8b39gm6.png

1080 to 4k alone is 2,073,600 pixels vs 8,294,400 pixels.. It is/was a giant leap worth generations of GPU power (hence also partly why many decided to land on a more sober 1440p).

That's not the whole story of course, because in addition videogames over the years have become more complex wide and deep beggar belief*


*It's funny but around the time when the Atari 2600 (~160x192 resolution btw) was the big thing, the biggest games of the time on average took one or two 9 to 5 working engineers about two or three months to complete.
 
Last edited:
Top Bottom