• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Crysis Remastered is the perfect benchmark for the next-gen consoles

VFXVeteran

Banned
All,

I had to reflect on the DF review of the Crysis Remastered version for the next-gen consoles and I went and installed it for the PC version with the new DLSS patch. After playing for awhile, it dawned on me what a true power delta means each generation.

My claim is that a reasonable jump in power should manifest itself with a game that has a dated unoptimized graphics engine running on a new platform to see how much the hardware can brute force it's way through rendering all the unoptimized code in the rendering engine.

We know that this company didn't go through optimizing the code for any of the modern graphics hardware. It has the following shortcomings:

1) No PBR materials

2) Very low polygon count

3) No tessellation routines

4) No true volumetric FX

The game however spams high res texture assets and reworked the lighting engine with brute force algorithms. Here is the list of things weighing down on a GPU/CPU:

1) Nearly every object has a hit-box which requires hit detection code (CPU-limited)

2) Lighting engine now uses ray-tracing (hardware accelerated for RTX cards only).

3) Spammed 8k textures (color and normal maps)

4) RT reflections

5) Reworked high res transparency alpha FX.

The mere fact that they had to cut so many graphics features to get the consoles to run at a target 60FPS is jarring to say the least. Even with a RTX 3090 activating all the graphics enhancements that the team implemented, it ran well above 40FPS @ native 4k. It wasn't until the DLSS patch using hardware Tensor cores to approximate the final image that the RTX boards are able to run @ 60FPS with all features available.

My take is that this is a benchmark that shows the gamer a "looking glass" of what the next-gen consoles can brute force raw pixels onto the screen. This benchmark shows to me that the consoles are grossly underpowered. With the latest reviews of games coming down the pike more rapidly, we are looking at last gen all over again. I'm very disappointed in this fact as I feel that MS/Sony needs to shoot for a higher bar, suck up the costs and price the consoles accordingly (~$1k or more). It is quite clear that Nvidia should be the chosen platform for the GPU and I'm dumbfounded that they both continue to rely on AMD for their heart transplant each generation.

FYI , here is a video of a massive fight with explosions and transparency polygons all over the place. Even with DLSS enabled, the framerate drops to mid-high 40s.

 

Andodalf

Banned
All,

I had to reflect on the DF review of the Crysis Remastered version for the next-gen consoles and I went and installed it for the PC version with the new DLSS patch. After playing for awhile, it dawned on me what a true power delta means each generation.

My claim is that a reasonable jump in power should manifest itself with a game that has a dated unoptimized graphics engine running on a new platform to see how much the hardware can brute force it's way through rendering all the unoptimized code in the rendering engine.

We know that this company didn't go through optimizing the code for any of the modern graphics hardware. It has the following shortcomings:

1) No PBR materials

2) Very low polygon count

3) No tessellation routines

4) No true volumetric FX

The game however spams high res texture assets and reworked the lighting engine with brute force algorithms. Here is the list of things weighing down on a GPU/CPU:

1) Nearly every object has a hit-box which requires hit detection code (CPU-limited)

2) Lighting engine now uses ray-tracing (hardware accelerated for RTX cards only).

3) Spammed 8k textures (color and normal maps)

4) RT reflections

5) Reworked high res transparency alpha FX.

The mere fact that they had to cut so many graphics features to get the consoles to run at a target 60FPS is jarring to say the least. Even with a RTX 3090 activating all the graphics enhancements that the team implemented, it ran well above 40FPS @ native 4k. It wasn't until the DLSS patch using hardware Tensor cores to approximate the final image that the RTX boards are able to run @ 60FPS with all features available.

My take is that this is a benchmark that shows the gamer a "looking glass" of what the next-gen consoles can brute force raw pixels onto the screen. This benchmark shows to me that the consoles are grossly underpowered. With the latest reviews of games coming down the pike more rapidly, we are looking at last gen all over again. I'm very disappointed in this fact as I feel that MS/Sony needs to shoot for a higher bar, suck up the costs and price the consoles accordingly (~$1k or more). It is quite clear that Nvidia should be the chosen platform for the GPU and I'm dumbfounded that they both continue to rely on AMD for their heart transplant each generation.

FYI , here is a video of a massive fight with explosions and transparency polygons all over the place. Even with DLSS enabled, the framerate drops to mid-high 40s.



This would mean something if it was a native game, but this is just enhanced BC. We aren't seeing next gen IPC improvements and hardware feature here.
 

VFXVeteran

Banned
This would mean something if it was a native game, but this is just enhanced BC. We aren't seeing next gen IPC improvements and hardware feature here.
We are seeing some next-gen features (8k textures, RT GI). I'm not marketing this game as the de facto standard for how pretty a next-gen game can look. On the contrary, the code is ridiculously unoptimized and that's the point.
 
Didn't nvidia fuck over both microsoft and sony on their gpus in the past while Nintendo is fine with the toaster leavins that nvidia bequeaths upon them?
 

VFXVeteran

Banned
They aren’t underpowered if you understand the market.

A Playstation designed around a 1k price point would be dope, but they’d rather just bring some games to the PC platform and sell a Ps5 pro down the line.
You gotta pay for power. The PS5 Pro would still be expensive if it's going to be powerful enough to brute force render crappy game engines.
 

VFXVeteran

Banned
Don't both PS5/Xbox Series versions run at back compat mode? How can it be a reasonable exemple of power?
If I took a game engine and I wanted it to render several billion triangles with a PBR shader attached to them and a crappy algorithm to iterate through the triangles, it should render pretty damn fast. If it doesn't, then there is a problem.
 
You are completely missing the point.
I'm not missing the point. I don't think you understand what point you are trying to make. PS5 and Series X are running the last gen versions of the game in back compat mode, in that they are not native to the hardware, and are leaving a TON of performance on the table. On top of that, this is a bad port to begin with. Whatever you think these performance numbers indicate, they don't.
 

SkylineRKR

Member
This game is a piece of shit that is indicative of nothing in my book. There are also good games like God of War essentially being brute forced to 60fps on PS5. Crysis is a PS4 Pro game on PS5 with just the resolution and framerate being upped somewhat (not even sure about the first though) but still shit. It has all the last-gen quirks, doesn't utilize SSD at all. Chokes when it saves checkpoints.

1k consoles personally I wouldn't mind, I use a console a lot and buy one every 6 years or so (the Pro was an exception since its been the first mid-gen refresh). If they can cram something like a high end Nvidia card in there its worth it. The problem though, is that you severely limit your market. And even a high end videocard will eventually be surpassed. I hope the price of that console drops accordingly then.
 
Last edited:

VFXVeteran

Banned
I'm not missing the point. I don't think you understand what point you are trying to make. PS5 and Series X are running the last gen versions of the game in back compat mode, in that they are not native to the hardware, and are leaving a TON of performance on the table. On top of that, this is a bad port to begin with. Whatever you think these performance numbers indicate, they don't.
"The dumb developer doesn't know what he's talking about.."

Carry on.
 

VFXVeteran

Banned
Let me be clear so people aren't confused.

This thread is NOT about showing off what the next-gen consoles can do with excellent optimized code with a complete streamlined pipeline built around the consoles.

It's about how powerful are the next-gen consoles with brute forcing rasterizing a crappy unoptimized engine. The exact opposite of what people are viewing this thread as.
 
Last edited:

Chukhopops

Member
If I took a game engine and I wanted it to render several billion triangles with a PBR shader attached to them and a crappy algorithm to iterate through the triangles, it should render pretty damn fast. If it doesn't, then there is a problem.
I want to say that I’m impressed by your patience to explain things thread after thread when half (if not more) of the replies you get are from idiots completely missing the point you are making. Even when you’re not mentioning any comparison between consoles you still get console war posts, anyone would have given up by now.
 

Venom Snake

Gold Member
" I'm very disappointed in this fact as I feel that MS/Sony needs to shoot for a higher bar, suck up the costs and price the consoles accordingly (~$1k or more). It is quite clear that Nvidia should be the chosen platform for the GPU and I'm dumbfounded that they both continue to rely on AMD for their heart transplant each generation."

Don't bother with consoles if you don't understand what they are for.
 

mrqs

Member
If I took a game engine and I wanted it to render several billion triangles with a PBR shader attached to them and a crappy algorithm to iterate through the triangles, it should render pretty damn fast. If it doesn't, then there is a problem.
Yes but both versions runs in back compat mode. Assassin's Creed Unity runs bad with it's framerate unlocked even in the PS5. Back compat mode has some restrictions on what can be achieved, 'cause as far as i know they don't have access to all the available power.
 

JackMcGunns

Member
Unreal Engine 5 will be a good benchmark, even though right now Sony has a deal with Epic which will unfortunately skew any objective comparison between PS5/XSX, but we can at least see what to expect out of PS5/XSX, I'm sure AAA games using UE5 will blow away Crysis visually.
 
Last edited:

RoadHazard

Gold Member
Let me be clear so people aren't confused.

This thread is NOT about showing off what the next-gen consoles can do with excellent optimized code with a complete streamlined pipeline built around the consoles.

It's about how powerful are the next-gen consoles with brute forcing rasterizing a crappy unoptimized engine. The exact opposite of what people are viewing this thread as.

BC games can't even use all the available power. So no.
 

VFXVeteran

Banned
Unreal Engine 5 will be a good benchmark, even though right now Sony has a deal with Epic which will unfortunately skew any objective comparison, but we can at least see what to expect out of PS5/XSX, I'm sure AAA games using UE5 will blow away Crysis visually.
Absolutely. This thread isn't about the crappy iteration of Crysis REmake. Moreso that these old ass games should be eaten up by modern hardware.
 

VFXVeteran

Banned
BC games can't even use all the available power. So no.
PC is using this same engine (which is console specific) and if you look at my video, the GPU usage is in the high 90s. That goes against your statement. If the consoles weren't using their available power, we'd see more FPS and more graphics features. Not struggling to maintain FPS with pretty much dumbed down graphics features at the level of PS3.
 

Andodalf

Banned
We are seeing some next-gen features (8k textures, RT GI). I'm not marketing this game as the de facto standard for how pretty a next-gen game can look. On the contrary, the code is ridiculously unoptimized and that's the point.
In that way it's a total worse case sure, but it's also slightly handicapped compared to the newest GPUs that have better driver support for this title than the last gen BC codebase
 
"The dumb developer doesn't know what he's talking about.."

Carry on.
I didn't call you dumb lmao. I just don't see how drawing conclusions from how a bad port runs through (at least in the case of the Xbox) software enabled backwards compatibility mode, is indicative of anything. I'm not going to run older programs through Rosetta on a new M1 Mac and then make some conclusions because the non-native software didn't perform as well as it could.
 

RoadHazard

Gold Member
PC is using this same engine (which is console specific) and if you look at my video, the GPU usage is in the high 90s. That goes against your statement. If the consoles weren't using their available power, we'd see more FPS and more graphics features. Not struggling to maintain FPS with pretty much dumbed down graphics features at the level of PS3.

I'm not saying the game isn't using the power available, I'm saying not all power the console has IS available in BC mode. Either you don't understand how this works, or you're deliberately misunderstanding it to make the consoles look bad.

But we all know this thread is just yet another attempt from you to show everyone how inferior the consoles are compared to even the weakest of PCs, as that is your favorite activity on this forum.
 
Last edited:

Shane89

Member
No it's not.
And for a simple reason, it's the previous gen version with no caps. It's just a beautiful impressive game, bad programmed and that run on powerfull consoles.

Oh, and btw, high gpu usage means literally nothing. i can write 10 lines of code that use 99% of your GPU, for nothing.
 
Last edited:

VFXVeteran

Banned
I didn't call you dumb lmao. I just don't see how drawing conclusions from how a bad port runs through (at least in the case of the Xbox) software enabled backwards compatibility mode, is indicative of anything. I'm not going to run older programs through Rosetta on a new M1 Mac and then make some conclusions because the non-native software didn't perform as well as it could.
If we all loaded up an early last gen game like AC: Black Flag, no matter how unoptimized the code is, the new GPU should run it 2-3x faster than the older hardware. That's my point. If it was totally BC mode, then the FPS wouldn't change and neither would the resolution. It would be locked to the original spec.
 
Last edited:

II_JumPeR_I

Member
The remaster is a bad benchmark. It uses a Frankenstein version of Cryengine from the 360/PS3 version of crysis instead of being based on a current version of CE.
Will be interesting how they will handle the ports/remasters of Crysis 2 and 3. Hopefully better
 
If we all loaded up an early last gen game like AC: Black Flag, no matter how unoptimized the code is, the new GPU should run it 2-3x faster than the older hardware. That's my point. If it was totally BC mode, then the FPS wouldn't change and neither would the resolution. It would be locked to the original spec.
In theory, yes.

The problem is software enabled backwards compatibility. We don't know how much performance is being left on the table. It's basically emulation as I understand it, which in and of itself requires cycles to execute the code. Versus on the PC the game runs natively and has access to MUCH faster hardware.

Correct me if I'm wrong. I like analysis like yours, I'm just missing the smoking gun part where there is a correlation. I'm just not seeing it.
 
Last edited:

Mithos

Member
If we all loaded up an early last gen game like AC: Black Flag, no matter how unoptimized the code is, the new GPU should run it 2-3x faster than the older hardware. That's my point. If it was totally BC mode, then the FPS wouldn't change and neither would the resolution. It would be locked to the original spec.

But it probably will only run the game at PS4 speeds (unless there is a patch to unlock it)
I remember the game at release ran only at 900p 30fps on PS4 and later got a patch to bring it up to 1080p 30fps (not even sure if there ever was a PS4 Pro patch).

So if there is no patch specific to PS5 for this game it is probably running in PS4 BC mode (CPU 1.6Ghz and GPU 800mhz)
 
Last edited:

chaseroni

Member
But don't the games running in back compat have a slight difference in clockspeed? Maybe not drastic but I thought they clocked down a few TF's. I'm not sure we're seeing the full "Brute force power" here
 

Shmunter

Member
BC modes run in a restricted hardware mode, more so on PS5 than XsX. Hardware is being held back from flexing.
 

SkylineRKR

Member
But don't the games running in back compat have a slight difference in clockspeed? Maybe not drastic but I thought they clocked down a few TF's. I'm not sure we're seeing the full "Brute force power" here

Thats the thing. When PS5 brute forces a PS4 release it downclocks to Pro levels but previously unlocked framerates usually lock to 60. Thats why the quality mode on MonHun is locked 60 just like the framerate mode when played on PS5, which renders the latter option moot.

Load times kinda half (they are also faster if you put an SSD into a PS4) but don't take full advantage of SSD. It doesn't seem to do much more. If the BC was using all raw power and brute forces the entire thing then you have a much better benchmark. There might be stability issues and games might not even boot but for some we would be able to see bigger improvements. Although resolutions usually can't be chosen on console so we don't know how far we can push it.

There were a few games, like Star Ocean 4, that have PC like graphics options. Those would be better benchmarks. But that game is better optimized than Crysis to begin with.
 

ZywyPL

Banned
Isn't the game still utilising mostly just a single thread (DX11)? Which is a terrible solution given how much CPU bound the game is...
 

kaosridder

Neo Member
To my understanding consoles are not meant to brute force anything. That's the whole point of making just one hardware iteration which developers can then get fully familiar with. Which is why games late in the specific cycle are more or less running the hardware to its limits. Brute forcing is a pc thing where you can endlessly upgrade partly because you want to brute force whatever you put into it.
 

VFXVeteran

Banned
BC modes run in a restricted hardware mode, more so on PS5 than XsX. Hardware is being held back from flexing.
There is no evidence of this. GoT is also running in BC mode and it's rendering native 4k @ 60FPS - significantly higher than the PS4 iteration.

Here's another theory. If they have already made 8k texture assets, there is nothing to change in the code to get those textures to load in BC mode. A texture lookup in a shader language is as simple as:

texture(<pointer to the texture>, u, v);

It's very easy to set the texture pointer to the 8k stash just like the PC version gets. Yet, they aren't using them.

Another example is texture filtering mode:

glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_LINEAR_MIP_MAP_LINEAR);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MAX_FILTER, GL_LINEAR_MIP_MAP_LINEAR);

You can easily set the GPU to use ANISOTROPIC filtering mode to 16x. It's an entire 2-line command. Yet, the anisotropic filtering is so low it matches PS4 4x settings.

More evidence of them doing something to the codebase is making those 3 modes of rendering. They literally SET the rendering modes and the lighting mode (for RT). Also, FPS drops are seen across ALL platforms and not just the consoles. If I took my PC and set the graphics settings to that of the console, it would run at 120FPS. A clear indication of brute forcing slow unoptimized algorthims.
 

VFXVeteran

Banned
It is a BC game lol
Even the settings are tied to last gen consoles.
It's not in pure BC. If it was there would be no graphics modes for the consoles and there wouldn't need to be a specific release for the nextgen consoles. It would just play like the PS4/1X all the way down to the FPS and resolution and the performance numbers would match 1:1 with the older hardware.
 

ethomaz

Banned
It's not in pure BC. If it was there would be no graphics modes for the consoles and there wouldn't need to be a specific release for the nextgen consoles. It would just play like the PS4/1X all the way down to the FPS and resolution and the performance numbers would match 1:1 with the older hardware.
It is pure BC.
Not just the code but the game runs in BC mode and have all BC limitations.

At least for PS5 there are 3 BC modes:

PS4 BC mode.
PS4 Pro BC mode.
PS5 patched BC mode.

It is just a a patched BC mode game... it can use only feature that exists in PS4/PS4 Pro.
 
Last edited:

VFXVeteran

Banned
It is pure BC.
Not just the code but the game runs in BC mode and have all BC limitations.
Dude, do you know what BC means for a console? I would love for you to educate me. Tell me how code would look in BC mode.
 
Last edited:

VFXVeteran

Banned
Yes... on PS5 the APU works as a PS4/PS4 Pro APU.
And what does that mean bro? Does that mean I can't change the resolution which is switching the graphics state to a higher res? Or is that NOT allowed in BC mode? Did you help with the design of the BC mode on the PS5?
 
Top Bottom