• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Crysis Remastered is the perfect benchmark for the next-gen consoles

Allandor

Member
What a odd thread. Crysis remastered on consoles is not indicative of anything. It's a poorly optimized game engine from over a decade ago that has a hack job of RT extensions implemented into it. On top of that, the consoles are running in BC mode, so none of the features of RDNA 2 are present, such as the IPC gains and hardware accelerated RT (as confirmed by MS themselves). To add to that, you will have a software overhead from the emulation taking place.

If the game was actually developed with the modern Cryengine, and was a native game utilizing the RT acceleration on RDNA 2, then sure, you can use it as a benchmark if you wish. If that was the case however, then the game would be performing significantly better on both consoles.

Your comment against AMD is also odd, going with Nvidia for the PS4/Xbox One generation would have been a terrible idea as Kepler was just worse then GCN, and only stayed ahead of GCN on the PC due to AMDs lackluster drivers and DX11 performance at the time. Low-level APIs such as Dx12 and Vulkan with Async compute clearly demonstrated that GCN was the superior architecture.

Even if both consoles could have had a single SOC combining an AMD CPU and Nvidia GPU for this generation, that would not have led to better console performance. AMD and Nvidia are very close with normal rasterization, with Nvidia leading the way in RT performance, but Nvidia is lacking in both performance per watt and die size when compared to AMD. Even accounting for the difference in 8nm VS 7nm, going Nvidia would have just resulted in a weaker GPU for normal rasterization and a bit better RT performance as console are limited by die size and power consumption.

This thread feels like a thinly disguised attempt to once again discredit consoles (weirdly enough AMD as well).
The game engine is not even GCN optimized. It is just so old. Without using new features and techniques it just tried to brute force it's way with code that was optimized to reach 30fps on 2007 hardware.
 
Last edited:

Shut0wen

Member
Didn't nvidia fuck over both microsoft and sony on their gpus in the past while Nintendo is fine with the toaster leavins that nvidia bequeaths upon them?
Yes they fuckee microsoft over by making the original xbox 25% less powerful while ps3 is a fucking nightmare to code, im guessing they under cut nintendo as well because its still less powerful then a nividia shield but then again nintendo are cheap and want to make money on hardware while nividia gets a big slice of the pie as well unlike the deals they made with microsoft and sony
 
Last edited:
If by perfect you mean a worst case scenario, sure. I mean my gosh, the amount of cobbled together legacy code from 2 different versions and a single or two threads to feed the gpu, out of 16...

Like I said before, wait for native software. First up is Ratchet and clank.
 

ss_lemonade

Member
Well, lets go find some Vega 64/1080 benchmarks and we can compare. Since that's the level of performance backwards compatability mode is expected to deliver.
oHWPmXK.png


Lx387FF.png


Here's a 1024x768 (latest patch) and a 720p (some older patch) shot from my PC. Max settings w/o raytracing. Probably not enough info, but maybe you could come up with some conclusions on why it's demanding? Specs below:

oc'ed 1080 Ti
3700x
32GB 3200
 

Neo_game

Member
SI have done a poor job with Crysis remastered. It is a demanding and unoptimized game on the PC as well. The console version are basically running it on previous gen BC mode with unlocked fps. Having said that I think people should keep their expectation at check. RT is obviously going to be limited and 1440P, DRS seems to be ideal resolution for gfx and performance point of view
 

Elog

Member
I understand what you are trying to achieve but this application does not do that and is a bad proxy for several reasons.

- The game is running in BC mode to some extent (the extent unknown) which limits the hardware in unknown ways for all next-generation systems (and in addition, to various degree depending on system).
- A lot of functionalities with dedicated transistor real-estate have been added to the hardware in recent years - this real-estate is not being used properly which means that newer hardware underperforms relative to old hardware (i.e. performance in relationship to GPU transistor count)
- GPU utilisation % does not capture this since a unit that is fully scheduled but waiting to receive a piece of information or waiting to send a completed task count as fully utilised despite the fact that not a single transistor is being used

In other words, what you end up are trying to do is measure the performance of a sword in combat by measuring how well the sword can gut an animal (as a proxy). It is not a meaningless proxy but it scales far from perfect.

In my opinion we really need those next generation engines for this analysis (e.g. Frostbite, UE5).
 
Last edited:

Zathalus

Member
BC mode doesn't mean the console can't reach faster FPS. You should know that.

That is exactly what it means, the hardware is being abstracted behind a software layer to emulate previous gen GCN GPU code and prevents the software from taking advantage of IPC improvements and newer hardware feature such as RT hardware acceleration. You can read more about this here if you were not aware of this. As per the article:

Eurogamer said:
There may be the some consternation that Series X back-compat isn't a cure-all to all performance issues on all games, but again, this is the GPU running in compatibility mode, where it emulates the behaviour of the last generation Xbox - you aren't seeing the architectural improvements to performance from RDNA 2, which Microsoft says is 25 per cent to the better, teraflop to teraflop. And obviously, these games are not coded for RDNA 2 or Series X, meaning that access to the actual next-gen features like variable rate shading or mesh shaders simply does not exist.

This thread is not about the ideal condition. You took the phrase of the title and misunderstood it entirely.

With every generation of a new graphics GPU, you can reload an old game and the new GPUs sheer power will push the FPS and resolution higher by default. Without coding a single line. I expected to see this on consoles due to their better hardware. I'm disappointed and frustrated - not trolling the consoles.

It does not have to be the ideal condition, it does however, actually need to be indicative of what the consoles can truly achieve. Consoles are not PCs, simply throwing more hardware at a game is not going to perfectly increase performance unless the game is actually coded to be aware of the new hardware features of the underlying GPU. The equivalent in the PC space would be to have a 3090 run games in a abstraction layer that is emulating the behavior of a Maxwell era GPU, then taking that as gospel as to the performance of how the 3090 runs. Or running a RT benchmark that does not take advantage of the RT cores and claiming that as how the 3090 performs in RT.

AMD is simply far behind Nvidia. Did you even watch Jensen's speech at the GTC? They are way ahead and not just RT. Their AI, drivers, visions, influence and overall quality of their product is better. No one on here, if given the choice, would choose AMD over Nvidia for PS6/X2 consoles. Their reputation is that good.

You utterly ignored my point and offered Jensen's speech at GTC as a rebuttal? What kind of elementary school level of debating is this? You claimed that Nvidia would have been the better choice this generation as well as last generation, I pointed out this was simply not true. GCN was better then Kepler, and Ampere is not better then RDNA2 in rasterization performance and is worse in terms of die area and performance per watt, two areas which are obviously more important for consoles then anything else. Going Nvidia this generation would not have lead to a more powerful console, considering the restraints on die cost and power consumption that both the PS5 and XSX face.

As for the next generation, that is years away, we have no idea how the competition between the two GPU manufacturer will play out. AMD is currently behind on RT performance, but the company has way, way more money to pump into GPU R&D now, thanks to the stellar success of Epyc and Ryzen. MCM modules are also going to be a game changer over the next few years, and I'd bet on the company that can better leverage this technology. Furthermore, you make it sound as if Nvidia has always been ahead of the competition, when a rudimentary knowledge of history between the two would indicate this is simply not true.

Finally, riddle me this, why is AMD the only GPU manufacturer powering the first Exascale Supercomputers? I thought Nvidia was so far ahead?

It's not a disguise at all. I am much invested in the consoles as anyone else on these boards. I'm just a realist and see things without bias. I do love my PS5 btw.

Oh come off it, anyone looking at your post history can clearly see you have a massive bias towards PC, you constantly shit on consoles all the time, your last ban was even due to that. This thread is testament to you trying to once again downplay consoles. You could have chose any number of native XSX/PS5 games, but your choice was a poorly coded, decade old, BC game to be held up as some golden standard for how the consoles are so underpowered.

This is coming from somebody who also majority games on my PC, a console warrior I am not.
 
Last edited:

Dream-Knife

Banned
To most people "next gen" is higher polygon counts.

Other than the upgrade from trash tier textures on PS4 and Xb1, most people wouldn't notice the difference in generations, much less an old game running at a higher frame rate.
 

Azurro

Banned
We are seeing some next-gen features (8k textures, RT GI). I'm not marketing this game as the de facto standard for how pretty a next-gen game can look. On the contrary, the code is ridiculously unoptimized and that's the point.

I don't have all the details, but it seems Crysis remastered is just backwards compatibility, right?

You are comparing a PC game optimized to run on RTX cards to an executable that thinks it's running on a PS4 Pro. Hardly fair, is it?
 

Whitecrow

Banned
GCN TFs != RDNA2 TFs

And as you said, it's poorly optimized.
Also, it's not using Mesh shaders or the Geometry Engine. And obviously, the engine doesnt give a crap about having an SSD.

So no. It cant be used as a benchmark. Native PS5 games still have a lot to get from the hardware that bruteforced BC games will never get.
 
I'm arguing that the next-gen consoles can't render constant 60FPS even when the resolution has dropped all the way down to 1080p. To me that shows a rasterization limitation. No need to argue with me like I don't know what I'm saying. We are all watching the same video that DF put in. And even they complain about it. Why does it seem like I'm saying something new?
Which console runs best?
 

raul3d

Member
BC mode doesn't mean the console can't reach faster FPS. You should know that.

[..]

With every generation of a new graphics GPU, you can reload an old game and the new GPUs sheer power will push the FPS and resolution higher by default. Without coding a single line. I expected to see this on consoles due to their better hardware. I'm disappointed and frustrated - not trolling the consoles.
Software written for PCs has a lot of abstraction built in since it is required to run on a variety of different hardware configurations. Software for consoles does not need this.

The BC argument is not that the consoles cannot reach higher FPS in BC than the base system, depending on the BC mode the IPC and frequency gains help automatically, but that all the remaining hardware capabilities that were not present in PS4/PS4pro are not available to the software. It is like testing Crysis Remaster on an RTX card without the game realizing it is running on an RTX card. It would leave large portions of the hardware unused and running in legacy shader profiles.

In addition you have a lot more control about the hardware utilizitation in the console, which could prevent BC software from even utilization all hardware resources. For example PS4 software only saw 6 CPU cores while the PS5 should have >14 virtual cores.
 
Last edited:
I am a bit disappointed by the consoles tbh as I expected them to blow my PC away with all that special console magic and the only game I have that arguably runs better on console conspired to my PC is Valhalla.

I know it is early days and that you have to be realistic about things. The new consoles are admittedly absurd VFM especially with GPUs costing the earth right now.

I would have paid £1,000 for a better PS or Xbox but I'm clearly in a huge minority there.
 

Elog

Member
I'm arguing that the next-gen consoles can't render constant 60FPS even when the resolution has dropped all the way down to 1080p. To me that shows a rasterization limitation. No need to argue with me like I don't know what I'm saying. We are all watching the same video that DF put in. And even they complain about it. Why does it seem like I'm saying something new?
The problem is that we do not know what bottle-necks and constraints that are introduced with all that partial BC code. You are assuming it is evenly distributed across the rendering pipeline with your statement (i.e. that it scales or in other words that there is linearity) - we do not know that. Looking at the performance I would actually argue the opposite: It looks as if the partial BC code introduces some serious bottle-necks in some areas but not in others since the limitations that pop up do not align with what we observe in true native titles.
 

SkylineRKR

Member
I am a bit disappointed by the consoles tbh as I expected them to blow my PC away with all that special console magic and the only game I have that arguably runs better on console conspired to my PC is Valhalla.

I know it is early days and that you have to be realistic about things. The new consoles are admittedly absurd VFM especially with GPUs costing the earth right now.

I would have paid £1,000 for a better PS or Xbox but I'm clearly in a huge minority there.

I wouldn't mind it either if it means its going to suffice for 5+ years unlike PS4 and X1 and consoles before them which lagged behind after 2 years or so, forcing me up to upgrade to Pro anyway which was another 400 bucks and still kinda underpowered. Basically, if they release a normal SKU for a competitive price, and a ultra enthusiast console for 1000 bucks. I'd buy the latter.
 

Allandor

Member
oHWPmXK.png


Lx387FF.png


Here's a 1024x768 (latest patch) and a 720p (some older patch) shot from my PC. Max settings w/o raytracing. Probably not enough info, but maybe you could come up with some conclusions on why it's demanding? Specs below:

oc'ed 1080 Ti
3700x
32GB 3200
This shows really good, how demanding and unoptimized the game is.
CPU core 16 is totally limiting here. The GPU does not even get pressure at all. I'm even surprised that the GPUs in your screenshots run at such high frequencies ^^

It must have to do something with the physics engine in that game. It's just not scaling that well with the framerate, visual sight, .... more or less everything on screen is "physical" active, which means there are many, many different states that need to get calculated (e.g. even in the correct order).
Normally you wouldn't design a physics engine that way. But it was ok, at that time, with the visual range etc it had, when it was released. Adding more trees, grass, particles (so everything that is somehow interacted by the wind, player, ...) does not make it any better.
 

Razvedka

Banned
"The dumb developer doesn't know what he's talking about.."

Carry on.
He's right though.


Edit: not even going to get into this, others have already covered what I was going to say a million times better. As an aside, You have some very interesting fixations. Between this and your poor OpSec it's sometimes odd knowing how to process your threads and the things you say or do, no offense. You definitely aren't lacking in confidence and unique takes though, always a fun read.

Edit edit: I'm not saying you're dumb, I'm just agreeing with his argument.
 
Last edited:

Spukc

always chasing the next thrill
game looks and runs like poop
was a cool gfx showcase when it originally came out,
now it's being dunked on by the likes of TLOU2 or other Playstation™ exclusives
and nobody should honesty give a fuck about this game or their devs anymore.
 
Last edited:
It's basically emulation as I understand it
Then you don't understand it. The code runs natively on the CPU (x86 is x86, it's all the same and explicitly designed to be compatible), and runs a hell of a lot faster than it does on last-gen, it just can't make use of any new instruction sets added specifically to the newer consoles. GPU code equally goes through hardware abstraction (API and driver), just like on PC, so that they can replace the GPU at any time with an equivalent or more powerful one and won't have to recompile the game to add support for the new hardware revision.
 

ethomaz

Banned
Then you don't understand it. The code runs natively on the CPU (x86 is x86, it's all the same and explicitly designed to be compatible), and runs a hell of a lot faster than it does on last-gen, it just can't make use of any new instruction sets added specifically to the newer consoles. GPU code equally goes through hardware abstraction (API and driver), just like on PC, so that they can replace the GPU at any time with an equivalent or more powerful one and won't have to recompile the game to add support for the new hardware revision.
BC in both consoles are emulation.

In Xbox it is software emulation.
In PS5 it is hardware emulation.

There are advantages and disadvantages in both but they are emulation.
 
Last edited:

SkylineRKR

Member
game looks and runs like poop
was a cool gfx showcase when it originally came out,
now it's being dunked on by the likes of TLOU2 or other Playstation™ exclusives
and nobody should honesty give a fuck about this game or their devs anymore.

Game also plays like poop, 60fps or not. Try to ADS an enemy thats moving. Or those fucking aliens dashing away every few hits. Gameplay is horrible. AI is pretty dumb (though not the dumbest AI around). I had almost no fun playing it, save for some chapters during the first half. But the entire last third is absolutely awful. I wanted to beat it at that point so I rushed through. You have this kick ass suit, but you're so weak. All you can do is a lousy melee, or a dumb grab. A higher jump, super armor that makes you slightly less weak and a super fast run that lasts for a whopping second. Then you're standing still or moving at a slow pace, waiting for the slow recharge to fill again. And during the last third? You don't need any suit powers, there is no use for stealth anymore. The gameplay loop is shit.

On PS5 its in a playable state, which is much better than it was on PS4. But I still regret spending money on this trash. I liked Crysis once. Dunno why actually, certainly didn't age well.
 
Last edited:
This benchmark shows to me that the consoles are grossly underpowered. With the latest reviews of games coming down the pike more rapidly, we are looking at last gen all over again. I'm very disappointed in this fact as I feel that MS/Sony needs to shoot for a higher bar, suck up the costs and price the consoles accordingly (~$1k or more). It is quite clear that Nvidia should be the chosen platform for the GPU and I'm dumbfounded that they both continue to rely on AMD for their heart transplant each generation.


So you want them to use 3080 level hardware in there and price them $1000+?

Do you know mass market prices of gaming hardware? How few people, even hardcore gamers choose cards costing that much? Check the steam charts. It's miniscule.

Nvidia doesn't do APUs. What exactly will they get by going Nvidia? Don't say dlss cause it is hardware based which consoles will never have. Their software based upscalers are best compromise for consoles.
 

Spukc

always chasing the next thrill
So you want them to use 3080 level hardware in there and price them $1000+?

Do you know mass market prices of gaming hardware? How few people, even hardcore gamers choose cards costing that much? Check the steam charts. It's miniscule.

Nvidia doesn't do APUs. What exactly will they get by going Nvidia? Don't say dlss cause it is hardware based which consoles will never have. Their software based upscalers are best compromise for consoles.
lol at even saying 3080 is around 1k it's a 3k card
 
BC in both consoles are emulation.

In Xbox it is software emulation.
In PS5 it is hardware emulation.

There are advantages and disadvantages in both but they are emulation.
No. They aren't. The PS5 / XSX aren't even remotely powerful enough to emulate the previous gen consoles. They no more emulate PS4 / Xbone than the Wii emulated GC games.
 

ss_lemonade

Member
This shows really good, how demanding and unoptimized the game is.
CPU core 16 is totally limiting here. The GPU does not even get pressure at all. I'm even surprised that the GPUs in your screenshots run at such high frequencies ^^

It must have to do something with the physics engine in that game. It's just not scaling that well with the framerate, visual sight, .... more or less everything on screen is "physical" active, which means there are many, many different states that need to get calculated (e.g. even in the correct order).
Normally you wouldn't design a physics engine that way. But it was ok, at that time, with the visual range etc it had, when it was released. Adding more trees, grass, particles (so everything that is somehow interacted by the wind, player, ...) does not make it any better.
I don't know if that's it because the environmental physics (to my eyes) is a bit of a downgrade from the original. Some examples





The same scenes and resolutions as my screenshots in the original would give me about 10-20 fps more with even less gpu and cpu utilization. The remaster though is clearly displaying more geometry and a significantly better LOD setup, which is hard to match even with a custom config in the original so it's not all negatives.
 
It is emulation.

The hyper visor emulate the old hardware on Xbox.
The APU emulates the old APU on PS5.
Clearly you have no idea what emulation is or how it works. You can't emulate hardware in the hypervisor. And unless you're going to tell me that the APU in the PS5 is an FPGA that isn't emulation either.
Lying to the game and denying access to additional cores and memory resources is not emulation. Nothing about that is emulation.
 

ethomaz

Banned
Clearly you have no idea what emulation is or how it works. You can't emulate hardware in the hypervisor. And unless you're going to tell me that the APU in the PS5 is an FPGA that isn't emulation either.
Lying to the game and denying access to additional cores and memory resources is not emulation. Nothing about that is emulation.
Virtual Machines or Hypervisors are emulators lol

"A hypervisor (or virtual machine monitor, VMM, virtualizer) is a kind of emulator; it is computer software, firmware or hardware that creates and runs virtual machines."

I mean you have no ideia what emulation is.

BTW AMD made the PS5 APU emulates PS4 and PS4 Pro... it is called hardware emulation.
When running in these emulated modes the game will only see what they want... in that case a PS4 or PS4 Pro APU... there is a third emulated mode that is the PS4/PS4 Pro APU overclocked (it uses the PS5 clocks but yet it is emulating the old hardware);
 
Last edited:

FireFly

Member
You won't be able to figure out the rendering settings from the consoles right now unless you ask one of the developers. Maybe we should do that instead?
These benchmarks show that on high settings at least, the Vega 64 is averaging 60 FPS, but with frequent drops into the 50s and below. (It's running 2048x1152 which is like 11% more pixels, but it is an OC version as well)

Here's a 1024x768 (latest patch) and a 720p (some older patch) shot from my PC. Max settings w/o raytracing. Probably not enough info, but maybe you could come up with some conclusions on why it's demanding? Specs below:

oc'ed 1080 Ti
3700x
32GB 3200
Well at max settings the game becomes very CPU limited. It would be interesting to test at 1080p medium say, to see if it ever dips below 60.
 

Zuzu

Member
A $1000 console wouldn’t appeal to the mass market. Without the mass market there probably wouldn’t be the investment into gaming that there is now and so we wouldn’t have the amount or diversity of games that we have now or games with the same production values and quality. Because of this, mid-range consoles and graphics cards are a necessary requirement for the health of the gaming industry and so we’ll always have them for as long as we have gaming on local hardware.
 
Last edited:
Virtual Machines or Hypervisors are emulators lol

"A hypervisor (or virtual machine monitor, VMM, virtualizer) is a kind of emulator; it is computer software, firmware or hardware that creates and runs virtual machines."

I mean you have no ideia what emulation is.

BTW AMD made the PS5 APU emulates PS4 and PS4 Pro... it is called hardware emulation.
When running in these emulated modes the game will only see what they want... in that case a PS4 or PS4 Pro APU... there is a third emulated mode that is the PS4/PS4 Pro APU overclocked (it uses the PS5 clocks but yet it is emulating the old hardware);
Virtual machines are not emulation...what the actual fuck are you smoking?
 

AgentP

Thinks mods influence posters politics. Promoted to QAnon Editor.
Virtual machines are not emulation...what the actual fuck are you smoking?
They can be. If I have a program that only run on Win XP, MS offers a XP VM for Win 10. This is how MS does emulation on Xbox. The game thinks it is running on native HW & SW, which is all done via the hypervisor.
 

ethomaz

Banned
Virtual machines are not emulation...what the actual fuck are you smoking?
You really have no clue what are you talking about.

The Hypervisor in Series X is set with emulated CPU/GPU of old consoles to run the games in BC modes.

Like every software emulation you can change things so the BC in Xbox can be changed very easily while the hardware emulation in PS5 can’t be changed... you have 3 modes and all of them have the same limitation as PS4/PS4 Pro APUs.

MS option is more flexible while PS5 can’t change the modes already hardware emulated inside the silicon.
 
Last edited:

VFXVeteran

Banned
I wouldn't mind it either if it means its going to suffice for 5+ years unlike PS4 and X1 and consoles before them which lagged behind after 2 years or so, forcing me up to upgrade to Pro anyway which was another 400 bucks and still kinda underpowered. Basically, if they release a normal SKU for a competitive price, and a ultra enthusiast console for 1000 bucks. I'd buy the latter.
That's exactly my point! Put out much more powerful hardware and don't worry about a mid-gen refresh trying to catch up because the early gen hardware is subpar.
 
Top Bottom