• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

God of War on PC - Digital Foundry Tech Review

rodrigolfp

Member
Ok but there’s still a limit with what you can do in the games settings. If someone buys it 5 years later it will still have the same assets. To be fair even BC on Xbox doesn’t work the same way that it does on PC. You can’t buy an X1 title and just choose higher settings on the XSX unless the devs make possible. The same goes for other consoles with BC.
Not sure what your point is, but yes, there is always limits at some point, even using tweaks and mods. Xbox bc also sucks compared to PC bc.
 

MasterCornholio

Gold Member
Not sure what your point is, but yes, there is always limits at some point, even using tweaks and mods. Xbox bc also sucks compared to PC bc.

Basically that BC works the way it does on consoles due to developers only focusing on a platform with fixed hardware. In order to obtain higher settings they have to modify the game so you can access them on a newer platform. It’s why the PS5 version was of FF7R is much better than the PS4 version. With PC since developers have a wide variety of hardware to develop for its up to the user to choose the settings they want to get the game to run how they want on their hardware. Unfortunately consoles are all about simplicity so those options are not given to the end user.
 
It'll always be dumb that the obvious has to be stated but it never fails people come in here with drive-by comments that make no sense.

People saying "the differences aren't that noticeable lulz" pretend as if people are running out to build a PC exclusively for God of War. How about its just another way to play an amazing game that just so happens to now be the best way to play it no matter how big or small the differences may be?

Also assuming such a argument above is as dumb as assuming PC players are gonna rush out to buy a PS5 to play Ragnarok, that also isn't happening. Cause again PC players care about improvements even when to you they may be insignificant.

It will be funny though when Ragnarok launches on PS5 cause I can guarantee you any graphical improvement will be gushed over as the greatest achievement in videogame entertainment history.

Until the port comes to PC that is...
 

Alexios

Cores, shaders and BIOS oh my!
AMD's super resolution seems ace @ Ultra @ 1440p. I can't tell the difference with native and it edges the game to 60fps on my aging rig with Ulta textures and High everything (but I think I'll Ultra everything and lock to 30fps instead but still use this as it really looks the same). Maybe it's a tiny bit softer at the distance but without really losing detail, if anything maybe giving it a slightly less aliased look (probably even more unnoticable if you use film grain and such, which I don't) but maybe not even that. I took a bunch of screens to compare and couldn't tell them apart so need to redo 🤦‍♂️

Edit: I skimmed the video and going by the time stamps they don't even test AMD's thingie, only DLSS? Come on guys, not everyone jumped to the RTX series. I have a 1080 and Nvidia locks me out of that stuff while AMD doesn't and, again, it really does look the same as native @ Ultra. I guess it's possible there's some artifacting in motion and/or if there are effects like heat distortion and motion blur and everything else applied on top of the image but again, are you really going to notice that while moving and playing rather than taking a snapshot of the action and checking it out later?

Edit: got the screens proper this time, can anyone really tell which is which in these stills, never mind while simply playing the game? Note I have film grain off but motion blur is at the default 10 and the camera isn't 100% immobile even when you're standing still.

1440p Native with Ultra textures and High everything else:

1440p Ultra AMD SR with Ultra textures and High everything else:

1440p Native with Ultra (and +) settings:

1440p Ultra AMD SR with Ultra (and +) settings:

Bonus shots showing some bad lod regardless of settings, the lighting on that door goes flat just a few feet away then pops closer (or is it the polygonal/normal mapped/tessellated/whatever relief design that goes flat so the lighting naturally follows, I dunno).

 
Last edited:

rodrigolfp

Member
Basically that BC works the way it does on consoles due to developers only focusing on a platform with fixed hardware. In order to obtain higher settings they have to modify the game so you can access them on a newer platform. It’s why the PS5 version was of FF7R is much better than the PS4 version. With PC since developers have a wide variety of hardware to develop for its up to the user to choose the settings they want to get the game to run how they want on their hardware. Unfortunately consoles are all about simplicity so those options are not given to the end user.
What you call simplicity I call lack of freedom/options, too many limitations.
 
Last edited:

Buggy Loop

Member
AMD's super resolution seems ace @ Ultra @ 1440p. I can't tell the difference with native and it edges the game to 60fps on my aging rig with Ulta textures and High everything (but I think I'll Ultra everything and lock to 30fps instead but still use this as it really looks the same). Maybe it's a tiny bit softer at the distance but without really losing detail, if anything maybe giving it a slightly less aliased look (probably even more unnoticable if you use film grain and such, which I don't) but maybe not even that. I took a bunch of screens to compare and couldn't tell them apart so need to redo 🤦‍♂️

Edit: I skimmed the video and going by the time stamps they don't even test AMD's thingie, only DLSS? Come on guys, not everyone jumped to the RTX series. I have a 1080 and Nvidia locks me out of that stuff while AMD doesn't and, again, it really does look the same as native @ Ultra. I guess it's possible there's some artifacting in motion and/or if there are effects like heat distortion and motion blur and everything else applied on top of the image but again, are you really going to notice that while moving and playing rather than taking a snapshot of the action and checking it out later?

Because it’s useless and of no interest to DF, pretty sure they’ve stated this on forums. Just like including an mCable in the discussion is meaningless.

You have a 1080? You have Nvidia’s image scaling that does equivalent to AMD at a driver level, not on a game to game support basis and it works on Pascal cards. They can’t throw you AI reconstruction on non tensor cores.
 
Last edited:

Ev1L AuRoN

Member
I am surprised that a 6 tflops 580 and its Nvidia equivalent are struggling to hit 60 fps at 1080p using PS4 settings. They are over 3x more powerful than the 1.8 tflops PS4 without even taking into consideration the IPC gains, they have access to more RAM, and way better CPU than the shitty jaguar 1.6 ghz and 2.1 ghz CPUs in the PS4.

To me, this is just a bad port just like Horizon. I tried running Horizon at 1080p using original settings last night, and it topped out at like 85 FPS on a RTX 2080. Thats an 11.5 tflops card running the game on a 8 core 16 thread CPU that can hit 5.1 Ghz. And all I get is less than 3x more performance? Absolutely ridiculous.

It's clear they did the bare minimum porting this. It's a straight port. Back in the day, a 3.5 Tflops GTX 970 would run every PS4 game at 1080p 60 fps despite its controversial 3.5 GB of split vram. Today I can run it at native 4k 90 fps (the capped max) at ultra settings and have like 25-30% of the GPU still left to be utilized. And that port was trashed by everyone as one of the worst PC ports ever.

Unlike the PS3 and PS5, there is no secret sauce in the PS4. The only thing I can think of is the extra ACEs Cerny added that helped with asynchronous compute but every Polaris card including the 580 should have the same number of ACEs.
To be fair, the game runs at 30fps with drops on PS4 and that is a huge distinction on the CPU department. As for your Teraflops argument, things don't work that linear on computers, the RX 580 may have 3x the compute power, but it doesn't have 3x the memory bandwidth. Resolution and frame rate don't scale linearly in requirements.

AMD is notoriously famous about their low performance in DX11 titles, if someone is to blame here is AMD. I, for one, love DX11 games. No cache compilation, no spikes in frame time, and having an Nvidia GPU, great performance all around.
 

Ev1L AuRoN

Member
AMD's super resolution seems ace @ Ultra @ 1440p. I can't tell the difference with native and it edges the game to 60fps on my aging rig with Ulta textures and High everything (but I think I'll Ultra everything and lock to 30fps instead but still use this as it really looks the same). Maybe it's a tiny bit softer at the distance but without really losing detail, if anything maybe giving it a slightly less aliased look (probably even more unnoticable if you use film grain and such, which I don't) but maybe not even that. I took a bunch of screens to compare and couldn't tell them apart so need to redo 🤦‍♂️

Edit: I skimmed the video and going by the time stamps they don't even test AMD's thingie, only DLSS? Come on guys, not everyone jumped to the RTX series. I have a 1080 and Nvidia locks me out of that stuff while AMD doesn't and, again, it really does look the same as native @ Ultra. I guess it's possible there's some artifacting in motion and/or if there are effects like heat distortion and motion blur and everything else applied on top of the image but again, are you really going to notice that while moving and playing rather than taking a snapshot of the action and checking it out later?

Edit: got the screens proper this time, can anyone really tell which is which in these stills, never mind while simply playing the game? Note I have film grain off but motion blur is at the default 10 and the camera isn't 100% immobile even when you're standing still.

1440p Native with Ultra textures and High everything else:

1440p Ultra AMD SR with Ultra textures and High everything else:

1440p Native with Ultra (and +) settings:

1440p Ultra AMD SR with Ultra (and +) settings:

Bonus shots showing some bad lod regardless of settings, the lighting on that door goes flat just a few feet away then pops closer (or is it the polygonal/normal mapped/tessellated/whatever relief design that goes flat so the lighting naturally follows, I dunno).

I think the game's native scalling probably do a better job than FSR. I mean, why waste time on a feature that don't do anything new or better than what we already have.
 

Kenpachii

Member
AMD's super resolution seems ace @ Ultra @ 1440p. I can't tell the difference with native and it edges the game to 60fps on my aging rig with Ulta textures and High everything (but I think I'll Ultra everything and lock to 30fps instead but still use this as it really looks the same). Maybe it's a tiny bit softer at the distance but without really losing detail, if anything maybe giving it a slightly less aliased look (probably even more unnoticable if you use film grain and such, which I don't) but maybe not even that. I took a bunch of screens to compare and couldn't tell them apart so need to redo 🤦‍♂️

Edit: I skimmed the video and going by the time stamps they don't even test AMD's thingie, only DLSS? Come on guys, not everyone jumped to the RTX series. I have a 1080 and Nvidia locks me out of that stuff while AMD doesn't and, again, it really does look the same as native @ Ultra. I guess it's possible there's some artifacting in motion and/or if there are effects like heat distortion and motion blur and everything else applied on top of the image but again, are you really going to notice that while moving and playing rather than taking a snapshot of the action and checking it out later?

Edit: got the screens proper this time, can anyone really tell which is which in these stills, never mind while simply playing the game? Note I have film grain off but motion blur is at the default 10 and the camera isn't 100% immobile even when you're standing still.

1440p Native with Ultra textures and High everything else:

1440p Ultra AMD SR with Ultra textures and High everything else:

1440p Native with Ultra (and +) settings:

1440p Ultra AMD SR with Ultra (and +) settings:

Bonus shots showing some bad lod regardless of settings, the lighting on that door goes flat just a few feet away then pops closer (or is it the polygonal/normal mapped/tessellated/whatever relief design that goes flat so the lighting naturally follows, I dunno).


More data is always nice, however FSR goes in general from worse then native to far worse then native in titles, DLSS is interesting to see how it weights up against native and CB i would say to see what preset of DLSS mirrors CB over to see what nvidia gpu u need to compete against PS5 version. what FSR does is kinda not much interesting after that anymore.
 

amscanner

Neo Member
Anyone find too much sharpening with dlss? Playing 4k DLSS Quality and Clear image is good thing but it seems like killing atmosphere in the game a little bit.
 

rofif

Gold Member
It'll always be dumb that the obvious has to be stated but it never fails people come in here with drive-by comments that make no sense.

People saying "the differences aren't that noticeable lulz" pretend as if people are running out to build a PC exclusively for God of War. How about its just another way to play an amazing game that just so happens to now be the best way to play it no matter how big or small the differences may be?

Also assuming such a argument above is as dumb as assuming PC players are gonna rush out to buy a PS5 to play Ragnarok, that also isn't happening. Cause again PC players care about improvements even when to you they may be insignificant.

It will be funny though when Ragnarok launches on PS5 cause I can guarantee you any graphical improvement will be gushed over as the greatest achievement in videogame entertainment history.

Until the port comes to PC that is...
The second argument is not invalid. People do rush out to buy consoles for exclusives.
I got 360 for gears of war and gta4 initially.
I got PS4 for bloodborne and the last guardian. This decisions were made only after learning this games do not come out on pc.
 

Brofist

Member
It'll always be dumb that the obvious has to be stated but it never fails people come in here with drive-by comments that make no sense.

People saying "the differences aren't that noticeable lulz" pretend as if people are running out to build a PC exclusively for God of War. How about its just another way to play an amazing game that just so happens to now be the best way to play it no matter how big or small the differences may be?

Also assuming such a argument above is as dumb as assuming PC players are gonna rush out to buy a PS5 to play Ragnarok, that also isn't happening. Cause again PC players care about improvements even when to you they may be insignificant.

It will be funny though when Ragnarok launches on PS5 cause I can guarantee you any graphical improvement will be gushed over as the greatest achievement in videogame entertainment history.

Until the port comes to PC that is...
It's almost like the exclusivity is what they are a fan of, not even the game itself. Notice the crickets when GoW for PC was said to have full DualSense support.
 
Last edited:

ZywyPL

Gold Member
No fov setting is kinda of a joke, horizon had it.

They didn't spend so much time figuring out the perspective and camera distance to put a FOV slider later on. There are GDC presentations from SSM engineers about all the road it took to get this aspect just right, which was driven by getting the game's combat right. FOV in GoW is basically a gameplay design choice, so I'm personally not surprised it's not there in the PC version
 

MonarchJT

Banned
The game isn’t running through BC on the PC though.
my gosh ..why don't give up master? there is not always an excuse for everything. It is also okay to accept things as they are and see the reality of the facts. We have a version of the game that has a number of improvements and that run doesnt it matter which "port mode" just better ...the same game no matter what and how runs worse and without improvements on the ps5. The fault is certainly not of the players but of software or console design if it cannot exploit the potential hardware or if unlike other hw it needs a different code to run. Things are as they are and will continue to be unless Cerny or Sony change the way the PS5 performs BC
 
Last edited:

GymWolf

Gold Member
They didn't spend so much time figuring out the perspective and camera distance to put a FOV slider later on. There are GDC presentations from SSM engineers about all the road it took to get this aspect just right, which was driven by getting the game's combat right. FOV in GoW is basically a gameplay design choice, so I'm personally not surprised it's not there in the PC versio
In the mean time, 99,99% of pc games have a fov slider that usually improve the gameplay because you can see more shit on screen and many people always hated how close the camera is to kratos (or aloy but you can fix this on pc)

No offense, but on console, close up camera is mostly to save performances, maybe for gow is also an artistic\gameplay choice because of the one camera shot thing, but on pc you have to give a choice to the player, that's what pc is for, if then people discover that playing with a better fov is worse they can still go back to a close up camera.

When mods are gonna come out to adjust the fov and the game is gonna play even better, this whole discussion is gonna look pretty darn funny.
 
Last edited:

ZywyPL

Gold Member
because you can see more shit on screen and many people always hated how close the camera is to kratos

Timestamped:



This whole video sadly summarizes everything I hate in the new GoW, dunno why SSM decided to change the gameplay so radically, "if it ain't broken, don't fix it" as they say, other than the title there's nothing left from the good old series.
 

rofif

Gold Member
Timestamped:



This whole video sadly summarizes everything I hate in the new GoW, dunno why SSM decided to change the gameplay so radically, "if it ain't broken, don't fix it" as they say, other than the title there's nothing left from the good old series.
Trying new stuff is great.
We already have ideal old god of war format in GOW3. Would be hard to beat anyway.
So why not try something new.

btw. GOW3 remaster desperately needs 4k patch...
 
Last edited:

GymWolf

Gold Member
Timestamped:



This whole video sadly summarizes everything I hate in the new GoW, dunno why SSM decided to change the gameplay so radically, "if it ain't broken, don't fix it" as they say, other than the title there's nothing left from the good old series.
Yeah the opposite for me, never been a super fan of gow before the 2018 iteration.

I don't even want a very far away camera, just a bit zoomed out compared to vanilla settings, i kinda like close camera even if gameplay gets worse.
 

ZywyPL

Gold Member
Trying new stuff is great.

Of course, but if you change a game that radically, basically make a completely new game, why not change the title and protagonist all along? That's all that's missing here, ditching the old name and Kratos in favor of completely new series, and no one would bat an eye.
 
Dlss is oversharpened. Not a single mentioning from the nvidia sponsored cuck. Fsr not even tested. And people ask why foundry sucks?
 
Last edited:

rofif

Gold Member
Of course, but if you change a game that radically, basically make a completely new game, why not change the title and protagonist all along? That's all that's missing here, ditching the old name and Kratos in favor of completely new series, and no one would bat an eye.
Sure. I think that's fair enough.
They could just make it a new IP and I wonder how much would it lost
 
Anyone find too much sharpening with dlss? Playing 4k DLSS Quality and Clear image is good thing but it seems like killing atmosphere in the game a little bit.

Yep. It's my main issue with DLSS. There's a dev version of DLSS you can use to turn of sharpening but it has a watermark that you can't turn off. Looks way better with sharpening off.
 
WTF dude. You can't understand the difference between concepts?
In game native support for dual sense is a different concept from connection interface.
And an in game option to change fov is a different concept from aspect ratio in a monitor.

For me, native support would be if the controller had 100% of its functionality (better triggers, vibrations, etc) wireless. It's not the case. You can only do this with wire, in the same way that Metro and Valhalla have been doing for a while. So much so that it is necessary that you disable Playstation control profile in the steam controller a panel. I wouldn't call it native support.


And yes, I understand that changing FOV is different from changing aspect ratio. However, I realized, playing and doing tests that if you change the aspect ratio to 21:9, even if your TV monitor is 16:9, you have a gain in FOV that gives you black bars on top and bottom of the image. do the test.
Here, in this case, I believe that I lacked better elaboration of what I wanted to say the first time.
 

winjer

Member
For me, native support would be if the controller had 100% of its functionality (better triggers, vibrations, etc) wireless. It's not the case. You can only do this with wire, in the same way that Metro and Valhalla have been doing for a while. So much so that it is necessary that you disable Playstation control profile in the steam controller a panel. I wouldn't call it native support.


And yes, I understand that changing FOV is different from changing aspect ratio. However, I realized, playing and doing tests that if you change the aspect ratio to 21:9, even if your TV monitor is 16:9, you have a gain in FOV that gives you black bars on top and bottom of the image. do the test.
Here, in this case, I believe that I lacked better elaboration of what I wanted to say the first time.

So you really don't understand these concepts. ok then.
 

Cryio

Member
The port is a POS. An AAA DX11 only game in 2022, with no DX12 or Vulkan? Performance is abysmal during loading transitions and when looking at areas with some draw distance on AMD GPUs. Settings, FSR, resolution, doesn't matter if you don't have a top of the line CPU.

I have a 5700 XT + R5 3600 and God of War is the worst PC port performance I've seen In a long while from a AAA. It's Batman Arkham Knight or Dishonored 2 at launch kind of bad.

People kept blasting Final Fantasy 7 Remake of being "a really bad port". Hell nah. That game runs like a dream at 120 fps on DirectX11 and 12 on tons of GPUs and CPUs, unlike God of War.
 
Last edited:
Flawless widescreen includes a FOV slider for this game now:

www.flawlesswidescreen.org



When you increase the FOV it also disables the depth of field.

I was about to post showing that photo mode has FOV already ingame slider so why can't we sue one during gameplay?

photo mode ~100 FOV [82 + a little zoomed out]


meanwhile in game this is 21:9 aspect and still looks like a bad joke [16:9 is completely unplayable]
 

MasterCornholio

Gold Member
my gosh ..why don't give up master? there is not always an excuse for everything. It is also okay to accept things as they are and see the reality of the facts. We have a version of the game that has a number of improvements and that run doesnt it matter which "port mode" just better ...the same game no matter what and how runs worse and without improvements on the ps5. The fault is certainly not of the players but of software or console design if it cannot exploit the potential hardware or if unlike other hw it needs a different code to run. Things are as they are and will continue to be unless Cerny or Sony change the way the PS5 performs BC

I’ve already accepted the PC version being better. It’s Sonys fault for not making a native PS5 version for PS5 owners. We all know if they wanted to the game on PS5 could be better instead of just having a simple frame rate unlock.

Also these same issues affect the Series consoles in case you didn’t notice. Not every X1 game runs at higher settings on the XSX unless the developer programs it to. What is true is that it’s easier to implement improvements into BC on the Series consoles. But neither console works like PC where all you do is adjust the settings to obtain the desired results.

It’s more of the fault of how consoles are designed than anything else. PC is vastly different since it’s a wide selection of hardware.
 
Last edited:

yamaci17

Gold Member
forced sharpening is not welcome

make it a toggle. have it on by default. problem is solved. no need to force it upon people who does not like it. i made my peace with TAA being mandatory in games. but now they're literally enforcing sharpeners upon people. these are literally post processing effects. it distorts the natural beauty of the game's image. i respect people who like the sharpened look. i guess that is why they just force it on and forget about it. i hate it. i hate it with all my conscious being. I HATE THE DAMN SHARPENING.
 
Last edited:
forced sharpening is not welcome

make it a toggle. have it on by default. problem is solved. no need to force it upon people who does not like it. i made my peace with TAA being mandatory in games. but now they're literally enforcing sharpeners upon people. these are literally post processing effects. it distorts the natural beauty of the game's image. i respect people who like the sharpened look. i guess that is why they just force it on and forget about it. i hate it. i hate it with all my conscious being. I HATE THE DAMN SHARPENING.

Yep. There are ways to enable sharpening already for those who want it through GFE or Reshade. Forced sharpening sucks, it usually looks like shit too.
 
Last edited:
Top Bottom