• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

God of War’s director would love to see the game on the PC, but this decision is above his paygrade

That would've been lovely to see God Of War on PC with enhanced graphics. But that is one of the most famous Sony exclusives, so they won't port it to PC
 

psorcerer

Banned
The entire point of hardware abstraction layers is that they are hardware agnostic.

Yes, it's true for PC abstractions.
But consoles don't need it.

P.S. you can open this old presentation http://twvideo01.ubm-us.net/o1/vault/gdceurope2013/Presentations/825424RichardStenson.pdf
And if you study it you will see quite a lot of hw specific things that are allowed in PSSL alone.
Yes, almost all of it is available in Vulkan, but only as GCN-specific extensions, i.e. will not work on Nvidia, at all, for example.
And all of these extensions are performance-oriented. Except PRT, which is kinda new architecture, you need to build around.
 
Last edited:
Yes, it's true for PC abstractions.
But consoles don't need it.
Oh really. How is Sony going to achieve 100% PS4 backwards compatibility on the PS5? The PS5, if I'm not mistaken, has a totally different GPU architecture does it not? I'll say it again. The entire point of hardware abstraction layers is that they are hardware agnostic. If you know exactly what GPU you're working with you're better off going bare metal. As for your point about extensions...kinda null. Both AMD and NVIDIA provide hardware specific extensions of their own for the APIs on PC that are...guess what...performance oriented. That goes for OpenGL, DirectX, and Vulkan. Developers don't necessarily have to use them, just like on console, but they do exist.
 

DeepEnigma

Gold Member
Oh really. How is Sony going to achieve 100% PS4 backwards compatibility on the PS5? The PS5, if I'm not mistaken, has a totally different GPU architecture does it not? I'll say it again. The entire point of hardware abstraction layers is that they are hardware agnostic. If you know exactly what GPU you're working with you're better off going bare metal. As for your point about extensions...kinda null. Both AMD and NVIDIA provide hardware specific extensions of their own for the APIs on PC that are...guess what...performance oriented. That goes for OpenGL, DirectX, and Vulkan. Developers don't necessarily have to use them, just like on console, but they do exist.

The IPC is still GCN even if the microarchitecture is RDNA. Just like the IPC on CPUs are x86/64 with different architectures, and nVidia has been also been using the same IPC, with different architectures. All for compatibility.
 
Last edited:
The IPC is still GCN even if the microarchitecture is RDNA. Just like the IPC on CPUs are x86/64 with different architectures, and nVidia has been also been using the same IPC, with different architectures. All for compatibility.
While that may be true that doesn't mean you can go bare metal and expect it to work on any GPU model other than the one you develop it for. It might work. That's about as much as you can say. Not only that but it'll never work on a GPU of a different architecture. That's why you need hardware abstraction. It basically makes it a non-factor, at a small cost to performance. The PS4 and Xbox One were built with backwards compatibility in mind. They could slap an NVIDIA GPU in the PS5. with the appropriate drivers of course, and PS4 games would still work.
They are not used in PC world
Of course they are. For example, for months Sonic Ether's Path-traced GI shaders worked on NVIDIA but not AMD cards. Do you know why? Because NVIDIA provides an extension to the OpenGL API (NV_geometry_shader4 specifically) that lets geometry shaders accept quads, or perhaps more accurately automatically tessellates them into tris, and AMD doesn't. On AMD cards geometry shaders only support tris. It does work now because SE found a way around it, if I recall correctly he axed the geometry shaders.

This idea that you have that extensions aren't used on PC...it's just demonstrably wrong. Of course specific extensions are going to be different between the platforms but that doesn't change the fact that extensions exist on PC.
 
You can read the RDR2 for PC OT to get the idea how "good" the performance of a "console first" title looks on PC.
Running at far higher settings than the console version, targeting twice the framerate, at native 4K. Set the graphical settings at, or close to, console quality and the console's performance will get shat on from such a great height that fanboys won't even know what happened.
 
No. No and No.
Yes. Yes and Yes.
I've been gaming on PC for long enough to know that if you target console quality the consoles get destroyed,. Every. Single. Time. More often than not you can push past 60FPS when the console you're "emulating" struggles to even hold 30.
 

psorcerer

Banned
I've been gaming on PC for long enough to know that if you target console quality the consoles get destroyed,. Every. Single. Time. More often than not you can push past 60FPS when the console you're "emulating" struggles to even hold 30.

Numbers say otherwise.
RDR2 is barely running 1080p/60fps on top PC GPUs with X1X settings. Sorry to break it to you.
 
Numbers say otherwise.
RDR2 is barely running 1080p/60fps on top PC GPUs with X1X settings. Sorry to break it to you.
Ultra (or a custom mix that is visually near-indistinguishable) is not X1X settings. Sorry to break it to you. 80FPS average also isn't "barely 60FPS".
 

psorcerer

Banned
Ultra (or a custom mix that is visually near-indistinguishable) is not X1X settings. Sorry to break it to you. 80FPS average also isn't "barely 60FPS".

I'm not sure you understand how to measure performance of a modern game properly.
Hint: "average" fps - is not that.
Horrible frame pacing and microfreezes never show in "average" numbers.
 
I'm not sure you understand how to measure performance of a modern game properly.
Hint: "average" fps - is not that.
Horrible frame pacing and microfreezes never show in "average" numbers.
When people talk about poor framepacing they're generally talking about issues with, for example, a wonky 30FPS cap where the frametimes rapidly waver between 50.1ms and 16.7ms. This is down to the cap and is generally a non-issue on PC where you can force driver level vsync or use RTSS to limit the framerate if the in game options are wanky. As for microstutters...well they didn't provide 1% and 0.1% lows...so neither of us have that data. However you should know that 0.1% lows are rarely down to GPU bottlenecks, and are usually down to asset streaming or momentary CPU load spikes.
 

psorcerer

Banned
When people talk about poor framepacing they're generally talking about issues with, for example, a wonky 30FPS cap where the frametimes rapidly waver between 50.1ms and 16.7ms. This is down to the cap and is generally a non-issue on PC where you can force driver level vsync or use RTSS to limit the framerate if the in game options are wanky. As for microstutters...well they didn't provide 1% and 0.1% lows...so neither of us have that data. However you should know that 0.1% lows are rarely down to GPU bottlenecks, and are usually down to asset streaming or momentary CPU load spikes.

I have tested it myself with OCAT.
Vulkan renderer is full of stutters. You can get to 500 dropped frames per minute.
DX12 renderer is much better, but frame times are all over the place. 99.9% - 25 ms in 1080p
It doesn't really matter which resolution and details. Because PC problems are not in brute forcing the game. But in the architecture itself.
 
I have tested it myself with OCAT.
Vulkan renderer is full of stutters. You can get to 500 dropped frames per minute.
DX12 renderer is much better, but frame times are all over the place. 99.9% - 25 ms in 1080p
Ignoring the issue you had with the Vulkan renderer... On what hardware? At what settings? Did you test in a wide variety of in game situations? Did you take the time to verify there was not some issue on your end? Where's the graph? Can others confirm your results? Did you try instituting a sane cap and seeing if the frametimes leveled out? This "data" is useless without context.
 
If he doesn't understand why it isn't then it's good that it's above his pay grade.
While I probably agree with you on the reason why there are exclusives for each platform, I am pretty sure Cory understands it too well, but he would still love to see a version of God of War that runs flawlessly in glorious 21:9 120+FPS...

Probably in the same manner Nintendo staff (not PR) reacted to seeing Super Mario 64 and other N64 titles in higher resolution back in the days.
 

Xmengrey

Member
While I probably agree with you on the reason why there are exclusives for each platform, I am pretty sure Cory understands it too well, but he would still love to see a version of God of War that runs flawlessly in glorious 21:9 120+FPS...

Probably in the same manner Nintendo staff (not PR) reacted to seeing Super Mario 64 and other N64 titles in higher resolution back in the days.

Exclusivity between consoles may matter.
Between console and PC does not.
The reason why you have exclusives for consoles is because they want you to invest in the echo system
PC Gamers will never invest in a console ecosystem. They will never buy third party titles there just the exclusives.
Hardware sales are also very thin margins or sometimes no money at all.

PC Gamers mainly buy games digitally not physically in fact I believe most gaming PCs don't even have disc drives.
Console gamers are mainly physical purchasers thus the incentive to buy a console and invest in the ecosystem is there.
 
Top Bottom