• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

IGNxgame: Uncharted: Legacy of Thieves Collection PC vs. PS5 Performance Review

ACESHIGH

Banned
SlimySnake SlimySnake I am all for devs taking advantage of console hardware.
But when porting something to PC they should do a good job. After all we are still paying full price for these games.

If a console release was as broken as these ports they would fix them on the double.
one thing I give to the studios working on these ports is that eventually their games get patched and left in a somewhat decent state. They had lots of back and forth communication on steam forums regarding issues to manage customers expectations.

Let's hope Sony straights iron galaxy out so that the game is fixed. It's one of their flagship IPs after all and it's selling like dogs hit. Sure as hell the port quality has something to do with it.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
No doubt that the Uncharted 4 port is a bad one. Making the comparison more pointless.
By that logic, all comparisons are pointless. How many has Alex done? AC Valhalla, Hitman 3, Call of Duty, Deathloop. He's constantly comparing PS5 ports of third party games that may or may not have been optimized to take full advantage of the PS5 hardware. But he has no problems doing those comparisons. Notice how he doesnt do a FPS comparison in this video? Why? Just because it doesnt fit his narrative that a PS5 is a 2060? No one is going to look at the Spiderman and Uncharted benchmarks to say that the PS5 is equivalent to a 3070, but Alex had no problems claiming the 2060 is equivalent to a PS5 literally days after launch.

At the end of the day, all this proves is that every game is different, and using one game to determine the GPU power of the PS5 is ridiculous. The PS5 GPU will punch above its weight on AMD sponsored titles, and we are now seeing that it clearly punches above even higher tflops AMD GPUs in exclusives made by Sony first party studios.

P.S I would love to see this game running on a Radeon 7850 or the GTX 750 Ti DF tried for so many years to present as a PS4 alternative.
 

SlimySnake

Flashless at the Golden Globes
SlimySnake SlimySnake I am all for devs taking advantage of console hardware.
But when porting something to PC they should do a good job. After all we are still paying full price for these games.

If a console release was as broken as these ports they would fix them on the double.
one thing I give to the studios working on these ports is that eventually their games get patched and left in a somewhat decent state. They had lots of back and forth communication on steam forums regarding issues to manage customers expectations.

Let's hope Sony straights iron galaxy out so that the game is fixed. It's one of their flagship IPs after all and it's selling like dogs hit. Sure as hell the port quality has something to do with it.
I guess I am a bit behind on the issues. I know its not performing well as it should, but I wasnt aware of any bugs beyond the motion blur issue Alex pointed out in this video. What else is broken?

I doubt its like HZD which had crashes, bugs, and stutters galore that took 6 months to fix. I doubt motion blur would take that long.
 

SlimySnake

Flashless at the Golden Globes
Good point and I just checked as you asked... 1-2 cores are 100% loaded only during game load, but during gameplay, all 16 threads are almost evenly loaded in a CPU-intensive scene like this.

Check it out:
FPS: 86
vlcsnap-2022-10-26-21h24m19s434.png


Let's see the same scene on PS5:
FPS: 117
Is this on 1440p or 1080p mode?

Testing at higher resolutions should reduce the CPU overhead somewhat. In my testing, i wasnt able to use native 4k because my LG CX cant correctly show the FPS below 60 fps, but in 1440p where the PS5 tops out at 90 fps, i saw the 3070 stay around 80-85 fps. Like the spot at the very start of Chapter 6 (The auction level). Wish NX Gamer had done more native 4k framerate tests. But you can probably use his madacasgar level benchmarks that show the PS5 around 45-50 fps max since it isnt CPU bound in that scene like it is in his Chase sequence 4k footage.
 

ACESHIGH

Banned
I guess I am a bit behind on the issues. I know its not performing well as it should, but I wasnt aware of any bugs beyond the motion blur issue Alex pointed out in this video. What else is broken?

I doubt its like HZD which had crashes, bugs, and stutters galore that took 6 months to fix. I doubt motion blur would take that long.

It's worse than HZD beyond motion blur issues, broken cutscenes, stuttering and more. HZD/Days Gone had acceptable performance on console equivalent hardware. This one far from it ,which is the main criticism with this game.

The thing is that most of these outlets like DF never point those issues out because they test games on overpowered hardware and are happy as long as they get more than 60 FPS. Very low standards.
 

rofif

Can’t Git Gud
Nick dropped a video.
2:20 - every cutscene transition is prerendered at 30fps.... OUCH WHAT ?!
It's real time on ps5 I think, so tis means pc IO is really lacking behind if they can't switch so quick
 

Md Ray

Member
Is this on 1440p or 1080p mode?
Using the same settings as PS5 in Performance+ mode, so 1080p.

Testing at higher resolutions should reduce the CPU overhead somewhat. In my testing, i wasnt able to use native 4k because my LG CX cant correctly show the FPS below 60 fps, but in 1440p where the PS5 tops out at 90 fps, i saw the 3070 stay around 80-85 fps. Like the spot at the very start of Chapter 6 (The auction level). Wish NX Gamer had done more native 4k framerate tests. But you can probably use his madacasgar level benchmarks that show the PS5 around 45-50 fps max since it isnt CPU bound in that scene like it is in his Chase sequence 4k footage.
I was purely benching the CPU performance there. I'll do native 4K framerate tests for you, can you link me to NXG's Madagascar level benchmark? Is it on his channel or IGN?
 
Last edited:

SlimySnake

Flashless at the Golden Globes
Using the same settings as PS5 in Performance+ mode, so 1080p.


I was purely benching the CPU performance there. I'll do native 4K framerate tests for you, can you link me to NXG's Madagascar level benchmark? Is it on his channel or IGN?
It’s in the video in the op. Around half way through the video.
 

Md Ray

Member
Nick dropped a video.
2:20 - every cutscene transition is prerendered at 30fps.... OUCH WHAT ?!
It's real time on ps5 I think, so tis means pc IO is really lacking behind if they can't switch so quick

It's the same on PS5. Those brief pre-rendered cutscenes run at 30fps in 40fps mode and likely in other HFR modes as well.

Source:
Digital Foundry said:
Also the brief interstitial videos that sometimes bridge real-time cutscenes and gameplay playback at a straight 30fps
Timestamped:
 
Last edited:

rofif

Can’t Git Gud
It's the same on PS5. Those brief pre-rendered cutscenes run at 30fps in 40fps mode and likely in other HFR modes as well.

Source:

Timestamped:

oh this sucks ass. Even 60fps mode has 30fps transitions?
I played 4k30, so I wouldn't know. That's why I didn't say for sure
 

Md Ray

Member
It’s in the video in the op. Around half way through the video.
There you go. PS5 vs PC 4K benchmark with settings as close as possible to PS5's Fidelity Mode (i.e. High). Keep in mind the LOD and motion blur bug still remains on the PC version.

I tried to mimic/match NXG's gameplay as best as I can and 4K is still being processed by YT.


EDIT: In short, PS5 is performing like an RTX 3070 at 4K, even outperforms the 3070 a touch in some scenarios!
It's also worth mentioning that my 3070 is not running at stock speeds. Along with the core clock, the memory is also overclocked giving the GPU 512 GB/s of mem bandwidth. And the PS5 with 448 GB/s (shared between GPU & CPU) is able to match and exceed the 3070's perf.
 
Last edited:
There you go. PS5 vs PC 4K benchmark with settings as close as possible to PS5's Fidelity Mode (i.e. High). Keep in mind the LOD and motion blur bug still remains on the PC version.

I tried to mimic/match NXG's gameplay as best as I can and 4K is still being processed by YT.

Nice comparison. So using that very GPU limited settings PS5 (GPU) is basically performing like a 3070 in the cutscene and even a bit better during gameplay with consistantly lower lows FPS on PC (so it's like performing like a... 3070ti during gameplay?).

No wonder DF didn't want to make a direct comparison as it would not have being good for their narrative they are pushing since the beginning! And obviously their high end 5ghz CPU would have not helped their lowly 2060 Super here.
 
Last edited:
That first PS5 version of COD has alpha effects on consoles reduced to 1/4 resolution compared to the PC version which is much easier on the ROPs and overall bandwidth. Furthermore, the latest GeForce patch also provided some strong performance improvements in some games.

In COD Warzone for instance, fps improved by as much as 44% in some games.




The 2080 Ti also outperforms the 3070 Ti and 3070 in some instances in Spider-Man. Some guys on Beyond3D also posted screenshots where the 2080 Ti outperformed the PS5 by 40%+ in some scenes.

There was something strange going on with the VRAM and I'm guessing the BVH structure being extremely heavy on the CPU as stated by Nixxes has something to do with the performance inconsistencies.

PCs and PS5 have different configurations so different bottlenecks will occur. PS5 seems to be doing excellently during rapid streaming but is ostensibly still bandwidth constrained with even in-house devs still opting for lower AF when 16x AF has been free on PC for years. Those 1 to 1 comparisons are often flawed because as I said, different scenes will hit different areas differently (whew, that's a lot of different). It's far more complicated than just PS5 GPU>2070>2080S etc. The PS5 is the sum of all its parts, not just the GPU that can be isolated.

The beyond 3d forums users were using better cpus and sometimes even doing things like lowering texture quality, JaĂ­r quality, or scene density.
 
There you go. PS5 vs PC 4K benchmark with settings as close as possible to PS5's Fidelity Mode (i.e. High). Keep in mind the LOD and motion blur bug still remains on the PC version.

I tried to mimic/match NXG's gameplay as best as I can and 4K is still being processed by YT.


EDIT: In short, PS5 is performing like an RTX 3070 at 4K, even outperforms the 3070 a touch in some scenarios!
It's also worth mentioning that my 3070 is not running at stock speeds. Along with the core clock, the memory is also overclocked giving the GPU 512 GB/s of mem bandwidth. And the PS5 with 448 GB/s (shared between GPU & CPU) is able to match and exceed the 3070's perf.

Hey I just subbed I love the benchmarks and how quick you are to find things will you test last of us, miles, and other games when they come out?
 

SlimySnake

Flashless at the Golden Globes
There you go. PS5 vs PC 4K benchmark with settings as close as possible to PS5's Fidelity Mode (i.e. High). Keep in mind the LOD and motion blur bug still remains on the PC version.

I tried to mimic/match NXG's gameplay as best as I can and 4K is still being processed by YT.


EDIT: In short, PS5 is performing like an RTX 3070 at 4K, even outperforms the 3070 a touch in some scenarios!
It's also worth mentioning that my 3070 is not running at stock speeds. Along with the core clock, the memory is also overclocked giving the GPU 512 GB/s of mem bandwidth. And the PS5 with 448 GB/s (shared between GPU & CPU) is able to match and exceed the 3070's perf.

Amazing. Thanks man!
 
As usual with most PS5 vs PC comparisons they are using any ways they can to twist the comparisons in order to make PS5 look bad by:

- Using cherry picked short scene or frame instead of bigger scene average.
- Using a high end 5Ghz CPU + 2060 super in CPU limited scenes and claiming they are comparing against PS5 GPU only (and not the CPU + GPU + API combination like for instance NXGamer righthfully says).
- When PS5 still beats say a 3070 they simply refuse to compare them by claiming x or y reason (which don't prevent them to make comparisons in plenty others analysis).
- And finally when comparison is possible they refuse to compare PC against PS5 uncapped when PS5 has a uncapped VRR mode (like in Spider-man).

This has being the case in almost all Alex PC vs PS5 comparisons to date notably Death Stranding, God of War and all the Uncharted remasters.
I want them to retest death stranding but have the gpus paired with a 3700x instead of the 12900k they initially had
 

Gaiff

Gold Member
The beyond 3d forums users were using better cpus and sometimes even doing things like lowering texture quality, JaĂ­r quality, or scene density.
No, they used equivalent settings and you're there too. Please don't lie. I'm talking specifically about the opening cutscene on the 2080 Ti.
 
Last edited:

rodrigolfp

Haptic Gamepads 4 Life
There you go. PS5 vs PC 4K benchmark with settings as close as possible to PS5's Fidelity Mode (i.e. High). Keep in mind the LOD and motion blur bug still remains on the PC version.

I tried to mimic/match NXG's gameplay as best as I can and 4K is still being processed by YT.


EDIT: In short, PS5 is performing like an RTX 3070 at 4K, even outperforms the 3070 a touch in some scenarios!
It's also worth mentioning that my 3070 is not running at stock speeds. Along with the core clock, the memory is also overclocked giving the GPU 512 GB/s of mem bandwidth. And the PS5 with 448 GB/s (shared between GPU & CPU) is able to match and exceed the 3070's perf.

Why is it blurry af in fidelity mode?
 
Last edited:

Gaiff

Gold Member
There you go. PS5 vs PC 4K benchmark with settings as close as possible to PS5's Fidelity Mode (i.e. High). Keep in mind the LOD and motion blur bug still remains on the PC version.

I tried to mimic/match NXG's gameplay as best as I can and 4K is still being processed by YT.


EDIT: In short, PS5 is performing like an RTX 3070 at 4K, even outperforms the 3070 a touch in some scenarios!
It's also worth mentioning that my 3070 is not running at stock speeds. Along with the core clock, the memory is also overclocked giving the GPU 512 GB/s of mem bandwidth. And the PS5 with 448 GB/s (shared between GPU & CPU) is able to match and exceed the 3070's perf.

Interestingly, my 2080 Ti outperforms your 3070 by 10-15% in those scenes.
 
No, they used equivalent settings and you're there too. Please don't lie. I'm talking specifically about the opening cutscene on the 2080 Ti.
Interesting. 2080 Super comparison has PS5 ahead in similar scenarios. This guy is using a i7-9700k which is 4.9 GHz, but only 8 core and 8 threads and my PS5 is roughly 15-20 fps ahead. Figured the CPU was bottlenecking the 2080 Super, but then I saw another 3070 Ti bottleneck, and the PS5 is roughly the same performance at 1440p High Settings despite being paired up with a 5900x.

Pretty impressive showing for the PS5 here. Either that, or this game is really poorly optimized for PC lol.





NX Gamer's results at least in this game might not be that far off if those 2080 Super and 3070 Ti benchmarks are an indication. I found a 3080 1440p benchmark but need to run through that level on my PS5 to see how it fares. I am unable to find any 2070 super benchmarks on youtube that arent using DLSS.


I saw the thread they said some of the settings dont matter
 
Have they even started they are working on patching this game? What a shit show.

Hopefully the PC version of the last of us is handled by a more competent developer. Let's see how Sackboy turns out this week.
Sack boy can’t be benchmarked until they give the ps5 versión vrr support it’s still capped to 60
 

Gaiff

Gold Member
I saw the thread they said some of the settings dont matter
Don't know who is the "they" that you're referring to. The 2080 Ti used equivalent settings and was massively ahead of the PS5.

As I said, it varies a lot and trying to isolate only the GPU on a PS5 is a tall task. It's difficult to tell where the bottleneck even is since there are no monitoring tools like on PC. The best analysts do is guesswork.
 
Death Stranding on PS5 performs on par with a 3060, even 3070 in some scenes. Shouldn't people complain that it was a bad PC version because the game runs so good comparatively on PS5? So the outcome is about the same as with Uncharted actually.
It’s in between a 3070 and 3070 ti in that game when using a 3700x ad the cpu
 

Gaiff

Gold Member
Likely due to it having more VRAM.
I was thinking perhaps that, because the game states it uses 7.5GB at those settings but your 3070 doesn't appear to hit any sort of VRAM limit so it's curious. The lower bound is 7% and the upper bound is 16% in terms of differential in favor of my 2080 Ti.
What cpu do you have?
9900K so a notch better for gaming. Could be the reason.
 
Last edited:
Using the same settings as PS5 in Performance+ mode, so 1080p.


I was purely benching the CPU performance there. I'll do native 4K framerate tests for you, can you link me to NXG's Madagascar level benchmark? Is it on his channel or IGN?
I can link both the video on his channel tests more of the 4k mode
 
There you go. PS5 vs PC 4K benchmark with settings as close as possible to PS5's Fidelity Mode (i.e. High). Keep in mind the LOD and motion blur bug still remains on the PC version.

I tried to mimic/match NXG's gameplay as best as I can and 4K is still being processed by YT.


EDIT: In short, PS5 is performing like an RTX 3070 at 4K, even outperforms the 3070 a touch in some scenarios!
It's also worth mentioning that my 3070 is not running at stock speeds. Along with the core clock, the memory is also overclocked giving the GPU 512 GB/s of mem bandwidth. And the PS5 with 448 GB/s (shared between GPU & CPU) is able to match and exceed the 3070's perf.

So in simple terms it’s performing like a 3070ti in gameplay sequences
 

Md Ray

Member
Hey I just subbed I love the benchmarks and how quick you are to find things will you test last of us, miles, and other games when they come out?
Thank you! I definitely am planning on taking a look at all those games you mentioned and more. Can't wait to test out The Last of Us in particular when it comes out.
Amazing. Thanks man!
You're welcome! What do you think? A 3700X+3070 PC is falling short in 120fps mode, 4K mode compared to PS5. Does this indicate poor CPU+GPU optimization by Iron Galaxy?
 
Nice comparison. So using that very GPU limited settings PS5 (GPU) is basically performing like a 3070 in the cutscene and even a bit better during gameplay with consistantly lower lows FPS on PC (so it's like performing like a... 3070ti during gameplay?).

No wonder DF didn't want to make a direct comparison as it would not have being good for their narrative they are pushing since the beginning! And obviously their high end 5ghz CPU would have not helped their lowly 2060 Super here.
Could theoretically be 1-2% over the 3070ti at points even during gameplay wouldn’t shock me seems you need a 3080 or above to consistently equal or be above the ps5 here
 

SlimySnake

Flashless at the Golden Globes
Thank you! I definitely am planning on taking a look at all those games you mentioned and more. Can't wait to test out The Last of Us in particular when it comes out.

You're welcome! What do you think? A 3700X+3070 PC is falling short in 120fps mode, 4K mode compared to PS5. Does this indicate poor CPU+GPU optimization by Iron Galaxy?
It's falling short in 1440p comparisons I did as well. As for why, I think its simply due to how these games were built on the PS4. Built around PS4's strengths and weaknesses. They might not be properly threading the CPU or even the GPU tasks. 3070 has an insane amount of shader processors compared to the PS5. Even though the GPU utilization is in the 90s, its possible the GPU isnt being fed fast enough. Or the game just prefers AMD cards. But then again NX Gamer's 16 tflops 6800 is outperforming the PS5 by only around 15-20% so the PS5 is definitely outperforming even AMD GPUs.

This is not the first time we've seen this. We saw this with GOW, with Spiderman, Death Stranding and to a lesser extent Horizon. I remember pulling up NX Gamer's PS4 Pro 60 fps mode footage, and it was averaging 56 fps with a shit tier jaguar CPU and a 4.2 tflops GPU. BETTER than the 1060 and AMD's 6 tflops 580 both roughly equivalent to each other. This is what happens when you finally see console games ported to PC. We see just how much third party games are held back by not developing on consoles first.

9900K so a notch better for gaming. Could be the reason.
Yeah thats a badass CPU. i bet the 5.0 ghz clockspeed is helping with the performance since NX Gamer or was it DF said that the game likes faster clocks. I saw the same thing in the Matrix demo. My i7-11700KF which was kinda shit in terms of thermals and wattage all of a sudden turned into a beast compared to the equivalent AMD CPUs which topped out at 4.4 ghz. The unlocked power and higher clocks really helped me hit higher FPS in those games. A lot of these console games are single threaded and they benefit from having higher clock speeds than more threads and cores.

No wonder DF didn't want to make a direct comparison as it would not have being good for their narrative they are pushing since the beginning! And obviously their high end 5ghz CPU would have not helped their lowly 2060 Super here.
Alex is so sad lmao.
 
Last edited:
It's falling short in 1440p comparisons I did as well. As for why, I think its simply due to how these games were built on the PS4. Built around PS4's strengths and weaknesses. They might not be properly threading the CPU or even the GPU tasks. 3070 has an insane amount of shader processors compared to the PS5. Even though the GPU utilization is in the 90s, its possible the GPU isnt being fed fast enough. Or the game just prefers AMD cards. But then again NX Gamer's 16 tflops 6800 is outperforming the PS5 by only around 15-20% so the PS5 is definitely outperforming even AMD GPUs.

This is not the first time we've seen this. We saw this with GOW, with Spiderman, Death Stranding and to a lesser extent Horizon. I remember pulling up NX Gamer's PS4 Pro 60 fps mode footage, and it was averaging 56 fps with a shit tier jaguar CPU and a 4.2 tflops GPU. BETTER than the 1060 and AMD's 6 tflops 580 both roughly equivalent to each other. This is what happens when you finally see console games ported to PC. We see just how much third party games are held back by not developing on consoles first.


Yeah thats a badass CPU. i bet the 5.0 ghz clockspeed is helping with the performance since NX Gamer or was it DF said that the game likes faster clocks. I saw the same thing in the Matrix demo. My i7-11700KF which was kinda shit in terms of thermals and wattage all of a sudden turned into a beast compared to the equivalent AMD CPUs which topped out at 4.4 ghz. The unlocked power and higher clocks really helped me hit higher FPS in those games. A lot of these console games are single threaded and they benefit from having higher clock speeds than more threads and cores.


Alex is so sad lmao.
PS4 secret sauce:messenger_relieved:

But really Sony put their best team on the field in their home stadium. As a PC gamer I wouldn't be upset- tent pole exclusives are few and far these days.
 

Gaiff

Gold Member
PS4 secret sauce:messenger_relieved:

But really Sony put their best team on the field in their home stadium. As a PC gamer I wouldn't be upset- tent pole exclusives are few and far these days.
The performance on the PS5 is fine, but the performance on PC is inexcusable. This is what annoys me with the PC environment; many developers just bank on powerful hardware bruteforcing through shitty performance. As is said, necessity is the mother of all inventions, and console development is a prime example of that.

I remember back in the days of the SNES where they took advantage of things like Mode 7 to fake 3D graphics. The stuff developers did to achieve their goals within the constraints of the hardware at the time was nothing short of brilliant.

For Uncharted 4, Iron Galaxy just looked at the most common GPU which is the 1060 and just aimed for that. Except that the 1060 is 2x the power of the PS4 so it shouldn't run similarly. That Naughty Dog did a better job at optimizing I can agree, but when the PS5 is matching a GPU that is theoretically 50% faster, the dev did a shit job at porting.
 
Last edited:

SlimySnake

Flashless at the Golden Globes
For Uncharted 4, Iron Galaxy just looked at the most common GPU which is the 1060 and just aimed for that. Except that the 1060 is 2x the power of the PS4 so it shouldn't run similarly. That Naughty Dog did a better job at optimizing I can agree, but when the PS5 is matching a GPU that is theoretically 50% faster, the dev did a shit job at porting.
I dont disagree. I made similar points back when GOW launched. However, one must remember that these are simply ports. They were never going to go and retool their engine to support multithreading. GOW porting studio simply admitted it.

There is also other stuff under the hood that we simply dont know about. Everyone tried to pass off the PS4 as an underpowered PC but thats clearly not the case. It might not be full of secret sauce like the cell but these latest benchmarks prove that the PS4 and now PS5 do have something thats boosting performance or letting the consoles punch above their weight. After all, the PS5 got the same treatment no? Doubt they went in and reengineered everything to take advantage of the PS5 IO and then removed all those changes in the PC version. Pretty sure these ports are based on the PS5 version of the game.
 

Gaiff

Gold Member
A 1060 is getting only 30fps at 1080p? Holy Gabe!
No, but it fails to maintain 60 even at 1080p/low settings. The PS4 is pretty much locked to 30fps so there's probably a bit of headroom to go higher but not high enough to reasonably hit the 60fps mark. The 1060 is more than twice as powerful as the OG PS4 and even faster than the PS4 Pro so with a halfway competent CPU, it should laugh at 60fps/1080p/low settings.
There is also other stuff under the hood that we simply dont know about. Everyone tried to pass off the PS4 as an underpowered PC but thats clearly not the case. It might not be full of secret sauce like the cell but these latest benchmarks prove that the PS4 and now PS5 do have something thats boosting performance or letting the consoles punch above their weight. After all, the PS5 got the same treatment no? Doubt they went in and reengineered everything to take advantage of the PS5 IO and then removed all those changes in the PC version. Pretty sure these ports are based on the PS5 version of the game.
Definitely but there are levels to this. The Horizon port manages to maintain 60fps+ easily at 1080p/low. It drops to the low 50's at medium which is PS4 level. Uncharted LL tanks to the 40's at low/1080p on a 1060. Furthermore, the settings in Horizon actually make a difference. Anything above Medium in Uncharted on PC is barely noticeable. The scaling is poor and some effects are still broken.

Here's hoping it gets better over time. HZD was also in a rough shape initially.
 

SlimySnake

Flashless at the Golden Globes
No, but it fails to maintain 60 even at 1080p/low settings. The PS4 is pretty much locked to 30fps so there's probably a bit of headroom to go higher but not high enough to reasonably hit the 60fps mark. The 1060 is more than twice as powerful as the OG PS4 and even faster than the PS4 Pro so with a halfway competent CPU, it should laugh at 60fps/1080p/low settings.

Definitely but there are levels to this. The Horizon port manages to maintain 60fps+ easily at 1080p/low. It drops to the low 50's at medium which is PS4 level. Uncharted LL tanks to the 40's at low/1080p on a 1060. Furthermore, the settings in Horizon actually make a difference. Anything above Medium in Uncharted on PC is barely noticeable. The scaling is poor and some effects are still broken.

Here's hoping it gets better over time. HZD was also in a rough shape initially.
ok, thats pretty bad. I think GOW and HZD are roughly on par since they both have PS5 setting presets and average in the mid 50s at 1080p on a 1060. So U4 being worse is a bit concerning.

GOW and HZD oddly didnt favor AMD cards IIRC. Uncharted does. I wonder if this is a driver issue.
 

ACESHIGH

Banned
I dont disagree. I made similar points back when GOW launched. However, one must remember that these are simply ports. They were never going to go and retool their engine to support multithreading. GOW porting studio simply admitted it.

There is also other stuff under the hood that we simply dont know about. Everyone tried to pass off the PS4 as an underpowered PC but thats clearly not the case. It might not be full of secret sauce like the cell but these latest benchmarks prove that the PS4 and now PS5 do have something thats boosting performance or letting the consoles punch above their weight. After all, the PS5 got the same treatment no? Doubt they went in and reengineered everything to take advantage of the PS5 IO and then removed all those changes in the PC version. Pretty sure these ports are based on the PS5 version of the game.

The secret sauce is man-hours and will to work hard. Lots of game studios and developers are slackers when it comes to PC ports, jist relying on overpowered hardware. Now they have found another excuse: "just turn FSR or DLSS on"

As long as the game does not set your PC on fire everything's good.

I will have to retool my spending habits and never spend more than 5 bucks for these kind of hack jobs.

It's not that we are asking for PS4 performance and visuals on a 750ti but it has to be reasonable.

Look at ID with Doom. Amazing performance and visuals across PC and several different consoles. All versions catering to their respective platform strenghts . Or the coalition/playground/Turn 10
Look how amazing FH3 and Gears 5 run on a weak Xbox one and how well they also run on budget PCs. Ridiculous scalability.
That's how a PC port should be made. I don't mind bugs at release, those can be ironed out down the line. But such low performance is inexcusable.
 
Last edited:
Top Bottom