• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGTech: Watch Dogs Legion 60fps Mode PS5 vs Xbox Series X|S Frame Rate Comparison

phil_t98

#SonyToo
No need to spin it. Also, you have it on first page too. Btw. of course you don't see jaggy double yellow linest because it is blurred on XSX version.




Looks who's talking. What a hypocrite!!

Cut the crap with VRR. NXGamer, Digital Foundry and VGTech does not analyzing games and do a comparisons with VRR. They are analyzing games in pure state what they are.

There is no spin on it. If the draw distance is better on ps5 then so be it, do you not see the jaggys also?
 
There is no spin on it. If the draw distance is better on ps5 then so be it, do you not see the jaggys also?

I see it. It is visible on XSX version too on double white lane on the left. You don't see it after the puddles because it is blurred, a.k.a texture filtering. You have an better examples on first page.
 
Last edited:
All the non believers of 60 fps try 30 and 60 in this game I swear I want to puke at 30 the game sucks but my god does it feel like a brand new game at 60fps
I think this 30 to 60 fps jump in games is bigger than 720p to 1080p it's borderline unplayable to play this game at 30 when u have seen it running at 60
 

Md Ray

Member
it is true that gears 5 has excellent optimization, so it's not surprising that all systems run it fine, same for forza horizon 4

xbox studios might fine tune their own games to hit 1080p 60/4k 60 for both s and x consoles respectively, but multiplatform games clearly target 1300/1440p 60 fps for ps5/xbox. in this scenario, series s never gets away with 1080p 60 fps

at some point, i'm fairly certain xbox studios will also target 1440p 60 fps for the big ones. otherwise, we will be having games that look like gears 5 (let's be honest, its not that impressive graphically) for the entirety of generation (even the last of us 1 remaster looks leaps and bounds better than gears 5 and that's telling a lot, since that game was practically designed to run on a ps3). i will give forza horion 4 props though
It's true. If you think about it the game's originally designed for a cut-down HD 7790 (Xbone GPU) to run at 1080p, so it's a lot easier to run on GTX 1060-level GPUs at 1620p-1800p at 60fps than most games. IIRC, I played the entirety of Gears 5 at near 4K 60fps on an OC'd GTX 970 when the game launched.
 

dcmk7

Banned
Why? It is "next-gen" console after all. There is no need ignore it in game comparisons. Then go to DF, NXG and VGTech and say to them to not include XSS in comparison. After all, MS said that difference in games between the XSS and X will be only in resolution. Let us see was MS right in that. So far, no.
This is a good post. Not sure why there are people are defending lows of 675p in 2021. There is nothing next gen about that figure.

I would personally exclude XSS from any next gen comparisons, since majority of the games are compromised versions, so it's all a bit misleading.
 

MonarchJT

Banned
well btw seem that a trend is setting at this point we should start to always to expect little bit better perf on xsx
 
Last edited:

Lysandros

Member
Looking at it again there seem to be up to 30% resolution advantage for XSX at points, that's quite significant i think. Isn't this the same engine as AC Valhalla where PS5 had higher resolution bounds? I remain perplexed.
 

Riky

$MSFT
If the bigger consoles are dropping down to 1180p then where do people think the XSS is going to go when it's less than a third as powerful?
At the end of the day there is a fidelity mode if you really care.
The next gen part is 60fps and actually the most consistent 60fps, no last gen consoles can do that.
 
Last edited:

MonarchJT

Banned
Looking at it again there seem to be up to 30% resolution advantage for XSX at points, that's quite significant i think. Isn't this the same engine as AC Valhalla where PS5 had higher resolution bounds? I remain perplexed.
I think that taking as examples the launch games as the definitive measure of performance has always been misleading, especially when we had multiple sources that told us that the tools of Microsoft consoles were really behind the competition.
 

assurdum

Banned
AF is free now. It hasn't been before, but it is now. 10 year old graphics cards can do 16xAF with zero notable performance impact.

With this game, we even know that the AF settings are exactly the same between both XSX and PS5, since all the settings are hidden in the PC config file. It's 100% a GDK issue.
It's not on console. And no AF it's not the same, pc config file tell you the setup file of the series X on the pc but not the actual AF setup in the game code which run on console.
It's not because you post laughing emoticon you change the reality of the coding 🤷‍♂️
 
Last edited:

assurdum

Banned
Looking at it again there seem to be up to 30% resolution advantage for XSX at points, that's quite significant i think. Isn't this the same engine as AC Valhalla where PS5 had higher resolution bounds? I remain perplexed.
It's not the same engine. Ubisoft uses 4 different engine maybe even more. WD and AC use two dedicated engine for sure as Far Cry series also.
 
Last edited:

KungFucius

King Snowflake
So a minimum resolution difference of 12000 pixels is "slightly lower" but a 0.02 median framerate increase is "better performance"?
It's so funny seeing fanboys wanting a win for their plastic box
Cmon. Clearly the PS5 is performing better, they just had to cut the resolution to do so. Oh wait.

If this is the type of takeaway we are going to get, this gen is going to be really, really long.
 

assurdum

Banned
Cmon. Clearly the PS5 is performing better, they just had to cut the resolution to do so. Oh wait.

If this is the type of takeaway we are going to get, this gen is going to be really, really long.
It's funny how you blame the other to make useless console war for meaningless difference but meanwhile you feel legitimates to do the same because the math gap is bigger though almost impossible to spot without a pixel counts.. The level of hypocrisy of such attitude is really high.
 
Last edited:
Looking at it again there seem to be up to 30% resolution advantage for XSX at points, that's quite significant i think. Isn't this the same engine as AC Valhalla where PS5 had higher resolution bounds? I remain perplexed.

What's the point of up to 30% res advantage when game on XSX tearing more and has worse texture filtering. I don't expect tearing to be "fixed" in this game on XSX as they've chosen for the frame to tear rather than duplicate a frame.

AF is free now. It hasn't been before, but it is now. 10 year old graphics cards can do 16xAF with zero notable performance impact.

With this game, we even know that the AF settings are exactly the same between both XSX and PS5, since all the settings are hidden in the PC config file. It's 100% a GDK issue.

It's not on console. And no AF it's not the same, pc config file tell you the setup file of the series X on the pc but not the actual AF setup in the game code which run on console.
It's not because you post laughing emoticon you change the reality of the coding 🤷‍♂️

It is not free on console. Otherwise, nor Xbone, nor PS4 nor XSS wouldn't have a problem with AF. As it is clearly visible low AF in this game on XSS.

I think that taking as examples the launch games as the definitive measure of performance has always been misleading, especially when we had multiple sources that told us that the tools of Microsoft consoles were really behind the competition.

Oh, tooolz excuse.
 
What's the point of up to 30% res advantage when game on XSX tearing more and has worse texture filtering. I don't expect tearing to be "fixed" in this game on XSX as they've chosen for the frame to tear rather than duplicate a frame.





It is not free on console. Otherwise, nor Xbone, nor PS4 nor XSS wouldn't have a problem with AF. As it is clearly visible low AF in this game on XSS.



Oh, tooolz excuse.
If the resolution is 30% higher there is of course room to fix the framerate, just need a slightly more aggressive DR.
 
IIRC, I played the entirety of Gears 5 at near 4K 60fps on an OC'd GTX 970 when the game launched.
h4CMjQW.png
 

assurdum

Banned
The funny thing is that if you could run the X1X version on Series X it would force 16x AF. It's obviously a setting in the GDK that is causing this.
You should reread what you said in this post. Of course you can have 16 AF free on series X if you use the same graphic setup of the X1X.
And he laughs. Can you use your brain for a second and think about the stupidity of your previous assumption?
 
Last edited:
If the resolution is 30% higher there is of course room to fix the framerate, just need a slightly more aggressive DR.

You mean up to 30%.

Irrelevant, the AF was the same in the 30fps mode at launch.

It was not. Check the DF comparison again. LOL at Bernd Lauert Bernd Lauert who liked your post. So he knows jack shit like you

 
Last edited:

assurdum

Banned
Cut the resolution by 10% to get a 0.03% better frame rate. Totally worth it :messenger_winking:
Would be worthy It for the vsync surely. At the daynight it's very present on series X. Honestly I don't understand this obsession to boost more pixels on series X at all costs when they can't even offer the same AF of the ps5. Or even improve other graphic setup why not at the same resolution of the ps5, would be more noticeable. It's really stupid and a waste of resources.
Until the others are happy to have something almost unnoticeable 🤷‍♂️
 
Last edited:
I no longer understand the meaning of comparisons threads if such a blatant trolling is allowed @Mod of War .. though even when a console is objectively pushing far less pixel at the same perfomance (0.03% and one also have VRR that the other console dosnt) doing so creates a precedent for future comparisons so it becomes impossible to understand something.
I understand that not having the edge on performance can hurt someone's feelings but so these threads lose their value completely. I don't know how much post I read with "another PS5 win" bs trolling in

I've seen him drop this before.

dd6f6db42f194bd4a4743613342b0aa7.jpg


:messenger_tears_of_joy:
 
40 tflops? We may not even see that in PS6. We're in diminishing returns territory with die shrink and how much performance you can get in console form factor. Most likely you'll get double the performance with Pro consoles (20-24 tflops) and that should be enough for 4K60 and 120fsp at lower resolution.
What the heck are you talking about the 3090 came out in 2020 and it can already do 36TFs. AMD doesn't make a chip that big and architectural differences give different TFs results for them but how in god's name can you think AMD will not have an affordable chip 7 years from now that will deliver the 40TFs we could almost achieve a year ago.

Did you think the behemoth that was the original Titan was the peak of graphics development or something? Because a dirt cheap RX480(similar to what the Xbox One X has) that launched 3 years later outperformed it. There's several big graphics advancements that are in the pipeline that will likely release way before 2027 and chips containing these advancements will run circles around the bleeding edge tech we had last year, by the time the PS6 comes it should blow the crap out of a 3090 just like how the PS5 does to the Nvidia Titan(2013).
 
Would be worthy It for the vsync surely. At the night it's very present on series X. Honestly I don't understand this obsession to boost more pixels on series X when they can't even offer the same AF. It's really stupid.
What do you mean "they can't offer"? It's up to the game developer to fix the AF.

Also talking about Vsync...

what year is it GIF
 
What the heck are you talking about the 3090 came out in 2020 and it can already do 36TFs. AMD doesn't make a chip that big and architectural differences give different TFs results for them but how in god's name can you think AMD will not have an affordable chip 7 years from now that will deliver the 40TFs we could almost achieve a year ago.

Did you think the behemoth that was the original Titan was the peak of graphics development or something? Because a dirt cheap RX480(similar to what the Xbox One X has) that launched 3 years later outperformed it. There's several big graphics advancements that are in the pipeline that will likely release way before 2027 and chips containing these advancements will run circles around the bleeding edge tech we had last year, by the time the PS6 comes it should blow the crap out of a 3090 just like how the PS5 does to the Nvidia Titan(2013).
Yep. 40 teraflops is actually lowballing it. It's only a 4x improvement over last gen, that's nothing.
 

BigLee74

Member
If the bigger consoles are dropping down to 1180p then where do people think the XSS is going to go when it's less than a third as powerful?
At the end of the day there is a fidelity mode if you really care.
The next gen part is 60fps and actually the most consistent 60fps, no last gen consoles can do that.
I wouldn’t waste your time. It’s the same folk shitting on the XSS here that do so in every other thread. They know exactly what the XSS is/is not capable of, yet express shock and concern every single time that it doesn’t match the big boys. They know exactly what they’re doing, and are just fishing for the rises. You can’t reason with these kind of people, best to just ignore them.

Meanwhile, in the real world, people that own them happily game on.
 

Riky

$MSFT
Last edited:

assurdum

Banned
What do you mean "they can't offer"? It's up to the game developer to fix the AF.

Also talking about Vsync...

what year is it GIF
What make you think AF it's fixable without a cost? Personally I don't know but if we look to the last trend in multiplat seems more a choice by the developers to prioritize res over the AF or other stuff. For me it's really a stupid choice especially when you can barely seen a boost in the IQ. You can do a lot more than just pixels. Maybe they haven't choice. I don't know.
 
Last edited:
What make you think AF it's fixable without a cost? Personally I don't know but if we look to the last trend in multiplat seems more a choice by the developers to prioritize res over the AF or other stuff. For me it's really a stupid choice especially when you can barely seen a boost in the IQ. You can do a lot more than just pixels. Maybe they haven't choice. I don't know.
Because the devs used the same AF setting as with the PS5. The result isn't the same though, which means it has something to do with the GDK and the dev couldn't be bothered to manually fix it.
 

assurdum

Banned
Because the devs used the same AF setting as with the PS5. The result isn't the same though, which means it has something to do with the GDK and the dev couldn't be bothered to manually fix it.
Oh God. I already told you pc file config is not trustworthy for practice reason. The series X game code is not tied to the pc file config about it. Could be outdated. I don't know how to explain you better.
 
Last edited:
Oh God. I already told you pc file config is not trustworthy for practice reason. The series X game code is not tied to the pc file config about it. Could be outdated. I don't know how to explain you better.
I'm not saying they're tied. But that's the settings the devs used.
 
I read before people saying ps5 for the exclusives and series X for the better multiplatforms. This aged really fast. PS5 is enough and anyone can see it now.
You say that, but I count way more 120fps games on Series X and S. It's BC related of course, but still a thing. Also, Game Pass.
 
Top Bottom