• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

VGTech: Watch Dogs Legion 60fps Mode PS5 vs Xbox Series X|S Frame Rate Comparison

Spinning that I was right? They said it had lower AF in the original comparison like I said.

You said AF is the same. So, DF also said it was not. Yeah, Xbox fan, synonym for spinning

O6lVZeq.png
 

Riky

$MSFT
Oh, my misunderstanding then. My bad. Then how the hell you concluded i was wrong when i also said few times that at launch comparison AF was also lower on XSX then on PS5?

My point was originally that if you could run the X1X version on Series X it would force 16x AF, so that says to me it's a problem with the GDK.
 
Series consoles force 16x AF on Xbox One games, but you can't play the X1X version on Series X as it just prompts you to upgrade. On PS5 you can play either version.

It can't force it to 16AF if next-gen version isn't 16 AF at all. Hitman 3 on XSX has 8x, on X1X is 4x for example.

Just checked boosted mode on XSX video by DF. No, there is no 16 AF in GTAV, Far Cry 4, Sniper Elite. Bluriness is clearly visible
 
Last edited:

Shmunter

Member
My point was originally that if you could run the X1X version on Series X it would force 16x AF, so that says to me it's a problem with the GDK.
The 16x AF for One games is surely due to the ample GPU & bandwidth overhead available on SeriesX executing software designed for half that.

Next gen games are made to flex SeriesX, and like any software, trade offs for balance are often made.

Apples and Oranges
 
Last edited:

Reindeer

Member
What the heck are you talking about the 3090 came out in 2020 and it can already do 36TFs. AMD doesn't make a chip that big and architectural differences give different TFs results for them but how in god's name can you think AMD will not have an affordable chip 7 years from now that will deliver the 40TFs we could almost achieve a year ago.

Did you think the behemoth that was the original Titan was the peak of graphics development or something? Because a dirt cheap RX480(similar to what the Xbox One X has) that launched 3 years later outperformed it. There's several big graphics advancements that are in the pipeline that will likely release way before 2027 and chips containing these advancements will run circles around the bleeding edge tech we had last year, by the time the PS6 comes it should blow the crap out of a 3090 just like how the PS5 does to the Nvidia Titan(2013).
Here we go, another person I have to explain to that comparing Ampere (RTX 30 series) teraflops to consoles is a bad idea. Ampere teraflops do not scale 1:1 or anything close to Turing (RTX 20 series) and RDNA2 (what consoles use). There's been videos and articles done on the subject of teraflop comparison between different architectures being irrelevant at this point. All you have to do is look at 3080 and 6800XT being identical in performance and yet 3080 has 10 more teraflops. Another example would 3070 and 2080TI that are identical in performance, but yet 3070 shows 7 more teraflops. This is why you have to stick to RDNA2 if you're talking about consoles and scale and compare teraflops from there.
 
Last edited:

yamaci17

Member
Here we go, another person I have to explain to that comparing Ampere (RTX 30 series) teraflops to consoles is a dumb idea. Ampere teraflops do not scale 1:1 or anything close to Turing (RTX 20 series) and RDNA2 (what consoles use). There's been videos and articles done on the subject of teraflop comparison between different architectures being irrelevant at this point. All you have to do is look at 3080 and 6800XT being identical in performance and yet 3080 has 10 more teraflops. Another example would 3070 and 2080TI that are identical in performance, but yet 3070 shows 7 more tflops. This is why you have to stick to RDNA2 if you're talking about consoles and scale and compare teraflops from there.
brilliant marketing by nvidia bro. as you can see, it really works. they will keep blabbering about teh 36 tflops 4ever now

this is not nvidia's first prank tbh. they did this before;

(4.2 tflops)
(2.1 tflops)



everyone can observce how the 2 times more flops card performs

its really impossible to convey these facts to the people

theoritical maximum fp32 performance does not always translate to better gaming performance.

same situation can be observed between vega 64 and a gtx 1080. vega 64 was a computational beast, managing a maximum theoritical fp32 tflops of 12.66. gtx 1080 with 7.6 tflops trounced the vega 64 in the entirety of the generation. only recently that vega 64 saw a slight win against the 1080, but its so slight that its irrevelant mostly
 

yamaci17

Member
I have a dumb question.

What is anistropic filtering and why should anyone care?

Genuine question by the way.
it makes the textures in the distance look sharper

consoles have limited bandwidth budget, hence developers used to run 4x AF for resource saving purposes
pc gpus have the benefit of having bandwith completely reserved to the gpu itself, hence it usualy seems like it is cost free to run 16x AF

people thought that with beastly GPUs this generation of consoles would force 16x AF in every game, but problem is, most console gamers will still play on couches and from a distance to their TV, hence most of them wouldn't notice the difference between 4x and 16x so some developers will still keep using 4x for a better budget management

Since w're early into the generation, i don't think there should be any budget limitations, but we may never know




IVsLN1D.jpg



some vendors can use their own specific AF algorythms, it seems like PS5 is doing something similar for some games and provides better AF quality across the board. but honestly, with a TV-couch setup, 4x and 16x difference may mostly be irrevelant.

we can observe similar effect on nvidia and amd gpus;


driver level AF usually gives better texture filtering compared to in-game filters

odyssey;


ubisoft's default 16x AF tends to look worse than driver level AF implementations. so xbox needs to refine their AF if they care about it but so far they only forced 16X AF on backwards compat games
 
Last edited:

Reindeer

Member
brilliant marketing by nvidia bro. as you can see, it really works. they will keep blabbering about teh 36 tflops 4ever now

this is not nvidia's first prank tbh. they did this before;

(4.2 tflops)
(2.1 tflops)



everyone can observce how the 2 times more flops card performs

its really impossible to convey these facts to the people

theoritical maximum fp32 performance does not always translate to better gaming performance.

same situation can be observed between vega 64 and a gtx 1080. vega 64 was a computational beast, managing a maximum theoritical fp32 tflops of 12.66. gtx 1080 with 7.6 tflops trounced the vega 64 in the entirety of the generation. only recently that vega 64 saw a slight win against the 1080, but its so slight that its irrevelant mostly

It's a third time I had to explain this to someone in this thread, something I thought wouldn't be an issue in a gaming enthusiast forum. Nvidia really did mess with people's heads with their marketing, you'll now have folks expecting 100 teraflops in next consoles when 40 series release with crazy amount of teratlops. AMD needs a way to counter market this when their next GPUs release so that your average Joe is not misinformed. I thought this wouldn't be an issue when people see that those teraflops don't match in performance between different GPUs, but it seems like a lot of people aren't paying attention to that.
 
Last edited:

TrebleShot

Member
Anisotropic_filtering_en.png

gives textures in the distance better detail when they are viewed from an angle , x16 is the highest value

it makes the textures in the distance look sharper

consoles have limited bandwidth budget, hence developers used to run 4x AF for resource saving purposes
pc gpus have the benefit of having bandwith completely reserved to the gpu itself, hence it usualy seems like it is cost free to run 16x AF

people thought that with beastly GPUs this generation of consoles would force 16x AF in every game, but problem is, most console gamers will still play on couches and from a distance to their TV, hence most of them wouldn't notice the difference between 4x and 16x so some developers will still keep using 4x for a better budget management

Since w're early into the generation, i don't think there should be any budget limitations, but we may never know




IVsLN1D.jpg



some vendors can use their own specific AF algorythms, it seems like PS5 is doing something similar for some games and provides better AF quality across the board. but honestly, with a TV-couch setup, 4x and 16x difference may mostly be irrevelant.

we can observe similar effect on nvidia and amd gpus;


driver level AF usually gives better texture filtering compared to in-game filters

odyssey;


ubisoft's default 16x AF tends to look worse than driver level AF implementations. so xbox needs to refine their AF if they care about it but so far they only forced 16X AF on backwards compat games
Thank you, learn something new every day.
So PS5 has better filtering at this point and will give the impression of an overall more detailed and clear image even at lower resolutions as more content will be visible in the background.

Interesting thanks again lads.
 

mansoor1980

Member
Thank you, learn something new every day.
So PS5 has better filtering at this point and will give the impression of an overall more detailed and clear image even at lower resolutions as more content will be visible in the background.

Interesting thanks again lads.
yes on ps5 version distant textures at angles will appear less blurry despite overall lower resolution , but who knows maybe xsx gdk update will improve anisotropy in the future
 
Last edited:

ethomaz

Banned
You can't possibly calculate that accurately knowing that the clock speeds are variable.

The PS5 is not running at peak GPU performance 100% of the time.
Yes it runs most of time at 2.23Ghz… only in some heavy workload like AVX it drops the clock a bit.
 

Lysandros

Member
Scrawny memory bandwidth on PS4. AF was first thing on chopping block.

XOne esram helped.
Bandwidth situation was clearly better on PS4 overall its 176 GB/s vs 68 GB/s for main RAM bandwidth. 109 GB/s SRAM was very tiny at only 32 MB. XboxOne games having better anisotropic filtering are quite few and some of them were later fixed on PS4. Actually there are far more games having better anisotropic filtering on PS4.
 
Last edited:

BbMajor7th

Member
Man, comparisons wars ain't gonna be much to write home about this gen, are they? I suppose file sizes and load times will probably see the biggest deltas, but comparison screens of install sizes don't really scratch the itch...
 
Always the MS victim card. No, people do not dump on the XSX as it is a great console.
The only pushback you hear is when some people try it to push it as God level HW and console war around the forum gloating and unfairly representing its competition to make the gap appear as giant as they feel it has to be… only then people will go into compromises both HW are doing and in comparative analysis about each architecture Pros and Cons… months later seeing how close both consoles are in third party titles there is some vindication in that.

It is not as a PS5 owner that I have an issue with the XSS, but as an XSX one: apparently being part of the club does not mean enjoying a console and its games, but agreeing to and praising its business strategy 🤷‍♂️.

I will keep saying it, as a console user who likes to keep the console model going, I think a digital only XSX with half the storage (512 GB instead of 1 TB) could have sold for $349-399 and would have made the situation better for a lot of gamers. Apparently given Riky’s analysis having the XSS version is the reason many Xbox Series games are much bigger than their PS5 equivalents so if we only had the regular XSX and a half storage cheaper digital only XSX games would be smaller too apparently. This is on top of making XSX support more complex (you need to target two HW profiles not one).

Yet another reason XSS is a solution that looked good to MS (before PS5 DE was announced), but does not look nor help XSX users.

Gears 5 Hivebusters is a first party cross gen title (not sure why people expect the XSS to fare that much better if XSX is pushed without having to worry about Xbox One and much of its GPU is used for Resolution independent processing… but 🤷‍♂️) that targets between 1080p and 1440p at stable 60 FPS so ✅, but even there you have graphical settings differences (reflections toned down in anything but the XSX version).
Maybe you haven't been paying attention but pretty much every bit of positive Xbox news gets greeted by negativity. I get it the Xbox is less popular worldwide so it is what it is but let's not pretend it doesn't happen. Look to how Jim Ryan's and Phil Spencer's cross generational statements were taken even though they were doing the same thing basically.

I simply disagree with your opinion that MS would have been better served with a more expensive XSX without a drive vs the XSS. Price is the main reason. The SOC in the XSS is smaller, cheaper to produce, and has higher yields per wafer than the XSX SOC. If MS had gone with the XSX discless version they would be even few Xbox consoles out there and they'd have an even smaller installed base for this generation. Plenty of people who couldn't find XSXs or PS5s have snagged an inexpensive XSS to tide them over and that is perfectly fine with MS seeing how it's a fantastic portal to Game pass. When they take that chance they are shocked with the amount of value they find. There are tons of videos on YouTube with happy XSS owners detailing their experiences. I'd trust their opinion over some dude fighting a lame console war who doesn't even have the box.

Your dismissal of Hive Busters is funny seeing how so many other games on the XSS are held up as if the system is fatally flawed yet when a game like Hive Busters comes out and runs beautifully, people ignore it. Of course the XSS runs at reduced graphical settings that's the point! I swear it's like people have no idea why the $200 less device has lower graphical fidelity. Does the RTX 3060 run games at lower settings than the RTX 3090? It's the same premise with the XSS and XSX.

How is the framerate seeing Phil Spencer said that was the most important thing? How is the framerate in Watchdogs? How is the framerate compared to the X1, the system the XSS replaced? Why make up arbitrary goals the system was never meant to compete with and ignore the basic premise of the system's existence. I can guess. MS has made plenty of mistakes in the past and they probably will in the future but the XSS isn't one of them.
 
Where did you get the info?
Can you please share the source, if you don't mind?
I just thought it was a sensible conclusion? Why release a Pro model three years into the generation only to sell it at a loss when it's a premium model already aiming for a niche of the main install base to begin with?

There are similar premium models in console ecosystems of years past we can look at which also support the idea the Pro was sold at least for break-even at launch; NEC PC Engine-CD, SEGA Mega CD, Nintendo/Panasonic iQue, Sony PS2-PSX, etc.
 
  • Thoughtful
Reactions: Rea
Here we go, another person I have to explain to that comparing Ampere (RTX 30 series) teraflops to consoles is a bad idea. Ampere teraflops do not scale 1:1 or anything close to Turing (RTX 20 series) and RDNA2 (what consoles use). There's been videos and articles done on the subject of teraflop comparison between different architectures being irrelevant at this point. All you have to do is look at 3080 and 6800XT being identical in performance and yet 3080 has 10 more teraflops. Another example would 3070 and 2080TI that are identical in performance, but yet 3070 shows 7 more teraflops. This is why you have to stick to RDNA2 if you're talking about consoles and scale and compare teraflops from there.
You don't have to explain anything because my original post literally already addressed TF differences between different architectures. I'm not gonna repeat all the words I already typed. Basically 2020 tech will be ancient in 2027, the PS6 will trump the 3090 and AMDscurreny flagship. And 40 TFs AMD TFs or not will be affordable enough to be on a console. End of story.
 

Reindeer

Member
You don't have to explain anything because my original post literally already addressed TF differences between different architectures. I'm not gonna repeat all the words I already typed. Basically 2020 tech will be ancient in 2027, the PS6 will trump the 3090 and AMDscurreny flagship. And 40 TFs AMD TFs or not will be affordable enough to be on a console. End of story.
You say you understand the difference yet quoted 36 teraflops of 3090 as some kind of comparison point, which makes your whole comment a bit daft. And yet again you bring up 3090 as some form of benchmark, the same card that is only equivalent to 23-24 tflops of RDNA2 console GPUs.

As I already mentioned in the post you originally replied to, 20-24 teraflops is what I think PS5 Pro performance will be, a 3090 level of performance basically. It is for this reason your comment makes zero sense as you are trying to claim as though I said that 3090 level of performance won't be achievable with PS6, even though I mentioned that it should be done with Pro consoles this gen.

It seems like you're very confused in regards to teraflops and how they scale with different architectures, even though you claim otherwise. When I spoke about teraflops and my expectations for Pro and next gen consoles I was speaking in terms of RDNA2 teraflops because that's what consoles use, but you brought up Ampere teraflops that are far exaggerated and don't give you a clear idea of performance level of those GPUs compared to console GPU architecture.
 
Last edited:

assurdum

Banned
I assume you uncle works at Ubisoft or something? :messenger_beaming:
I assume you are deliberately stupid at this point because I did 3 posts trying to explain to you how it could works but still you persist with stupid childish argumentation. Believe whatever you want and persist with your stupid console war 🤷‍♂️
 
Last edited:
What are you playing it on? I thought it looked great on PS5 especially considering it probably has the most advanced and impressive world simulation every seen in a game. Every NPC is out there living a life unrelated to the actions of your character.
I'm on ps5. I mean there is barely any AA at all let alone lod and draw distance. Valhalla is leagues ahead from a technical perspective. But that's my subjective observation. I am massively disappointed. But I don't mind. Only paid 20€ new so fuck it.
 
Tools lol, just another excuse...

Am I doing it right?
Here it was a very specific setting not used by the devs so they were actually not using the API correctly. It was not a general performance problem like on Xbox. The "tools" excuse in the case of XSX is just a way to say their libraries are going to be better optimized (games will run better) in later versions of their SDK (which is always the case on all systems, consoles and PCs).
 
Last edited:

JackMcGunns

Member
Here it was a very specific setting not used by the devs so they were actually not using the API correctly. It was not a general performance problem like on Xbox. The "tools" excuse in the case of XSX is just a way to say their libraries are going to be better optimized (games will run better) in later versions of their SDK (which is always the case on all systems, consoles and PCs).


And you have evidence that devs are purposely turning off AF on Xbox Series X because it is underpowered below Xbox One AF capabilities and that it's not a similar bug?
 

assurdum

Banned
And you have evidence that devs are purposely turning off AF on Xbox Series X because it is underpowered below Xbox One AF capabilities and that it's not a similar bug?
No one said series X is underpowered eh. Probably the game will run worse with higher AF and they lowered it. Not means the impact was dramatic in the performance but could help to eliminate some annoying drops here and there. Not sure to understand what you trying to say talking about the Xbox one AF capabilities.
 
Last edited:
Top Bottom