• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[VG TECH] Tony Hawk's Pro Skater 1+2 PS5 & Xbox Series X|S Frame Rate Test

SkylineRKR

Member
There is no next-gen console game yet. Demon's Souls could classify, but its a near identical remake of the original down to the animations. So far the biggest improvement are fast load times, higher framerates without sacrificing resolution and fidelity as much as on last-gen systems. You pretty much get a last-gen res at 120hz now. Which is 4 times or more the fluidity of last-gen.

I think Rift Apart simply plays with the faster loading also. The game combines a short loading screen with an ingame warp. Which makes it seem you change worlds on the fly. This ofcourse makes the game impossible for PS4 Pro as it would need to load in that new world defeating its purpose.
 

Concern

Member
You really seem to like the constructive and intellectual defense known as "NO U".I'm not saying it's childish but even my 8 years old nephew grew past it, maybe it is time for you to evolve at the level of my nephew.
Maybe it will help you to stop complaining about each and every thread.
Now have a great day.

Take your own advice and stop jumping into other people's discussions/arguments or whatever it is. Especially when it doesn't concern you.

Good day
 

azertydu91

Hard to Kill
Take your own advice and stop jumping into other people's discussions/arguments or whatever it is. Especially when it doesn't concern you.

Good day
Can you tell me which of my own advice I should take?Like a quote or something since you seem to just say random bullshit and complain, that's all of your contribution to this forum.
Seriously maybe change your username to Complain it would be a much better representation of what you bring here.

About things not concerning me ... Did the poster you replied to was talking to you?Then it doesn't concern you by your own logic.Or maybe you are way more invested than you are willing to admit like everyone here noticed.
It is quite funny how the more prone to be a console warrior the more you try to pass as neutral.
Grow up it is only games and some people are allowed to criticize Xbox don't take it personaly, you are not the brand you root for.
 

Zoro7

Banned
Can you tell me which of my own advice I should take?Like a quote or something since you seem to just say random bullshit and complain, that's all of your contribution to this forum.
Seriously maybe change your username to Complain it would be a much better representation of what you bring here.

About things not concerning me ... Did the poster you replied to was talking to you?Then it doesn't concern you by your own logic.Or maybe you are way more invested than you are willing to admit like everyone here noticed.
It is quite funny how the more prone to be a console warrior the more you try to pass as neutral.
Grow up it is only games and some people are allowed to criticize Xbox don't take it personaly, you are not the brand you root for.
Just ignore him dude. One of the biggest trolls on gaf. It ain’t worth it.
 

azertydu91

Hard to Kill
Just ignore him dude. One of the biggest trolls on gaf. It ain’t worth it.
It at least shows its hypocrisy to some that might be dupped by him.Knowing whom to trust on internet is important, like I wouldn't trust Riky or Abel Empire.And those petty warrior always trying to pass as neutral are the worst simply because they spread FUD more efficiently than any other.

Edit: But yeah I'll block him, it is not the first time that he just shitpost while trying to pass as the moral compass here,And since he barely ever bother to read threads and just complain I don't think I'll miss anything from him.
 
Last edited:

Heisenberg007

Gold Journalism
Yes, it sends 4 GB/s of data every time a new chunk is loaded, which allows for higher details at shorter loading times. It's still not comparable to UE5, mainly because what we've seen on UE5 couldn't run on an HDD. DeS could, with some tinkering.
To be fair, even UE5 could (and will) run on HDDs. It will on Nintendo Switch, old-gen consoles, and mobile devices.

The key here is the graphical fidelity at which a UE5 game will run on different devices. Could Demon's Souls run on a PS4? Probably, yes. But will it have the same graphical fidelity and high-quality textures? No. That wouldn't be possible even if the game had a 5-minute loading screen in the beginning.

That was the point I was making earlier. Demon's Souls graphical fidelity is made possible with PS5's SSD because the game sent 4Gb/s data to maintain that fidelity level.
 

yamaci17

Member
lol these threads are so funny to read lmaooo very funny :messenger_tears_of_joy:

i will chime in my own point of view why tflops between ps5 and sx does NOT matter

let's go with the gtx 1060 and 1080

according to theoritcal tflops, gtx 1060 has 4.4t flops, and gtx 1080 has 8.8 tflops

wow, two times, eh? well how much of that "tflops" translate to actual performance?

let me tell, there are some games that where gtx 1080 only surpasses a gtx 1060 by a freaking %50 margin.

and in average, it beats the 1060 by around %70.

what does this tell us? the both gpu is clocked similarly. 1080 has two times more shaders, two times more SMs (CUs for nvidia), two times net more tflops, yet performance does not scale linearly.

performance will, most of the time, scale linearly with core clock, on the other hand.

math is simple, series x has %44 more CUs, right?
but ps5 has %22 faster core clock

going by a similar analogy, it's clear to see that %44 more therotical CU power may only scale by %25-30.

And offset that with the %22 faster core clock, what you get? Pretty equal systems. Interesting, huh?

Only place where Series X can shine would be in scenarios where all the CUs are heavily loaded with lots of parallize load.


This game is brutal.

At 1080p, the difference between gtx 1060 and 1080 is only %29. Can you believe that? It's clear that this game did not care about 1080's "tflops" or it's more "SMs".

At 1440p, difference becomes a whopping %68, and at 4k, it becomes %73.

Only scenarios where Series X may get the full advantage of its %44 more CU count would be situations where the game runs native 4k, most likely. Even then, it will not scale the performance by %44.

More CUs and TFLOPS are good, but core clock scales better.

In the end, after these conclusions, for my frame of view, PS5 and SX are near-equal performance wise at 1440p and below. At 4K 30 fps modes, i would gather PS5 might have to resort to 1800p because of the reasons stated above.
 
There was a time where someone decided that the XSS would give PS5 a run for its money. Well, that narrative didn't live up to expectations.

Well the cheapest PS5 is the DE and it's only 100$ more than the XSS. Unless you highly value gamepass it's difficult for the XSS to look like a good value compared to the PS5 DE. I would have preferred a cheaper diskless XSX for example.

Just my thoughts though.
 

SkylineRKR

Member
Well the cheapest PS5 is the DE and it's only 100$ more than the XSS. Unless you highly value gamepass it's difficult for the XSS to look like a good value compared to the PS5 DE. I would have preferred a cheaper diskless XSX for example.

Just my thoughts though.

Absolutely, I happen to own both.

I first got my XSS and was happy with it. It was 299, I had GP running and the console is lightning fast. And actually cheaper than a PS4 Pro and X1X over here. But then I got my PS5 DE, and since I paid up front I didn't realize I only paid 100 more for this. Its a fully next-gen system with better graphics, a bigger SSD, just far more value overall. A discless XSX for 399 would've been perfect and also easier for MS and developers alike.
 
lol these threads are so funny to read lmaooo very funny :messenger_tears_of_joy:

i will chime in my own point of view why tflops between ps5 and sx does NOT matter

let's go with the gtx 1060 and 1080

according to theoritcal tflops, gtx 1060 has 4.4t flops, and gtx 1080 has 8.8 tflops

wow, two times, eh? well how much of that "tflops" translate to actual performance?

let me tell, there are some games that where gtx 1080 only surpasses a gtx 1060 by a freaking %50 margin.

and in average, it beats the 1060 by around %70.

what does this tell us? the both gpu is clocked similarly. 1080 has two times more shaders, two times more SMs (CUs for nvidia), two times net more tflops, yet performance does not scale linearly.

performance will, most of the time, scale linearly with core clock, on the other hand.

math is simple, series x has %44 more CUs, right?
but ps5 has %22 faster core clock

going by a similar analogy, it's clear to see that %44 more therotical CU power may only scale by %25-30.

And offset that with the %22 faster core clock, what you get? Pretty equal systems. Interesting, huh?

Only place where Series X can shine would be in scenarios where all the CUs are heavily loaded with lots of parallize load.


This game is brutal.

At 1080p, the difference between gtx 1060 and 1080 is only %29. Can you believe that? It's clear that this game did not care about 1080's "tflops" or it's more "SMs".

At 1440p, difference becomes a whopping %68, and at 4k, it becomes %73.

Only scenarios where Series X may get the full advantage of its %44 more CU count would be situations where the game runs native 4k, most likely. Even then, it will not scale the performance by %44.

More CUs and TFLOPS are good, but core clock scales better.

In the end, after these conclusions, for my frame of view, PS5 and SX are near-equal performance wise at 1440p and below. At 4K 30 fps modes, i would gather PS5 might have to resort to 1800p because of the reasons stated above.

Clocks do not scale better at all. I'm not sure where you are getting this. We have years of examples in the PC space. Some architectures benefit more of course but it is certainly not linear. In fact the greatest gains always come from higher CU counts. You have literally got this backwards. If I overclock a 1060 I will not get 1080 levels of performance.
 

Lysandros

Member
A tie? Damn that's another win for PS5!!! That 18% advantage from xbox became extinct faster than the dodo birds. It'll only get worse for xbox here on out when the real next gen games comes out.
At purely technical level PS5 wins this comparison, it's outperforming XSX in this game. A tie can be argued based on 'perceptual parity' i guess.
 

Cyborg

Member
Why is this thread so hot? I thought the game was almost identical for both platforms. Whats the issue now?
 

ethomaz

Banned
Clocks do not scale better at all. I'm not sure where you are getting this. We have years of examples in the PC space. Some architectures benefit more of course but it is certainly not linear. In fact the greatest gains always come from higher CU counts. You have literally got this backwards. If I overclock a 1060 I will not get 1080 levels of performance.
That is not true in PC scheme too.

Clock doesn't scale linearly but the lose up to the limit of the Architecture is very small.... when you change the clock of a RDNA card from 1.8Ghz to 1.9Ghz it will basically scale near the increase in clock... but if you try to go to 2.1Ghz it won't scale close to linear anymore and you will have a disproportional increase in performance.

There is article made few weeks ago comparing the RDNA 2 cards in all scenarios... the same GPU with different clocks and number of CUs.

The result was pretty clear performance scale better with clocks than CUs.
That is true since ever in PC world... including CPUs.... clock in CPU scale better in performance than more CPU-cores... a 2-core CPU clocked at 4Ghz runs better than a 4-core CPU at 2Ghz.

So he didn't get it backward... the opposite... it just his example is not optional because there are several parts in 1060 that makes it have bootlenecks compared with 1080.

In PC world the greatest gain come from clocks... it just that the clock started to reach a physical silicon limit that made the push for more units in the same package become a thing... it doesn't give the same gain as increase in clocks but when you can't increase the clocks anymore you have still a big gain with more units.

Edit - The recent tests results:


But any other test will show the same results with any CPU or GPU since there is no limitation in the clock increase.
Increase in clock give better performance than increase in units.
 
Last edited:

jroc74

Phone reception is more important to me than human rights
Why is this thread so hot? I thought the game was almost identical for both platforms. Whats the issue now?
A mistake was made in a tweet about the game, that lead to a wild thread. This analysis lead to an even wilder thread in reverse.

Basically it went from "PS5 weaksauce, dur hur hur" to "omg what do I do now since that tweet was wrong??"

Some ppl arent taking the updated info well.

The PS5 was listed as 1080p, XSX/XSS 1440p in the tweet.

For those ppl to save face, to immediately try to deflect from the real issue, they decided to bring up frame rates. And conveniently forget the huge issue was the 1080 res for PS5.
 
Last edited:
That is not true in PC scheme too.

Clock doesn't scale linearly but the lose up to the limit of the Architecture is very small.... when you change the clock of a RDNA card from 1.8Ghz to 1.9Ghz it will basically scale near the increase in clock... but if you try to go to 2.1Ghz it won't scale close to linear anymore and you will have a disproportional increase in performance.

There is article made few weeks ago comparing the RDNA 2 cards in all scenarios... the same GPU with different clocks and number of CUs.

The result was pretty clear performance scale better with clocks than CUs.
That is true since ever in PC world... including CPUs.... clock in CPU scale better in performance than more CPU-cores... a 2-core CPU clocked at 4Ghz runs better than a 4-core CPU at 2Ghz.

So he didn't get it backward... the opposite... it just his example is not optional because there are several parts in 1060 that makes it have bootlenecks compared with 1080.

In PC world the greatest gain come from clocks... it just that the clock started to reach a physical silicon limit that made the push for more units in the same package become a thing.
So it doesn’t scale in a linear fashion? You say yourself that moving up to 2.1GHz from 1.8Ghz won’t scale close to linear. Higher clocks increase temps and power draw in a way that higher CU counts do not. In the PC world the greatest gains come from increased CU count period (primarily architecture gains of course but we are talking within architectures here). The Nvidia stack is separated by CU, the AMD is separated by CU, Intel stack is seperated by cores and the AMD stack is separated by cores. In a typical GPU you are lucky to get 10% perf boost from increased clocks.

In your reality I should overclock my 3060 and expect 3090 perf?
 

ethomaz

Banned
So it doesn’t scale in a linear fashion? You say yourself that moving up to 2.1GHz from 1.8Ghz won’t scale close to linear. Higher clocks increase temps and power draw in a way that higher CU counts do not. In the PC world the greatest gains come from increased CU count period (primarily architecture gains of course but we are talking within architectures here). The Nvidia stack is separated by CU, the AMD is separated by CU, Intel stack is seperated by cores and the AMD stack is separated by cores. In a typical GPU you are lucky to get 10% perf boost from increased clocks.

In your reality I should overclock my 3060 and expect 3090 perf?
Yeap... RDNA from 1.8Ghz to 2.1Ghz will have a bad scale in performance.
But from 1.8Ghz to 1.9Ghz will be very close to linear.

But RDNA 2 changed that and now the cards scale in clock near linearity up to 2.5Ghz or more.

Depende what you mean with greatest?
The best gain in PC world performance is Clock.... it gives a better performance/increase curve than more CUs.
But if you say greatest in terms of overall performance well a new Core will give more performance than 500Mhz increase in clock because you know the new Core will double the Mhz of the chip... so even with the loss due the parallelism you have a big increase in performance.

But clock scale more linear in performance than increase in units.

So if you have a CPU with 1 Core running a 2Ghz... you will have better performance with a CPU with 1 Core at 4Ghz than 2 Cores at 2Ghz.
 
Last edited:
Yeap... RDNA from 1.8Ghz to 2.1Ghz will have a bad scale in performance.
But from 1.8Ghz to 1.9Ghz will be very close to linear.

But RDNA 2 changed that and now the cards scale in clock near linearity up to 2.5Ghz or more.

Again the greatest gain in PC world performance is Clock.... it gives a better performance/increase curve than more CUs.
That's not true and never has been. It's overly simplifying and ignoring many other variables.
 

ethomaz

Banned
That's not true and never has been. It's overly simplifying and ignoring many other variables.
That is true and proved dozen of times lol
Even today... I linked a extended article about it.

Even at Computer Science college you have that explained.

There is some myths created because the silicon tech reached a limit in clocks... so more near the limit the scale curve get disproportional and so overclockers basically works with little gains in higher frequencies because they already crossed the physical clock limit.

The increase in performance vs increase in clock will be very linear if you are below the physical limitations of the chip.
 
Last edited:

Heisenberg007

Gold Journalism
I remember Bluepoint told John from Digital Foundry something like that. I don't think they were lying when they did.
Exactly, and they just shared that information casually -- no reason to lie.

But it was an important insight, i.e., 4 Gb/s raw data is a lot, and that's where PS5's SSD comes into play. With further optimization, Kraken + Oodle compression techniques, and by pushing the limits of the 5.5 Gb/s ceiling, we will have even more beautiful-looking games in the future on PS5.
 
That is true and proved dozen of times lol
Even today... I linked a extended article about it.

Even at Computer Science college you have that explained.

There is some myths created because the silicon tech reached a limit in clocks... so more near the limit the scale curve get disproportional and so overclockers basically works with little gains in higher frequencies because they already crossed the physical clock limit.

The increase in performance vs increase in clock will be very linear if you are below the physical limitations of the chip.
I'm arguing the clocks vs count. To say clock is better is not true and never has been. It's always been determined by hundreds of variables.
 

ethomaz

Banned
I'm arguing the clocks vs count. To say clock is better is not true and never has been. It's always been determined by hundreds of variables.
It was and is always better.

The units counts started become a thing after the processors reached the physical limit in clock of the architecture.
The increase in units give a performance curve more disproportional than the increase in clock.

The clock x performance curve is very linear until you reach the physical limit of the architecture.

That is where the overclock scheme lives and where the false claims come because they doesn't understand what is happening.

Now of course if you can't increase the clock why not increase the units count? There is no harm in more performance.... it just the increase is performance is not close to linear like the clock gives and depends a lot on the parallelism coded on the software level to have a good performance scaling... clock increase doesn't rely on software.

BTW recent tests.


But if want you can find tests on Google that goes down to archaic 486 or Pentinum processors.
There were a lot of discussion when the first dual-core processors started to be released.
 
Last edited:

Lysandros

Member
It still boggles my mind how people can claim any conclusions about the performance of Series X or PSV during their launch window, while in a pandemic.
Why people shouldn't draw 'any' conclusions? Sure the conditions are far from ideal with all this pandemic and early cross gen period but the situation is the same for both parties. Now, i would consider a 'definitive conclusion' about the whole picture maybe a mistake, but i think we have a sufficient sample size for an early idea about the comparative performance of these systems.
 
Last edited:
It was and is always better.

The units counts started become a thing after the processors reached the physical limit in clock of the architectures.
The increase in units give a performance curve more disproportional than the increase in clock.
I may be showing my age but these arguments were all played out during the GHz wars. The power efficiency curve exists for a reason. When discussing parts at or neat the top range of their respective performance efficiency curves the performance increase by pushing them further doesn’t scale in a linear fashion. Look at Pentium 4 Netburst. Architecture improvements or addition of CU/cores and logic is the only way to scale.
 

jroc74

Phone reception is more important to me than human rights
Why people shouldn't draw 'any' conclusions? Sure the conditions are far from ideal with all this pandemic and early cross gen period but the situation is the same for both parties. Now, i would consider a 'defintive conclusion' about the whole picture maybe a mistake, but i think we have a sufficient sample size for an early idea about the comparative performance of these systems.
Thank you.
 

ethomaz

Banned
I may be showing my age but these arguments were all played out during the GHz wars. The power efficiency curve exists for a reason. When discussing parts at or neat the top range of their respective performance efficiency curves the performance increase by pushing them further doesn’t scale in a linear fashion. Look at Pentium 4 Netburst. Architecture improvements or addition of CU/cores and logic is the only way to scale.
Because we reached years ago a silicon limitation for clocks.

BTW Pentinum 4 Netburst is a good example of different architecture... Intel have reached a limit in clock with Pentinum III so they choose to decrease the number of operations made per clock cycle with Netburst to break that limit and increase the clocks.
What happened? Well with less things beings done per clock cycle the Netburst failed... the Pentinum III was faster than Pentinum IV at same clock.

But that doesn't change that increase the clock of these processors give a better performance curve than increasing the core counts and it is true for both Pentinum III and Pentinum IV.

If there is no physical limitation for clock a 4Ghz Pentinum III/IV will perform better than a Dual-core 2Ghz Pentinum III/IV.
We know it is not even possible to reach 4Ghz with a Pentinum III and even for Pentinum IV it will be already crossed the limit where the clock scale close to linear.
 
Last edited:

ethomaz

Banned
Just to add to the talk.

This is a old benchmark with GTX 8800 (because I showed a recent benchmark then it should show a old one to show things didn't change).

Look at scaling... see that the clock have a linear scaling with performance until around 1300Mhz? after that the things started to get weird? It is when it crossed the limit of the Arch... you can se the clock have weird variation like no scaling, away from linear, a bit close to linear... after the clock cross that limit it become unstable.

That is exactly why some says clocks doesn't scale linear without have any ideia of what is talking about.

oblivionshader575.png


CU count increase is nowhere close to that linear increase found in clock increase up to 1300Mhz.
Not in the past, not today and not in the future.

Clock trumps in performance increase but it have a limit to where it can go... so increase CU counts become a strategy when you reach that clock limit.

Said that neither Series X or PS5 are near the limit where clock vs performance start to become unstable... RDNA 2 increased that target to over 2.5Ghz.
 
Last edited:
Because we reached years ago a silicon limitation for clocks.

BTW Pentinum 4 Netburst is a good example of different architecture... Intel have reached a limit in clock with Pentinum III so they choose to decrease the number of operations made per clock cycle with Netburst to break that limit and increase the clocks.
What happened? Well with less things beings done per clock cycle the Netburst failed... the Pentinum III was faster than Pentinum IV at same clock.

But that doesn't change that increase the clock of these processors give a better performance curve than increasing the core counts and it is true for both Pentinum III and Pentinum IV.

If there is no physical limitation for clock a 4Ghz Pentinum III/IV will perform better than a Dual-core 2Ghz Pentinum III/IV.
We know it is not even possible to reach 4Ghz with a Pentinum III and even for Pentinum IV it will be already crossed the limit where the clock scale close to linear.
I think we are talking past each other. I am talking about parts at or near the top of their respective power efficiency curves and pushing beyond that. You are talking about running something at 2GHz with 4 cores and 4GHz with 2 and seeing better perf with the later. Pentium 4 failed due to non linear returns with clock speed increases. Process technology couldn’t keep up with the designs being pushed past nominal performance efficiency causing them to be hot and power hungry.
 

ethomaz

Banned
I think we are talking past each other. I am talking about parts at or near the top of their respective power efficiency curves and pushing beyond that. You are talking about running something at 2GHz with 4 cores and 4GHz with 2 and seeing better perf with the later. Pentium 4 failed due to non linear returns with clock speed increases. Process technology couldn’t keep up with the designs being pushed past nominal performance efficiency causing them to be hot and power hungry.
Yeap you have limits in clocks increase... crossing that your have unstable performance increase.
Increase in units count even not being as linear as clock increase still gives a lot of performance.

But my point is clock increase gives more linear performance increase than core count increase... that is true in the past and today.
 
Last edited:

dcmk7

Banned
I kept being told framerate was king? So Xbox wins again then.

So framerate is more important now?

Because resolution was more important to you very recently.

Higher resolution and improved texture filtering, just better again.

The actual lowest point on PS5 is 54fps as per the VGtech data, but with no VRR to compensate and at a lower resolution.
So yeah the Xbox version is the best way to play.

Yep, higher resolution and smoothest performance, another one.

The above is only recently didn't care to dig any deeper, I'm sure there are some hilarious ones around the launch period.
Back then you were having regular meltdowns.

But it seems like the moving of the goalposts in these comparison threads isn't solely down to Sony fanboys.

Stuart360 Stuart360 you have strong opinions on this. Agree?
 
Last edited:

Stuart360

Member
So framerate is more important now?

Because resolution was more important to you very recently.







The above is only recently didn't care to dig any deeper, I'm sure there are some hilarious ones around the launch period.
Back then you were having regular meltdowns.

But it seems like the moving of the goalposts in these comparison threads isn't solely down to Sony fanboys.

Stuart360 Stuart360 you have strong opinions on this. Agree?
Strong opinions on what?, Riky doing an obvious sarcastic post in regards to the goal post shifting in this very thread?. I did the same kind of post (as banter), even though i already said PS5 wins this one due to the higher overall rez, while XSX framerate advantage will be almost impossible to notice.
Where were you lot the last few comparisons anyway?, didnt see a wink out of a lot of you until this one. Will it be the same again with the Kingdom Hearts 3 comparison, which is already in the Next Gen Troll thread, and has XSX higher resolution and miles better AF?.
 

Riky

$MSFT
So framerate is more important now?

Because resolution was more important to you very recently.







The above is only recently didn't care to dig any deeper, I'm sure there are some hilarious ones around the launch period.
Back then you were having regular meltdowns.

But it seems like the moving of the goalposts in these comparison threads isn't solely down to Sony fanboys.

Stuart360 Stuart360 you have strong opinions on this. Agree?

Just repeating what has been said over and over as the resolution gap widened, Hitman 3 having that huge constant advantage made people start clinging to miniscule framerate drops as their machine inevitably fell behind. So they must be disappointed now they are dealing with inferior framerates.
 

Heisenberg007

Gold Journalism
Just repeating what has been said over and over as the resolution gap widened, Hitman 3 having that huge constant advantage made people start clinging to miniscule framerate drops as their machine inevitably fell behind. So they must be disappointed now they are dealing with inferior framerates.
Both consoles are hitting 60 FPS 100% of the time and 120 FPS 99% of the time. There is no difference in frame rates. But there is a difference in resolution.
 
Last edited:

Stuart360

Member
Let him says what he wants... this gen is not being good for them.
Actually the last few comparisons have been all XSX, and whoever makes the Kingdom Hearts 3 comparison thread will have 'another one' for XSX.
We went from PS5 edging out almost all the launch games, to XSX getting a couple of ties in the next mini game wave, to XSX getting more ties, and some wins, in the next mini game wave. Whats it going to be like a year from now?, 2 years?.
 

DJ12

Member
Actually the last few comparisons have been all XSX, and whoever makes the Kingdom Hearts 3 comparison thread will have 'another one' for XSX.
We went from PS5 edging out almost all the launch games, to XSX getting a couple of ties in the next mini game wave, to XSX getting more ties, and some wins, in the next mini game wave. Whats it going to be like a year from now?, 2 years?.
And ps5 fans freely admit for the most part playing last gen games is better on series x, when are you going to admit playing current gen games is better on ps5.
 

Heisenberg007

Gold Journalism
Actually the last few comparisons have been all XSX, and whoever makes the Kingdom Hearts 3 comparison thread will have 'another one' for XSX.
We went from PS5 edging out almost all the launch games, to XSX getting a couple of ties in the next mini game wave, to XSX getting more ties, and some wins, in the next mini game wave. Whats it going to be like a year from now?, 2 years?.
What about the KH3 comparison? (not sure we have a next-gen version now).

But VGTech did a comparison in BC mode a few months ago. It's pretty much a locked 60 on XSX and PS5, with PS5 locked at a slightly lower resolution (because of BC mode and PS4 Pro settings). But the difference is pretty nominal. 1300p vs. 1400p.
 

SLB1904

Banned
And ps5 fans freely admit for the most part playing last gen games is better on series x, when are you going to admit playing current gen games is better on ps5.
Daddy phil is making the tools. You just wait. Shit is boring at this point
 

Stuart360

Member
What about the KH3 comparison? (not sure we have a next-gen version now).

But VGTech did a comparison in BC mode a few months ago. It's pretty much a locked 60 on XSX and PS5, with PS5 locked at a slightly lower resolution (because of BC mode and PS4 Pro settings). But the difference is pretty nominal. 1300p vs. 1400p.
There is a new comparison vid in the troll thread. Higher rez, miles better AF on XSX.
The problem is that a lot of you are strangely absent from the comparison threads for the games XSX wins. I suppose out of sight, out of mind.
 

Stuart360

Member
Daddy phil is making the tools. You just wait. Shit is boring at this point
Yes it is boring, and it was Sony fans that started the whole 'XSX has bad tools' stuff in the troll thread, and its them that keep bringing it up. I have barely seen a Xbox fan on here even mention tools really.
 

Heisenberg007

Gold Journalism
There is a new comparison vid in the troll thread. Higher rez, miles better AF on XSX.
The problem is that a lot of you are strangely absent from the comparison threads for the games XSX wins. I suppose out of sight, out of mind.
I haven't even seen that thread, and I checked the main Gaming Forum page twice at different times.

Either way, unless it's a proper next-gen version, how does it define XSX's or PS5's power? We know that in BC mode PS5 just inherits PS4 Pro settings and Xbox Series X just inherits Xbox One X settings. BC comparisons just remind us that One X was more powerful than PS4 Pro. It doesn't tell us anything else.
 
Top Bottom