• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5 and Xbox Series X GPU TFLOPs Difference “Probably” Won’t Matter in the Long Run – Hellpoint Developer

I did notice that, if we check both 5700 vs 5700xt and 2070 SUPER VS 2080 SUPER which is the closer to next gen consoles taking in account CU count and and frequency, even tho next gen consoles are RDNA 2.0 GPUS.
In terms of actual rendering results and capability the 2070 Super to the 2080 Super would be a pretty apt comparison of where these GPU's will fall.

PS5 = 2070 Super
Series X = 2080 Super
 

Romulus

Member
You can't make an argument, our interaction ends here.

If you extrapolate the rendering load being put on the hardware? Yes, it feasibly could.

You're absolutely missing the point.

Then why even respond like at all if I'm "absolutely" missing your point. Explain it, because you posting imagines of XB1 games doesn't add up what we're talking about.
 
Then why even respond like at all if I'm "absolutely" missing your point. Explain it, because you posting imagines of XB1 games doesn't add up what we're talking about.
The point is to highlight the surplus in compute, contextualize it so it's easier to understand than just this faceless 18% - XX% number. That with the available compute afforded to this GPU it could feasibly handle the entire workload of rendering out an Xbox One game like Forza Horizon 4 and a PlayStation 4 game like God of War at the same time with just with the extra power afforded to it over the PlayStation 5.

It puts things into perspective.
 
I see, well the question now is what is the price MS is going for? because if they can't match Sony then they wont have an advantage at all.
Well that's the kicker and the silver lining to the whole thing. The PlayStation 5 probably has a similar BoM if not even more expensive. I think Microsoft did their homework extremely effectively, they knew their storage solution would be more than ample to handle anything that could ever possibly be required of it and with this cost saving they were able to divert those funds into the rest of the compute engineering.

Sony's SSD is without a doubt the most expensive aspect in their system, and given the amount of money poured into that the rest of the system suffered computationally. Their cost offset is likely proportional, the extra money they put into that SSD is the money Microsoft put into their GPU, CPU and memory subsystem.

They can no doubt target exactly the same price, and actually given Microsoft's financial footprint they could possibly even go lower. Mike Ybarra put it best just so you can understand the difference between these companies.

This was directly related to price matching for competition, basically Microsoft could lose money on this and it would be the equivalent of a rounding error, for Sony it would actually have a real financial impact.

8sm5Aox.png
 
Last edited:

Redlight

Member
Yes it was. IGN round-table with the "you can't see a difference past 720p anyhow" and all the devs doing "parity, because of console wars and stuff" early on. Tons and sites shilling for the always online DRM, etc..

What is this revisionism?
So, you found the downplaying of any power difference back then appalling? Surely you want to make sure that it never happens again?
 
D

Deleted member 775630

Unconfirmed Member
I was thinking of using machine learning to constantly improve AI within a game in response to how a player plays it. Obviously graphics are what sells games and casual players don't really want big challenges....
Better to offload that to the cloud, the learning part at least. Also not sure if playing versus one opponent is enough to learn. It's better to accumulate everyone's playstyle and train on that, but again in the cloud. So you don't necessarily need dedicated hardware for this on the system itself.
 
RTX 2070 Super - 9.062 TF
RTX 2080 Super - 11.15 TF

Difference is 2.088‬



It's not a sizable difference.

when the gap is 2TF or even 1.8TF.

You can compare a RX 5700 vs RX 5700 XT which is a 1.8 difference. Still, there's no sizable difference.




This is just about TF performance alone and I just don't see why people think a 2TF difference between these two cards is a huge difference. It's only 18%, and like others have said, the PS4\XB1 had around a 40% difference.

You're getting 10-12FPS different compared to the higher card. PS5 will have to lower the resolution *slightly* if it can't hit the 60fps or 30fps target while rendering at the highest resolution.

I see 20 FPS from 2070 Super to 2080 Super, I might be blind though
 

Romulus

Member
I'm not talking about resolution. PS5 won't be nearly as capable as XSX when it comes to Raytracing.

But how do you know how discernable the differences will be onscreen to the human eye? It could be 30%+ more capable and not show much difference onscreen if at all.
 

Romulus

Member
The point is to highlight the surplus in compute, contextualize it so it's easier to understand than just this faceless 18% - XX% number. That with the available compute afforded to this GPU it could feasibly handle the entire workload of rendering out an Xbox One game like Forza Horizon 4 and a PlayStation 4 game like God of War at the same time with just with the extra power afforded to it over the PlayStation 5.

It puts things into perspective.

If you think so, I just dont see it translating onscreen, the same way ps4 games barely translated over xbox one. The gap between the ps4 and xbox one could have likely been enough to power Killzone 2, but whatever.
 
But how do you know how discernable the differences will be onscreen to the human eye? It could be 30%+ more capable and not show much difference onscreen if at all.
Why wouldn't there be a discernible difference on screen? RT is configured just like any other part of the rendering pipeline, disable and enable, varying levels of quality etc.
 
It will be. RT scales with resolution.
This doesn't make any sense because the Series X could target a higher resolution and still push as much RT into a scene, depending on the resolution even more, or they could target the same resolution and push even more than that.

There's no way around this computational gulf. I really don't understand what the end game here is for posts like this, the system has higher raster capability and more RT hardware. How do you guys come to these logically incongruent conclusions?
 
Last edited:

Romulus

Member
Why wouldn't there be a discernible difference on screen? RT is configured just like any other part of the rendering pipeline, disable and enable, varying levels of quality etc.


Exactly my point. If you have high vs very high shadows, most people cant even tell unless its side by side, and even then it's not much.
We haven't seen both side by side and little is known about Sony's. I just don't know how MS will have enough to set them apart once the entire image us constructed. I think if Sony even has something even remotely close you wont be able to tell with zoom lenses from DF.
 
Last edited:
Exactly my point. If you have high vs very high shadows, most people cant even tell unless its side by side, and even then it's not much.
We haven't seen both side by side and little is known about Sony's. I just don't know how MS will have enough to set them apart once the entire image us constructed. I think if Sony even has something even remotely close you wont be able to tell with zoom lenses from DF.
Because they can construct the same image and still have a 20% surplus of raster compute to throw into graphical complexity, and they still have 44% more RT hardware on die.

None of this makes sense.
 

darkinstinct

...lacks reading comprehension.
Why is EVERONYE in the whole INDUSTRY - every dev, every journalist - trying HARD AS FUCK to downplay the huge power difference and relevance of power? Like, suddenly power doesn't matter?! wtf is going on? Right before the PS5 GDC talk the narrative was completely different. What is going on? This is crazy.

Wasn't even remotely like this during XBOX ONE reveal. lol
Damage control. Xbox devs are still under strict NDA, PS devs as well, but Sony lifts restrictions seemingly for a couple of devs (notably Quantum Error dev and He'll point dev), possibly for those working on PS exclusives cause they know those don't try to talk up XSX. Started in December with Randy Pitchford going crazy on Phil Spencer because he didn't believe XSX was powerful - and they are publishing that PS5 launch title/exclusive with raytracing. Most people in the industry don't have an agenda. They need all platforms to sell their games, being a console warrior hurts your bottom line.

Teraflops didn't magically become irrelevant. But hey, you know that every game you play on Xbox One (2013) is output at 1080p? And Crytek talked up Xbox One last Gen because they had an exclusive launch title. The roles are just completely reversed.
 

Romulus

Member
Because they can construct the same image and still have a 20% surplus of raster compute to throw into graphical complexity, and they still have 44% more RT hardware on die.

None of this makes sense.

Dynamic resolution. Itll hold the same settings and drop to a constructed image or lower resolution. Mark my words, you'll need DF to know the differences this gen as you can a 300% zoom hoping for a hint of an XSX advantage.
 
Dynamic resolution. Itll hold the same settings and drop to a constructed image or lower resolution. Mark my words, you'll need DF to know the differences this gen as you can a 300% zoom hoping for a hint of an XSX advantage.
Yeah I don't think so, because any of these methods which can be deployed on the PS5 can be applied all the same on Series X and still push things further.

You're greatly underselling all of this.
 
This doesn't make any sense because the Series X could target a higher resolution and still push as much RT into a scene, depending on the resolution even more, or they could target the same resolution and push even more than that.

There's no way around this computational gulf. I really don't understand what the end game here is for posts like this, the system has higher raster capability and more RT hardware. How do you guys come to these logically incongruent conclusions?

It makes perfect sense. RT applies to all of the pixels that need to be processed. So yes, in theory we are looking at a worst case 15% reduction in resolution by PS5, The "ray tracing" per resolution will be roughly the same.
 

The gains are not that huge when comparing 2080 S vs 2070 S.
VRS and Direct ML don't exist to target 4K directly, there's a reason there's such an investment in these kind of technologies. They can come in at a lower resolution with perceptibly same results but in the middle ground for a perfect balance of fidelity and performance.

Guess what happens with a 2070 Super and 2080 Super in the middle ground? The 2080 Super walks on the 2070 Super by a wide margin.

It makes perfect sense. RT applies to all of the pixels that need to be processed. So yes, in theory we are looking at a worst case 15% reduction in resolution by PS5, The "ray tracing" per resolution will be roughly the same.
And if it's not scaled the RT will be considerably disproportionate.

You guys can stop, there's no way to overcome this gap in rendering capability.
 
Last edited:

darkinstinct

...lacks reading comprehension.
I see, well the question now is what is the price MS is going for? because if they can't match Sony then they wont have an advantage at all.
Xbox One cost more to build than PS4. PS5 will cost more to build than XSX.
 

clem84

Gold Member
I predict that the best looking games on PS4 and XB1 will still look pretty respectable compared to next gen games. So do I think the TF difference between PS5 and XSX will matter? IMO it will be inconsequential.

At first I thought maybe multiplats could perform better on XSX. In that case the power difference would very much matter. But the thing is, there's no reason for that to happen. The only thing PS5 devs have to do is turn off a few effects, bump down the resolution a little bit and the PS5 game would perform the same, and the difference, when you're actually playing the game and not going over every frame with a magnifying glass, would most likely go unnoticed.

The PS5's SSD could make a difference though. I'm actually really looking forward to seeing if it's as significant as Sony claims it is. Really looking forward to the release of both systems.
 
Last edited:
VRS and Direct ML don't exist to target 4K directly, there's a reason there's such an investment in these kind of technologies. They can come in at a lower resolution with perceptibly same results but in the middle ground for a perfect balance of fidelity and performance.

Guess what happens with a 2070 Super and 2080 Super in the middle ground? The 2080 Super walks on the 2070 Super by a wide margin.

And if it's not scaled the RT will be considerably disproportionate.

You guys can stop, there's no way to overcome this gap in rendering capability.

"and if it's not scaled"....if it's not scaled and the two have the exact same resolution, then again PS5 would have about 15% reduction in Ray Tracing? Not a big deal

Certainly nothing compared to the 120% massive, staggering, most powerfulest SSD gulf that has ever existed on consoles before. :)
 

Ar¢tos

Member
Better to offload that to the cloud, the learning part at least. Also not sure if playing versus one opponent is enough to learn. It's better to accumulate everyone's playstyle and train on that, but again in the cloud. So you don't necessarily need dedicated hardware for this on the system itself.
But then it's not as relevant. The game has to evolve towards your playstyle, not towards an average playstyle of a group.
When you encounter more than one enemy they usually act like they don't even see each other, attacking at the same time, or one the non attacking ones patiently await their turn. AI can teach enemies to cooperate against your specific playstyle. Also you often encounter enemies of different types together or in the same area. Games rarely turn enemies against each others, and when they do there is no pattern or logic behind it. Enemies could make a temporary partnership with the player (or each other) in these fights and eventually all enemies of that species/types could become allies.
All this could be done in script but it would evolve more naturally if it was done via learning AI (instead of "side with that one 3 times = instant allies).
 
"and if it's not scaled"....if it's not scaled and the two have the exact same resolution, then again PS5 would have about 15% reduction in Ray Tracing? Not a big deal

Certainly nothing compared to the 120% massive, staggering, most powerfulest SSD gulf that has ever existed on consoles before. :)
Bringing up the SSD isn't going to do you any favors here, that shit is a joke. Secondly I'm not sure where you're getting 15% because there's a 44% deficit in physical RT hardware.
 
Bringing up the SSD isn't going to do you any favors here, that shit is a joke. Secondly I'm not sure where you're getting 15% because there's a 44% deficit in physical RT hardware.

Yup. It's such a joke that your boy Tom Warren says that SSD will be the biggest gamechanger this gen :)

44% deficit in hardware when that hardware is running at over 20% less frequency...so more like 15% reduction in actual processing power. Not a big deal. If Microsoft wanted to use that 44% hardware to create an actual 44% advantage they'd need to clock it a lot, lot higher, but they do not have the thermal budget to do so.
 

darkinstinct

...lacks reading comprehension.
It makes perfect sense. RT applies to all of the pixels that need to be processed. So yes, in theory we are looking at a worst case 15% reduction in resolution by PS5, The "ray tracing" per resolution will be roughly the same.
Even RTX struggles with 1080p. That drop in resolution doesn't exist, unless PS5 raytraces at 900p and XSX at 1080p, which are then upscale to 1800p and 4k. But that means the difference in raytracing quality between them gets quadrupled as well. Also AMD raytracing is in the shades. You can't use them for raytracing and general shading, they have the same access to memory. You can use them for one or the other. With Sony having 44 % less shader engines it's not a simple solution. XSX can run a game with 36 CU and add 16 CU with raytracing on top. Basically PS5 games will compromise fidelity for raytracing, XSX won't.
 
Yup. It's such a joke that your boy Tom Warren says that SSD will be the biggest gamechanger this gen :)

44% deficit in hardware when that hardware is running at over 20% less frequency...so more like 15% reduction in actual processing power. Not a big deal. If Microsoft wanted to use that 44% hardware to create an actual 44% advantage they'd need to clock it a lot, lot higher, but they do not have the thermal budget to do so.
No one can articulate how that SSD provides some advantage that Microsoft's can't, or how it could possibly even load more into a scene if there's an obligation to render. Beyond loading it's a non-factor, not going over that again that discussion is cooked.

We have no idea if the increase in frequency even has an effect on the RT cores, we have no idea what kind of impact it could have or the efficiency of frequency over raw hardware.

Even RTX struggles with 1080p. That drop in resolution doesn't exist, unless PS5 raytraces at 900p and XSX at 1080p, which are then upscale to 1800p and 4k. But that means the difference in raytracing quality between them gets quadrupled as well. Also AMD raytracing is in the shades. You can't use them for raytracing and general shading, they have the same access to memory. You can use them for one or the other. With Sony having 44 % less shader engines it's not a simple solution. XSX can run a game with 36 CU and add 16 CU with raytracing on top. Basically PS5 games will compromise fidelity for raytracing, XSX won't.
Xbox Series X is confirmed to run in parallel.

"For the Xbox Series X, this work is offloaded onto dedicated hardware and the shader can continue to run in parallel with full performance. In other words, Series X can effectively tap the equivalent of well over 25 TFLOPs of performance while ray tracing."
 
Last edited:

Romulus

Member
Yeah I don't think so, because any of these methods which can be deployed on the PS5 can be applied all the same on Series X and still push things further.

You're greatly underselling all of this.

Of course, but to what degree? You need quite a of a bit of power to even make a difference now. For one, most of the time devs won't even bother to push the more powerful system beyond quick and easy. That's the same with every console, what is so different here other than less of a power advantage? It sounds like I'm underselling it because you want MS to really pull off some unprecedented advantage like never before. It's not going to happen. The last time we'll ever see multiplatform differences that were huge was the original Xbox. Those days are long gone.
 
Last edited:

MurfHey

Member
These ps5 vs xsx arguments are getting really old and annoying. Stop with the school yard antics and just be happy a new gen is on the way. One may look better one may load faster... there is literally no reason to argue over this. People are just trying to justify their purchase and defend it. We don't even have all the info yet people...and even when we do..again does it matter.. one will still buy Xbox another will still buy playstation.

Try to spit facts and argue with each other. All it is opnions and theories. We don't know the facts as much as you want to believe we do.

I still play my n64 daily and enjoy the heck out of it!
 

Romulus

Member
I predict that the best looking game on PS4 and XB1 will still look pretty respectable compared to next gen games. So do I think the TF difference between PS5 and XSX will matter? IMO it will be inconsequential.

At first I thought maybe multiplats could perform better on XSX. In that case the power difference would very much matter. But the thing is, there's no reason for that to happen. The only thing PS5 devs have to do is turn off a few effects, bump down the resolution a little bit and the PS5 game would perform the same, and the difference, when you're actually playing the game and not going over every frame with a magnifying glass, would most likely go unnoticed.

The PS5's SSD could make a difference though. I'm actually really looking forward to seeing if it's as significant as Sony claims it is. Really looking forward to the release of both systems.

And I think by the time they figure out how to push the hardware and RT, it'll be near the middle of the generation. XSX Pro/ PS5 Pro will be here anyway. Not to mention, much of that time Xbox devs will be forced to pull along the Xbox One as a boat anchor until they cut it loose.
 
Last edited:
Even RTX struggles with 1080p. That drop in resolution doesn't exist, unless PS5 raytraces at 900p and XSX at 1080p, which are then upscale to 1800p and 4k. But that means the difference in raytracing quality between them gets quadrupled as well. Also AMD raytracing is in the shades. You can't use them for raytracing and general shading, they have the same access to memory. You can use them for one or the other. With Sony having 44 % less shader engines it's not a simple solution. XSX can run a game with 36 CU and add 16 CU with raytracing on top. Basically PS5 games will compromise fidelity for raytracing, XSX won't.

and PS5 can crank up the clocks over 20% higher on those CUs

basically PS5 has a 15% reduction in resolution at worst. Not a big deal. Smallest difference in console history

the 120% difference in ssd speeds will have a much more meaningful impact for what you see on screen (detail, LOD, etc)
 

Romulus

Member
and PS5 can crank up the clocks over 20% higher on those CUs

basically PS5 has a 15% reduction in resolution at worst. Not a big deal. Smallest difference in console history

the 120% difference in ssd speeds will have a much more meaningful impact for what you see on screen (detail, LOD, etc)

I honestly don't see any of these console advantages really shining unless it's an exclusive. Maybe load times on PS5 and some very slight resolution boosts on X.
 

Shmunter

Member
There’s always a chance PS5 raytracing is better than XsX with some undisclosed custom GPU features. There were a few tweets in the past seemingly alluding to it. Now game devs with shovel in the ground PS5 projects are beemig over the rt in their games. And ms has 1080p 30 fps Minecraft.

Believe in the power of the sauce, spicy, tangy, a little bit fruity!
 
Top Bottom