• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

EGOMON

Member
As PS fan I'm a bit disappointed with PS5 spec and kudos to Xbox Team you can tell they were on track to deliver the goodness.

I'm still getting PS5 day one but this new PlayStation management need to go, the only saving grace for them is the price.

Either way it is going to be an awesome gen with huge potential in many areas i. e. design, AI, world building and gameplay mechanics.
 
Last edited:

OsirisBlack

Banned
Any explanation on what happened? Bad sources?
On the hardware side not even upset it puts them in a pretty shit position for the entire generation though. They will be playing catch up and their is no sauce created that can span that gulf. XSeX is definitely more powerful. Also clearing up a bunch of bullshit Ive read PSV does constantly run in boost mode but I don't think that is a good thing. (Own Opinion) Cant imagine sustaining that high a clock speed is going to be quiet after hours of playing anything. Never cared who won will still be getting both systems for exclusives but XSeX should be the multiplat king.
 

HeisenbergFX4

Gold Member
Btw matt from the other site was right on the money .he said xsx will have edge by nothing major and when he was pushed he said 15% last year.

I don't follow that other place much, just hang out here :)

On the hardware side not even upset it puts them in a pretty shit position for the entire generation though. They will be playing catch up and their is no sauce created that can span that gulf. XSeX is definitely more powerful. Also clearing up a bunch of bullshit Ive read PSV does constantly run in boost mode but I don't think that is a good thing. (Own Opinion) Cant imagine sustaining that high a clock speed is going to be quiet after hours of playing anything. Never cared who won will still be getting both systems for exclusives but XSeX should be the multiplat king.

That also concerns me about new RROD issues over time.
 
Last edited:
And the overclocked PS5 GPU is still 15% slower than Series X GPU in the best case scenario, so I'm not sure what your point is.
The GPU's use the same microarchitecture, but as Cerny said, a higher clocked 36 CU GPU performs better than a lower clocked 42 CU GPU because the higher clockspeeds have a trickle down effect to other aspects such as L1/L2 caches speeds and rasterization speeds. Having a dedicated SPU-like chip for audio helps offload audio processing from the GPU. When AMD talked about SmartShift, they said that it gives a free 10% boost in performance to devices over those who do not use SmartShift. Offloading in favor to the GPU and CPU is important because not all games are created equal. We have seen with games like Assassin's Creed Unity being more CPU intensive than GPU intensive, which was why the XB1 version actually performed better than the PS4 version.

Note how I actually paid attention to the fine details and even cite old games as examples. What do you have to back up your points? Unfounded assertions ad nauseaum.
The gap isn't large enough to make such a large difference visually. For multiplats, the PS5 version can run 75% the resolution of the XB1 version with Radeon Image Sharpening and it will look identical, if not very very close.


You're making circular arguments. If I can simply reference my previous posts to rebut your Argument from Assertions ad nauseam, then you have provided absolutely zero points.

Also, the percent difference only refers to teraflops, but teraflops do not provide the entire picture of actual performance.

What rebuttals? You use boost clock as direct measurement against fixed clock, that shows how silly your argument is.
You claimed that overclocking an 2080 does not make it perform close to a 2080Ti. Benchmarks showed that you are wrong. Note how you provided no benchmarks to back your claims.
 
Last edited:

CJY

Banned
For example, you can overclock 4 core CPU, but it will still not be able to do the same workload as 6 or 8 core CPU. You won't cut down the difference by much. Plus having less RT is different because of how much they impact the performance.

It completely depends on the developer and how they choose to multithread their application. It's part of the reason why XSX has a slower clock speed when using SMT and all 16 threads. What you are saying is wildly inaccurate and grossly misrepresents the situation. Your knowledge and understanding is very suspect too.

Bottom line, it's entirely developer-dependent. You downplaying the important role developers play along every step of the chain boggles the mind.

Fact is, to use your example, you will indeed be able to get through workloads faster by overclocking if the application is designed for 4 cores. Throwing more cores at the problem doesn't help a single jot if the the application isn't designed to take advantage of more cores. Just saying it won't "cut down the difference by much" isn't saying anything and I feels like you are continuously conflating PC-centrism with console-specific development.
 

xool

Member
Screenshot-20200319-035427-Samsung-Internet-2.jpg


No offense but this "load stuff as you turn round" is becoming a meme and it's going to bite on the ass. Try taking 1 (virtual) step back and to the left in the scene he just described - now both assets are in close view .. and you don't have the memory to do it .. this isn't going to work.
 
Last edited:

mitchman

Gold Member
Do you have exemples of graphics cards benchmarks showing this in practice? I remember that the Vega 64 did not perform as well as the Tflops would suggest compared to the 56, but would be nice to see testes done with RDNA cards, or even Nvidia for that matter.
No, but I'm sure DF will try to replicate these conditions soon by under and overclicking gfx cards.

Damn thats crazy. Do you think ps5 can keep up with xsx performance wise ? I m assuming resolutions will be 15% lower due to gpu being 15% weaker .


Any comments on that ? Thanks man
15% will likely be lower due to other parts of the GPU being clocked significantly higher on PS5. We will have to wait and see how much that will matter.
Good Old Gamer also confirmed my suspicion that base clock of PS5 GPU is around 9.7 tflops. He also said variable frequency solution is not a good solution in the long term because of CPU and GPU having to constantly underclock and overclock all the time.
Tell that to all PC GPU and CPUs then. Variable clocks has been common for a long time.
 

Reindeer

Member
Show
You're making circular arguments. If I can simply reference my previous posts to rebut your Argument from Assertions ad nauseam, then you have provided absolutely zero points.


You claimed that overclocking an 2080 does not make it perform close to a 2080Ti. Benchmarks showed that you are wrong. Note how you provided no benchmarks to back your claims.
Show me one benchmark where 2080 performance the same as 2080Ti. You just making up stuff now. And read Osiris's comment above if you still in denial about power gulf.
 
Last edited:

Reindeer

Member
No, but I'm sure DF will try to replicate these conditions soon by under and overclicking gfx cards.


15% will likely be lower due to other parts of the GPU being clocked significantly higher on PS5. We will have to wait and see how much that will matter.

Tell that to all PC GPU and CPUs then. Variable clocks has been common for a long time.
No they haven't, DF already stated that consoles always ran at fixed clock and the exception was Switch because it has to run in docked mode as well.
 
Last edited:
Show

Show me one benchmark where 2080 performance the same as 2080Ti. You just making up stuff now. And read Osiris's comment above of thy still in denial about power gulf.
I can tell that you're just responding as fast as you can without actually thinking because I already showed you a while ago:

Show the benchmarks. Here's what I found.

2080Ti has a +15% effective game eFPS over the 2080. When overclocking the 2080 by 110 Mhz, average FPS improved by 11%. And you also have to take into account that the 2080Ti has 11GB of VRAM on a 352-bit bus while the 2080 only has 8GB of VRAM on a 256-bit bus.
 
On the hardware side not even upset it puts them in a pretty shit position for the entire generation though. They will be playing catch up and their is no sauce created that can span that gulf. XSeX is definitely more powerful. Also clearing up a bunch of bullshit Ive read PSV does constantly run in boost mode but I don't think that is a good thing. (Own Opinion) Cant imagine sustaining that high a clock speed is going to be quiet after hours of playing anything. Never cared who won will still be getting both systems for exclusives but XSeX should be the multiplat king.
That gulf ?do you mean gulf in power ? Man its just 17% . Right now x1x has 45% advantage over pro. Or ps4 has 41% advantage over x1.this is closest 2 console have ever been
 
Last edited:

MarkMe2525

Member
Nope it was referring to their own take of RDNA 2 cores that's on PS5, and they compered those to again another specialized core that is based (but not with complete feature parity!) on GCN that was on their PS4. So pears to avocados.
PS5 RDNA 2 is just a custom RDNA 2, and again PS4 GCN is a custom GCN, both of which you can't find in off the shelf PC part!


Well if a game is bottlenecked by I/O, solving that bottleneck frees up cycles that can be used by GPU to push FPS. So sticking in an SSD =/= higher fps directly, but indirectly if you know I/O is the bottleneck and then yeah SSD = higher fps. Star Citizen is a game that is highly bottlenecked by I/O, so if you have and HDD oh boy good luck getting it above 14 fps, but put in an m2 and it is night and day. Your top of the line graphics card is choked in the first instance, now it is flopping those points as it's supposed to.

Yeah me too. It sounds similar to Vega fp16 packed math and primitive shader stuff that was largely crippled by other part of the gpu and mostly unused for that reason. Now, it seems whatever was crippling it has been solved, and I kinda have a feeling it is have something to do with higher bandwidth, or using the audio SPU as coprocessor.


Yet! Maybe from now on, having PCIe 4.0 m2 SSDs will be much more important going in the future even on the PC landscape. What if the previously weakest links/consoles are generating a leap for PC gaming as well, pushing all enthusiast grade rigs in this direction, and all the developers start targeting these hw and expecting certain streaming speed demands for their upcoming games.
Man.. I don't mean to be a dick but you're just wrong.
 

xool

Member
Yup, its all down to cost and exclusives now.

First party I would usually side with Sony, but damn if MS's recent change of tact hasn't got me hopeful. They now have a shit ton of first party studios, many of which haven't shown a single thing of what they are doing for next gen yet. I remain hopeful that they have taken the last few years to prep for all this. Sony, I will give the benefit of the doubt, because while they get a bit of stick for dad walking simulators... They are bloody well GOOD dad walking simulators.

But price...? Who knows. Actually who knows... PS5 could be priced at $399 but I don't actually see it. The parts are not THAT much cheaper, and that SSD is not gonna be cheap, that's for sure... I think the consoles will be on par price wise, but I really have this odd feeling MS will undercut them for some reason. I can't put my finger on it, I just have the tingles?
I don't see them going cheaper - the APU is probably smaller, but their cooling solution probably isn't. Everything else seems about equal. Sony's SSD may even be a little more (probably not, just more parallel) .. I think both companys will have very similar options on price.

And yeah, a lot of MS's new studios are AA/A or B developers .. they need some cash and talent injections I think. I think most will do ok, but they don't have an Insomniac.
 

HeisenbergFX4

Gold Member
Good Old Gamer in his video said variable frequency will likely result in reduced performance after few years because clocks are always under stress.

I am a Sony fan first and coming from a PS gamer I can say I can't wait to play games on the PS5.

That said their approach does concern me especially over time.

To me it felt like a knee jerk reaction to the XsX being 12 tfs and they had to do whatever they could to reach double digits.
 

xool

Member
I have a question, I'm sorry if it was asked before...

I'm reading about Xbox Lockhart as a cheaper option coming out....

What about Xbox One X? They are killing it after less than 3 years?

Makes no sense to me, it is plenty fast for 1080p gaming
It's a slow CPU on an old node .. they could replace/retire it with something better and compatible at the same price .. probably ..
 

POak

Neo Member
As someone whose PlayStation consoles broke around the 2-year mark (i.e., after the warranty had expired, and despite all the care), I would like to ask the following - how's everyone feeling about console durability after both consoles' deep-dives? I know we are yet to see PS5's form factor, but going by what we have right now, which system seems more prepared to deal with heat issues, energy spikes, and the like?

I got a strange vibe from Cerny's discussion of PS4's heat issues, and how they would be handled in the coming generation. What are your thoughts on this, guys?
 

DaGwaphics

Member
No they haven't, DF already stated that consoles always ran at fixed clock and the exception was Switch because it has to run in docked mode as well.

This. In "game mode" it's been the norm for static clocks, just to guarantee performance in different environments. On PC, you'll get more out of part x if ambient temperatures are low vs. high, this generally isn't something you want. Of course this is exactly why Sony made a point of stating that the boost isn't thermal based (because that would kill a lot of the advantages of console). The PS5 appears very nuanced, it will be interesting to see how both systems fair in practice.
 

MarkMe2525

Member
That gulf ?do you mean gulf in power ? Man its just 17% . Right now x1x has 45% advantage over pro. Or ps4 has 41% advantage over x1.this is closest 2 console have ever been
That is gpu at full throttle, which also means the cpu is being downclocked during this time. So by maxing out gpu you create a cpu bottleneck and by maxing out cpu you are creating a gpu bottleneck. At least that is how I hear it. The ps5 cannot run 3.5 ghz cpu and 2.3 ghz gpu concurrently.
 

StreetsofBeige

Gold Member
As someone whose PlayStation consoles broke around the 2-year mark (i.e., after the warranty had expired, and despite all the care), I would like to ask the following - how's everyone feeling about console durability after both consoles' deep-dives? I know we are yet to see PS5's form factor, but going by what we have right now, which system seems more prepared to deal with heat issues, energy spikes, and the like?

I got a strange vibe from Cerny's discussion of PS4's heat issues, and how they would be handled in the coming generation. What are your thoughts on this, guys?
Based on what they've shown, MS seems confident in their kleenex box design. And their recent Xbox X system is quiet too.

PS4/Pro systems are loud, and solely based on PS5 having this weird back and forth power struggle between CPU usage and a GPU with a boost clock where both cant sustain max clocks at the same time, there's something weird going on here.

There have been BOM rumours Sony spent good coin on a cooling system/fan for PS5, so heat might be an issue.

Well have to wait and see. We don't even have PS5 form factor yet.
 
That is gpu at full throttle, which also means the cpu is being downclocked during this time. So by maxing out gpu you create a cpu bottleneck and by maxing out cpu you are creating a gpu bottleneck. At least that is how I hear it. The ps5 cannot run 3.5 ghz cpu and 2.3 ghz gpu concurrently.
Man the down clock is 2% its nothing when it happens .its juts to keep the console from sounding like jet engine .the gpu is between 10.08 to 10.28 lol .u sound like the performance drops by 50% . Its 2% 😂
 

Handy Fake

Member
That is gpu at full throttle, which also means the cpu is being downclocked during this time. So by maxing out gpu you create a cpu bottleneck and by maxing out cpu you are creating a gpu bottleneck. At least that is how I hear it. The ps5 cannot run 3.5 ghz cpu and 2.3 ghz gpu concurrently.
I'd like to hear more about that, I'm a bit fuzzy as to whether that's the case.
Thinking about it, I'm very curious as to just how fast the internals can switch between GPU and CPU boosting. If they have custom architcture in there to remove any bottleneck from that then that could be a very interesting aspect.
 

Reindeer

Member
I am a Sony fan first and coming from a PS gamer I can say I can't wait to play games on the PS5.

That said their approach does concern me especially over time.

To me it felt like a knee jerk reaction to the XsX being 12 tfs and they had to do whatever they could to reach double digits.
Yeah, this just proves to me that Cerny was told to work within a certain limit and it probably was cheaper to go for faster SSD than it was for bigger APU.
 
Last edited:

Joey.

Member
E3 digital events, State of Plays, Inside Xbox's are going to be so important in the upcoming months.
We know that both consoles are beasts. Xbox obviously edges out in total power but will it matter? I guess we have to wait and see.
It's going to come down to price and the games. That's it. Well at least for me...
 
Yeah, this just proves to me that Cerny was told to work within a certain limit and it probably was cheaper to go for faster SSD than it was for a bigger APU.
Well ssd can be designed in house but R&D costs will be absorbed else where rather than BOM .apu has to be bought. So different methods
 

Fake

Member
The reason you wouldn't get a PS5 day one is because it won't play all games that you can play right now on your PS4? And I thought the main selling point were next gen games...🤷
I don't remember MS making RT mandatory on XSX.

I recommend you rewatch the video from yesterday. PS5 games are entering in beta testing and right now top 100 games will be on BC day one. They give me 'less' reasons to get a PS5 day one.
Keep in mind I not get a PS4 day one as well. I get the PS4 slim. I was hoping for a next gen way of doing BC. I could be happy with little PS5 games while I could play PS4 games while I wait the proper next gen games to happen, but I not so sure now. What is the probability of those 100 games one of them be my?

I live in Brazil and the top 7 games here are Fortnine, Ark and GTAV. None of those games I play.

Again, different people, different targets. For me was a deal break. No more day one for me.

MS don't make RT mandatory as well, but only Sony don't release tech demo about RT.
 
Last edited:

MarkMe2525

Member
Man the down clock is 2% its nothing when it happens .its juts to keep the console from sounding like jet engine .the gpu is between 10.08 to 10.28 lol .u sound like the performance drops by 50% . Its 2% 😂
I don't know man. Just doesn't sound right. Kinda feels like a 32mb esram situation. I remember all of the xbox fans that were active on the forums defending xbox one 8gb ddr3 with this. In the end, it caused complexity which means the developers had trouble optimizing for it. I believed them all to at the time. That is what wishful thinking will do.
 

Darius87

Member
On the hardware side not even upset it puts them in a pretty shit position for the entire generation though. They will be playing catch up and their is no sauce created that can span that gulf. XSeX is definitely more powerful. Also clearing up a bunch of bullshit Ive read PSV does constantly run in boost mode but I don't think that is a good thing. (Own Opinion) Cant imagine sustaining that high a clock speed is going to be quiet after hours of playing anything. Never cared who won will still be getting both systems for exclusives but XSeX should be the multiplat king.
and you are king of bullshit riddles.
 
I don't know man. Just doesn't sound right. Kinda feels like a 32mb esram situation. I remember all of the xbox fans that were active on the forums defending xbox one 8gb ddr3 with this. In the end, it caused complexity which means the developers had trouble optimizing for it. I believed them all to at the time. That is what wishful thinking will do.
Sure. We will see the fruits soon enough man and if it falls behind xsx in multiplatform games more than 17% then you will be correct. Otherwise it is minor as cerny said and rarely happens.
 

Handy Fake

Member
I don't know man. Just doesn't sound right. Kinda feels like a 32mb esram situation. I remember all of the xbox fans that were active on the forums defending xbox one 8gb ddr3 with this. In the end, it caused complexity which means the developers had trouble optimizing for it. I believed them all to at the time. That is what wishful thinking will do.
I don't pretend to know anything more about technology than "turn it on and it makes a noise", but I'd have thought that esRAM issue analogy would be more applicable to the SeX's slower 6gb pool of RAM?
 
That is gpu at full throttle, which also means the cpu is being downclocked during this time. So by maxing out gpu you create a cpu bottleneck and by maxing out cpu you are creating a gpu bottleneck. At least that is how I hear it. The ps5 cannot run 3.5 ghz cpu and 2.3 ghz gpu concurrently.

Cerny specified both the CPU and GPU would run max clocks most of the time. Video timestamp around 37 min.

This quote would give the idea how things work :

However, there is a twist and it's something we've covered before, that we can now see play out in real-time - Nintendo's 'boost mode'. This amounts to optimisations in how certain games selectively overclock the CPU to improve loading times. For example, when you die in Mario Odyssey, the screen fades to black and the game loads you back to the last checkpoint. There is a fairly quick turnaround in Odyssey but this is faster thanks to boost mode. During loading, the CPU gets upclocked temporarily to 1785MHz - a 75 per cent increase on the stock clock. Meanwhile, the GPU actually drops all the way down to 76.8MHz - a tenth of its usual speed. Nintendo is balancing thermals by overclocking one component to the max, while downclocking another to the bare minimum.
New Switch mod delivers real-time CPU, GPU and thermal monitoring - and the results are remarkable

Cerny also said 10% drop in power can be achieved by 2% drop in clock speed. That's not 10% drop in performance.
 

llien

Member
if it falls behind xsx in multiplatform games more than 17%
How would you measure that? For starters, do we even have any numeric values, besides lower resolution, that Xbone has vs PS4?

The most unfortunate thing for me, personally, is Sony pushing those 36CUs to cross that psychological 10TF mark.
I'd rather it stayed cool, quiet, easier on component and power plants.
 
How would you measure that? For starters, do we even have any numeric values, besides lower resolution, that Xbone has vs PS4?

The most unfortunate thing for me, personally, is Sony pushing those 36CUs to cross that psychological 10TF mark.
I'd rather it stayed cool, quiet, easier on component and power plants.
By fps and resolution .those are hard numbers man.if xsx runs bative 4k (2160p) ps5 should be around 1800p . If its lower then we can say its behind more than the gpu power .

We dont know about their cooling system which cerny seemed proud of .we ll see
 

MarkMe2525

Member
I don't pretend to know anything more about technology than "turn it on and it makes a noise", but I'd have thought that esRAM issue analogy would be more applicable to the SeX's slower 6gb pool of RAM?
Yeah, the XsX ram set up is different for sure. It does seem very deliberate and that is wasn't configured this way as a bandaid like 32mb esram. DF mentioned that MS stated they actually embedded a processor in Xbox one X that monitored texture usage in ram and designed their setup accordingly. But this conclusion is based off watching a bunch of youtube and reading breakdowns of hardware so I could just be full of shit.
 

xool

Member
Cant imagine sustaining that high a clock speed is going to be quiet after hours of playing anything.
That also concerns me about new RROD issues over time.

It's such a high figure. 2.2GHz is over the (probably best case) +10% speed iso power promised from TSMC for n7+ compared to the highest clocked previous RDNA chips (Radeon 5700XT is 1.9GHz boost)

It sounds like a 250W tdp part .. are they even on N7 ? when is N6

[edit - well we all saw the air ports on the dev kits ... that should've told something]
 
Last edited:
Status
Not open for further replies.
Top Bottom