• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

Bo_Hazem

Banned
Unfortunately no one know shit about the PS Now, I'm using it and it's great. This month I played Control and Wolfenstein 2, last month Uncharted Lost Legacy, they rotate big titles like gamepass but on a smaller scale I guess.

It's still not available in my region, I think. Although I do use a US account, but not sure if it'll work. I tend to buy the game if I do care about it. But PS Now offering is very solid, especially the downloadable thing.
 

kyliethicc

Member
I think they'll add more ram for 8K gaming, chiplet 72CU is a big possibility and a cheap solution for them. They've went with 36CU to make it easier, just slap 2x and you're good to go.
That would be amazing but be VERY HOT. They’d need to wait until like 2023 when the next die shrink can be used. Whatever is less than 7nm. Like how the PS4 was 28nm. Then the PS4 slim and Pro were 16nm. They can add more CUs with the die shrink, I think. My guess is no PS5 Pro until 2022 or 2023, along with the slim.
 

Bo_Hazem

Banned
That would be amazing but be VERY HOT. They’d need to wait until like 2023 when the next die shrink can be used. Whatever is less than 7nm. Like how the PS4 was 28nm. Then the PS4 slim and Pro were 16nm. They can add more CUs with the die shrink, I think. My guess is no PS5 Pro until 2022 or 2023, along with the slim.

Hence:

yoVYPL8RxwyuMKq8uhCNKU-1200-80.jpg


One on each side, seperately cooled with custom connection. Like a true SLI/Crossfire but not like the PC shit where you kill 25% from the first GPU power drain and take only 50% on the second making a very laughable advantage for a gigantic investment.
 
Last edited:

kyliethicc

Member
Yea but if they wanted to show off the SSD Spidey fighting off a thousand enemies would be better as he swings through the city
That’s not the point of the demo. It’s designed to show off the speed of the data streaming. Allowing the city to be steamed in at super fast speeds, allowing for huge theoretical travel speeds for Spidey. It’s just a tech demo. Not about combat.
 

ph33rknot

Banned
That’s not the point of the demo. It’s designed to show off the speed of the data streaming. Allowing the city to be steamed in at super fast speeds, allowing for huge theoretical travel speeds for Spidey. It’s just a tech demo. Not about combat.
Sounds like the Sega Saturn infinite plains
 
Last edited:
I'm going to wait until Intel's and AMD's next generation of CPUs before I upgrade. However, I'm just curious about the i7-6950x (10 cores, 20 threads). How would it fair against the upcoming consoles' CPUs?

Again, it's very hard to say.

I'd guess that they'll be broadly similar using up to 8 cores (trying to take into account lower clock speeds and cache sizes on console Zen 2), after which the Intel should have an advantage. Then again, decompression and audio may need additional cores on PC.

I think it'd make a decent CPU for next gen especially if you overclock to 4+ gHz .... but damn those things are expensive as hell, and there will be much better options!
 

Bo_Hazem

Banned
you assuming that someone thinks that the game that the initiative is working on must be good, in my case I simply wanted to clarify that m booty out of context sentence wasn’t correctly interpreted.

for all we know it might be a big budget AAAA flop. As someone else stated; game being AAA title doesn’t guarantee quality, yes it will probably look great, but if it is another 3rd person story game I will not be impressed myself. As I have stated couple times, while I enjoy uncharted and completed all except 4 that I am trying to beat now for me the latest instalments are solid 7/10 games as they lack innovation, this is my subjective opinion. Just FYI; games like gow or gears are for me just okay, good looking games, but lack something special that would make me want to finish them in one sitting or spend 100s of hours playing them.

What type of games then?
 
But esram was a bandaid to boost up the 40% weaker Xbone with its slow ddr3.

Not really. The esram was always part of the plan, before memory type and quantity was finalised. Lots of engineering work went into the esram that was entirely custom. Literally never used in that way anywhere else.

Unfortunately, while it was fast as fuck and maintained high levels of throughput even under adverse read/write conditions that would have hurt single-ported GDDR5, it was a tiny 32MB at a time when you needed a lot more, and if you went outside of it you hit "lol 68 GB/s shared with CPU".

Sony called it a lot better, overall. A lot.
 

V4skunk

Banned
Not really. The esram was always part of the plan, before memory type and quantity was finalised. Lots of engineering work went into the esram that was entirely custom. Literally never used in that way anywhere else.

Unfortunately, while it was fast as fuck and maintained high levels of throughput even under adverse read/write conditions that would have hurt single-ported GDDR5, it was a tiny 32MB at a time when you needed a lot more, and if you went outside of it you hit "lol 68 GB/s shared with CPU".

Sony called it a lot better, overall. A lot.
Yeah ps4 is 40% more powerful than Xbone yet people here are claiming dominance over a 15-20% power difference! All while downplaying a huge on paper ssd performance difference and claiming ssd won't do much for gaming.
Lol really.
 
Yeah ps4 is 40% more powerful than Xbone yet people here are claiming dominance over a 15-20% power difference! All while downplaying a huge on paper ssd performance difference and claiming ssd won't do much for gaming.
Lol really.

Well yeah, for the most part I expect the PS5 to be running games like the XSX goes but at a slightly lower resolution or frame rate.

SSDs in general (both are fantastic solutions, but one is faster) are going to be huge.

I don't believe at all that either PS5 or XSX will be a bad or weak system in any way. But in the absence of meaningful divides, we gamers always seem to create them ... :/
 

CJY

Banned
Well yeah, for the most part I expect the PS5 to be running games like the XSX goes but at a slightly lower resolution or frame rate.

SSDs in general (both are fantastic solutions, but one is faster) are going to be huge.

I don't believe at all that either PS5 or XSX will be a bad or weak system in any way. But in the absence of meaningful divides, we gamers always seem to create them ... :/
I think next gen we're going to be experiencing things so frequenly in so many games where it's like "no way in hell would that be possible on PS4".

The novelty won't wear off fast either because we've been living with these IO bottlenecks since the PS1 and the optical drive basically. The shitness has become ingrained. I don't know how old you are, but I think there are so many people who don't remember/never experienced what gaming on cartridges was like at all. We're finally getting back to that, or at least somewhere close. It's gonna be amazing.
 

Dory16

Banned
Too much backward compatibility buzz. I just don't give a shit about backward compatibility even to PS4. Why the hell would I care about outdated games? If that's my goal why would I care to buy PS5.

Some people care, but I don't expect more than 1%. Give me new games, new experiences, new game designs and things aren't possible with previous consoles.
We care because we have a huge current gen library and it’s digital. We can’t sell or trade it. At the very least we want to be able to keep it and we are selling the current gen consoles to upgrade the hardware. That’s literally every gamer I know. Not that hard to understand, unless you’re a console crusader out to justify all and everything their fav manufacturer does/fails to do
 
That's the joke we need to consider when some gurus talk about the big difference between 448 vs 560GB/s neglecting all other important bottlenecks in the "off the shelf", bin-shaped console.
He was invited by fucking Atari to work there when was 17 and still in school. This motherfucker is around almost since the start of the industry, he is behind the production and consultation of a lot of Sony exclusives. That man is serious business.
 

Imtjnotu

Member
So the ESRAM bandwidth of Xbox One was 204GB/s vs the PS4's GDDR5 at 176GB/s, but XBone only had 32MB of fast RAM. That's truly an insane difference looking at it now.
the problem with it was with Xbone and 360, you had to choose pick and choose what exactly was going to be in the ES(ed for 360) and what was going into the ddr3. it was very last minute that microsoft added one last bus for the cpu to actually access the ES ram on xbox one.

it wasnt on the original showing but a couple of days later the new layout was shown with a black line just pointing from one to the other. sort of like how sony had garlic and onion (+)set up.

2537607-0728435398-XBox_.jpg




sony went the smarter route with just allowing one large pool and having the CPU and GPU access what ever it wants or write into ram what it wants.

PS4-APU.png
 

CJY

Banned
the problem with it was with Xbone and 360, you had to choose pick and choose what exactly was going to be in the ES(ed for 360) and what was going into the ddr3. it was very last minute that microsoft added one last bus for the cpu to actually access the ES ram on xbox one.

it wasnt on the original showing but a couple of days later the new layout was shown with a black line just pointing from one to the other. sort of like how sony had garlic and onion (+)set up.

2537607-0728435398-XBox_.jpg




sony went the smarter route with just allowing one large pool and having the CPU and GPU access what ever it wants or write into ram what it wants.

PS4-APU.png
Woah, Xbone was one overly-engineered POS.
 
So the ESRAM bandwidth of Xbox One was 204GB/s vs the PS4's GDDR5 at 176GB/s, but XBone only had 32MB of fast RAM. That's truly an insane difference looking at it now.

Awesome thing about the esram was that it was dual ported, so could both read and write at the same time. Perfect for banks of ROPs which are always firing out read and write requests.

Meanwhile, a DRAM controller (e.g. DDR3 / GDDR5) has to wait until an in flight access is cleared before a change between read and write can be made. That can take a significant amount of time, and with frequent changes between r/w it can decimate effective bandwidth.

That's one of the reasons that memory latency is so high on graphics cards - scheduling accesses so you aren't constantly changing between read and write appears to be very important in maintaining throughput.

Anyway, long story short (as far as I know), when you were outside that inadequately sized tiny amount of esram, it was similar to being on the PS4 ... only with a small fraction of the PS4 BW to feed the GPU.

I suspect (though can't prove) this is why so many games on X1 have a suitable reduction in [edit: RESOLUTION!] over the PS4, but still end up with an inexplicably shit frame rate.
 
Last edited:

Imtjnotu

Member
Woah, Xbone was one overly-engineered POS.
somehow a 100 Million R&D went to shit with the ram selection. they couldnt get gddr5 cheap enough and didnt want straight DDR3 so they supplemented GPU Die for ES ram. this is how we got the 1.3TF gpu

but the real fucking issue with the Xbone was forcing everyone to have a camera come standard with the unit. that decision alone could have been the difference maker in going with GDDR5 and not DDR3
 
Last edited:

CJY

Banned
Awesome thing about the esram was that it was dual ported, so could both read and write at the same time. Perfect for banks of ROPs which are always firing out read and write requests.

Meanwhile, a DRAM controller (e.g. DDR3 / GDDR5) has to wait until an in flight access is cleared before a change between read and write can be made. That can take a significant amount of time, and with frequent changes between r/w it can decimate effective bandwidth.

That's one of the reasons that memory latency is so high on graphics cards - scheduling accesses so you aren't constantly changing between read and write appears to be very important in maintaining throughput.

Anyway, long story short (as far as I know), when you were outside that inadequately sized tiny amount of esram, it was similar to being on the PS4 ... only with a small fraction of the PS4 BW to feed the GPU.

I suspect (though can't prove) this is why so many games on X1 have a suitable reduction in BW over the PS4, but still end up with an inexplicably shit frame rate.
That's very interesting, thanks. I was wondering why the Xbone specs stated "204GB/s (102 In/102 Out)"
 
somehow a 100 Million R&D went to shit with the ram selection. they couldnt get gddr5 cheap enough and didnt want straight DDR3 so they supplemented GPU Die for ES ram. this is how we got the 1.3TF gpu

From what I've read, it was the over way round. After their experience with the 360, MS continued to want a small pool of very fast memory.

This, and the requirement very early on for large amounts of memory (for TV TV TV TV SPORTS WATERCOOLER) necessitated DDR3 to be sure of having 8GB of affordable* ram.

*That bit is important.

Sony didn't dedicate a large portion of the die to esram (leaving more for CUs), and also had the flexibility to alter total system memory relatively late on. This was a very smart move, and paid off like a boss.
 

PaintTinJr

Member
Variable frequencies which we know nothing about expept it's not depending on heat but power consumption, which is determinable and not the heat throttle half the world rumble about understanding shit. The console is not 9.2 UNTIL proven otherwise, because the system is not a heat-based PC standard.
The difference is around 2 TF, which yeah is more than one PS4, but diminishing returns are a thing, ya know? When the power scale is so big each gen, every next gen consoles unless basically identical has like "one old gen" console difference which means nothing, unless few fps or pixels are a big deal now. The ACTUAL DIFFERENCE is 15-20%, percentage is the correct way of looking at it because it takes into account proportions, while you don't because you want to believe SeX is so superior because "one PS4 more bro".
I honestly can't bare people stretching facts, here I lose my politeness.

I think 15-20% with such little info is too generous and probably in the wrong direction. The reason the PS3 and PS4’s prowess in compute were important for comparison was because the Playstation consoles aligned or bettered their Xbox rival hardware in the other metrics, so the difference really looked usable. The PS4 and Xbox One aren’t very exotic (Esram is a bit exotic) so even without minute details, comparing effective capability was pretty straight forward. The PS3 was very exotic, but because all the IBM Cell SDK and design/development info was fully documented and available to all, it was very easy to compare in a fair way to a Tri-Core 2-way PPC chip (PowerMac or IBM entry web server).

At the moment TF doesn’t represent a realistic comparison of average throughput for XsX, because the memory setup and absence of I/O complex instinctively tells us that it is going to have a large difference between max and average TF real-work done. Where as the PS5 info is talking about narrower CU count, high clock and constant energy use to push for optimal utilisation and work done.

So, even if the PS5 drops clock resulting in 9.2TF for optimum work done - it won’t, it will be just under 10TF in all likelihood – the Ps5 is going to get more real-work done in TF than XsX throughout the next-gen and almost certainly by some margin based on Cerny’s talk. I would certainly reconsider that view if Xbox were able to show the bandwidth graphs of Gears and RT Minecraft on XsX to prove they are getting close to 75% max memory bandwidth (0.75 * 560GB/s) and show they can sustain 75% or more utilisation of their 12 TF in gameplay, but it seems they are happier to have the assumed hardware victory for marketing, rather than a real victory – which is a shame because more people from Playstation ranks like me would probably buy both next-gen if the XsX is really the more impressive hardware.
 

ethomaz

Banned
So the ESRAM bandwidth of Xbox One was 204GB/s vs the PS4's GDDR5 at 176GB/s, but XBone only had 32MB of fast RAM. That's truly an insane difference looking at it now.
ESRAM runs at 109GB/s (it can reach 192GB/s with simultaneous write and read).
It works like a big cache but for a big cache to cover the slow speed of DDR3 for the GPU it should be way more than 32MB.
 
Last edited:

ph33rknot

Banned
I think 15-20% with such little info is too generous and probably in the wrong direction. The reason the PS3 and PS4’s prowess in compute were important for comparison was because the Playstation consoles aligned or bettered their Xbox rival hardware in the other metrics, so the difference really looked usable. The PS4 and Xbox One aren’t very exotic (Esram is a bit exotic) so even without minute details, comparing effective capability was pretty straight forward. The PS3 was very exotic, but because all the IBM Cell SDK and design/development info was fully documented and available to all, it was very easy to compare in a fair way to a Tri-Core 2-way PPC chip (PowerMac or IBM entry web server).

At the moment TF doesn’t represent a realistic comparison of average throughput for XsX, because the memory setup and absence of I/O complex instinctively tells us that it is going to have a large difference between max and average TF real-work done. Where as the PS5 info is talking about narrower CU count, high clock and constant energy use to push for optimal utilisation and work done.

So, even if the PS5 drops clock resulting in 9.2TF for optimum work done - it won’t, it will be just under 10TF in all likelihood – the Ps5 is going to get more real-work done in TF than XsX throughout the next-gen and almost certainly by some margin based on Cerny’s talk. I would certainly reconsider that view if Xbox were able to show the bandwidth graphs of Gears and RT Minecraft on XsX to prove they are getting close to 75% max memory bandwidth (0.75 * 560GB/s) and show they can sustain 75% or more utilisation of their 12 TF in gameplay, but it seems they are happier to have the assumed hardware victory for marketing, rather than a real victory – which is a shame because more people from Playstation ranks like me would probably buy both next-gen if the XsX is really the more impressive hardware.
I'd say 15-20% without a thorough explanation is under estimating
 
I think 15-20% with such little info is too generous and probably in the wrong direction. The reason the PS3 and PS4’s prowess in compute were important for comparison was because the Playstation consoles aligned or bettered their Xbox rival hardware in the other metrics, so the difference really looked usable. The PS4 and Xbox One aren’t very exotic (Esram is a bit exotic) so even without minute details, comparing effective capability was pretty straight forward. The PS3 was very exotic, but because all the IBM Cell SDK and design/development info was fully documented and available to all, it was very easy to compare in a fair way to a Tri-Core 2-way PPC chip (PowerMac or IBM entry web server).

At the moment TF doesn’t represent a realistic comparison of average throughput for XsX, because the memory setup and absence of I/O complex instinctively tells us that it is going to have a large difference between max and average TF real-work done. Where as the PS5 info is talking about narrower CU count, high clock and constant energy use to push for optimal utilisation and work done.

So, even if the PS5 drops clock resulting in 9.2TF for optimum work done - it won’t, it will be just under 10TF in all likelihood – the Ps5 is going to get more real-work done in TF than XsX throughout the next-gen and almost certainly by some margin based on Cerny’s talk. I would certainly reconsider that view if Xbox were able to show the bandwidth graphs of Gears and RT Minecraft on XsX to prove they are getting close to 75% max memory bandwidth (0.75 * 560GB/s) and show they can sustain 75% or more utilisation of their 12 TF in gameplay, but it seems they are happier to have the assumed hardware victory for marketing, rather than a real victory – which is a shame because more people from Playstation ranks like me would probably buy both next-gen if the XsX is really the more impressive hardware.
It just drives me crazy that some people are really thinking that Cerny was explaining fucking thermal variability like a new paradigm instead of thinking "hey, maybe I didn't got shit".
It is indeed a semplification starting from the opposite directions of the two GPUs, TFs are just an average of a single function. We don't have solid application of smartshift, if that 10% is even only half true you are already seeing the difference smaller and smaller. We don't know the real application of an SSD because no game is designed around them, we don't know how clocks scales because RDNA2 isn't out, we don't know almost nothing about RT solution on both or about those mystic cache scrubbers PS5 has, but overal we don't have exact data even on audio management, OS, APIs (especially regarding PS5), we don't even know if PS5 has still something completely unkown to show. Even regarding RAM, people already making calculation about 10 GBs and such while I find kind of naive try to calculate how the workload will be for next gen. Both Sony and MS are confident SSDs can make up for RAM anyway and explained how, but that should tell you how also both are not that confident of the RAM alone.
People are going crazy over some TFs and bandwitch, but those are the most simplistic things and the least exiting.
 
Last edited:
Status
Not open for further replies.
Top Bottom