• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

How much difference does the improved SSD + io make in series S/X and PS5?

PaintTinJr

Member
Good post. What is the source for the flash controller having 8 arm cores? (By the way these aren't the cores of the I/O complex of course.)
Didn't we have someone do a thread on here a week ago - with them showing a tweet of someone identifying the manufacturer and part number of the PS5's SSD flash controller chip from a photo, no? Which when cross referenced with parts from the manufacturer was an 8 core ARM chip IIRC.
 

Lysandros

Member
Didn't we have someone do a thread on here a week ago - with them showing a tweet of someone identifying the manufacturer and part number of the PS5's SSD flash controller chip from a photo, no? Which when cross referenced with parts from the manufacturer was an 8 core ARM chip IIRC.
Thanks. I see, i must have missed the post. Do we have any info about the core count of XSX flash controller?
 

PaintTinJr

Member
Thanks. I see, i must have missed the post. Do we have any info about the core count of XSX flash controller?
From the past, I'm not sure if the controller specs of the Phison PS5019-E19T - that someone outed on a linked-in CV as being the XsX controller - was ever confirmed, but if that was accurate, I think it is dual core ARM and DRAM-less.
 

JimRyanGOAT

Member
There's a technique called motion matching in animations that exactly solves the problem you're describing here. It was developed by Ubisoft and was first used in their For Honor game. And to my knowledge, the second game to use was The Last of Us Part II. Since then I don't know how many games have been implemented with it, Naughty Dog made brilliant use of this technique for TLOU2. The animations were quite "next-gen" on a last-gen console. With SSD, I'm sure they can iterate more and do even better anims.

Interesting

But that's proprietary tech, right?

Wonder what other sports game will use to take advantage of the better CPU's and faster SSD's

Do you remember Digital molecular matter from the ps3/Xbox 360 era?

The tech seemed so ahead of its time but due to politics it was never widely adopted. Seems so many devs just wanna play it safe using what they already know with incremental updates

 
Last edited:

PaintTinJr

Member
A Allandor

After reading your reply, it feels like quoting it would be wrongly validating the strawman arguments you've made.

To claim someone using specific terms with technical meanings is using marketing buzzwords - without listing your supposed buzzwords - is disingenuous, because the terms used where by Cerny, and he's not marketing anything, other than how smart and technically proficient he is. So please state "the buzzwords" you have issue with.

Your comment about the cache scrubbers and priority levels indicates you are in no way experienced in data comms, because if you don't understand that memory bandwidth contention lessened in one area, impacts all the items that share/contented for that bandwidth - such as the IO complex(and by extension the SSD controller and SSD), and the CPU and GPU, then it seems no matter what I write you're going to wave away.

The latency of the IO complex Esram is what the CPU LLC cache and GPU LLC see - in directly through their the GDDR6 latency - when they need data. Not the latency hiding of the SSD flash controller or connected SSD.

You've then made a further strawman about write speeds and wear and tear on the SSD nand chips in the "use it sort of like RAM" scenario, when the data that needs to use the SSD like an extension of RAM is static 3D geometry and texture data that won't be written back to disk - that Epic have in UE5 nanite info said is about 90% of a typical game scene data, when they talked about nanite data being effectively immutable in game. To anyone capable of game programming watching the Road to PS5 it was obvious which data needed the streaming capability.

As for the "PS5 processor kits", that testing doesn't reflect the PS5 scenario in any shape or from, where the PS5 is offloading all IO and audio off the CPU, or match up with Cerny's concerns that heavy CPU use could use up to 35GB of memory bandwidth (IIRC), or that the CPU is expected to be under utilised by wattage, as it is expected to redirect excess power to the GPU for increased rasterizing . It is also a huge leap to suggest that slightly defective PS5 APU boards go direct to AMD sell off.

Sony could easily use those boards with lesser defects for other things, like in their £10K 8K flagship TVs as DSP boards, to mention just one re-bin use, and what was left as waste with AMD is probably highly defective in comparison to the the PS5 CPU, too.

As for the GDDR6 latency on a PS5 system, how can you possibly suggest some defective kit has the same memory timings? The board maybe defective because of the GDDR6, for all we know.
 
Last edited:

Boglin

Member
The last bolded part is exactly what we are talking about. The fast paced movement of Spiderman was a game mechanic. So there bandwidth was the bottleneck.
But that's a symptom of game design and game mechanics. Like I said, take Ratched and Clank as a example. You could achieve most likely the very exact same game if you'd put loading screens on the rifts.
Ofc that would destroy the flow of the game etc. but that's not the point. Bandwidth only matter for graphics if you decide on a fast paced game, which is a game mechanic decision. If you aim for best graphics, you'll always get your GPU close to 100% utilization. The only difference would be that on a HDD the game would be slow paced, something like Death Stranding where with a SSD you could go for a more fast paced game like Spiderman or even a flight sim because you'd see constantly changing geometry.

But all of the above is a game mechanic decision. It doesn't allow you to create better looking games. It allows you to create equally good looking games with different gameplay.
Correct me if I'm wrong but you're saying when comparing a fast paced game to another equally fast paced game, the one with better bandwidth can have better fidelity. Isn't allowing the same gamplay with better graphics the same as allowing better looking games?

The fidelity will improve because the SSDs and I/O of the new consoles allow them to free up memory that would normally be reserved for assets to use in upcoming scenes. I can guarantee that having more usable memory for the current scene will help to improve graphics fidelity since the Xbox Series X has 9x the teraflops of the Xbox One yet only 2x the memory.

The 3GB version of the GTX 1060 can reach 100% gpu utilization but that doesn't mean the version with more memory at 6GB is useless. By freeing up reserved memory, the consoles can mimic having more memory too.
 

Shmunter

Member
The i/o certainly alleviates the need for buffering reserves leaving more usable ram by the system. It’s good for all systems, but S is still smaller than the others so you do need to scale down, or scale up, or just use the S as the baseline and call it a day.
 
Last edited:

Shmunter

Member
We still need to see RTX I/O in practice. But the obvious difference on the surface between it and PS5 is assets arrive decompressed into PS5 ram via the i/o complex. RTX I/O will need to load the compressed asset into vram and decompress onboard. The RTX I/O GPU will need to reserve VRAM for this scratchpad and run a workload, latency will be a factor and there is a cost - still light years better than the cpu loading it into ddr, decompressing there and then copying it to vram.
 
Last edited:

Md Ray

Member
Interesting

But that's proprietary tech, right?
Nah, don't think so. I don't know the full details, but if it was a proprietary tech then Naughty Dog wouldn't have been able to use it after Ubisoft.
Do you remember Digital molecular matter from the ps3/Xbox 360 era?
I wasn't following along back then, I was just a kid. :p
The tech seemed so ahead of its time but due to politics it was never widely adopted. Seems so many devs just wanna play it safe using what they already know with incremental updates


But wow, you see those are the kind of realistic destruction I expect to see from these new consoles. It's a shame we don't see stuff like this in games nowadays thanks to PS4/XB1 weak CPUs. It definitely seemed ahead of its time.
 

hlm666

Member
Interesting

But that's proprietary tech, right?

Wonder what other sports game will use to take advantage of the better CPU's and faster SSD's

Do you remember Digital molecular matter from the ps3/Xbox 360 era?

The tech seemed so ahead of its time but due to politics it was never widely adopted. Seems so many devs just wanna play it safe using what they already know with incremental updates
It wasn't adopted because of baked lighting and shadows, you can see in that video it's not casting shadows. If they put in a bunch of objects like that into a game with baked realistic lighting the whole scene ends up looking incoherent with static objects having realistic shadows and AO etc then the dynamic objects having none or baking the lighting for the dynmaic objects which then are all wrong when they are moved/broken etc. So they either went with more interactive worlds with worse looking visuals or went with better visuals and static environments. Visuals won in the aaa space obviously.
 
Top Bottom