• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

What's next for Hard Drive/Storage/SSD Tech?

CuNi

Member
It’s been proven already with the likes of Ratchet (lots more on screen), the Spider-Man demo (fast swinging movement) and the original UE5 demo (8K texture streaming).

No. It's not been proven. Fast movement was possible before. Ratchet looked good, but it's not "never before seen graphics".
It's just that nothing of this was ever done before on a console at the same time. Can you really, 100% say that all of those feats are solely because of the SSD? No obviously not.
We got 2 things at once with the new console gen. We got stronger consoles and SSD. You can't just attribute every kind of upgrade in fidelity to one component when every piece of hardware is being pushed to it's limits.
 

Sosokrates

Report me if I continue to console war
Im still waiting for the SSD to be "key to next gen" Cerny is a smart dude, pretty sure there will be truth in what he says.
 

Mercador

Member
johnnymnemonic-jonny-nemonic.gif
How much it was already ? 320gb?
 

ChazAshley

Gold Member

Let's go
 

bitbydeath

Member
No. It's not been proven. Fast movement was possible before. Ratchet looked good, but it's not "never before seen graphics".
It's just that nothing of this was ever done before on a console at the same time. Can you really, 100% say that all of those feats are solely because of the SSD? No obviously not.
We got 2 things at once with the new console gen. We got stronger consoles and SSD. You can't just attribute every kind of upgrade in fidelity to one component when every piece of hardware is being pushed to it's limits.
Explain CyberPunk, that was a clear hardware limitation situation, SSD’s would push it well beyond what it achieved.
 

CuNi

Member
Explain CyberPunk, that was a clear hardware limitation situation, SSD’s would push it well beyond what it achieved.

I see that completely different.
Cyberpunk was everything but an issue with hardware limitations. It was, if we're honest, the same issue that Star Citizen has. On paper, it sounded good, but they were overambitious and simply didn't account for roadblocks. They also rushed the release for reasons only they know. CP is also what I'd call a "victim" of today's game development, as we see so many games get released in a stage I would describe as early access at best. People know the saying "too big to fail", but as recent games have shown this only works for companies and not games, as we seem to have hit a point in time when so many big games fail around the same time we really should take a good look and ask ourselves "why did they fail?". Hint: It's not because they were cross-gen.
 
No. It's not been proven. Fast movement was possible before. Ratchet looked good, but it's not "never before seen graphics".
It's just that nothing of this was ever done before on a console at the same time. Can you really, 100% say that all of those feats are solely because of the SSD? No obviously not.
We got 2 things at once with the new console gen. We got stronger consoles and SSD. You can't just attribute every kind of upgrade in fidelity to one component when every piece of hardware is being pushed to it's limits.

Before file I/O subsystems (at both the hardware and software level) were being significantly redesigned, if you had games with lots of data that the player MIGHT see, but you weren't sure they WOULD see, you'd need to pre-cache big chunks of it in RAM to prevent stalls and hangs, or significant pop-in issues. In other words, for the immediate things a player was doing, only a portion of the RAM was actually being used. But since you can only compress but so much data into RAM, that amount still acted as a hard limit.

With the SSDs and accompanying I/O hardware subsystems and file I/O software systems, RAM that would've been allocated in the past as a cache is being freed up, meaning games are getting more RAM capacity for immediately pertinent graphics and asset data. You don't need to reserve 1 GB or whatever for data that might be accessed by the player 30 seconds away anymore; you can now let that 1 GB be used for current area the player is actually in, then quickly replace it with new data as it's needed. That wasn't possible before without SSDs and, more importantly, revamped memory I/O subsystems and file I/O restructuring.

Basically, just go rewatch Mark Cerny's Road to PS5 if you want a short-but-simple overview of what these new technologies can actually do (and in parts, are already helping to enable; trackside detail in the new Forza Motorsport is much better than the extremely simple & basic ones in Forza 7 not just because of GPU/CPU power increase but because the memory subsystem and file I/O are able to refresh bigger chunks of RAM magnitudes more quickly than what the previous gen of consoles were able to do. Powerful GPU & CPU don't mean much if the stalling bottleneck in your pipeline is a HDD barely doing 100 MB/s with no decompression support of any kind).

Anyway to answer the OP's question, I personally don't think any of that stuff is the future of storage. The posters basically saying things will stay the same but the drives will get faster, are probably correct. Other subtle parts will improve as well, but the real improvements will come with the technologies leveraging SSDs.

-True cache coherency over PCIe with some form of CXL (preferably 3.0)​
-Standardization of decompression ICs for offloading decompression tasks from the CPU (will be vital for lower-powered devices)​
-Opening up decompression IC access to peripherals other than just SSDs (microSD cards, USB drives for example)​
-Potentially leveraging some form of more advanced PNM (Processing-Near-Memory) for higher-tier SSDs where the drive​
can have its own block of integrated RAM and processing logic to process the data before sending it over a CXL 3.0-layered​
PCIe link (taking off the stress of decompression and data processing from the CPU/GPU, maybe having the decompression IC​
built into the SSD itself, decentralizing the storage I/O process more or less completely from the device accessing the storage)​
Those things, should they happen, are going to have a much more meaningful impact than just increasing capacity tenfold (and likely getting worst performance, certainly less cycle endurance for that type of stuff). I'm hoping these things, and in particular serious PNM and especially PIM (Processing-In-Memory) architecture designs become prevalent with at least one of the 10th-gen consoles, where along with chiplets and switching to better memories (HBM-based ones, maybe with some NVRAM for specific caches thrown in) will help bring big performance increases for those systems without blowing them up into 400 watt monstrosities just to be competitive. But that is another type of conversation to have altogether.
 

Sosokrates

Report me if I continue to console war
It interesting that despite all the ultra fast SSD hype, EPIC who have been making game engines for decades only require about 300MB/s for there latest engine to have good performance.
 
Last edited:

CuNi

Member
Before file I/O subsystems (at both the hardware and software level) were being significantly redesigned, if you had games with lots of data that the player MIGHT see, but you weren't sure they WOULD see, you'd need to pre-cache big chunks of it in RAM to prevent stalls and hangs, or significant pop-in issues. In other words, for the immediate things a player was doing, only a portion of the RAM was actually being used. But since you can only compress but so much data into RAM, that amount still acted as a hard limit.

With the SSDs and accompanying I/O hardware subsystems and file I/O software systems, RAM that would've been allocated in the past as a cache is being freed up, meaning games are getting more RAM capacity for immediately pertinent graphics and asset data. You don't need to reserve 1 GB or whatever for data that might be accessed by the player 30 seconds away anymore; you can now let that 1 GB be used for current area the player is actually in, then quickly replace it with new data as it's needed. That wasn't possible before without SSDs and, more importantly, revamped memory I/O subsystems and file I/O restructuring.

Basically, just go rewatch Mark Cerny's Road to PS5 if you want a short-but-simple overview of what these new technologies can actually do (and in parts, are already helping to enable; trackside detail in the new Forza Motorsport is much better than the extremely simple & basic ones in Forza 7 not just because of GPU/CPU power increase but because the memory subsystem and file I/O are able to refresh bigger chunks of RAM magnitudes more quickly than what the previous gen of consoles were able to do. Powerful GPU & CPU don't mean much if the stalling bottleneck in your pipeline is a HDD barely doing 100 MB/s with no decompression support of any kind).

Anyway to answer the OP's question, I personally don't think any of that stuff is the future of storage. The posters basically saying things will stay the same but the drives will get faster, are probably correct. Other subtle parts will improve as well, but the real improvements will come with the technologies leveraging SSDs.

-True cache coherency over PCIe with some form of CXL (preferably 3.0)​
-Standardization of decompression ICs for offloading decompression tasks from the CPU (will be vital for lower-powered devices)​
-Opening up decompression IC access to peripherals other than just SSDs (microSD cards, USB drives for example)​
-Potentially leveraging some form of more advanced PNM (Processing-Near-Memory) for higher-tier SSDs where the drive​
can have its own block of integrated RAM and processing logic to process the data before sending it over a CXL 3.0-layered​
PCIe link (taking off the stress of decompression and data processing from the CPU/GPU, maybe having the decompression IC​
built into the SSD itself, decentralizing the storage I/O process more or less completely from the device accessing the storage)​
Those things, should they happen, are going to have a much more meaningful impact than just increasing capacity tenfold (and likely getting worst performance, certainly less cycle endurance for that type of stuff). I'm hoping these things, and in particular serious PNM and especially PIM (Processing-In-Memory) architecture designs become prevalent with at least one of the 10th-gen consoles, where along with chiplets and switching to better memories (HBM-based ones, maybe with some NVRAM for specific caches thrown in) will help bring big performance increases for those systems without blowing them up into 400 watt monstrosities just to be competitive. But that is another type of conversation to have altogether.

I never claimed SSDs don't have any benefit whatsoever.
I am just saying that even with infinite bandwidth, we are still CPU and GPU limited.
You could have all the needed assets ready at the very instant they are requested and still wouldn't be able to render "infinite" details because at the end of the day the GPU still needs to render the stuff and the CPU still needs to calculate things like AI/Audio etc.
So yes SSDs do have an effect on, let's call it "detail density", but they are not doubling or tripling it. Most of the upgrade over last gen comes from better CPU and GPU. The SSDs helps those obviously with faster response times and faster delivery of requested data, but even the 7GB/s (I think that was the figure?) are nothing compared to RAM speeds, which are 4-5x higher (7GB/s vs ~28GB/s).
The latency issue could have easily been solved by having bigger RAM and simply stuffing more into RAM, since consoles have unified RAM. With an added loading screen to pre-fill the RAM, this would result in the very same gaming experience. The only exception where this would not be equal/even faster would be when you Load Saves in different parts of the game or teleport around as the RAM would need to be pre-filled but once you'd be in gameplay, there'd be no difference.

That doesn't mean the SSD is obsolete or anything, it's simply a more "user-friendly" solution to the latency issue, since that way you don't have loading screens and can teleport around/load saves at very different parts of the game.
But it is not and never was the "only" solution to achieving the same level of detail games like Ratchet have.
 

Drew1440

Member
What happened to HVD discs? The successor of Blu-ray that could store 200GB - 1.6TB per disc.

Or Archival Disc which Sony and Panasonic developed, that can store 300GB - 1TB. I guess optical media is a dying format, just imagine the kind of games we would get if 1TB of storage was the standard.
 

Emedan

Member
Personally, and this might be slightly off topic, but I was recently blessed with 1gig fibre and find less need to store massive files and games locally.

But regarding the op i think they'll just get faster, but what's around just now is plenty fast enough, for gaming anyway.
Yeah this is it, developments in internet speeds have made bigger storage solutions for traditional consumers obsolete. I have less storage in my PC now than I did 15 years ago, I simply have no need to store files anymore when redownloading them takes 10 minutes rather than 10 hours. Music and video has moved to streaming as well so no need for storage there, photos are in the cloud, and so on. So for personal computers I don't really see the point in increased storage.

Regarding the OT at hand I think it sounds too good to be true and probably not viable for mass adoption any time soon.
 
Top Bottom