• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS5's SSD is "far ahead" of those found in high-end PCs, according to Epic Games CEO Tim Sweeney

VFXVeteran

Banned
Yup, makes sense that 1440p is a target the hardware was comfortable with.

Do you think that Pixel rate and Texel rate in GPUs in tech like nanite will be a big factor in performance? It would gel with the idea that AMD will be pushing clocks in their next cards, as I think PS5 high clock is really just the beginning and we will see PC GPUs, at least from AMD, hitting higher frequencies than that.

I'm not sure the clock rate would matter a whole lot. Meaning if you turn up the clock 15-20% wouldn't necessarily translate to more triangles or more pixels. It could though. I think the clock rate would affect more of the lighting/shading side of things more. And while Nanite stole the spotlight, we still have to contend with lighting/shading.
 

ToadMan

Member
Yeap from the manager of Epic Games.
Tim added “PC will catch up with faster SSDs late this year” ;)

At launch of PS5 PC will already have hardware to delivery better performance then PS5.

PS5 does 5.5Gb/s uncompressed and 8-9Gb/s compressed Transfer.

PCs may be able to get to 5.5gb/s this year and then still have the circuitous path to the GPU to trundle through making them less efficient than PS5.

PCs won’t get 8-9Gb/s compressed data for several years because that will need a PCIE standard revision or a CPU manufacturer to include hardware decompression which is unlikely.
 
Last edited:

ethomaz

Banned
Everything affects GPU performance. If that was the case, this demo should have run at 8k/60FPS. Point me to where they talk about the GPU being a non-factor.
You are confusing the GPU render and the Nanite tech.
The GPU render is there like any other game... the Nanite tech in simple terms add several millions (actually they used billions) of triangles to that render... the GPU was not used to render these triangles the only work is to insert in the render send to the screen.
Of course there is exceptions when they believe the GPU rasterization is faster the they use GPU power but most of triangles are done via software rasterization.

"The vast majority of triangles are software rasterised using hyper-optimised compute shaders specifically designed for the advantages we can exploit," explains Brian Karis. "As a result, we've been able to leave hardware rasterisers in the dust at this specific task. Software rasterisation is a core component of Nanite that allows it to achieve what it does. We can't beat hardware rasterisers in all cases though so we'll use hardware when we've determined it's the faster path. On PlayStation 5 we use primitive shaders for that path which is considerably faster than using the old pipeline we had before with vertex shaders."

Epic says that Nanite is will automatically generate “as much geometric detail as the eye can see,” without need the GPU to render them itself... they said “allow developers to diversify that geometry endlessly based on any source asset.”
 

MHubert

Member
...because nobody marketing a product has ever lied about it...never happened...ever...
The fact that many of you cannot see this for the bought and paid for marketing spin it is tells me that it was genius. I doubt their claims because unlike many in the console space I have actual hands on experience with SSDs. I know what their strengths are, and what their weaknesses are. I can point to scenarios where they can outpace a HDD by 50x...for a short while anyway...and ones where the margins are more like 2x...regardless of theoretical bandwidth limits.

SSDs are not magic bullets. They have their limits. No amount of Sony secret sauce is going to change the fact that unless you're doing continuous sequential reads and writes of large, singular, files, the controller is going to bog down, the DRAM cache is going to fill, and the drive is going to get hot...and when any one of those things happen...POOF...your theoretical limits don't mean jack shit.
I just find it a bit odd that instead of adding a larger GPU, Sony somehow decided to bank on and invest millions in a solution that is only meant to fool you. How are you able to believe that?
... And doesn't the PS5 controller essentially mitigate those exact problems you are describing?
 

Thirty7ven

Banned
I'm not sure the clock rate would matter a whole lot. Meaning if you turn up the clock 15-20% wouldn't necessarily translate to more triangles or more pixels. It could though. I think the clock rate would affect more of the lighting/shading side of things more. And while Nanite stole the spotlight, we still have to contend with lighting/shading.

But won't Pixel Rate and Texel rate affect nanite performance in particular?
 

phil_t98

#SonyToo
I don't disagree with that. But the way the PS guys are reacting is completely overblown. I'm sure this UE5 demo will be able to run equal or even better fidelity on a high-end PC. I also think the Xbox guys have nothing to worry about.
the only thing he is quoted as saying is HDD will have a downgrade the rest is all implied by everybody else
 

MHubert

Member
I don't disagree with that. But the way the PS guys are reacting is completely overblown. I'm sure this UE5 demo will be able to run equal or even better fidelity on a high-end PC. I also think the Xbox guys have nothing to worry about.
If you mean the ones that think the SSD is going to help improve resolution and framerate, then I agree.

I think it is safe to say that Xbox will have an edge in resolution and framerate, and ps5 will have an edge in data streaming.
 
Last edited:
I still remember all the discussions back on 2008 or so, talking about how ultra powerful and revolutionary the cell processor of the ps3 was, how it was a game changer, how it would crush the x360, how it would make PCs obsolete, yada yada.

Worst part is that console makers are probably fully aware that people in these forums fight for their plastic boxes, and probably feed into it to groom that sweet brand loyalty.

Cell you say? I seem to remember big EDRAM and Power of the Cloud boners circa 2014.

AT LEAST Sony's over compensation has been realistic! I buy the benefits of multi core CPUs and bandwidth shit.

As of today I'm leaning Xbox like the good old days. I like that they brought Yakuza on stage, hope they get Judgment soon. All that said, fuck the cloud that shit was fucking fake as Kinect Minority Report hands. God we are all so fucking stupid but everyone in Xbox One era that bought the bait we're total rubes.
 

DeepEnigma

Gold Member
PS5 does 5.5Gb/s uncompressed and 8-9Gb/s compressed Transfer.

PCs may be able to get to 5.5gb/s this year and then still have the circuitous path to the GPU to trundle through making them less efficient than PS5.

PCs won’t get 8-9Gb/s compressed data for several years because that will need a PCIE standard revision or a CPU manufacturer to include hardware decompression which is unlikely.

This is rumored in the Ryzen pipeline, so don't shit on my future build hopes!
 

ethomaz

Banned
PS5 does 5.5Gb/s uncompressed and 8-9Gb/s compressed Transfer.

PCs may be bale to get to 5.5gb/s this year and then still have the circuitous path to the GPU to trundle through making them less efficient than PS5.

PCs won’t get 8-9Gb/s compressed data for several years because that will need a PCIE standard revision or a CPU manufacturer to include hardware decompression which is unlikely.
SSDs launching late this year are 7-9GB/s without compression.

PCI-E 4.0 can do 2GB/s per lane... it can have up to 64GB/s in total.

The usual PCI-E slot used to SSDs is the 4x... in PCI-E 4.0 it can hold up to 8GB/s SSDs... for SSDs over that it will need to use the 8x slot.

PCI-E 4.0 is still overkill for that.
 
Last edited:
... And doesn't the PS5 controller essentially mitigate those exact problems you are describing?
No. The PS5's SSD controller cannot lift the limits that cause it to bog down when receiving lots of requests for little random files strewn across the logical space, which it has to take the time to find...each...individual...one. It can raise them...but not eliminate them. It cannot do anything to change the fact that the faster the controller is running the hotter it gets...to say nothing of the fact that modern high bandwidth nand flash kicks out a ton of heat of its own...and it can't do anything to change the fact that even MLC nand (which is prohibitively expensive for a console, it'll most likely be TLC or QLC) is nowhere near fast enough to saturate that kind of bandwidth...hell it can't even saturate 3.5GB/s PCIe 3.0 x4 connection...and if it can't once the cache runs dry and it starts reading / writing directly to / from the nand flash it's going to slow down.
 

ethomaz

Banned
Im saying what you said has nothing todo with my post or his quote in my post.

Source for the no need for GPU rasterization, please.
"The vast majority of triangles are software rasterised using hyper-optimised compute shaders specifically designed for the advantages we can exploit," explains Brian Karis. "As a result, we've been able to leave hardware rasterisers in the dust at this specific task. Software rasterisation is a core component of Nanite that allows it to achieve what it does. We can't beat hardware rasterisers in all cases though so we'll use hardware when we've determined it's the faster path. On PlayStation 5 we use primitive shaders for that path which is considerably faster than using the old pipeline we had before with vertex shaders."

They use software rasterization instead GPU hardware rasterization for the vast majority of the triangles.
 
Last edited:
I think you missed Nanite is not using the GPU rasterizer to render the triangles coming from the storage to screen.

It uses software rasterization some (guess) probably on CPU or Async Compute.

The critical path in how many data you can send from storage.
That is not what Epic is saying...
Maybe you should ask or talk with them because they even said it won’t affect GPU performance.
The GPU only add them to the render... the processing of these triangles are done by software running on CPU or Async Compute.
Everything affects GPU performance. If that was the case, this demo should have run at 8k/60FPS. Point me to where they talk about the GPU being a non-factor.

This is not what I am understanding either, generating a triangle at a pixel level is a function of the gpu. Generating a pixel is a function of the gpu as well
 

MHubert

Member
No. The PS5's SSD controller cannot lift the limits that cause it to bog down when receiving lots of requests for little random files strewn across the logical space, which it has to take the time to find...each...individual...one. It can raise them...but not eliminate them. It cannot do anything to change the fact that the faster the controller is running the hotter it gets...to say nothing of the fact that modern high bandwidth nand flash kicks out a ton of heat of its own...and it can't do anything to change the fact that even MLC nand (which is prohibitively expensive for a console, it'll most likely be TLC or QLC) is nowhere near fast enough to saturate that kind of bandwidth...hell it can't even saturate 3.5GB/s PCIe 3.0 x4 connection...and if it can't once the cache runs dry and it starts reading / writing directly to / from the nand flash it's going to slow down.
So what would you say is the real speed of Sonys' ssd solution?
 

ethomaz

Banned
This is not what I am understanding either, generating a triangle at a pixel level is a function of the gpu. Generating a pixel is a function of the gpu as well
So what do you understand when they say Nanite are not using hardware rasterization but software rasterization instead for performance gain?
The GPU received the triangle already generated... that is the trick of the Nanite.
 
Last edited:

kiphalfton

Member
So the SSD is the marketing this gen. Got it!!

Which is interesting as this totally flies in the face of the general consensus that the GPU was the most important thing...

The fact SSD has been so heavily marketed this upcoming gen makes me thinks the other components may suck. No way these SSD's are that impressive, when a Samsung 970 Pro 1TV nvme ssd costs $350 and is the best of the best.
 

Spukc

always chasing the next thrill
Man i hope when i build my new pc i will get access to this magical SSD
It would be bad if the 3080ti won't be able to keep up with ps5 games.

Might as well not build a new pc
 
Man i hope when i build my new pc i will get access to this magical SSD
It would be bad if the 3080ti won't be able to keep up with ps5 games.

Might as well not build a new pc
3080ti with shit components wont do anything for you. Why is everyone so offended Sony has made advancements in this particular bottleneck? Question, if you could for free include Sonys custom i/o to your system with a 3080ti would you?
 

Lux R7

Member
It would be bad if the 3080ti won't be able to keep up with ps5 games.

200.gif
 
So what do you understand when they say Nanite are not using hardware rasterization but software rasterization instead for performance gain?
The GPU received the triangle already generated... that is the trick of the Nanite.

Software rasterization is instructions given to components of the hardware to turn shapes into pixels and images. So rather than the gpu turning them into pixels and images, other parts of the hardware (such as the CPU) are doing it. But this is unique to unreal engine 5.0, not to playstation 5. If the level of detail is determined by billions of triangles, the generation of those triangles is done by the GPU. So why cant XsX and PC not generate the same amount of triangles?
 
Last edited:

pyrocro

Member
"The vast majority of triangles are software rasterised using hyper-optimised compute shaders specifically designed for the advantages we can exploit," explains Brian Karis. "As a result, we've been able to leave hardware rasterisers in the dust at this specific task. Software rasterisation is a core component of Nanite that allows it to achieve what it does. We can't beat hardware rasterisers in all cases though so we'll use hardware when we've determined it's the faster path. On PlayStation 5 we use primitive shaders for that path which is considerably faster than using the old pipeline we had before with vertex shaders."

They use software rasterization instead GPU rasterization for the vast majority of the triangles.
Oh I see where you are getting confused.
The word software there is referring to the compute shader they wrote-> it a piece of software.
GPUS have dedicated hardware rasterization blocks, but from what I understand passing data through these blocks can happen in parallel with other things.
what he is saying(and I would like someone else to verify) is they don't go through the hardware rasterizer, because for more of their use cases the software compute route is faster and in other cases the hardware rasterizer is.

compute units are on the GPU, so it's the GPU doing the processing just with their code/software.

Oh and guess which has more compute units todo this sort of thing. (just a little twist on the knife there, you like that)
 

pawel86ck

Banned
Which is interesting as this totally flies in the face of the general consensus that the GPU was the most important thing...

The fact SSD has been so heavily marketed this upcoming gen makes me thinks the other components may suck. No way these SSD's are that impressive, when a Samsung 970 Pro 1TV nvme ssd costs $350 and is the best of the best.
Dealer has compared his 970 PRO to XSX SDD, and it was over 3x slower (3-5 seconds on XSX, 15 seconds on 970 PRO). Keep in mind PS5 SDD has 2x more RAW speed compared to XSX, so the difference will be even bigger.
 
Last edited:

MHubert

Member
Man i hope when i build my new pc i will get access to this magical SSD
It would be bad if the 3080ti won't be able to keep up with ps5 games.

Might as well not build a new pc
A fun quote from my friend who works as a professional animator, and just bought a 3000 dollar PC:
"Global illumination on so many pixels.. I'm sitting with 2 million polygons and my PC chugs at 10 fps. Guess I have to upgrade to a PS5..."
Meant as a joke, obviously ;)
 

Three

Member
You guys are completely generalistic in your claims and that's 99% because you don't know how the 3D graphics pipeline works.

I'll say it again one last time.

The SSD is a storage medium. If you have a gazillion bytes of data on it, that doesn't mean it makes the computer process more data or it's fidelity.
But there is way more to this than just the speed of the SSD transferring data into VRAM. The algorithm by which you measure HOW much data, HOW fast you want your target render frame to be, WHAT resolution you want to render at, etc.. ALL depends on the limitation of GPU/CPU/RAM,

In short, can an SSD play a role in the overall fidelity of a 3D image? Absolutely! Is it the main component that allows this fidelity? Absolutely NOT! It certainly helps, but it's not more important than all of the main systems that drive it (i.e. CPU/GPU).

Nobody is saying it is more important than anything else, whats with the constant strawman arguments and appeals to authority like you're the only person that knows "how a graphics pipeline works" ? you said that it cannot affect fidelity at all when it can actually be the main factor in texture quality or variety when traversing or the main factor in level variety. Are you Alex, answer that question too because he was making the exact same false claims you are and where I asked you to watch a GDC talk, you didn't answer then or now. I've asked you to look at the GDC talk for spiderman. You said you would take a look but disappeared. I'll timestamp it this time for you



Watch it from there. Look at what isn't loaded when traversing due to the HDD speed. Make the connection to this Unreal 5 demo and the detailed fast traversal section and draw some parallels. Or failing that look at the leaked spiderman PS5 demo. Don’t tell me storage doesn't affect texture detail by jumping to some strawman about framerate or resolution because it very clearly does. Nobody is saying you will get higher framerates or more pixels so stop with that ridiculous strawman. Also when highlighting things I've said make sure to get the full context and realise I was very specific. Don't then pretend I was generelistic when it's you who just chose to ignore it.

A lot of games stream data constantly nowadays. God of war, TLOU, Spiderman. They will benefit greatly from a fast SSD. Devs make compromises on texture quality and variety due to the storage speed as is clearly seen in the spiderman GDC talk.
 
Last edited:
Software rasterization is instructions given to components of the hardware to turn shapes into pixels and images. So rather than the gpu turning them into pixels and images, other parts of the hardware (such as the CPU) are doing it. But this is unique to unreal engine 5.0, not to playstation 5. If the level of detail is determined by billions of triangles, the generation of those triangles is done by the GPU. So why cant XsX and PC not generate the same amount of triangles?
Im no expert but my understanding is that the data that makes up those triangles neesd to be read from somewhere. If it takes too long to read the data it won't matter how fast you can render it. Lets say in one frame you need 5.5GBs but can only read 2.4 GBs. You wont have the data fast enough to render it even though your system can render more triangles.
 
Im no expert but my understanding is that the data that makes up those triangles neesd to be read from somewhere. If it takes too long to read the data it won't matter how fast you can render it. Lets say in one frame you need 5.5GBs but can only read 2.4 GBs. You wont have the data fast enough to render it even though your system can render more triangles.

This is where the 112GB/sec of extra RAM bandwidth of the 10GB GDDR6 of XsX comes in, no? Isnt that faster and more data than the SSD drive for both XsX and PS5?
 
This is where the 112GB/sec of extra RAM bandwidth of the 10GB GDDR6 of XsX comes in, no? Isnt that faster and more data than the SSD drive for both XsX and PS5?
Granted your numbers are accurate in practice and If the data already resides in ram then yes you are correct. However, the data needs to be loaded into and out of ram. This is where the PS5 has made industry leading advancements imo. The scene where the girl is flying through the map is a good example of needing new data and quickly removing old data.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
I don't disagree with that. But the way the PS guys are reacting is completely overblown. I'm sure this UE5 demo will be able to run equal or even better fidelity on a high-end PC. I also think the Xbox guys have nothing to worry about.

We KNOW it'll run on PC. Who's saying it wont?
 

banjo5150

Member
Im no expert but my understanding is that the data that makes up those triangles neesd to be read from somewhere. If it takes too long to read the data it won't matter how fast you can render it. Lets say in one frame you need 5.5GBs but can only read 2.4 GBs. You wont have the data fast enough to render it even though your system can render more triangles.

That's how I look at it also, but I could be totally wrong.

I would love to know if the likes of Cerny or any of these system architects read these forums. Wonder how many people they would say that is completely wrong or this guy actually knows what he is talking about. LOL
 
So what would you say is the real speed of Sonys' ssd solution?
Unfortunately it's not as easy as just pulling a number out of my ass. The answer depends on a lot. Peak, average or sustained? What kind of workload are we talking about? Random or sequential? Reads or writes? A mix? TLC or QLC? (or god forbid...PLC?) How many chips? Bandwidth of the chips? Size of the cache? DRAM or SLC? Does it even have a cache? (almost definitely yes...but it's not like cacheless SSDs don't exist, they just suck ass) There's a reason SSD manufacturers advertise using peak bandwidth in ideal conditions and then put an asterisk next to it. To be clear I'm sure their SSD can reach the peak speeds they're using in their marketing...but there's a difference between what it can reach briefly when the wind is blowing in just the right direction and nobody in the room has farted for atleast an hour and when the room is on fire. Where exactly it will fall when the drive begins to choke, and it will, is anyone's guess, it depends on far too many factors that we just don't know yet.
 

sircaw

Banned
These forums are fucking crazy some times, I remember after the Cerny video, people were calling him a liar and said he was bascially not telling the truth etc etc.

Now we get a a video showing basically what Cerny was talking about and now people are calling that bullshit and a lie. That Sweeney guy, he does not have a clue either, etc etc etc.

Imagine going to a court of law.

The Prosecuting Team . " Here are the facts, we got hand written testimonials, sworn affidavits from vetted professionals, oh and we also have video evidence to support the claim your honour.

The Defence Team. We got feelings and Armchair experts your honour, their is no way we can be wrong.

The judge. How did this even get to my courtroom.
 
Last edited:

sendit

Member
Unfortunately it's not as easy as just pulling a number out of my ass. The answer depends on a lot. Peak, average or sustained? What kind of workload are we talking about? Random or sequential? Reads or writes? A mix? TLC or QLC? (or god forbid...PLC?) How many chips? Bandwidth of the chips? Size of the cache? DRAM or SLC? Does it even have a cache? (almost definitely yes...but it's not like cacheless SSDs don't exist, they just suck ass) There's a reason SSD manufacturers advertise using peak bandwidth in ideal conditions and then put an asterisk next to it. To be clear I'm sure their SSD can reach the peak speeds they're using in their marketing...but there's a difference between what it can reach briefly when the wind is blowing in just the right direction and nobody in the room has farted for atleast an hour and when the room is on fire. Where exactly it will fall when the drive begins to choke, and it will, is anyone's guess, it depends on far too many factors that we just don't know yet.

Agreed. Way to many variables to consider and how effective the solutions Sony has in place will be. Along with how they plan on cooling the PS5 components (which we still don't know of yet), software will be a key determining factor including a deep dive presentation on how or if the SSD architect (not just the SSD, but everything encompassing data to get from SSD to VRAM) in the PS5 is able to solve problems that existed prior to this generation.
 
Last edited:
These forums are fucking crazy some times, I remember after the Cerny video, people were calling him a liar and said he was bascially not telling the truth etc etc.

Now we get a a video showing basically what Cerny was talking about and now people are calling that bullshit and a lie. That Sweeney guy, he does not have a clue either, etc etc etc.

Imagine going to a court of law.

The Prosecuting Team . " Here are the facts, we got hand written testimonials, sworn affidavits from vetted professionals, oh and we also have video evidence to support the claim your honour.

The Defence Team. We got feelings and Armchair experts your honour, their is no way we can be wrong.

The judge. How did this even get to my courtroom.

That's not what people are talking about. What people are pointing out about Sweeney saying that this is only possible due to PS5 SSD's and not possible on high end PC's and XsX without compromising or lowering the level of detail and fidelity. That is just false, because that can be achieved by other means such as more RAM and higher bandwidth. You are meaning to tell me that 1-2 years from now, NVIDIA Ampere and Hopper, AMD RDNA 3 will still struggle to achieve this and at 1440p and less than 30fps because of SSD's being the bottleneck?
 
Last edited:

sircaw

Banned
That's not what people are talking about. What people are pointing out about Sweeney saying that this is only possible due to PS5 SSD's and not possible on high end PC's and XsX without compromising or lowering the level of detail and fidelity. That is just false, because that can be achieved by other means such as more RAM and higher bandwidth. You are meaning to tell me that 1-2 years from now, NVIDIA Ampere and Hopper, AMD RDNA 3 will still struggle to achieve this and at 1440p and less than 30fps because of SSD's being the bottleneck?

In one to two years, is that not moving the goal posts though, i mean is it not obvious just on evolution or advancement of hardware we will get to that point.

As far as i know Sweeneys point is about now, the tech that is available now, why are you not trusting his word.

It is his engine, i sure he understand it better than anyone in the world plus , he has access to the ps5,

I don't have those things, do you?

I mean what does it take for someone to prove something.

People have already quoted that the ssd on the ps5 is far better than anything on the pc market atm. Why is it such a hard pill to swallow that atm a piece of tech on a another device is better. It will be coming to the pc in the not to future.
 
Last edited:
in a nutshell the PS5 and XsX SSD is acting like the RAM from the Original Xbox One from 2001. That is best explanation I can think of. Not everything that you see when you play a game:

-Needs to be fully rendered
-Needs high level of detail if you are going to pass through it, or its in a distance, or hiding in the dark
-Needs high bandwidth
-Needs to be in the RAM.

This frees up the main RAM from crap and prevents waste of shit that you dont need. This gives room for things that you do want to see: realistic shit like grass, trees, car reflection, skin texture blah blah blah blah..
 

VFXVeteran

Banned
You are confusing the GPU render and the Nanite tech.
The GPU render is there like any other game... the Nanite tech in simple terms add several millions (actually they used billions) of triangles to that render... the GPU was not used to render these triangles the only work is to insert in the render send to the screen.
Of course there is exceptions when they believe the GPU rasterization is faster the they use GPU power but most of triangles are done via software rasterization.

You don't understand that the compute shaders are still done on the GPU. Which means the PS5/XSX/PC all have limits determined by the GPU.

Read up on DirectCompute:


"The vast majority of triangles are software rasterised using hyper-optimised compute shaders specifically designed for the advantages we can exploit," explains Brian Karis. "As a result, we've been able to leave hardware rasterisers in the dust at this specific task. Software rasterisation is a core component of Nanite that allows it to achieve what it does. We can't beat hardware rasterisers in all cases though so we'll use hardware when we've determined it's the faster path. On PlayStation 5 we use primitive shaders for that path which is considerably faster than using the old pipeline we had before with vertex shaders."

All the graphics boards now have primitive shaders or mesh shaders (PC). This has nothing to do with the PS5's SSD.

Epic says that Nanite is will automatically generate “as much geometric detail as the eye can see,” without need the GPU to render them itself... they said “allow developers to diversify that geometry endlessly based on any source asset.”

The GPU isn't rendering the nanite, but it is used to process the triangles. It's just not going through the conventional 3D graphics pipeline.
 
Last edited:

ethomaz

Banned
Software rasterization is instructions given to components of the hardware to turn shapes into pixels and images. So rather than the gpu turning them into pixels and images, other parts of the hardware (such as the CPU) are doing it. But this is unique to unreal engine 5.0, not to playstation 5. If the level of detail is determined by billions of triangles, the generation of those triangles is done by the GPU. So why cant XsX and PC not generate the same amount of triangles?
Because the SSD can't feed the software rasterization as PS5.
They were clear about that.
 

Guilty_AI

Member
Cell you say? I seem to remember big EDRAM and Power of the Cloud boners circa 2014.

AT LEAST Sony's over compensation has been realistic! I buy the benefits of multi core CPUs and bandwidth shit.

As of today I'm leaning Xbox like the good old days. I like that they brought Yakuza on stage, hope they get Judgment soon. All that said, fuck the cloud that shit was fucking fake as Kinect Minority Report hands. God we are all so fucking stupid but everyone in Xbox One era that bought the bait we're total rubes.
Hey, i'm not taking sides here. TBH i don't remember much from the ps4/Xone disputes since I was in the middle of caring less and less about this console war bullshit.
 

ethomaz

Banned
You don't understand that the compute shaders are still done on the GPU.

Read up on DirectCompute:




All the graphics boards now have primitive shaders or mesh shaders (PC). This has nothing to do with the PS5's SSD.



The GPU isn't rendering the nanite, but it is used to process the triangles. It's just not going through the conventional 3D graphics pipeline.
Again the vast majority or triangles doesn't use the GPU render.
In specific cases it will use it and in hardware with Mesh/Primitice Shader support it will use them for these exceptions.

Your link has nothing related with Nanite.

With Nanite even weaker GPUs can use these generated triangles and what defines the number (millions, billions, etc) of triangles is how fast you can feed it from the storage.

PS5's SSD has clearly advantage over both Xbox and PC today for this new tech.
 
Last edited:

Mozza

Member
Come on children, I am sure both the PS5 and the Xbox series X will run the latest gaming engines, and will both produce lovely graphics, all this fighting is funny, but please keep doing it. ;)
 
Last edited:
In one to two years, is that not moving the goal posts though, i mean is it not obvious just on evolution or advancement of hardware we will get to that point.

As far as i know Sweeneys point is about now, the tech that is available now, why are you not trusting his word.

It is his engine, i sure he understand it better than anyone in the world plus , he has access to the ps5,

I don't have those things, do you?

I mean what does it take for someone to prove something.

People have already quoted that the ssd on the ps5 is far better than anything on the pc market atm. Why is it such a hard pill to swallow that atm a piece of tech on a another device is better. It will be coming to the pc in the not to future.

There is nothing about PS5 SSD helping out graphics I doubt from this tech demo. What I do doubt is him making this outrageous claim that this is simply not possible because SSD's in PC and XsX is just not fast enough. That is dangerous to 'believe' such a thing, because even talented developers have hidden agendas with marketing ploys on gullible and impressionable consumers. The best rebuttal to this tech demo is to DEMONSTRATE it with another unreal 5.0 tech demo from MSFT and other developers on XsX and PC. That will eventually come in time.
 

VFXVeteran

Banned
Nobody is saying it is more important than anything else, whats with the constant strawman arguments and appeals to authority like you're the only person that knows "how a graphics pipeline works" ? you said that it cannot affect fidelity at all when it can actually be the main factor in texture quality or variety when traversing or the main factor in level variety. Are you Alex, answer that question too because he was making the exact same false claims you are and where I asked you to watch a GDC talk, you didn't answer then or now. I've asked you to look at the GDC talk for spiderman. You said you would take a look but disappeared. I'll timestamp it this time for you



Watch it from there. Look at what isn't loaded when traversing due to the HDD speed. Make the connection to this Unreal 5 demo and the detailed fast traversal section and draw some parallels. Or failing that look at the leaked spiderman PS5 demo. Don’t tell me storage doesn't affect texture detail by jumping to some strawman about framerate or resolution because it very clearly does. Nobody is saying you will get higher framerates or more pixels so stop with that ridiculous strawman. Also when highlighting things I've said make sure to get the full context and realise I was very specific. Don't then pretend I was generelistic when it's you who just chose to ignore it.

A lot of games stream data constantly nowadays. God of war, TLOU, Spiderman. They will benefit greatly from a fast SSD. Devs make compromises on texture quality and variety due to the storage speed as is clearly seen in the spiderman GDC talk.


I gave you guys a very thorough explanation of the role of the SSD. I'm not going to keep repeating myself. Believe what you want to believe.
 
Top Bottom