• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

jroc74

Phone reception is more important to me than human rights
I don't think people have issue understanding that it's streaming in only the portions it needs, therefore leaving out data it doesn't need and increasing effective memory. That's the largest key benefit local hardware streaming has always had.

The issue I have is that numbers are never given within a well defined context. This is what Jason Ronald's article on Xbox.com states about SFS: "This innovation results in approximately 2.5x the effective I/O throughput and memory usage above and beyond the raw hardware capabilities on average. SFS provides an effective multiplier on available system memory and I/O bandwidth, resulting in significantly more memory and I/O throughput available to make your game richer and more immersive."

To me, the bolded sounds like it's being compared strictly against the peak hardware bandwidths and not against other similar streaming methods such as PRT or PRT+. Hopefully if I'm wrong, someone will correct me and show me a source that is more clear.

In the Xbox vs Playstation debates, it often sounds like people are comparing one machine that can render a scene using 2.5x less textures loaded(SFS) vs one that requires all the textures(No SFS) but I don't think that is an accurate representation of these consoles.

Shouldn't we be comparing a machine with SFS vs PRT+? I want to know if it is more akin to a 2.5x multiplier vs a 2.0x multiplier or whatever the real numbers.
Ok. so I wasnt the only one confused by this part too, seeing recent posts about SFS. Only thing I am clear about is VA is made up of 4 parts:

The Xbox Velocity Architecture comprises four major components: our custom NVME SSD, hardware accelerated decompression blocks, a brand new DirectStorage API layer and Sampler Feedback Streaming (SFS).

 
Last edited:

Boglin

Member
Ok. so I wasnt the only one confused by this part too, seeing recent posts about SFS. Only thing I am clear about is VA is made up of 4 parts:



Yeah. I have no doubt that SFS is an evolution to the technologies it supercedes but I'm unclear of the extent that it's better.
 

IntentionalPun

Ask me about my wife's perfect butthole
B Boglin : SFS is “PRT+” (or rather, Velocity Architecture SFS is a specific implementation of PRT+); but yes that’s what I’ve been saying as well, it all depends on what they are comparing against.

However I disagree that everyone understands the concept of the multipliers; people certainly pretend not to at least lol
 
Last edited:

SlimySnake

Flashless at the Golden Globes
UE4 and UE5 are great, versatile engines but won't replace proprietary engines and always more taxing that it should be, direct voice comments from Returnal dev on random 1fps drops:


This is a really neat video from IGN. glad to see them get devs on the record. They even talked about how staying at 1080p 60 fps instead of pushing to get 1440p 60 fps allowed them more headroom in terms of bumping up the visuals instead of just pixels. music to my ears.
 

Bo_Hazem

Banned
This is a really neat video from IGN. glad to see them get devs on the record. They even talked about how staying at 1080p 60 fps instead of pushing to get 1440p 60 fps allowed them more headroom in terms of bumping up the visuals instead of just pixels. music to my ears.

And man that final image is so good that we need to erase every 1080p game off our brains! Also had crazy distant objects sharpness.
 

Boglin

Member
B Boglin : SFS is “PRT+” (or rather, Velocity Architecture SFS is a specific implementation of PRT+); but yes that’s what I’ve been saying as well, it all depends on what they are comparing against.

However I disagree that everyone understands the concept of the multipliers; people certainly pretend not to at least lol
Ok, thanks for that. I was under the impression that PRT+ was more granular than PRT by streaming parts of mips but SFS went further still by caching mapping data and blended LODs for a smoother transition. I never actually read any research papers or anything so all my information is second hand.

Also, I think people can generally understand but choose to play dumb. Everyone seems to be experts for their favorite console but their skulls get 3x thicker when it comes time to absorb information about the competition.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
Ok, thanks for that. I was under the impression that PRT+ was more granular than PRT by streaming parts of mips but SFS went further still by caching mapping data and blended LODs for a smoother transition. I never actually read any research papers or anything so all my information is second hand.

I’m not some expert but it’s what MS has in their SFS doc (that it’s also called PRT+.)


Terminology

Use of sampler feedback with streaming is sometimes abbreviated as SFS. It is also sometimes called sparse feedback textures, or SFT, or PRT+, which stands for “partially resident textures”.
 
Last edited:

kyliethicc

Member
Where did you get your info on the 1440p checkerboarding? I am not surprised that 1440p checkerboarding is less taxing than 1440p native. I AM surprised that 1440p checkerboarding is less taxing than 1080p. What happens when a 1440p checkerboarded image is displayed on a 1080p set? Artifacts might be seen when 4K checkerboard is shown on a 4k set but I'd imagine a 4k checkerboard image on a 1080p set would look pretty good. Perhaps this is what MS was going for when they were talking 1440p gaming.
Checkerboard rendering is rendering half the pixels of native res, per frame. Hence the name.

What else could you have thought it was?
 
Last edited:

Hashi

Member

Godfavor

Member
The 8-9 GB/sec was provided before Oodle texture was announced. When games take advantage of both Kraken and Oodle texture compression it can reach 17.38 GB/sec. The 8-9 GB/sec is probably still in specifications all over the place though.


Those are the original values given by Sony, it’s much higher now because of the oodle compression etc. which wasn’t available at that time. I’m pretty sure series X maxes out at 4.8 and it might also be an average at the same time. I could be wrong there but as I remember is 2.4 raw and max 4.8 using compression etc.

Do you have official sources stating that it can reach 17gb/sec with oodle?
 
Last edited:
Do you have official sources stating that it can reach 17gb/sec with oodle?
PS5 IO System to Be 'Supercharged' by Oodle Texture, Bandwidth Goes Up to 17.38GB/s (wccftech.com)

Generally a pro Xbox site and even they mention around 17gb/s. It's simple maths. It also makes sense when Cerny said the theoretical max throughput would be 22gb/s. As you can see they come closer now to that number because of the Oodle texture. Now 17gb/s is of course only working in good conditions and there will be situations where it would be a bit more or less, but in general this number should be about right.
 
Last edited:

ethomaz

Banned
Speaking of PS5’s lack of hardware VRS support:


Foveated rendering tries to mimic the human eyes vision... where you focus it render at higher resolution/quality while the outside of the eyes focus it render at lower resolution/quality.

Our eyes are like that in the focus points everything is sharper but the non focused pets is a bit blurry until you shift the focus.

You can distinguish a people reaching outside your focal point from sides but you can read for example his shirt words until you focus on him.

A easy test is that famous way to put a pen in front of your eyes and focus in the pen... only the pen will be sharper.

It is like motion blur where the render should be faster enough to our eyes created the real motion blur effect (not sure if possible anyway) but insteads devs use a cheap trick... in the case of that render tech the render should be sharper and our eyes created the effect... so they created a cheap effect with the tech.

That is a tender tech and I don’t think it relastes to VRS... plus in a TV with more than can have more than one person watching it makes no sense to use... it works on VR because it saves performance.
 
Last edited:

Sinthor

Gold Member
Hello, everyone. This is not gaming related, but I've been on this thread for a LONG time...way back at least before post #900. Look...just had an extended family member pass from cancer this morning. Won't hit the details, but just for all of the people who get a little too invested in threads like this and go after each other.... Once you've seen a person go from normal and healthy to looking like a WWII death camp survivor in the matter of two months and pass away....puts this shit in perspective. We all have our hobbies that we love and enjoy, but this shit isn't worth getting so fired up about. It could all end for any of us, any day. Try to make sure that what you're putting out into the world is positive as much as possible, because these things matter pretty much not at all in the grand scheme of life on this planet.
 

SlimySnake

Flashless at the Golden Globes
Hello, everyone. This is not gaming related, but I've been on this thread for a LONG time...way back at least before post #900. Look...just had an extended family member pass from cancer this morning. Won't hit the details, but just for all of the people who get a little too invested in threads like this and go after each other.... Once you've seen a person go from normal and healthy to looking like a WWII death camp survivor in the matter of two months and pass away....puts this shit in perspective. We all have our hobbies that we love and enjoy, but this shit isn't worth getting so fired up about. It could all end for any of us, any day. Try to make sure that what you're putting out into the world is positive as much as possible, because these things matter pretty much not at all in the grand scheme of life on this planet.
Sorry for your loss, Sinthor.
 

SlimySnake

Flashless at the Golden Globes
And man that final image is so good that we need to erase every 1080p game off our brains! Also had crazy distant objects sharpness.
I think part of the reason why the 1080p base image looks good here is because its 1080p 60 fps. I highly doubt a 1080p 30 fps image wouldve looked that clean.

Also, the pixel budget for 1080p 60 fps is basically the same as the pixel budget for 4kcb 30 fps. I hope more and more devs go for 4kcb 30 fps and 1080p 60 fps performance modes.
 
Checkerboard rendering is rendering half the pixels of native res, per frame. Hence the name.

What else could you have thought it was?
I was told initially that 1440p CB was a less demanding resolution to render than 1080p native. I thought that was odd. Turns out it isn't true. Otherwise why would you ever render things at 1080p?
 

Loope

Member
Hello, everyone. This is not gaming related, but I've been on this thread for a LONG time...way back at least before post #900. Look...just had an extended family member pass from cancer this morning. Won't hit the details, but just for all of the people who get a little too invested in threads like this and go after each other.... Once you've seen a person go from normal and healthy to looking like a WWII death camp survivor in the matter of two months and pass away....puts this shit in perspective. We all have our hobbies that we love and enjoy, but this shit isn't worth getting so fired up about. It could all end for any of us, any day. Try to make sure that what you're putting out into the world is positive as much as possible, because these things matter pretty much not at all in the grand scheme of life on this planet.
My condolescences man. I've seen the same with my dad 2 years ago. You have, of course, a point there. I think people sometimes just get in the heat of the moment and say stupid shit.
 

Fafalada

Fafracer forever
Ok, thanks for that. I was under the impression that PRT+ was more granular than PRT by streaming parts of mips
Terminology around some of these things is in flux (and consequently a bit of a mess).
Best I can recall, PRT came into active use around 10 years ago (in game contexts anyway), while streaming on mip-map granularity has been around for over 20 (again, in games), so those concept were never meant to be interchangeable. The whole point of "partially resident" is that any given mip-level isn't completely loaded, so it always referred to finer granularity (eg. VRam pages, or smaller).

Best way to contextualize SFS is 'hardware features used by PRT' - eg. Rage implemented sampler-feedback through a simple software rasterizer, allowing it to ship across entire stack of DX9 GPUs of the era. But obviously - at the cost of some CPU time. And while this may seem like a problematic concession, it's worth remembering that cutting edge engine-tech of 2007-ish era used software rasterization to also generate occlusion coverage, force propagation and more - so you'd get multiple uses from going down this route.
As for the multipliers - that's a really poot way of explaining the benefit(though it's obvious why they chose it). The key point is that any given frame in a 4k render accesses less (way less actually) than 512MB of unique data, so if you can optimally fetch from SSD(ie. what SFS should allow to get close to) - that's all the Ram you'd ever need for 'asset storage'. Makes those 16GB suddenly look much bigger.

So what component of the GPU shared on RDNA 2 PC cards, nvidia cards, and XSX/S APUs is responsible for hardware VRS tier 2?
It isn't necessarily the same component, VRS doesn't dictate implementation details, and there are multiple viable approaches to the problem. But all of them require certain extensions to existing hardware, think of it less like 'new component' and more like how RDNA2 extends texture-sampling to accelerate intersection math for RT for example..
 
Last edited:

Lemondish

Member
Sounds more like foveated rendering to me. From a comment I read elsewhere:

For those not hip on VR tech, foveated rendering is an inevitable holy grail that will make it so that VR is an order of magnitude easier to run than games on traditional displays. The human eye can only resolve an area about the size of your thumbnail at arms length in full clarity. You can try just looking around you while paying attention to how much of your vision is actually in focus, or you can use this https://www.shadertoy.com/view/4dsXzM for a pretty instant example.

What this means is that it is unnecessary to render 95% of the image in full resolution because your eyes cannot resolve those details anyway. So now you have an image that is perfectly sharp in the 5% you are rendering at full res, but as soon as you move your pupils off center then it's going to be very noticeable. To combat this in HMDs like the Quest while also getting some perf benefit they essentially match the full res area with the sharpest center part of the lens and then gradually reduce res towards the edges where you're getting some optical blur/distortion anyway. This static style is called "fixed foveated rendering" or "lens matched shading" and it does improve performance but can still be noticeable and leaves a boatload of potential rendering savings on the table.

Now enter eye tracking that is fast and accurate enough to keep up with eye movements and you can render only the part of the display you are looking at in a specific moment in full resolution while going as far as filling in the rest with an ML inferred reconstruction based off of a sparse cloud. Once this gets fully worked out and solved, you would get an absolutely gamechanging rendering cost reduction (20x reduction according to FB's Michael Abrash and that's not out of line with other experts) while being indistinguishable from rendering the whole damn thing in full res at once. Gabe Newell a couple years back talked about how he believes we will hit a point where VR HMDs leapfrog traditional displays and anyone that wants to see the highest graphical fidelity will need to put one on their face, this is what he was talking about and it is going to happen eventually.
 
Last edited:

Allandor

Member
1440p CB is definitely expensive than 1080p native in performance terms.
It is cheaper than 1440p by around 20-30%.

The doc I posted shows that pretty well.
exactly. Because of the extra passes after the image was created it is not the same as cut the res in half (e.g. like simple interlacing would do it). It would have the same "resolution" but wouldn't be processed any further. And checkerboarding is a bit more (and also more effective) than "just" interlacing + TAA. But at least for the sake of simplicity I'm ok with such a description :)
 
Yeah. I have no doubt that SFS is an evolution to the technologies it supercedes but I'm unclear of the extent that it's better.
No need to be unclear, just watch the latest Microsoft Game Stack demonstration of it in action versus a highly optimized, even unrealistically so, gen9 console with fast SSD texture streaming system without Sampler Feedback. Pay attention to when they make clear that the advantages do not change with more visual complex games. The percentage improvement over a texture streaming system without Xbox's Sampler Feedback Streaming remains the same.
 
Hello, everyone. This is not gaming related, but I've been on this thread for a LONG time...way back at least before post #900. Look...just had an extended family member pass from cancer this morning. Won't hit the details, but just for all of the people who get a little too invested in threads like this and go after each other.... Once you've seen a person go from normal and healthy to looking like a WWII death camp survivor in the matter of two months and pass away....puts this shit in perspective. We all have our hobbies that we love and enjoy, but this shit isn't worth getting so fired up about. It could all end for any of us, any day. Try to make sure that what you're putting out into the world is positive as much as possible, because these things matter pretty much not at all in the grand scheme of life on this planet.

Truly very sorry to hear about your loss. And, yes, this stuff really doesn't need to be as toxic as it often becomes. Hell, I've had someone send me a DM regarding my most recent post in this thread saying all kinds of fucked up shit about dying in a fire and accusations of working for MIcrosoft, and almost certainly they post in this thread too. Now obviously I'm built from much stronger stuff than that, so none of it remotely bothers me, but it's sad. Gaming is a hobby that we should have fun with and be able to about respectfully without all the bullshit. That said, as admirable a desire as it may be, many can't, and simply refuse, to ever do so. Again, truly sorry to hear about your loss.
 

Godfavor

Member
No need to be unclear, just watch the latest Microsoft Game Stack demonstration of it in action versus a highly optimized, even unrealistically so, gen9 console with fast SSD texture streaming system without Sampler Feedback. Pay attention to when they make clear that the advantages do not change with more visual complex games. The percentage improvement over a texture streaming system without Xbox's Sampler Feedback Streaming remains the same.
I wonder why MS uses the term SFS and not PRT+ as they are the same thing. Even in their own explanation page the terminology for SFS can be named as PRT+. PRT+ was known for years as partial texture rendering. The only difference that I found is that SFS has the streaming aspect of mips that are calculated in real time according to the distance of the camera view, as well as a solution for replacing mips that arrive late temporarily. Not sure if PRT+ can also do both of them.

I guess that SFS is a unique solution that will be used from all devs, as an API feature set and will not rely on each specific engine for usage (like unreal engine 5).

Edit: link of the terminology: https://microsoft.github.io/DirectX-Specs/d3d/SamplerFeedback.html
 
Last edited:

ethomaz

Banned
Thanks.. wonder why their frame time dropped like 30% in the 2160p native vs 1800p native though? It's like a ~20% rez drop with ~30% frame time drop (head math, forgive me, but it's something like that)
I did not have time to check it before.
In essence the performance scaling should not be really proportional to the resolution scaling... doubling the pixles should a bit more harsh than just half the performance.

1800p = 5.760.000 pixels
2160p = 8.294.400 pixels

2160p to 1800p is ~30% less pixels.
29.7ms to 21.7ms is ~27% less time to render the frame.

BTW 1800CB is 15.99ms... 26% faster to render the frame than native 1800p.

If you take that performance drop is not exactly proportional to resolution increase then I thing the results are pretty fine.
 
Last edited:

IntentionalPun

Ask me about my wife's perfect butthole
IntentionalPun IntentionalPun If you are bored I found the video presentation for the slides I posted before.
Maybe there are more things he talks that are not in the slides but it is 56 minutes lol


Thanks buddy.

And on your other post; I think I was just totally wrong on the pixel differences, which I should know better on lol, was just comparing 1800/2160 vs. 21/29.
 

MrLove

Banned
I wonder why MS uses the term SFS and not PRT+
Marketing and to calm down for the fanboys. it works fine for fanboys obviously. But the bandwith is only one part, most likely the bigger part for high quality instant assets streaming like in unreal 5 demo is latency. Sony put many custom chips to handle that. MS did nothing




Just for reminder



unreal-engine-5-tech-demo-ps5-ar-vr-1.jpg
 
Last edited:
I wonder why MS uses the term SFS and not PRT+ as they are the same thing. Even in their own explanation page the terminology for SFS can be named as PRT+. PRT+ was known for years as partial texture rendering. The only difference that I found is that SFS has the streaming aspect of mips that are calculated in real time according to the distance of the camera view, as well as a solution for replacing mips that arrive late temporarily. Not sure if PRT+ can also do both of them.

I guess that SFS is a unique solution that will be used from all devs, as an API feature set and will not rely on each specific engine for usage (like unreal engine 5).

I guess regardless of what they may call it, Xbox Series X was, according to Microsoft, strongly designed around it as one of the premier features of the console. The GPU even has customizations unique to Series X with the feature in mind according to a Microsoft graphics engineer.












dwho6Lh.jpg
 
Never let the truth get in the way for a weird console war post.

Referring to MrLove

I do know that one "in theory" is better than the other. And out of both systems the PS5 has impressed me the most with it's I/O applications.

But you're right that anyone saying the XSX doesn't have any custom I/O hardware is just trolling.
 

reksveks

Member
I do know that one "in theory" is better than the other. And out of both systems the PS5 has impressed me the most with it's I/O applications.

But you're right that anyone saying the XSX doesn't have any custom I/O hardware is just trolling.
Yeah, the ps5 is going to be more impressive for a while and will become slightly less impressive relatively over time in comparison to the general market (as pc parts and the io stack improves)
 

Bo_Hazem

Banned
Hello, everyone. This is not gaming related, but I've been on this thread for a LONG time...way back at least before post #900. Look...just had an extended family member pass from cancer this morning. Won't hit the details, but just for all of the people who get a little too invested in threads like this and go after each other.... Once you've seen a person go from normal and healthy to looking like a WWII death camp survivor in the matter of two months and pass away....puts this shit in perspective. We all have our hobbies that we love and enjoy, but this shit isn't worth getting so fired up about. It could all end for any of us, any day. Try to make sure that what you're putting out into the world is positive as much as possible, because these things matter pretty much not at all in the grand scheme of life on this planet.

Be strong, brother. But I refuse to lay down my sword in this sacred war.

sword-gif-army.gif
 
I do know that one "in theory" is better than the other. And out of both systems the PS5 has impressed me the most with it's I/O applications.

But you're right that anyone saying the XSX doesn't have any custom I/O hardware is just trolling.

Honestly, the most impressive thing to me so far out of either console this generation where I/O is concerned has been Quick Resume. Multiple games, instantly resumable even after a major system update or full power down.

And the feature just got even faster and more reliable today.



xVJcCfu.jpg





I've already tested some, and it's actually much faster.
 
Honestly, the most impressive thing to me so far out of either console this generation where I/O is concerned has been Quick Resume. Multiple games, instantly resumable even after a major system update or full power down.

And the feature just got even faster and more reliable today.



xVJcCfu.jpg





I've already tested some, and it's actually much faster.


Sounds more like an OS feature than anything else. I was talking about the actual performance in games. Like what Ratchet is doing for example.



To each his own.
 
Last edited:
Status
Not open for further replies.
Top Bottom