• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[IGNxGamer] Matrix Awakens, Hellblade and the Power of Unreal Engine 5 - Performance Preview

Status
Not open for further replies.

MrFunSocks

Banned
I just can't understand how these warriors are coming in here saying that Epic have the same experience with the Xbox as the PS5 when they've literally told us differently, and the fact that a Microsoft First Party Studio literally had to step in to make/fix the Matrix demo lol. From what it sounds like, the demo wouldn't have existed on the Xbox were it not for the Coalition, just like the first demo didn't exist on Xbox.

Crazy.

Then we've got the usual NXGamer getting things wrong and having his usuals defending him. He is absolutely wrong in this argument/discussion about the term proxy etc. Again - dude is NOT a game developer. He doesn't have access to Dev Kits. He doesn't have any insider knowledge.

The console with lower CPU clocks, less raster TF, less RT potential is rendering that thing with the same details (some say it's better), same resolution and same or better performance. No matter what people convince themselves their heads the actual machines don't care.
After one year with both machines in the market society just needs to accept that their believe during the entire year before was not true, that the two machines are actually equal in power and potential, that there's no advantage for one side. The only thing that matters now is which side has the better developers to deliver the best games.
The problem with this is that the PS5 version was made by the people that make the engine and they have been working with the PS5 hardware for longer than they have Xbox Series hardware, confirmed. The Xbox version sounds like it was essentially ported by Microsoft, using an engine they didn't make. There's no comparison worth mentioning here because they're different things.
 
Last edited:

Topher

Gold Member
Then we've got the usual NXGamer getting things wrong and having his usuals defending him. He is absolutely wrong in this argument/discussion about the term proxy etc. Again - dude is NOT a game developer. He doesn't have access to Dev Kits. He doesn't have any insider knowledge.

Why is the bar for providing analysis set at being a developer with dev kit access when it comes to NXGamer NXGamer ? Where are your posts making this same observation in regards to DF?
 

Duchess

Member
Looking back at that thread is hilarious especially all the talk of "staggering differences" especially in RT between the XSX and PS5 and how severely Sony apparently dropped the ball.
But MS haven't yet released the RT Direct X stuff for Xbox or PC yet, I don't think?

So, perhaps when they do, we'll see the "staggering differences"? Sometime this year, I should expect. Maybe it'll arrive with Hellblade 2, or that game will get an upgrade to support it later down the line. We'll finally then see the true performance gap between the Series consoles and PS5.

Hey, it could be that the Series X ends up being leaps and bounds ahead of PS5, and the Series S will be on part with Sony's gigantic white box?

(spoiler - even if such tools release, the PS5 will continue to surprise everyone by showing it's actually just as, if not more, capable than XSX, as we've already seen these past 12 months)
 
Last edited:

hlm666

Member
But MS haven't yet released the RT Direct X stuff for Xbox or PC yet, I don't think?
Errr DXR came out in like 2018, we are into revision 1.1 now on PC and I'm pretty sure the the xbox series consoles had DXR support out the gate.
 

Panajev2001a

GAF's Pleasant Genius
Why is the bar for providing analysis set at being a developer with dev kit access when it comes to NXGamer NXGamer ? Where are your posts making this same observation in regards to DF?
Answer:
episode 8 parking lot GIF
 

Panajev2001a

GAF's Pleasant Genius
I just can't understand how these warriors are coming in here saying that Epic have the same experience with the Xbox as the PS5 when they've literally told us differently, and the fact that a Microsoft First Party Studio literally had to step in to make/fix the Matrix demo lol. From what it sounds like, the demo wouldn't have existed on the Xbox were it not for the Coalition, just like the first demo didn't exist on Xbox.

Crazy.

Then we've got the usual NXGamer getting things wrong and having his usuals defending him. He is absolutely wrong in this argument/discussion about the term proxy etc. Again - dude is NOT a game developer. He doesn't have access to Dev Kits. He doesn't have any insider knowledge.


The problem with this is that the PS5 version was made by the people that make the engine and they have been working with the PS5 hardware for longer than they have Xbox Series hardware, confirmed. The Xbox version sounds like it was essentially ported by Microsoft, using an engine they didn't make. There's no comparison worth mentioning here because they're different things.
So your point is that a year after the XSX|S have launched in stores that the XSX and XSS tools got lost again / taking advantage of them is troublesome or… what?

This an odd flex to make… scaling down to XSS is trivial, XSX being much faster than PS5, GDK allowing to treat XSX as a PC which for a PC developer like Epic should be a dream come true… where did all of that go?

TC has been working with UE5 code access for over a year and they know the engine well. Still, I think were called in to optimise the Xbox version, but something tells me it was more for the XSS than the XSX (far more for the former than the latter).
If you think that their help was needed to get the XSX up to speed with PS5… well, it might be true, but it is a point I would expect a Sony fanboy to make… it does not make the state of XSX tools and GDK’s ease to extract performance out of the console (or the ease of actually tapping all of the on paper theoretical performance of the machine, theoretical vs actual) look that good or make their competitors’ plan look even better.

Considering engine used in games tend to lag the most up to date version (like the one TC optimised for this demo), you are stating that most devs on UE5 were working with tools that favoured PS5.
So, you are stating that the cheaper console (yet breaking even for a while), that had its dev kits sent out earlier, that had better yields and more units sent to market (I guess if it is true that MS is using a good number of XSX SoC’s for Xcloud we know where their priorities lie), the one with supposedly much lower sustained performance but in reality is going toe to toe (and in some cases pulling ahead) in most games with the faster monster XSX… this is also the console with a non PC like custom console graphics API that got earlier support by one of the biggest multi platform engines without requiring the first party teams to optimise the engine for Epic… and you think this happen on its own without planning? Interesting praise for Cerny here ;).
 

Heisenberg007

Gold Journalism
I just can't understand how these warriors are coming in here saying that Epic have the same experience with the Xbox as the PS5 when they've literally told us differently, and the fact that a Microsoft First Party Studio literally had to step in to make/fix the Matrix demo lol. From what it sounds like, the demo wouldn't have existed on the Xbox were it not for the Coalition, just like the first demo didn't exist on Xbox.

Crazy.
So you're saying that The Coalition, who is considered as the "Masters of Unreal Engine" by some people here on this forum, had to jump in, optimize the demo for Xbox, and it still doesn't perform as well as it should, despite the special treatment?

No first-party PS studio has to jump in to optimize the demo on PS5. It already ran well without any special treatment.

Will an MS first-party studio have to jump in every time a UE5 game drops? How long are we going to use this as an excuse that UE5 devs have more experience working with the PS5?

Or ... or ... perhaps it's a better idea to just accept the possibility that the PS5 environment with 2-3x faster I/O and SSD might be better suited to Nanite tech and UE5, as Tim Sweeney said in May 2020 that PS5 is the "absolute best hardware" and a "remarkably balanced device."
 

sinnergy

Member
So you're saying that The Coalition, who is considered as the "Masters of Unreal Engine" by some people here on this forum, had to jump in, optimize the demo for Xbox, and it still doesn't perform as well as it should, despite the special treatment?

No first-party PS studio has to jump in to optimize the demo on PS5. It already ran well without any special treatment.

Will an MS first-party studio have to jump in every time a UE5 game drops? How long are we going to use this as an excuse that UE5 devs have more experience working with the PS5?

Or ... or ... perhaps it's a better idea to just accept the possibility that the PS5 environment with 2-3x faster I/O and SSD might be better suited to Nanite tech and UE5, as Tim Sweeney said in May 2020 that PS5 is the "absolute best hardware" and a "remarkably balanced device."
Or none of the above , and they simple didn’t have the man-power to do a Xbox conversion.
 

NXGamer

Member
Hate to jump in here but you've misunderstood what funking giblet funking giblet is trying to tell you in terms of definitions. He is right in questioning your use of the word proxy. nanite doesn't fit the definition of a proxy (replacing data with another often simpler representation) nor is it tessellation (adding/interprolating data that isn't there) . Nanite is just sampling data that is there and is the true representation of the object. Not a proxy and not tessellation.

Think of an analogy of looking at a photo of somebody. Replacing that person with a cardboard cutout to represent that person might be considered a proxy. Using ML to upscale a low res image of that person is akin to tessellation. Simply taking a picture of somebody at different resolutions (ie sampling) isn't either a proxy or tessellation. Now apply this to nanite. The visual representation of a virtual mesh is not a proxy it is simply a mesh (all the data) sampled at whatever resolution is required. It is not a proxy (another replacement representation of the data) nor is it tessellation ( interpolation/generation of data that isn't there).


Looking back at that thread is hilarious especially all the talk of "staggering differences" especially in RT between the XSX and PS5 and how severely Sony apparently dropped the ball. Now though some of the same people defend even having no RT at all and how there is barely noticeable differences in lower settings as "it's the same experience" because papa MS is mostly manufacturing XSS' to sell.
I welcome conversation so all good, but you are falling into the same trap as Giblet in the naming convention is fixed.
" nanite doesn't fit the definition of a proxy (replacing data with another often simpler representation)"
But it does, at any one point in a frame a 1million +polygon model with 2 million or more verts will be possibly 900 Polys and about 1500 verts, this is a reconfiguration and a simplification of the original data. In this case quite literally.

" nor is it tessellation (adding/interprolating data that isn't there"
Again, tessellation is a mathematics term and it means to create or fill a surface with multiple shapes to fill the same space/silhouette. Then what happens with nanite is it constantly 'adapts' the tessellation to create a close as visual view of the base object with the least amount of verts/tris as possible. Adaptively tessellating the inner surface at all times, which you can quite clearly see in the visualiser and even count them as they change.

"Nanite is just sampling data that is there and is the true representation of the object. Not a proxy and not tessellation."
Nanite is sampling from the source model verts/objects and creating a compressed representation of that at all times but using (most likely) a complexity based compression (As in Data science ranges of data can be compressed easier/better the less complex they are and this is likely why it only supports rigid body models). Ala at any one frame to the next it will be a Simpler/reduced/smaller model of the base model unless big enough to require all Verts/Tris i.e. a Proxy representation of the huge source import.

Think of it like this, if it rendered all the polygons of each imported model at all times (a full data target no reductions) it simply would not run at playable frame-rates, it has to simplify the model on the fly and is likely using a something as described above to manage against the verts and reduce this with a compression technique to represent it.
 
Last edited:

DJ12

Member
“they worked with Epic to ensure the assets in the demo were set up to fully leverage virtual texture streaming and nanite wherever possible and tuned internal memory systems, ESPECIALLY on Xbox Series S, to ensure it all fit in the memory.”

“With this FOCUS, the Xbox Series S version shipped with all the same UE5 features enabled as Xbox Series X (albeit with different quality) including but not limited to raytraced reflections and raytraced shadows”
Isn't the key takeaway here that some features ps5 'doesn't have' that are supposed to increase performance 10 fold, don't actually have much relevance at all or are not a patch on the mythical geometry engine.
 

Fredrik

Member
I swear we’ve had a year of the same old posts and same images Being shared. Hell even more than a year and still when you think of the actual games that have released. There’s no real major difference between the consoles.

its a shame. I think the only games we have seen large discrepancies on are tiny Indy games with small teams that haven’t had the resources to support all platforms or have outsourced development.

I bet they all will be more or less identical with little sways either side and we can all just play the games on whatever platform we want.

can’t wait to see real unreal engine 5 games released on these consoles.
Yeah console people need to stop fighting and use their time to figure out how they can acquire both consoles instead, because unless they have a proper gaming PC it’ll be hell living through this generation with only one console, both will be essential purchases for any serious gamer considering how many first party studios both MS and Sony now owns.
 
" nor is it tessellation (adding/interprolating data that isn't there"
Again, tessellation is a mathematics term and it means to create or fill a surface with multiple shapes to fill the same space/silhouette.
Just to clarify, you can turn 8 vertices in a cube formation into a sphere using tessellation, it doesn't have to be the same space/silhouette.
 

Three

Member
Think of it like this, if it rendered all the polygons of each imported model at all times (a full data target no reductions) it simply would not run at playable frame-rates, it has to simplify the model on the fly and is likely using a something as described above to manage against the verts and reduce this with a compression technique to represent it.

Yes of course it only samples/renders and stores in memory a subset ( lower poly subset) of the whole data set (complete mesh) that is on the drive but this is different to tessellation which aims to make things smoother than the original complete mesh. In nanite it's one virtualised mesh where you just sample at different lower resolutions not a proxy mesh or a mesh stored at low vertex count then tesselated. It's not a proxy nor tesselated. Calling it a proxy mesh only confuses things too because unreal has actual proxy geometry which is not how nanite works.


Nanite is one mesh with complete data and then it's sampled at different resolutions on the fly in real-time. There isn't a proxy mesh in existence and there isn't vertices added which are not part of the original mesh (tessellation). The whole point is that you shouldn't really be calling it a proxy. It's not a proxy in any way. A picture of your arm isn't a proxy arm simply because it doesn't include your entire body or isn't atom scale resolution, it's still your actual arm even if the printed image of your arm doesn't have all the data of you entirely. So proxy doesn't make sense in any context here. There isn't realy somthing representing something else. It's one thing being sampled.
 

Panajev2001a

GAF's Pleasant Genius
Or none of the above , and they simple didn’t have the man-power to do a Xbox conversion.
Yet PS5 was fine for them to do and Epic does not have the manpower to optimise this demo for XSX|S a year after the console launch?
How does an argument like this look/sound good? This is why it seems like an argument that PS5 fanboys would start, I mean it can be a valid explanation for a symptom, but what is the cause?
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Yes of course it only samples/renders and stores in memory a subset ( lower poly subset) of the whole data set (complete mesh) that is on the drive but this is different to tessellation which aims to make things smoother than the original complete mesh. In nanite it's one virtualised mesh where you just sample at different lower resolutions not a proxy mesh or a mesh stored at low vertex count then tesselated. It's not a proxy nor tesselated. Calling it a proxy mesh only confuses things too because unreal has actual proxy geometry which is not how nanite works.


Nanite is one mesh with complete data and then it's sampled at different resolutions on the fly in real-time. There isn't a proxy mesh in existence and there isn't vertices added which are not part of the original mesh (tessellation). The whole point is that you shouldn't really be calling it a proxy. It's not a proxy in any way. A picture of your arm isn't a proxy arm simply because it doesn't include your entire body or isn't atom scale resolution, it's still your actual arm even if the printed image of your arm doesn't have all the data of you entirely. So proxy doesn't make sense in any context here. There isn't realy somthing representing something else. It's one thing being sampled.
While true, you do see how we are playing semantics too. Detail is added to the scene on the fly… geometry generated at runtime or streamed from a virtual mesh that exists in a compact abstracted representation somewhere on disk or in RAM and that gets sampled and streamed over as you would do with a texture. You do not have a a traditional triangle mesh on disk either, you generate the lod dynamically at runtime taking scene parameters and using it to sample the source mesh and generate the optimal triangle representation on the fly so what the GPU render can be described as a proxy for the real abstract geometry (although we shall not call it as such not to overload a state of the art term UE wise, but again it is still semantics :)).
 

Panajev2001a

GAF's Pleasant Genius
Isn't the key takeaway here that some features ps5 'doesn't have' that are supposed to increase performance 10 fold, don't actually have much relevance at all or are not a patch on the mythical geometry engine.
I am not sure if one can take what happened in a way that does not look good for PS5 as an architecture and as a product… we must see how the planning and execution of PS5 just comes out vindicated by all of this much more than hurt, right?

Cheaper box managing to equal if not beat performance of a more expensive box and is already breaking even on HW (plus or minus shipping costs due to COVID and global shortages), devkits sent out in stable form to developers earlier, Epic did not need Sony’s ICE team or other technically proficient game studios to be dropped in to optimise the demo for the biggest cross platform engine on consoles and PC’s (despite using custom graphics and I/O API’s), etc…
 
Last edited:

Sosokrates

Report me if I continue to console war
I am not sure if one can take what happened in a way that does not look good for PS5 as an architecture and as a product… we must see how the planning and execution of PS5 just comes out vindicated by all of this much more than hurt, right?

Cheaper box managing to equal if not beat performance of a more expensive box and is already breaking even on HW (plus or minus shipping costs due to COVID and global shortages), devkits sent out in stable form to developers earlier, Epic did not need Sony’s ICE team or other technically proficient game studios to be dropped in to optimise the demo for the biggest cross platform engine on consoles and PC’s (despite using custom graphics and I/O API’s), etc…

PS5 ain't cheaper....
 

Panajev2001a

GAF's Pleasant Genius
PS5 ain't cheaper....
$399 vs $499, it sure looks like you can get the top performance tier for $100 less, but still nice try to focus on a single adjective and ignore everything else (remove the “cheaper” bit and the argument is still fundamentally the same) trying to start a barely tangential discussion lol… would look just like deflecting.
 
Last edited:

Sosokrates

Report me if I continue to console war
$399 vs $499, it sure looks like you can get the top performance tier for $100 less, but still nice try to focus on a single adjective and ignore everything else (remove the “cheaper” bit and the argument is still fundamentally the same) trying to start a barely tangential discussion lol… would look just like deflecting.

Yes but it doesn't have disc drive.
Sorry for focusing on one point.
They are both very good hardware. I dont think its really lobsided in either direction. I would not put hardware performance at a point of contention when deciding which one system to get. Overall the seriesX does seem to be performing better now, the last several digital foundry comparisons have shown the majority of games performing better on the seriesX, but we do get some games performing better on PS5 so for me its a game by game basis, but other things influence my decision more as a multi console owner
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Yes but it doesn't have disc drive.
So? You might one that does not like GamePass and Xcloud as directions nor the digital future and I could take this point as semi-relevant… but I do not think you fit the profile.

Still, if you were to remove the word cheaper just for sake of argument the disc drive would not matter as a point… so you are taking an already orthogonal to the current discussion point (attaching to the lowest hanging fruit you could find?) and then trying to focus the discussion on it. Trying to get further and further past the actual point? Not getting what the play is here…

I am not sure if one can take what happened in a way that does not look good for PS5 as an architecture and as a product… we must see how the planning and execution of PS5 just comes out vindicated by all of this much more than hurt, right?

Cheaper box managing to equal if not beat performance of a more expensive box and is already breaking even on HW (plus or minus shipping costs due to COVID and global shortages), devkits sent out in stable form to developers earlier, Epic did not need Sony’s ICE team or other technically proficient game studios to be dropped in to optimise the demo for the biggest cross platform engine on consoles and PC’s (despite using custom graphics and I/O API’s), etc…

My point there was clear (I think): some people are putting up a defense that does not even look like deflecting at this point, but an argument that Sony fanboys would make as it make the other platform look kind of worse.
 
Last edited:

NXGamer

Member
Yes of course it only samples/renders and stores in memory a subset ( lower poly subset) of the whole data set (complete mesh) that is on the drive but this is different to tessellation which aims to make things smoother than the original complete mesh. In nanite it's one virtualised mesh where you just sample at different lower resolutions not a proxy mesh or a mesh stored at low vertex count then tesselated. It's not a proxy nor tesselated. Calling it a proxy mesh only confuses things too because unreal has actual proxy geometry which is not how nanite works.


Nanite is one mesh with complete data and then it's sampled at different resolutions on the fly in real-time. There isn't a proxy mesh in existence and there isn't vertices added which are not part of the original mesh (tessellation). The whole point is that you shouldn't really be calling it a proxy. It's not a proxy in any way. A picture of your arm isn't a proxy arm simply because it doesn't include your entire body or isn't atom scale resolution, it's still your actual arm even if the printed image of your arm doesn't have all the data of you entirely. So proxy doesn't make sense in any context here. There isn't realy somthing representing something else. It's one thing being sampled.
Again, that is what it has become known as within gaming circles, it, ironically here, is not its definition which I have shown. Adaptive tessellation for example is used to add polygons and increase detail when closer to the camera sometimes based on height maps. You can see the triangle build up changing of the object, tessellation of that object is being changed on the fly.

What would you call that then, if verts, polygons and overall mesh build is changed and simpler dynamically from the source?

You are fixating on the Data stored on the disc, this is always true and never not. A virtualised server from a P2V migration awakes as a folder defined on a disc but this can be suspended, reverted and even duplicated across multiple sets. As far as itself and any clients know it is the same server as before, but it is a virtualised proxy of the physical server that can use more or less resources of the host dynamically as required. A proxy server of the original that dynamically scales data use and resources based on dynamic requirements.

Proxy : An entity or variable used to model or generate data assumed to resemble the data associated with another entity or variable that is typically more difficult to research. That which takes the place of something else; a substitute.

I do hate doing the "appeal to authority" but even in UE5 dash and Epic's notes they call it a Proxy mesh from the Nanite solution for Ray Tracing, collision and Lightmass use.

The fact is they are doing one manually and the other, Nanite, Dynamically but they are presenting the source model at a lower density and data at render time, each time.


Nanite Proxy Mesh and Precision Settings​

Static Meshes include additional properties that control the precision of the nanite representation, and the coarse representation generated from the highly detailed mesh called the Proxy Mesh.

These settings can be found in the Static Mesh Editor Details panel under the Nanite Settings.
 
Last edited:
Or ... or ... perhaps it's a better idea to just accept the possibility that the PS5 environment with 2-3x faster I/O and SSD might be better suited to Nanite tech and UE5, as Tim Sweeney said in May 2020 that PS5 is the "absolute best hardware" and a "remarkably balanced device."
Can't believe people are still falling for Sweeney's nonsense. Actually I can, it's the power of marketing.
 

Three

Member
While true, you do see how we are playing semantics too. Detail is added to the scene on the fly… geometry generated at runtime or streamed from a virtual mesh that exists in a compact abstracted representation somewhere on disk or in RAM and that gets sampled and streamed over as you would do with a texture. You do not have a a traditional triangle mesh on disk either, you generate the lod dynamically at runtime taking scene parameters and using it to sample the source mesh and generate the optimal triangle representation on the fly so what the GPU render can be described as a proxy for the real abstract geometry (although we shall not call it as such not to overload a state of the art term UE wise, but again it is still semantics :)).
It is semantics and you could say when you look at a person you are only seeing a proxy of them. the image the photons on your retina is creating is not the real person but nobody confuses others like this with semantics. In UE Tessellation is one thing, proxy geometry another. These have definitions already. the use of the word proxy and general wording here only confuses things:

"your 3d characters your
buildings your cars are all constructed
of polygons or triangles effectively a
3d approximation of the intended
geometry and as you can see from the
demos visualizer these objects have a
vast amount of triangles within them
that make up the object but as i walk
and move you can see all these verts
popping dividing and shifting within
each surface this is a new element of
nanite as each of these 3d models has
been virtualized into a nanite proxy
mesh a scan and calculation of the large
full-sized
objects that creates almost
every vertex array of the object into a
set of proxy subdivided meshes
these are then built and defined by the engine on
input and dynamically shifted at runtime
to define the object as close to the
base import at all times even up to the
huge film scale imported assets if close
enough"

Notice virtualised into a nanite proxy mesh and proxy subdivided meshes here doesn't really make sense. The subdivided mesh is the actual whole data. It isn't a proxy.
 

Three

Member
I do hate doing the "appeal to authority" but even in UE5 dash and Epic's notes they call it a Proxy mesh from the Nanite solution for Ray Tracing, collision and Lightmass use.

The fact is they are doing one manually and the other, Nanite, Dynamically but they are presenting the source model at a lower density and data at render time, each time.

I believe this is exactly what funking giblet funking giblet is referring to here.
Sorry to be pedantic, they are not solving the same issue. The UE5 Proxy objects are NOT rendered when Nanite is enabled, they are used for things like Collision only.

The proxy mesh is not the same as the full detail nanite representation that is being shown in the visualiser.

From the doc:
"In those cases, the Proxy Mesh is the generated representation used, like when complex collision is needed or when a platform doesn't support Nanite rendering"

Or for raytracing. The proxy mesh in UE5 is not the nanite mesh you are seeing in the visualiser. Proxy mesh is referring to something else here. An actual not rendered proxy mesh.
 
Last edited:

Darius87

Member
Of course.

But panajev spoke like the cheaper price and performance parity the PS5 digital has, is some achievement of merit, but the price advantage goes out the window when you factor in the extra expense of digital games.
no one argues about price of digital games.
the point i'm making is that you contradicted yourself, going by your logic PS5 isn't cheaper then XSX but digital is PS5.
 

assurdum

Banned
ill be honest i wasnt impressed by the matrix demo. maybe if i had more control yeah
I was definitely impressed by the cutscene of UE5. But when I start to play with the demo I immediately noticed the cost of AMD Fix Resolution. Compressed artifacts are very visible around the main character, Jesus I really hope it's because it's an early code because if Fix Resolution cause such issue, it's even worse to look CBR. Good I hope to be wrong.
 

assurdum

Banned
Can't believe people are still falling for Sweeney's nonsense. Actually I can, it's the power of marketing.
Can't believe it's so hard to accept that someone else could praise a console which is not your favourite. Because Tim Sweeney is not the only developers who praised ps5 hardware design, if you take a look to the net. And what exactly nonsense has said Tom Sweeney, please enlight us.
 
Last edited:
Can't believe it's so hard to accept that someone else could praise a console which is not your favourite. Because Tim Sweeney is not the only developers who praised ps5 hardware design, if you take a look to the net. And what exactly nonsense has said Tom Sweeney, please enlight use.
That's not the point, you can praise the hardware all you want because it is indeed well designed. The point is that Sweeney sold the lie that Nanite somehow needs 22 GB/s of Cerny sauce, when in reality it runs at 300 MB/s.
 

Elog

Member
Proxy : An entity or variable used to model or generate data assumed to resemble the data associated with another entity or variable that is typically more difficult to research. That which takes the place of something else; a substitute.

I do hate doing the "appeal to authority" but even in UE5 dash and Epic's notes they call it a Proxy mesh from the Nanite solution for Ray Tracing, collision and Lightmass use.

The fact is they are doing one manually and the other, Nanite, Dynamically but they are presenting the source model at a lower density and data at render time, each time.
To some extent this is a play with words. I actually see Three's point of view. In a way, proxy is more commonly used when you measure something else to represent another variable (i.e. more of a surrogate). In this case the simplified geometrical model is still using the true source data so is it really a proxy or simply a low resolution model of the 'truth' that is used to prioritise work downstream of that model?

The point is kind of moot though since Epic uses the word proxy here :)
 

NXGamer

Member
I believe this is exactly what funking giblet funking giblet is referring to here.


The proxy mesh is not the same as the full detail nanite representation that is being shown in the visualiser.

From the doc:
"In those cases, the Proxy Mesh is the generated representation used, like when complex collision is needed or when a platform doesn't support Nanite rendering"

Or for raytracing. The proxy mesh in UE5 is not the nanite mesh you are seeing in the visualiser. Proxy mesh is referring to something else here. An actual not rendered proxy mesh.
Again, that is what it has become known as within gaming circles, it, ironically here, is not its definition which I have shown. Adaptive tessellation for example is used to add polygons and increase detail when closer to the camera sometimes based on height maps. You can see the triangle build up changing of the object, tessellation of that object is being changed on the fly.

What would you call that then, if verts, polygons and overall mesh build is changed and simpler dynamically from the source?

Proxy : An entity or variable used to model or generate data assumed to resemble the data associated with another entity or variable that is typically more difficult to research. That which takes the place of something else; a substitute.
 
"Sweeney sold the lie that Nanite somehow needs 22 GB/s of Cerny sauce"

Those are your words. Where does Sweeney say that Nanite needs 22 GB/s of Cerny sauce?
I never said that he said it. He just sold the lie. The sauce is in the post, it proves that people still believe it.
 

nemiroff

Gold Member
Can't believe people are still falling for Sweeney's nonsense. Actually I can, it's the power of marketing.
Yep. He's a smart guy with authority and knows how to play the game, we can't really fault some people falling for it. In my personal opinion he's also a sociopath. Good example of that was when he infamously threw his own employees under the bus in a cold calculated gas lighting move just to save his face and his pr relationship with Sony.
 
Last edited:

Elog

Member
I never said that he said it. He just sold the lie. The sauce is in the post, it proves that people still believe it.
Turn this into an intelligent discussion instead: If an application wants to utilise 4K textures in UE5, what technical specifications are required to achieve that in a rapidly changing environment (i.e. fast movement of the camera point of view)? We do not know that except that the number is high in terms of bandwidth and low in terms of latency. The real question is if current systems can easily be saturated this way or not - my bet is that the answer is yes to that question so that lower latency and higher bandwidth can result in better graphics in UE5.
 
Last edited:
Turn this into an intelligent discussion instead: If an application wants to utilise 4K textures in UE5, what technical specifications are required to achieve that in a rapidly changing environment (i.e. fast movement of the camera point of view)? We do not know that except that the number is high in terms of bandwidth and low in terms of latency. The real question is if current systems can easily be saturated this way or not - my bet is that the answer is yes to that question so that lower latency and higher bandwidth can result in better graphics in UE5.
The two demos we have don't exceed 300 MB/s. We don't know what the future holds, but the post I quoted was based on these two demos.
 

Three

Member
The point is kind of moot though since Epic uses the word proxy here :)
As far as I know Epic uses the word proxy mesh when talking about representations of a mesh that do not match the nanite render. Actual simplified proxy meshes for things like collision. Though I guess for platforms that don't support nanite they would technically be called proxy meshes when they are not really proxy to anything.

In the doc:

Proxy Mesh​

Many parts of Unreal Engine need access to the traditional vertex buffer provided by traditionally rendered meshes. When Nanite is enabled for a Static Mesh, it generates a coarse representation that is accessible and used where the highly detailed Nanite data cannot be. In those cases, the Proxy Mesh is the generated representation used, like when complex collision is needed or when a platform doesn't support Nanite rendering.

So nanite generates a proxy mesh for collision and lighting which require it but it it isn't the actual mesh you see in the nanite visualiser and certainly can't be used to refer to the data "full-size detail mesh with every vertex" as was said in the video.
 
Last edited:

Arioco

Member
That's not the point, you can praise the hardware all you want because it is indeed well designed. The point is that Sweeney sold the lie that Nanite somehow needs 22 GB/s of Cerny sauce, when in reality it runs at 300 MB/s.


It's funny you say that. That's exactly the opposite of what Sweeney actually said when UE5 features were unveiled.

While Epic wouldn’t comment on any potential performance differences between the PS5 and Xbox Series X, Sweeney confirmed that the features shown today, like real-time global illumination and virtualized geometry, are “going to work on all the next-generation consoles.”


https://webcache.googleusercontent....h-beats-high-end-pc+&cd=5&hl=es&ct=clnk&gl=es


The most used multiplatform engine in the world and still somehow some people understood it would only work on PS5. I wonder if they were paying attention at all. 🤷‍♂️
 
Where did you get that number? I thought we did not know as of yet. Maybe I have missed a tech update somewhere?
300 mb/s was confirmed to DF for the Matrix demo. For the 2020 demo, we have the PC version which rarely exceeds even 200 mb/s. There have also been statements saying that Nanite generally doesn't use that much bandwidth, which one could basically consider a feature of it.
 

Loxus

Member
Yep. He's a smart guy with authority and knows how to play the game, we can't really fault some people falling for it. In my personal opinion he's also a sociopath. Good example of that was when he infamously threw his own employees under the bus in a cold calculated gas lighting move just to save his face and his pr relationship with Sony.
I never said that he said it. He just sold the lie. The sauce is in the post, it proves that people still believe it.

Tim Sweeney Explains Exactly Why the PS5’s SSD and I/O Architecture Is Way More Efficient Than PC’s
Systems integration and whole-system performance. Bringing in data from high-bandwidth storage into video memory in its native format with hardware decompression is very efficient. The software and hardware stack go to great lengths to minimize latency and maximize the bandwidth that's actually accessible by games.

Those PC numbers are theoretical and are from drive into kernel memory. From there, it's a slow and circuitous journey through software decompression to GPU driver swizzling into video memory where you can eventually use it. The PS5 path for this is several times more efficient. And then there's latency.

On PC, there's a lot of layering and overhead. Then you have the issue of getting compressed textures into video memory requires reading into RAM, software decompressing, then calling into a GPU driver to transfer and swizzle them, with numerous kernel transitions throughout.



Even Nvidia and Microsoft with RTX IO and Direct X Storage realized this.
Xkocq7T.jpg


Difference with the PS5 is it has dedicated hardware for decompression.
uaG29tj.jpg



I don't get why both of you spread misinformation for no reason.
 
Status
Not open for further replies.
Top Bottom