• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Guerrilla Games may indicate that they have bigger plans for the Decima Engine

Beautiful looking engine but janky as fuck and I’m not a fan of it. Horizon and death stranding are examples of how garbage this engine is for traversal and platforming, and the physics aren’t great, I don’t know why it gets praise, people really only give a damn about visuals and graphics.

I don’t know why some outside studios choose to use it instead of Naughty Dogs or Santa Monica’s for example.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Im enjoying all these workflow upgrades and we clearly need them but constantly relearning new wotkflows can leave aging artists like myself in a consant state of Future Shock. It is for the best though. Super Exciting times.
I think part of it, is that Unreal Engine wants to be the go to place for effectively all things CGI.
To do that they need to adopt systems that are industry standard.....people coming from film using the unreal material system probably take a good few hours to make materials they used to make in a few minutes.

So now they are adopting an analog of standard.

Nanite itself is kinda the same idea....imagine you were a CG artist using offline, you made your model, optimized it, unwrapped(if that even), textured it and hit render....that was it. (being very reductive here)
With game engines, you would have to make your model, then you would have to make a lowpoly version of that model, bake a normal map of your high poly onto the low poly, then you would texture the lowpoly, then you would put the low poly in engine and import the normal before you could hit render.
Nanite is closer to working with offline....make model, texture it, import into engine....render.
 

CGNoire

Member
I think part of it, is that Unreal Engine wants to be the go to place for effectively all things CGI.
To do that they need to adopt systems that are industry standard.....people coming from film using the unreal material system probably take a good few hours to make materials they used to make in a few minutes.

So now they are adopting an analog of standard.

Nanite itself is kinda the same idea....imagine you were a CG artist using offline, you made your model, optimized it, unwrapped(if that even), textured it and hit render....that was it. (being very reductive here)
With game engines, you would have to make your model, then you would have to make a lowpoly version of that model, bake a normal map of your high poly onto the low poly, then you would texture the lowpoly, then you would put the low poly in engine and import the normal before you could hit render.
Nanite is closer to working with offline....make model, texture it, import into engine....render.
Yah I figured thats the case between the Stagecraft or whatever its called and now UEFN with the later also doubleing as a Metaverse in. But REYES finally...didnt think Id see the day.

My biggest fear going forward as far as tech goes now is that devs are somehow not that willing or financially motivated to include a true next step in physics. I guess we will have to wait for Remedy's Max Payne Remake to lead the way.
 

CamHostage

Member
Chaos - is far far more advanced than havok or physx. Check out there release video they did for that vr robot game where they break down its features.
Ive seen Chaos but I havent seen it do anything that Havok and PhysX couldnt.....im pretty sure the main reason Epic chose to abandon PhysX is because they want everything to be handled in house, they dont want to pay anyone for licensing.

CGNoire, you're not talking about the Robo Recall Chaos demo, are you? AFAIK, they never did a full breakdown of that (and never released it to the public, or had it out on the show floor.)

All of its effects systems are doable with Chaos, so I'm not questioning what was shown. (Also, it's 4 years old now so it's frustrating no game designer has gone all out in a physics showcase in a released game doing that stuff; "fun with physics" unfortunately tapered off as a type of game playground over the years, but there's always plenty of indie games in the works trying to make their own Matrix hallway sequence and maybe a good one will come out some day...) However, if you watch the Chaos unveil demo carefully, you can see where it hides the seams. The camera is on a track, either locked to a small corridor with set destructibles or it is panned to look above the horizon for complicated outdoor shots so that the debris doesn't need to be tracked hitting and bouncing and resting. And then the big finale is I would assume pre-calculated, like the bridge in Matrix Awakens.

Cool stuff, but what Chaos is doing reads as "standard" use of fractured destruction (combined with Niagara particle systems and other effects) and physical sims of weight, velocity/direction, bounce, and sleep/culling. Not that I know what I'm talking about, but I haven't read talks of Unreal Chaos being special in a specific way, and in fact some bemoaned its forced replacement of PhysX since Chaos was not as performative in its launch version. (Maybe since then it has caught up?)

...Either way, it's kind of out of the conversation here since Decima Engine itself may not have a physics engine of its own per se.

Guerrilla used Jolt Physics (which isn't nearly as sexy in demos as the big licensed middleware tools, it's just for devs) for rigid body physics and collision detection in Horizon, and then Houdini or other tools to generate the game's special effects. I would assume a game development studio using Decima to make a game could use the same Jolt system or probably could plug in middleware or their own physics system.
 
Last edited:

CGNoire

Member
CGNoire, you're not talking about the Robo Recall Chaos demo, are you? AFAIK, they never did a full breakdown of that (and never released it to the public, or had it out on the show floor.)

All of its effects systems are doable with Chaos, so I'm not questioning what was shown. (Also, it's 4 years old now so it's frustrating no game designer has gone all out in a physics showcase in a released game doing that stuff; "fun with physics" unfortunately tapered off as a type of game playground over the years, but there's always plenty of indie games in the works trying to make their own Matrix hallway sequence and maybe a good one will come out some day...) However, if you watch the Chaos unveil demo carefully, you can see where it hides the seams. The camera is on a track, either locked to a small corridor with set destructibles or it is panned to look above the horizon for complicated outdoor shots so that the debris doesn't need to be tracked hitting and bouncing and resting. And then the big finale is I would assume pre-calculated, like the bridge in Matrix Awakens.

Cool stuff, but what Chaos is doing reads as "standard" use of fractured destruction (combined with Niagara particle systems and other effects) and physical sims of weight, velocity/direction, bounce, and sleep/culling. Not that I know what I'm talking about, but I haven't read talks of Unreal Chaos being special in a specific way, and in fact some bemoaned its forced replacement of PhysX since Chaos was not as performative in its launch version. (Maybe since then it has caught up?)

...Either way, it's kind of out of the conversation here since Decima may not have a physics engine of its own per se. Guerrilla used Jolt Physics (which isn't nearly as sexy in demos as the big licensed middleware tools) for rigid body physics and collision detection in Horizon, and then Houdini or other tools to generate the game's special effects. I would assume a game development studio using Decima to make a game could use the same Jolt system or probably could plug in middleware or their own physics system.
Its been awhile since I watched it so I dont completly remember all the impressive stuff that I saw but one I do remember was that if you had a pre-cached sim like a building collapse or something if your dynamic objects like characters ect touched any of the simulated debris or objects they would immidietly switch to realtime sim even in mid fall. Which I thought was impressive when I saw it.

I agree its so shitty that its been a decade on since a number of broken physic promises have never been met or even made much progress on. Like that Nvidia demo where the walls where made out of individual pieces like wood beams, styrofoam insulation and drywall and they where cannoning dummy puppets at it like bullets and it was shredding into pieces. Rainbow Sixs Siege also had that physics downgrade as well.
 
Top Bottom