• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unreal Engine 5 revealed! Real-Time Prototype Gameplay Demo Running On PS5

Redlight

Member
Ha Ha, seriously! They could've just showed that and no other games and everyone would have been super excited with no complaints whatsoever, so why didn't MS do this?

My theory is that they're trying to take the humble sort of Xbox 360 approach. If anyone remembers them showing off initial games like Full Auto, GUN, and other REAL games, while Sony showed off Gundam Crossfire, Killzone and Motorstorm tech demos, actually those weren't even tech demos, but CG trailers, but still, MS and its Xbox 360 had a similar reception where gamers were left wondering what was so next gen about Full Auto, leaving only their Gears of War demo looking anything close to what we were expecting from that current gen, oh and Fight Night Round 3, but that was showcased on the PS3 at the time.
I remember that clearly, everyone foaming over the tech demos and being underwhelmed by actual gameplay (I was one of them btw). As it turned out the gameplay was a truer indication of what to expect. Of course. I suspect history will repeat.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Ha Ha, seriously! They could've just showed that and no other games and everyone would have been super excited with no complaints whatsoever, so why didn't MS do this?

My theory is that they're trying to take the humble sort of Xbox 360 approach. If anyone remembers them showing off initial games like Full Auto, GUN, and other REAL games, while Sony showed off Gundam Crossfire, Killzone and Motorstorm tech demos, actually those weren't even tech demos, but CG trailers, but still, MS and its Xbox 360 had a similar reception where gamers were left wondering what was so next gen about Full Auto, leaving only their Gears of War demo looking anything close to what we were expecting from that current gen, oh and Fight Night Round 3, but that was showcased on the PS3 at the time.

So, Xbox does big PR and boasting exercises online... “Sony is staying quite, there must be problems.. concern concern concern about PS5... concern”.

MS overhyped an event and has to apologise for it while Sony does a kick ass demo with Epic... “MS is being humble and smartly so, like with Xbox 360 and PS3 going on phony demos... concern concern concern about PS5... concern”.

This is some nimble gymnastics there...
 
And all that without the RayTracing gimmick, finally some voxel rendering of models!

I feel like this is a big of a jump in realism as it was back in the PSX -> PS2 transition!
I'm not 100% sure that voxels are on the leaf nodes of the Octree, could be anything there! That's what I'm waiting to hear about. It could be mesh fragments, but they don't tend to subdivide as easily as voxels.
 

Bartski

Gold Member
I remember watching FF Spirits Within when it came out and thinking one day videogames will look like that. I just wanna say I'm happy I'm still alive to see how much we've actually managed to surpass it.
My hype levels for next gen have now reached hights equal to those at the MGS4 debut trailer at TGS 2005.

Back to the demo - I'm blown away, despite the presentation focusing on easy selling pretty look factors which is really not the type of candy I'm after. I'm personally most interested in UE5 developments regarding tech that was just briefly mentioned there - mainly "Chaos" physics, "motion warping" and everything regarding collision and object to object interactions, deformation and all the "seemless contextual animation triggering" stuff.
I hope the next presentation of the engine is a deep dive into those areas.
 
Last edited:

Fafalada

Fafracer forever
how far down could you reasonably compress a 33m poly model with textures?
Naively storing a raw mesh I'd say around 500MB. Textures are an interesting question, but presumably we're talking 1:1 with vertex detail(which comes out to 8k texture(s) in this specific case) so probably another 30-60MB on top of that.
That would all still be compressed down on disc(LZW/Kraken/whatever) - so could be another 50% smaller.
That being said there's no way this is storing raw mesh data anyway - the progressive detail refinement by default requires something more involved, and potentially more efficient in terms of storage sizes.

i'm just trying to see at what point the game package size becomes a large concern with unique assets
Any system like this has to have a way for user to specify target resolution of your assets (something like amount of data/meter) when you import them. Especially for multiplatform dev, where you'd presumably just play with that lever on export to adjust quality and size of packages for each platform.
Storage will totally be a problem with tons of unique data(if the demo itself was all unique at the same precision - we'd be blowing storage on PS5 already), but the user should have control over the tradeoffs based on their target platforms.
Don't over-index on the idea that suddenly you no longer have any limitations to consider - as long as you build content, especially realtime, limitations always exist unless someone invents infinite-processing power someday.
 
Last edited:

Mozza

Member
These threads are so fun, basically a tech demo of the new unreal engine that will run on both Soy and Microsoft's new consoles, but because this is running on a PS5 it triggers all sorts of issues with both camps, and as yet who knows how indicative of actual games this will be, we have been stung many times before by awesome looking tech demos only o have to wait very long into consoles life cycles for these visuals to be realized if they ever do.

And once again the masses that buy most of the consoles next gen would not even see a difference between this and the current platforms best, oh the fun of these sorts of threads, endless entertainment value. ;)
 

TGO

Hype Train conductor. Works harder than it steams.
This is the most fanboyish comment I've ever seen in my life.
I'm sure you've seen worse, but he right about something.
It is exciting to think what Sony's first party will bring to the table.
You can't deny that.
 

mckmas8808

Mckmaster uses MasterCard to buy Slave drives
He's not wrong though. Your average PC will not be able to run it. But, provided the PC has enough RAM space and enough CPU power to do decompression, the SSD is not required to be able to run this on a PC.

So 2% of PC players will be able to run this properly. Cool. Clearly that isn't good.
 

Ascend

Member
Let's do some basic math here...

We know the demo was running mostly at 1440p
We know the demo was running at 30 fps
We know the demo was using raw uncompressed data
We know the demo was having pretty much one triangle per pixel
I'll assume 32-bit colors.
I'll assume RAM doesn't exist.

Based on the above, you have 2560 x 1440, which is 3,686,400 triangles/pixels.
Running at 30 fps means you have 33.3 ms to render a frame.

If you stream everything on the fly, that means you have to process all those 3,686,400 pixels in less than 33.3ms.
32 x 3,686,400 / 8 = 14745600 bytes = 14.7456 MB.

Vertices are a better indication of throughput cost than triangles, but since they are generally almost equal, rounding up should be ok. You would have to output about 15MB of data per frame, multiply by 30 is about 450MB/s. That is what you would need to stream if you were literally sending data from the SSD to the GPU without any processing in the middle. That is too fast to be streamed from an HDD, but very doable from any SSD.

Note that this is output data being used as reference, not input. The input is inevitably higher, but it is unclear how much higher it would be. It all depends on how efficient the reading from the SSD is, i.e. if you're loading full textures and culling them later, or if you're loading primarily what you need and ignoring the rest.
Additionally, this completely ignores reused assets, and assumes that every triangle/pixel is completely unique, which generally is not the case.

Bottom line is, even if the streaming from SSD is 5 times larger, you would still be below 2.4GB/s.
Based on this extremely rough estimate of what would need to be streamed, I doubt the full capacity of the PS5 SSD is being used, and I extremely doubt that the XSX or PC would be incapable.

It's the engine that is the 'hero' here. They've completely changed the way of rendering things.
 
Let's do some basic math here...

We know the demo was running mostly at 1440p
We know the demo was running at 30 fps
We know the demo was using raw uncompressed data
We know the demo was having pretty much one triangle per pixel
I'll assume 32-bit colors.
I'll assume RAM doesn't exist.

Based on the above, you have 2560 x 1440, which is 3,686,400 triangles/pixels.
Running at 30 fps means you have 33.3 ms to render a frame.

If you stream everything on the fly, that means you have to process all those 3,686,400 pixels in less than 33.3ms.
32 x 3,686,400 / 8 = 14745600 bytes = 14.7456 MB.

Vertices are a better indication of throughput cost than triangles, but since they are generally almost equal, rounding up should be ok. You would have to output about 15MB of data per frame, multiply by 30 is about 450MB/s. That is what you would need to stream if you were literally sending data from the SSD to the GPU without any processing in the middle. That is too fast to be streamed from an HDD, but very doable from any SSD.

Note that this is output data being used as reference, not input. The input is inevitably higher, but it is unclear how much higher it would be. It all depends on how efficient the reading from the SSD is, i.e. if you're loading full textures and culling them later, or if you're loading primarily what you need and ignoring the rest.
Additionally, this completely ignores reused assets, and assumes that every triangle/pixel is completely unique, which generally is not the case.

Bottom line is, even if the streaming from SSD is 5 times larger, you would still be below 2.4GB/s.
Based on this extremely rough estimate of what would need to be streamed, I doubt the full capacity of the PS5 SSD is being used, and I extremely doubt that the XSX or PC would be incapable.

It's the engine that is the 'hero' here. They've completely changed the way of rendering things.

Damn, you’re blinding me with science here
 
Let's do some basic math here...

We know the demo was running mostly at 1440p
We know the demo was running at 30 fps
We know the demo was using raw uncompressed data
We know the demo was having pretty much one triangle per pixel
I'll assume 32-bit colors.
I'll assume RAM doesn't exist.

Based on the above, you have 2560 x 1440, which is 3,686,400 triangles/pixels.
Running at 30 fps means you have 33.3 ms to render a frame.

If you stream everything on the fly, that means you have to process all those 3,686,400 pixels in less than 33.3ms.
32 x 3,686,400 / 8 = 14745600 bytes = 14.7456 MB.

Vertices are a better indication of throughput cost than triangles, but since they are generally almost equal, rounding up should be ok. You would have to output about 15MB of data per frame, multiply by 30 is about 450MB/s. That is what you would need to stream if you were literally sending data from the SSD to the GPU without any processing in the middle. That is too fast to be streamed from an HDD, but very doable from any SSD.

Note that this is output data being used as reference, not input. The input is inevitably higher, but it is unclear how much higher it would be. It all depends on how efficient the reading from the SSD is, i.e. if you're loading full textures and culling them later, or if you're loading primarily what you need and ignoring the rest.
Additionally, this completely ignores reused assets, and assumes that every triangle/pixel is completely unique, which generally is not the case.

Bottom line is, even if the streaming from SSD is 5 times larger, you would still be below 2.4GB/s.
Based on this extremely rough estimate of what would need to be streamed, I doubt the full capacity of the PS5 SSD is being used, and I extremely doubt that the XSX or PC would be incapable.

It's the engine that is the 'hero' here. They've completely changed the way of rendering things.
Some of u xbox folks have really lost it after UE5 reveal and I don't understand why ? Xgs will be using this engine as well extensively. All ur damage control effort should be pointed at MS and demad a better reveal for July imo.bunch of random nonsense calculation with assumption after assumptiom wont change that. Sorry but Cheers
 
Last edited:

Jon Neu

Banned
That HB2 "in engine" cinematic isn't something that's playable though it's an in engine cinematic it's nowhere close to what gameplay will look like, in engine cinematics are always deceptive and all companies use them and they shouldn't. The UE5 demo was actually playable and captured directly from the dev kit.

Well, HB2 is a real game.

The UE5 gameplay was 1440p, 30 fps, had zero enemies, empty enviroments and even a slow animation to help the game load among other scripted as fuck moments.
 
Some of u xbox folks have really lost it after UE5 reveal and I don't understand why ? Xgs will be using this engine as well extensively. All ur damage control effort should be pointed at MS and demad a better reveal for July imo.bunch of random nonsense calculation with assumption after assumptiom wont change that. Sorry but Cheers

Damage control? Greenberg tweets out "imagine UE5 running on XsX" and people rail on him for it, a pretty benign tweet. That got called damage control too. Is everything XsX related damage control to you?

Assumption after assumption is not science. Sorry 😞

I was kidding. I had just woken up, and that post was the first post I read. I have no idea what any of that means right now.
 

onQ123

Member
Let's do some basic math here...

We know the demo was running mostly at 1440p
We know the demo was running at 30 fps
We know the demo was using raw uncompressed data
We know the demo was having pretty much one triangle per pixel
I'll assume 32-bit colors.
I'll assume RAM doesn't exist.

Based on the above, you have 2560 x 1440, which is 3,686,400 triangles/pixels.
Running at 30 fps means you have 33.3 ms to render a frame.

If you stream everything on the fly, that means you have to process all those 3,686,400 pixels in less than 33.3ms.
32 x 3,686,400 / 8 = 14745600 bytes = 14.7456 MB.

Vertices are a better indication of throughput cost than triangles, but since they are generally almost equal, rounding up should be ok. You would have to output about 15MB of data per frame, multiply by 30 is about 450MB/s. That is what you would need to stream if you were literally sending data from the SSD to the GPU without any processing in the middle. That is too fast to be streamed from an HDD, but very doable from any SSD.

Note that this is output data being used as reference, not input. The input is inevitably higher, but it is unclear how much higher it would be. It all depends on how efficient the reading from the SSD is, i.e. if you're loading full textures and culling them later, or if you're loading primarily what you need and ignoring the rest.
Additionally, this completely ignores reused assets, and assumes that every triangle/pixel is completely unique, which generally is not the case.

Bottom line is, even if the streaming from SSD is 5 times larger, you would still be below 2.4GB/s.
Based on this extremely rough estimate of what would need to be streamed, I doubt the full capacity of the PS5 SSD is being used, and I extremely doubt that the XSX or PC would be incapable.

It's the engine that is the 'hero' here. They've completely changed the way of rendering things.

I have a math problem for you


If PS5 is using the engine to stream in 90MB of raw data each frame in a 60fps game with a 5.5GB/s SSD how will the game run on Xbox SX with 2.4GB/s SSD? will they use half the detail each frame or will they run the game at half the frames per second?


81DQGo.gif
 
Last edited:

Ascend

Member
Some of u xbox folks have really lost it after UE5 reveal and I don't understand why ? Xgs will be using this engine as well extensively. All ur damage control effort should be pointed at MS and demad a better reveal for July imo.bunch of random nonsense calculation with assumption after assumptiom wont change that. Sorry but Cheers
What makes you think I'm "Xbox folk", whatever that means? I'm primarily a PC gamer and overclocker.
 

vpance

Member
I have a math problem for you


If PS5 is using the engine to stream in 90MB of raw data each frame in a 60fps game with a 5.5GB/s SSD how will the game run on Xbox SX with 2.4GB/s SSD? will they use half the detail each frame or will they run the game at half the frames per second?

I don’t think it will be half the perceivable detail. Depends how good the lossy compression holds up.
 

Ascend

Member
I have a math problem for you


If PS5 is using the engine to stream in 90MB of raw data each frame in a 60fps game with a 5.5GB/s SSD how will the game run on Xbox SX with 2.4GB/s SSD? will they use half the detail each frame or will they run the game at half the frames per second?
That's developer choice. They can do either one. Or they use the compression and lose less detail while maintaining 60 fps. Again, assuming there's no RAM. It might be as simple as keeping an additional 50MB in RAM for each frame, to achieve the same thing. As long as there's enough RAM available, the same thing could be achieved.
 

Jon Neu

Banned
The "real game" was an in-engine rendered cutscene at 24fps.

But yeah, the UE5 demo was shit. Let's go with that.

An in-engine of an actual game you are going to play.

The UE5 wasn’t as impressive as some want to claim. The face of the character was shit, it was 1440p, 30fps, it had slow animations to help the game load, no enemies, pretty empty scenarios...

I’m sure we are going to see better things with the UE5 already.
 
Last edited:
An in-engine of an actual game you are going to play.

The UE5 wasn’t as impressive as some want to claim. The face of the character was shit, it was 1440p, 30fps, it had slow animations to help the game load, no enemies, pretty empty scenarios...

I’m sure we are going to see better things with the UE5 already.
Loving all ur posts 😂
 

onQ123

Member
and then you think that only one game on PS4 was running on unreal engine its Days Gone... will they make now 10-20 AAA games on ps5 with this engine ? not exclusive devs at least. so thats not that good news.


So Fortnite isn't running on Unreal Engine? Final Fantasy VII Remake isn't running on Unreal Engine? Mortal Kombat 11 isn't running on Unreal Engine? is there a group giving out pamphlets on how to downplay the demo?
 

Matsuchezz

Member
Now, my internet isn't great for watching live streams, so I wasn't able to FULLY appreciate the demo until I downloaded the 4K video and played it straight to my TV, and the quality is beyond the pale great. While it'll take up to two years to see games fully utilize UE5 to this level, I expect first-party titles to at least approach it in the meantime with their own in-house engines. I'm hype.
Can you share the link to download it myself
 


Alex from DF just put up an Inside Unreal Engine 5 article


Epic's reveal of Unreal Engine 5 running in real-time on PlayStation 5 delivered one of the seismic news events of the year and our first real 'taste' of the future of gaming. A true generational leap in terms of sheer density of detail, alongside the complete elimination of LOD pop-in, UE5 adopts a radical approach to processing geometry in combination with advanced global illumination technology. The end result is quite unlike anything we've seen before, but what is the actual nature of the new renderer? How does it deliver this next-gen leap - and are there any drawbacks?

Watching the online reaction to the tech trailer has thrown up some interesting questions but some baffling responses too. The fixation on the main character squeezing through a crevice was particularly puzzling but to make things clear, this is obviously a creative decision, not a means to slow down the character to load in more data - it really is that simple. Meanwhile, the dynamic resolution with a modal 1440p pixel count has also drawn some negative reaction. We have access to 20 uncompressed grabs from the trailer: they defy traditional pixel counting techniques.

Some interesting topics have been raised, however. The 'one triangle per pixel' approach of UE5 was demonstrated with 30fps content, so there are questions about how good 60fps content may look. There have also been some interesting points raised about how the system works with dynamic geometry, as well as transparencies like hair or foliage. Memory management is a hot topic too: a big part of the UE5 story is how original, full fidelity assets can be used unaltered, unoptimised, in-game - so how is this processed? So, to what extent is the Lumen in the Land of Nanite tech demo leveraging that immense 5.5GB/s of uncompressed memory bandwidth?

Core to the innovation in Unreal Engine 5 is the system dubbed Nanite, the micro-polygon renderer that delivers the unprecedented detail seen in the tech demo.

"With Nanite, we don't have to bake normal maps from a high-resolution model to a low-resolution game asset; we can import the high-resolution model directly in the engine. Unreal Engine supports Virtual Texturing, which means we can texture our models with many 8K textures without overloading the GPU." Jerome Platteaux, Epic's special projects art director, told Digital Foundry. He says that each asset has 8K texture for base colour, another 8K texture for metalness/roughness and a final 8K texture for the normal map. So, we end up with eight sets of 8K textures, for a total of 24 8K textures for one statue alone," he adds.

ince detail is tied to pixel amount in screen size, there is no more hard cut-off - no LOD 'popping' as you see in current rendering systems. Likewise, ideally, it should not have that 'boiling' look like you can see with standard displacement as seen in with ground terrain in a game like 2015's Star Wars Battlefront (which still holds up beautifully today, it has to be said).

In lieu of triangle-based hardware-accelerated ray tracing, te UE5 demo on PlayStation 5 utilises screen-space as seen in current generation games to cover small details, which are then combined with a virtualised shadow map.

"Really, the core method here, and the reason there is such a jump in shadow fidelity, is virtual shadow maps. This is basically virtual textures but for shadow maps. Nanite enables a number of things we simply couldn't do before, such as rendering into virtualised shadow maps very efficiently. We pick the resolution of the virtual shadow map for each pixel such that the texels are pixel-sized, so roughly one texel per pixel, and thus razor sharp shadows. This effectively gives us 16K shadow maps for every light in the demo where previously we'd use maybe 2K at most. High resolution is great, but we want physically plausible soft shadows


We were also really curious about exactly how geometry is processed, whether Nanite uses a fully software-based raw compute approach (which would work well across all systems, including PC GPUs that aren't certified with the full DirectX 12 Ultimate) or whether Epic taps into the power of mesh shaders, or primitive shaders as Sony describes them for PlayStation 5. The answer is intriguing.

"The vast majority of triangles are software rasterised using hyper-optimised compute shaders specifically designed for the advantages we can exploit," explains Brian Karis. "As a result, we've been able to leave hardware rasterisers in the dust at this specific task. Software rasterisation is a core component of Nanite that allows it to achieve what it does. We can't beat hardware rasterisers in all cases though so we'll use hardware when we've determined it's the faster path. On PlayStation 5 we use primitive shaders for that path which is considerably faster than using the old pipeline we had before with vertex shaders."

he other fundamental technology that debuts in the Unreal Engine 5 technology demo is Lumen - Epic's answer to one of the holy grails of rendering: real-time dynamic global illumination. Lumen is essentially a non-triangle ray tracing based version of bounced lighting - which basically distributes light around the scene after the first hit of lighting.

"Lumen uses ray tracing to solve indirect lighting, but not triangle ray tracing," explains Daniel Wright, technical director of graphics at Epic. "Lumen traces rays against a scene representation consisting of signed distance fields, voxels and height fields. As a result, it requires no special ray tracing hardware."

To achieve fully dynamic real-time GI, Lumen has a specific hierarchy. "Lumen uses a combination of different techniques to efficiently trace rays," continues Wright. "Screen-space traces handle tiny details, mesh signed distance field traces handle medium-scale light transfer and voxel traces handle large scale light transfer."

And finally, the smallest details in the scene are traced in screen-space, much like the screen-space global illumination we saw demoed in Gears of War 5 on Xbox Series X. By utilising varying levels of detail for object size and utilising screen-space information for the most complex smaller detail, Lumen saves on GPU time when compared to hardware triangle ray tracing.

Another crucial technique in maintaining performance is through the use of temporal accumulation, where mapping the movement of light bounces occurs over time, from frame to frame to frame.
 
Last edited:

Herr Edgy

Member
So making bugs move using the particle system isn't giving them artificial intelligence?
No, it's not. Is making the player move by pressing a button artificial intelligence? Your character even knows how to climb a wall, jump or fit into tight places and all you do is hold the analog stick!
 
Have these unreal engine demo videos ever had a game in the upcoming generation ever get close? And as far as gameplay this doesn't even look real playable, more like on rails

Yes. Unreal Engine 3 tech demo turned into a small game called Gears of War.

from 2004. Prototype of the Gears universe in 2nd half of the video. Game released in 2006.
 
Last edited:
If a game on PS5 uses 5gb of v-ram and PC has 20gb v-ram cards, it can store up to 4x the amount of data that the GPU will need,
The ps5 has 16GB of vram. The 2080ti only has 11GB. Maybe if you mean titan. Otherwise it may even not fit in memory whats needed if it uses over 11GB of vram
You can just load more assets than PS5.
ps5 has 16GB of vram outside titan most cards have 8 or 11GB of vram.

If you try to load 30+GB into ram expect good loading times
An in-engine of an actual game you are going to play.

The UE5 wasn’t as impressive as some want to claim. The face of the character was shit, it was 1440p, 30fps, it had slow animations to help the game load, no enemies, pretty empty scenarios...

I’m sure we are going to see better things with the UE5 already.
That flight was quite fast. And the character movement wasnt out of the ordinary.
 
Top Bottom