• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Unreal Engine 5.1 Features Scalable Lumen To Achieve 60fps On Consoles, And More

Lunatic_Gamer

Gold Member
unreal-engine-5.jpg



Epic Games has published its roadmap for Unreal Engine 5.1, and among the list of new features is a more scalable version of Lumen, with the aim that current-gen consoles will be able to target 60fps in games that take advantage of the tech.

According to the Unreal Engine 5.1 roadmap, PS5 and Xbox Series X/S will have an easier time targeting 60fps in games running on the engine that take advantage of Lumen. In addition, the roadmap outlines a potential fix for PSO-related stutters in Unreal Engine games on PC.


unreal-engine-5.png


Lumen Improvements In Unreal Engine 5.1​

According to the roadmap shared by Epic Games, the following improvements are being made to Unreal Engine 5’s lighting tech:

  • Improved performance optimizations in High scalability mode with the goal of achieving 60 fps on consoles
  • Improved support for foliage
  • Reflections on Single Layer Water
  • Support for high-quality mirror reflections on translucent surfaces
  • Support for nDisplay (SWRT and HWRT)
  • Initial support for split-screen (SWRT only); performance characteristics still TBD
  • Experimental: Hardware Ray Tracing (HWRT) in Vulkan – Surface cache lighting only, no support yet for Hit Lighting
  • Many stability, quality, and bug fixes

Nanite Improvements In Unreal Engine 5.1​

Based on the roadmap, the virtualized micropolygon geometry system introduced in Unreal Engine 5 is getting the following improvements:

  • Addition of a programmable rasterization framework
  • Masked materials
  • Two-sided foliage
  • Pixel depth offset
  • World position offset
  • Nanite material switch in the Material Editor
  • Additional diagnostic and debug modes
  • Many quality and performance improvements
Note: the exact feature list and expected stability and performance characteristics have yet to be confirmed.


New-Unreal-Engine-5-World-of-Darkness-Demo-PS5-Xbox-Series-X-PC-10-41-screenshot-1536x864.png


Path Tracing Improvements In Unreal Engine 5.1​

Unreal Engine 5.1’s Path Tracer will continue to receive support for additional features, including:

  • Exponential Height Fog and Sky Atmosphere Fog
  • Decals
  • Single Layer Water
  • Per-instance custom data
  • Light functions
  • Multi-GPU rendering

Automated PSO Gathering​

The roadmap for Unreal Engine 5.1 mentions a new feature that is set to reduce PSO-related stutters in PC games running on the engine. Automated PSO Gathering will:

  • Replace the manual work required to collect all possible PSO combinations for a project
  • Keep the number of PSOs as small as possible


 
Last edited:

Kuranghi

Member
While I do think modern upscaling tech is really good in what it achieves I don't like this push towards "native pixel counts don't matter anymore" and "this upscaled 1440p to 4K is the same as native", its completely not the same in 99% of cases and there is a not-insignificant trade-off in IQ when you do it.

Fair enough if you can't tell the difference but I can and I hate this we don't need native anymore pish. *grumpy armfold*
 

sinnergy

Member
While I do think modern upscaling tech is really good in what it achieves I don't like this push towards "native pixel counts don't matter anymore" and "this upscaled 1440p to 4K is the same as native", its completely not the same in 99% of cases and there is a not-insignificant trade-off in IQ when you do it.

Fair enough if you can't tell the difference but I can and I hate this we don't need native anymore pish. *grumpy armfold*
Most of the times , good is good enough ..
 

01011001

Banned
Better than using light probes paired with SSAO like we've been seeing for dynamic lighting these past two generations and even now.

it's not if it looks like in the Matrix demo which still has among the worst image quality I've ever seen, and the only culprit that's left, after seeing the demo run with different AA methods and even DLSS, is Lumen in combination with their denoising

aspects of the image in that demo look like you're watching a really low quality Youtube stream of the game.
 
Last edited:

Mister Wolf

Member
it's not if it looks like in the Matrix demo which still has among the worst image quality I've ever seen, and the only culprit that's left, after seeing the demo run with different AA methods and even DLSS, is Lumen in combination with their denoising

aspects of the image in that demo look like you're watching a really low quality Youtube stream of the game.

Raytraced GI will always be heavy no matter what optimizations they make. Faster performance equals more noise. No escaping this if you are not using dedicated hardware acceleration.
 
Last edited:

Sosokrates

Report me if I continue to console war
Well that unreal collaboration did its marketing job but nothing has come of the engine yet.
It takes a while. What was the first UE4 game which really showed its caperbilities? I think it was Gears 4 or paragon.
 
Last edited:
While I do think modern upscaling tech is really good in what it achieves I don't like this push towards "native pixel counts don't matter anymore" and "this upscaled 1440p to 4K is the same as native", its completely not the same in 99% of cases and there is a not-insignificant trade-off in IQ when you do it.

Fair enough if you can't tell the difference but I can and I hate this we don't need native anymore pish. *grumpy armfold*
Totally agree although it's going to be necessary to achieve good fps on console as the gen goes on. I recently bought COD Vanguard after watching NX Gamer talk about how amazing the temporal upscaling is and how it's "indistinguishable from native 4k" but I was disappointed in how soft it is compared to actual native res.
 
While I do think modern upscaling tech is really good in what it achieves I don't like this push towards "native pixel counts don't matter anymore" and "this upscaled 1440p to 4K is the same as native", its completely not the same in 99% of cases and there is a not-insignificant trade-off in IQ when you do it.

Fair enough if you can't tell the difference but I can and I hate this we don't need native anymore pish. *grumpy armfold*
Spider-Man on the fidelity mode (native 4k) really opens your eyes to show how native still matters especially on detailed titles
 

01011001

Banned
Raytraced GI will always be heavy no matter what optimizations they make. Faster performance equals more noise. No escaping this if you are not using dedicated hardware acceleration.

hence I say it looks like shit. so IMO no dev should actually use it on console as even their heavy version looks shit in motion, I really can't imagine how an even worse looking version will be like 🤮
 
Last edited:
Totally agree although it's going to be necessary to achieve good fps on console as the gen goes on. I recently bought COD Vanguard after watching NX Gamer talk about how amazing the temporal upscaling is and how it's "indistinguishable from native 4k" but I was disappointed in how soft it is compared to actual native res.
It’s a trade off do you want native 4k at lower settings or high-ultra settings at sub 4k
 

Kuranghi

Member
Totally agree although it's going to be necessary to achieve good fps on console as the gen goes on. I recently bought COD Vanguard after watching NX Gamer talk about how amazing the temporal upscaling is and how it's "indistinguishable from native 4k" but I was disappointed in how soft it is compared to actual native res.

I completely understand its usage for console, as long as it doesn't go so low like your example. Its more in the PC space I'm frustrated with it.

I think a lot of the time people are evaluating upscaling vs native in one or a combination of the following scenarios: on tiny monitors, average sized TVs from too far away, low contrast displays and/or poorly calibrated displays, ie with sharpening or contrast too high or any number of things that make it harder to see the difference because you've already broken the image in various ways.

It’s a trade off do you want native 4k at lower settings or high-ultra settings at sub 4k

Usually I'd prefer high-ultra settings, at 4K but a lower locked framerate. Some games 30fps is unwanted for gameplay reasons or the engine has high input latency so that and 30fps pushes the lag past a tipping point where its better to sacrifice the IQ/graphics to get 60fps instead, but those situations aside I don't want to sacrifice the other stuff to get "kinda 60fps" with low IQ, because it's often not even locked 60fps for all the sacrifice of IQ, better to make two corners of the triangle fantastic and the 3rd decent in my eyes than all 3 good.

I usually favour settings, then res, then fps in that order, but it also depends how low the res goes, if its so low you can't make out the distant texture detail or higher detail effects then not much point in using the highest settings in the first place.
 
Last edited:
I completely understand its usage for console, as long as it doesn't go so low like your example. Its more in the PC space I'm frustrated with it.

I think a lot of the time people are evaluating upscaling vs native in one or a combination of the following scenarios: on tiny monitors, average sized TVs from too far away, low contrast displays and/or poorly calibrated displays, ie with sharpening or contrast too high or any number of things that make it harder to see the difference because you've already broken the image in various ways.



Usually I'd prefer high-ultra settings, at 4K but a lower locked framerate. Some games 30fps is unwanted for gameplay reasons or the engine has high input latency so that and 30fps pushes the lag past a tipping point where its better to sacrifice the IQ/graphics to get 60fps instead, but those situations aside I don't want to sacrifice the other stuff to get "kinda 60fps" with low IQ, because it's often not even locked 60fps for all the sacrifice of IQ, better to make two corners of the triangle fantastic and the 3rd decent in my eyes than all 3 good.
This I agree with now that we have vrr
 
I completely understand its usage for console, as long as it doesn't go so low like your example. Its more in the PC space I'm frustrated with it.

I think a lot of the time people are evaluating upscaling vs native in one or a combination of the following scenarios: on tiny monitors, average sized TVs from too far away, low contrast displays and/or poorly calibrated displays, ie with sharpening or contrast too high or any number of things that make it harder to see the difference because you've already broken the image in various ways.



Usually I'd prefer high-ultra settings, at 4K but a lower locked framerate. Some games 30fps is unwanted for gameplay reasons or the engine has high input latency so that and 30fps pushes the lag past a tipping point where its better to sacrifice the IQ/graphics to get 60fps instead, but those situations aside I don't want to sacrifice the other stuff to get "kinda 60fps" with low IQ, because it's often not even locked 60fps for all the sacrifice of IQ, better to make two corners of the triangle fantastic and the 3rd decent in my eyes than all 3 good.

I usually favour settings, then res, then fps in that order, but it also depends how low the res goes, if its so low you can't make out the distant texture detail or higher detail effects then not much point in using the highest settings in the first place.
Yeah ..when DF started up with this whole notion that "native res no longer is important" I got totally triggered.
 
hence I say it looks like shit. so IMO no dev should actually use it on console as even their heavy version looks shit in motion, I really can't imagine how an even worse looking version will be like 🤮
Your talking about RT gi? Looked great in Metro on ps5 ..obviously they had to make sacrifices to IQ bit it was totally worth it
 

Mister Wolf

Member
On a big screen sitting close DLSS is easily noticeable from native. I usually pair DLSS with Nvidia's sharpen+ filter.
 

01011001

Banned
You know damn well what I mean.

Don't expect that much from UE5 is what I'm trying to say.
UE5 is basically UE4.11 basically. yes it has new features, but especially on Console most of them will most likely not be used all that much, or if they are they will be used sparingly I bet
 

Mister Wolf

Member
Your talking about RT gi? Looked great in Metro on ps5 ..obviously they had to make sacrifices to IQ bit it was totally worth it

Would you be content with games only looking slightly better than Metro Exodus graphically this generation? Personally I would. Its a 60fps game.
 
Last edited:

01011001

Banned
Your talking about RT gi? Looked great in Metro on ps5 ..obviously they had to make sacrifices to IQ bit it was totally worth it

I'm talking specifically about Lumen as seen in UE5 in the Matrix demo. that shit looked like ass the moment it wasn't used in a controlled environment (aka the scripted intro sequence)

the amount of disocclusion artifacts in that demo is staggering
 

FireFly

Member
Don't expect that much from UE5 is what I'm trying to say.
UE5 is basically UE4.11 basically. yes it has new features, but especially on Console most of them will most likely not be used all that much, or if they are they will be used sparingly I bet
Nanite isn't going to be used?
 

Sosokrates

Report me if I continue to console war
Thats true because in 2015 we were still getting Unreal Engine 3 games like Arkham Knight which looked phenomenal at the time
We have had 1 current gen exclusive which is UE4 this gen, Returnal, which looks pretty good with its smoke effects and particles.

UE5 seems to a bigger improvement then 4 was.

However wihile im still unsure, this gen seems to be a smaller leap in visuals then previous ones. The UE5 demos have shown some improvements, but its not going to be until a big AAA game uses next gen tech until we know what visuals these consoles are truly caperble of.
 
Last edited:
Top Bottom