• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Raytracing doesn't have to be expensive - Raytraced GI for no little to no FPS loss.

Dampf

Member
2 days ago, there was a patch released for an MMO shooter called Ring of Elysium. It is free to play and looks pretty decent for that in my opinion. But the game itself doesn't matter here.

That patch added what might be a glimpse into the future of Raytracing, specifically Raytraced GI. You might know from experience turning on Raytracing is very expensive. In Metro, you get a FPS loss of 25% on high and more on ultra (pretty reasonable for this kind of quality, I might say). In Fortnite you pay 60% or more of the framerate at medium settings just for the GI for little to no visual differences which is absolutely bonkers and cannot be recommended at all. Then you have all the other UE4 games similarly destroying framerate.

This little patch however, added Raytraced GI into the game, but properly. The game already has a pretty well built baked GI solution, like you're used to in most games. The Raytraced GI enhances that significantly.

The best part? It literally costs nothing if you have a Raytracing capable GPU and as I said, it enhances the visual quality massively. Take a look.


zru4VA6.jpg


This is the game running at max settings with its prebaked GI at 1440p on a thin and light 2060 laptop. As you can see, the interior of that container looks brightly lit with your usual type of SSAO even though not much light should reach that area as its covered in shadow. Or look at that building in the background, it is clearly covered in sunlight yet the building appears to be shaded dark. This is your typical fake GI lighting in video games. Take a look at the framerate, it does 73 FPS.

Now, let's take a look at the Raytraced shot.


Wrj8eCk.jpg


Now we are talking! The game is lit beautifully. The dark corners are actually dark and the building in the background finally has its bright sunlight. And what is this? Look at the frame counter. We have 74 FPS now. That is actually 1 FPS faster than without RTX! And yep, we are not using any reconstruction here, it is still native 1440p. The game doesn't support DLSS.
This is RTX set at medium, but it looks pretty similar to max RT settings. Max RT is obviously more demanding but performance hit is still much less than UE4s RT GI.

Let's take a look at another screenshot.


nDz2CH5.jpg

VS

cvqHGTs.jpg

The last screenshot is RTX on. Without RTX, the hangar is as always way too bright and not properly lit. With RTX on, you can notice the areas which are reached by sunlight are brighter than the others. In the RTX on screenshot, the blue container's underside where light cannot reach easily is darker like it should be. You might also notice pretty bounce lighting from the red container on the ground and on the players character, which is also top tier stuff. Then take a look at the framerate again, only 1 FPS less than without RTX!

So, why is this a glimpse into the future? Well, Nvidia recently updated their RTXGI SDK which brings UE5's Lumen like multi bounce Global Illumination into people's hands. Unlike Lumen however, RTXGI uses hardware accelerated Raytracing for light probe updating, that makes it super fast, so it makes use of the specialized hardware in Nvidia and AMD GPUs, as well as in the PS5 and Xbox Series. That is magnitudes more efficient than Lumen as it leaves not specalized hardware idle. And it looks just as great. Read more here: https://developer.nvidia.com/rtxgi

How much does it cost? Only 1ms on a Turing and 0.5ms on an Ampere GPU, approximately. Which means barely an impact to performance. Nvidia plans to integrate it into UE4 for all DXR capable GPUs, meaning even AMD can make use of RTXGI.

Of course, those Raytraced GI solutions run best on a card with DXR acceleration. However, they also run nicely on non-hardware accelerated RT cards which is a key argument, they are not locked to RTX or RDNA2 cards. So in the future, developers might want ditch baking rasterized lights completely as solutions like RTXGI still run fast enough on cards without dedicated hardware for RT, meaning they can still satisfy a huge playerbase while reducing development cost significantly. So yes, I believe this is the future.

If you have a Pascal GPU I would love to know how the software based approach runs for you in this game. Elysium is free to play on Steam for everyone to try out! Be aware however, this is likely not RTXGI but a custom made solution for this game.

Still, this might be the first game for a glimpse into the future of highly efficient Raytracing.
 
Last edited:

Mister Wolf

Member
So they made RTGI less expensive. Thats good news. Might not be exactly the same as what Metro Exodus was doing. I know Control only uses RT to augment their own lighting system while Metro completely replaces the light probe technique in their engine for RTGI.

Edit: See the youtube video posted further down of someone using it. Big performance hit.
 
Last edited:

Dampf

Member
The inside of the warehouse with the stairs looks mad artificial without RTGI.
Yep. The difference is similar to Metro IMO, but costs much less.

Sadly, this game will fall under the radar for most people but this is the first game ever released to have a what I call highly efficient Raytracing implementation. We will start to see those more with the launch of next gen consoles.
 

Dampf

Member
I recall Sony saying the same thing about GI costs a while back. Will this be common now ?

5Zam2ap.png
Good catch. I was wondering why he was putting GI so low in that performance list, giving its pretty demanding usually, but if devs really know to optimize RT GI now or lets better say use RT to enhance GI, it does make perfect sense.

And this is awesome news because good RTGI is imo easily the most obvious effect of Raytracing.
 

geordiemp

Member
Good catch. I was wondering why he was putting GI so low in that performance list, giving its pretty demanding usually, but if devs really know to optimize RT GI now or lets better say use RT to enhance GI, it does make perfect sense.

And this is awesome news because good RTGI is imo easily the most obvious effect of Raytracing.

I recall some GAF posters saying that was rubbish and GI is more costly RT. They know who they are

Anyway its good that it will be spread around and seems most devs will be onto it.
 
Last edited:

Mister Wolf

Member
I recall some GAF posters saying that was rubbish and GI is more costly RT. They know who they are

Anyway its good that it will be spread around and seems most devs will be onto it.



Look at the framerate drop when turning the GI on. The title of this thread about it not incurring a significant performance hit simply isn't true. Everything posters told you about RTGI being more costly was the truth.
 
Last edited:

Dampf

Member


Look at the framerate drop when turning the GI on. The title of this thread about it not incurring a significant performance hit simply isn't true. Everything posters told you about RTGI being more costly was the truth.


That video is at high RT settings, I was testing medium and at around 74 FPS. Medium already looks good enough and does not even hit framerates much when at high refreshrates beyond 130 FPS like in that video. High however does hit the framerate, like it is being shown in that video. Of course when running higher framerates, GI will have a tighter frame time budget and will affect performance more. I was a bit off about high though, as it costs more than 2-3 FPS for sure, I didn't test that one that much as I don't care for high RT settings.

Medium Raytracing is the way to go here.

Here is the RT GI performance at medium when having a tighter frametime budget.


Yf9FrVw.png


hYDBDz0.png


The performance impact is still minimal and I'm not lying or anything.
 
Last edited:

Dampf

Member
RT implementation cost scales with its accuracy. There really isn't much more to say to this topic.
It's a good rule of thumb but its not always the case.

Take the RTGI in Fortnite for example. It destroys the framerate and you have to really look very closely for differences. In Elysium the differences are immediately obvious by correctly lit interiors especially. Not the case with Fortnite.
 
When we;re getting our ‘ray tracing’ options for consoles in the future I hope we don’t just get ‘ray tracing on or off’ I want my rt gi but I don’t care so much about reflections if screenspace can do essentially the same thing.
‘I hope devs can work out algorithms to combine ray traced and screen space so that they use ray tracing just to fill in where the screen space can’t cover. I hope this would be more efficient.
 
To me this game looks like shit in all the pics :messenger_poop:

And you claimed little to no fps lost? But in the video turning it on takes the fps from 160fps to 110fps. Still more than playable sure, but that IS a significant hit to framerate. - 50fps
 
Last edited:
It's a good rule of thumb but its not always the case.

Take the RTGI in Fortnite for example. It destroys the framerate and you have to really look very closely for differences. In Elysium the differences are immediately obvious by correctly lit interiors especially. Not the case with Fortnite.
Think again about what I wrote..... Your comment is not even related to what you quoted from me.
 
Last edited:

Lethal01

Member
Accurate GI is a much bigger benefit than raytraced reflections too, IMO. It enhances everything, not just shiny surfaces.

Good stuff.

I mostly agree, but let's not forget that shiny environments and armor creatures exist.
A game focus on fighting robots or in scifi setting could see huge benefits. It's very circumstantial.
 

Lethal01

Member
I recall some GAF posters saying that was rubbish and GI is more costly RT. They know who they are

Anyway its good that it will be spread around and seems most devs will be onto it.

GI is most costly if you want more accurate or dynamic results that what this game offers.
 

Shmunter

Member
Begs to reason techniques will improve and efficiency will go up over time. Software is a scientific and engineering pursuit and will always move forward.

De-noising techniques are already allowing less rays to be cast with interpolated results filling in the blanks.

As far as consoles go, things will be interesting. XSX can fire more beams per clock, whereas PS5 fires less but faster beams allowing them to travel further per clock. Real world outcome will be fascinating.
 

Elias

Member
Begs to reason techniques will improve and efficiency will go up over time. Software is a scientific and engineering pursuit and will always move forward.

De-noising techniques are already allowing less rays to be cast with interpolated results filling in the blanks.

As far as consoles go, things will be interesting. XSX can fire more beams per clock, whereas PS5 fires less but faster beams allowing them to travel further per clock. Real world outcome will be fascinating.
Modern rendering engines don't really take into account pixel fill rate, so the Series X is just flat out better.
 
I think when most people hear raytracing it's reflections they want to see even if all of those other things add visual improvements.
 

Armorian

Banned
i honestly don't care about traytracing

give me superior graphics with inferior reflections and puddles any day of the week

Ryse and The Order 1886 look better than any next gen footage I've seen so far

You can argue about importance of RT reflections but RT GI is what we need, lighting in games is far worse without it. GI in general and I think RT implementation is the most accurate (compared to SVOGI for example).

Ryse was looking good in 2013 and Order... I don't know what people see in this game GFX aside from few (great looking) PBR materials. Low res textures and blur and piss filters all over it.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
i honestly don't care about traytracing

give me superior graphics with inferior reflections and puddles any day of the week

Ryse and The Order 1886 look better than any next gen footage I've seen so far

Lighting might be the most important part of superior graphics, so you should care about RT GI at the very least as any games without static TOD that can be baked will look much much better than without GI.
 

pyrocro

Member
Begs to reason techniques will improve and efficiency will go up over time. Software is a scientific and engineering pursuit and will always move forward.

De-noising techniques are already allowing less rays to be cast with interpolated results filling in the blanks.

As far as consoles go, things will be interesting. XSX can fire more beams per clock, whereas PS5 fires less but faster beams allowing them to travel further per clock. Real world outcome will be fascinating.
you have to clarify that statement.
Everything happens at the same speed per clock on both chips(same architecture, same clock generator, same speed of electrons)
if it all happening in the same "clock" the one with more units will do more work for a given clock.
XSX capacity for work per clock will be higher.
 
Last edited:
i honestly don't care about traytracing

give me superior graphics with inferior reflections and puddles any day of the week

Ryse and The Order 1886 look better than any next gen footage I've seen so far

It's not about puddles. It's about lighting. Raytracing literally makes the difference between a game looking realistic or "game-y".
 

Shmunter

Member
you have to clarify that statement.
Everything happens at the same speed per clock on both chips(same architecture, same clock generator, same speed of electrons)
if it all happening in the same "clock" the one with more units will do more work for a given clock.
XSX capacity for work per clock will be higher.
PS5 is clocked higher, XsX is clocked lower but wider.

Think of 118 torches on XsX vs 100 on PS5 per clock. On PS5 the torches are stronger and can bounce off 10 objects vs 6 objects on XsS within the same amount of time. Made up figures to illustrate the point.
 
Last edited:

OZ9000

Banned
The game looks like shit, RT or not.

I don't care about fancy bullet points. Just give me a game which looks good.
 

sinnergy

Member
Screenspace solutions are less expensive, but also less precise, and that’s about it . Good RT and GI is expensive.
 

Self

Member
i honestly don't care about traytracing

give me superior graphics with inferior reflections and puddles any day of the week

Ryse and The Order 1886 look better than any next gen footage I've seen so far

I agree. The best looking games this gen had superb lightning without RT.
RT is nice and all, but you don't sacrifice overall picture IQ for something like RT and higher Resolution.

I'm sure competent devs will find a workaround.
 
Last edited:

Dampf

Member
To me this game looks like shit in all the pics :messenger_poop:

And you claimed little to no fps lost? But in the video turning it on takes the fps from 160fps to 110fps. Still more than playable sure, but that IS a significant hit to framerate. - 50fps
At High settings.

The visual differences from Medium to High are minor.
 

pyrocro

Member
PS5 is clocked higher, XsX is clocked lower but wider.

Think of 118 torches on XsX vs 100 on PS5 per clock. On PS5 the torches are stronger and can bounce off 10 objects vs 6 objects on XsS within the same amount of time. Made up figures to illustrate the point.
geeez how to say this,
it is you who is confused, I'm correcting you.
Begs to reason techniques will improve and efficiency will go up over time. Software is a scientific and engineering pursuit and will always move forward.

De-noising techniques are already allowing less rays to be cast with interpolated results filling in the blanks.

As far as consoles go, things will be interesting. XSX can fire more beams per clock, whereas PS5 fires less but faster beams allowing them to travel further per clock. Real world outcome will be fascinating.

Per Clock means 1 clock cycle, meaning 1 generated pulse.

there is no traveling further per clock, as the speed of the electron is ~fixed.

this is basic, and you're confusing the units in your 1st post, just acknowledge the nonsense you wrote, then we can attempt to unwind what you're attempting to convey.

its like saying "10 meters" is a speed.

I can't be more clear than that.
 
Last edited:

Shmunter

Member
geeez how to say this,
it is you who is confused, I'm correcting you.


Per Clock means 1 clock cycle, meaning 1 generated pulse.

there is no traveling further per clock, as the speed of the electron is ~fixed.

this is basic, and you're confusing the units in your 1st post, just acknowledge the nonsense you wrote, then we can attempt to unwind what you're attempting to convey.

its like saying "10 meters" is a speed.

I can't be more clear than that.
Did I not say ‘in the same amount of time’?

Wider GPU does more operations per clock incase it’s not clear.

Love semantic wars, like being back in grade school.
 
Last edited:

Inviusx

Member
Does this game have a dynamic time of day?

I've read that RTGI will impact performance greatly if the game has time of day.
 

pyrocro

Member
Did I not say ‘in the same amount of time’?

Wider GPU does more operations per clock incase it’s not clear.
look at what you wrote below.
XSX can fire more beams per clock, whereas PS5 fires less but faster beams allowing them to travel further per clock.
It's as if your talking to your 30mins ago self, and correcting yourself.

Love semantic wars, like being back in grade school.

semantics.
LOL
I guess what you wrote up there is correct then?

don't be the guy that can't can't deal with corrections.

Travel further per clock...????????????? brilliant stuff.
and you still don't get it.
PS5 is clocked higher, XsX is clocked lower but wider.

Think of 118 torches on XsX vs 100 on PS5 per clock. On PS5 the torches are stronger and can bounce off 10 objects vs 6 objects on XsS within the same amount of time. Made up figures to illustrate the point.

your visualizing of how things work is just wrong.(not talking about the made up numbers, the actual analog is just wrong)

no need to feel like you're back in high school, when your being corrected dude.
just repeat 5 times "i'm a big boy and it's ok to be corrected"
tenor.gif
 

Shmunter

Member
look at what you wrote below.

It's as if your talking to your 30mins ago self, and correcting yourself.



semantics.
LOL
I guess what you wrote up there is correct then?

don't be the guy that can't can't deal with corrections.

Travel further per clock...????????????? brilliant stuff.
and you still don't get it.


your visualizing of how things work is just wrong.(not talking about the made up numbers, the actual analog is just wrong)

no need to feel like you're back in high school, when your being corrected dude.
just repeat 5 times "i'm a big boy and it's ok to be corrected"
tenor.gif
I’m sorry if you’re autistic. No harm, no foul, no judgement.
 
Last edited:
Top Bottom