• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Is ray tracing a pointless gimmick or the next milestone in graphical advancement?

Ray tracing good

  • Yeah

    Votes: 215 68.0%
  • Nope

    Votes: 101 32.0%

  • Total voters
    316

Drell

Member
Light simulation has always been what makes games look next gen to me. I don't care about refresh rates superior to 60hz and any resolution higher than 1080p so I hope that when I change my GTX 1070, games will allow me to trade all of these things against complex geometry and good light simulation but not limited to just reflections.
 

01011001

Banned
I seriously doubt RTX50 series will be able to render games that look like this at acceptable framerates.
This is a static scene too. Throw in object physics and things become way more complicated and taxing for the best hardware.



I mean look at the demos Nvidia already runs fully pathtraced, and with Cyberpunk running at 30fps on a 4090 (higher with DLSS3 of course) with RT Overdrive.

that's really just one step away.

who knows maybe Nvidia will even use Cyberpunk again with the 50 series and work with CDPR on a Raytracing EXTREME OVERDRIVE mode or something that's fully pathtraced
 

Rayderism

Member
I don't think it's a gimmick, just kinda impractical at this point. It's just far too taxing on the hardware when striving for high framerates.

I figure that sometime in the future some company will have a eureka moment and come up with a method/technique (whatever) that will simplify its implementation and allow it to be used without it being so demanding on the hardware. I mean, right now, it feels like a brute forced way of doing it when just turning it on slams your framerates so nastily.
 

LiquidMetal14

hide your water-based mammals
I mean look at the demos Nvidia already runs fully pathtraced, and with Cyberpunk running at 30fps on a 4090 (higher with DLSS3 of course) with RT Overdrive.

that's really just one step away.

who knows maybe Nvidia will even use Cyberpunk again with the 50 series and work with CDPR on a Raytracing EXTREME OVERDRIVE mode or something that's fully pathtraced
Running Cyberpunk maxed/psycho settings with DLSS 2.0 gets me finally over constant 60fps and averaging around 70. Will be interesting to see RT Overdrive/DLSS 3.0 numbers.

For comparison, here is my findings with DLSS 3.0 on Plague Tale Requiem.
Decided to continue to where you first spot the rats.

Took DLSS 2.0 VS 3.0 screens. HDR off this time.

2.0
contentscreenshot202263fpo.png





3.0
contentscreenshot202296dbr.png



It's as maxed as you can get (4k res)
 

Haggard

Banned
No.
Ray-tracing is the oldest, the most straight-forward and the most "stupid" technology in computer graphics.

There are a lot of much more modern and better looking methods, that don't take years to render:
- Path tracing
- Metropolis light transport
- Photon mapping
- GIBS
Etc. etc.
Aside from GIBS which I don`t know every algorythm u mentioned is at its core RT. There are easier ways to say "i have no idea what I´m talking about".
 
Last edited:

JCK75

Member
It's a vital feature that is in it's infancy and the benefits for now are moderate.
but like bump/normal maps in time when hardware can actually handle it it will be incredible.
 

DeepEnigma

Gold Member
It's not a gimmick at all, however with that said, there are some amazing advancements with rasterization lighting techniques that come so close, your average person couldn't tell them apart.

It will really shine in 10 years time.
 

01011001

Banned
I don't think it's a gimmick, just kinda impractical at this point. It's just far too taxing on the hardware when striving for high framerates.

I figure that sometime in the future some company will have a eureka moment and come up with a method/technique (whatever) that will simplify its implementation and allow it to be used without it being so demanding on the hardware. I mean, right now, it feels like a brute forced way of doing it when just turning it on slams your framerates so nastily.

you couldn't be farther from the truth here.

we can't brute force it, we do the exact opposite currently.
what you currently see in terms of raytracing in games is only even possible due to advanced denoising algorithms, Nvidia and Intel are pushing the boundaries here year after year.

current hardware can't shoot even remotely enough rays into a scene to get an actual coherent image back, what you get is a mess of visual noise and seemingly random pixels scattered across a scene.

thanks to intelligent denoising the GPU is able to create a coherent image out of a mess of visual noise.

so GPUs right now use the absolute minimal amount of rays possible, just enough so that these really astonishingly good denoising algorithms can build a good looking image.
real-time raytracing wouldn't be possible by brute forcing it, it's only possible thanks to very clever image reconstruction.
 
Last edited:

hlm666

Member
I seriously doubt RTX50 series will be able to render games that look like this at acceptable framerates.
This is a static scene too. Throw in object physics and things become way more complicated and taxing for the best hardware.
It might be a close call, between rtx racer, marbles and this the rtx50 series might be the start of us seeing new full path traced games. I say new because that rtx remix thing is gonna pop out alot of old games with path tracing.

 

Drizzlehell

Banned
No.
Ray-tracing is the oldest, the most straight-forward and the most "stupid" technology in computer graphics.

There are a lot of much more modern and better looking methods, that don't take years to render:
- Path tracing
- Metropolis light transport
- Photon mapping
- GIBS
Etc. etc.

The problem is that you probably drank NV coll-aid too eagerly. NV called their implementation of dynamic pipeline - RTX, for marketing reasons.
It should have been called DRX: "dynamic rendering pipeline" or something similar. But NV always thinks that their fans are stupid... (see 4080 12GB)

The idea of RTX is to remove "rasterization" as the only "fixed function" step that's left in the pipeline.
Essentially we had "vertex shaders", "pixel shaders" and instead of "rasterization shaders" we got RTX.
Which is a pretty convoluted and simplistic implementation of what it should have been.

It's akin to launch of GeForce 2 GTS in the day where "Transform & Lighting" was introduced.
A precursor to the real shader implementation that worked as intended (in Geforce FX 5XXX).
You should expect at least 3 generations here too.



These are the games with one of the worst art directions I've ever seen.
No wonder they look "good" with RTX. It's only because they look atrocious without it...
Not sure why you're quoting me for this post when I made it pretty clear that my knowledge about this is cursory at best and honestly, I couldn't care less about this nerd stuff.

All I care about is for the games to look nice and perform well, and the issue I'm describing here is that hardware producers, as per usual, are shoving some dumb gimmick down our throats to make it look like they're making some great strides in graphics technology because we've already reached a point where video game graphics look so good that making them look any better is extremely difficult. This is why last gen games still look great today and also why we haven't seen any major leaps in graphics quality on par with what we've seen throughout the 2000s for nearly a decade now. They just have to resort to peddling some hardware-intensive gimmicks that can't justify their own existence and the payoff is way too high (lower resolutons and framerates).

This feels more like taking one step forward and two steps back. Or more like one step sideways and then a back flip where we shit in our own mouths.
 
Last edited:

K' Dash

Member
most of its implementations are lost to me when I'm playing, cause I'm not looking for it, I mean, it's good but not worth it TO ME, I'd just turn it off.
 

SF Kosmo

Al Jazeera Special Reporter
Being able to do raytracing in real time is absolutely a paradigm shift, but we're still in the early days of that transition. Performance costs are still high, and the implementation of ray tracing is often limited as a result, leading to questions of cost to benefit.

But make no mistake it's where we're going. I mean truly we are going to run out of other things to spend performance on, especially as AI resolution upscaling and frame generation make it pointless to just chase higher and higher resolutions and framerates, and we hit the wall on what we can reasonably "fake" using traditional rasterized methods. I think that dream of fully path-traced games with no compromises is still a couple generations away, but that's obviously where we're headed.
No.
Ray-tracing is the oldest, the most straight-forward and the most "stupid" technology in computer graphics.

There are a lot of much more modern and better looking methods, that don't take years to render:
- Path tracing
- Metropolis light transport
- Photon mapping
- GIBS
Etc. etc.
You're either being needless semantic or misunderstanding these concepts. Path-tracing is a method of ray-tracing. Photon mapping is a method of light mapping that works in concert with traditional ray-tracing. The question here is not about whether or not the future of rendering is POVRay algorithms from the 90s, it's about the shift away from the sorts rasterized rendering methods we've been using.
 
Last edited:
Path tracing is done via ray tracing, they are not separate concepts.
is at its core RT
Path-tracing is a method of ray-tracing. Photon mapping is a method of light mapping that works in concert with traditional ray-tracing.

And AI path-finding is also a "ray tracing" in some implementations, not to mention clicking your mouse pointer on units in 3D RTS traces a ray to the unit too.
So essentially "everything is ray tracing" then.

When the term "ray tracing" is used in CG context it's about Turner Whitted ray-tracing algorithm.
 

SF Kosmo

Al Jazeera Special Reporter
And AI path-finding is also a "ray tracing" in some implementations, not to mention clicking your mouse pointer on units in 3D RTS traces a ray to the unit too.
So essentially "everything is ray tracing" then.

When the term "ray tracing" is used in CG context it's about Turner Whitted ray-tracing algorithm.
This isn't "CG context" it's in the context of real time video game graphics, and it's a rapidly advancing field involving a mix of techniques for simulating light transport in a more robust way, none of which are the Turner Whitted ray-tracing algorithm.

Instead of trying to engage in the conversation everyone is having you're sitting here pretending we're all talking about something that you know we aren't. What purpose does that serve? Just talk about the topic and use whatever preferred nomenclature you like if you don't feel "ray-tracing" is descriptive enough.
 
Last edited:
Light simulation has always been what makes games look next gen to me. I don't care about refresh rates superior to 60hz and any resolution higher than 1080p so I hope that when I change my GTX 1070, games will allow me to trade all of these things against complex geometry and good light simulation but not limited to just reflections.
100% agree. not sure why the RT focus is always on shitty reflections. it's super pixelated and pointless. why don't they implement RT lighting in the games instead? or am i not understanding things properly? tbh on series x, gotham knights lighting (especially indoors) is FANTASTIC. i don't think it's RT, but it definitely adds to immersion. there have been numerous times i've just walked around in rooms and looked around because it looks so good. and the explosions and lighting that comes off of them are awesome. definitely want more games like that!
 

SF Kosmo

Al Jazeera Special Reporter
100% agree. not sure why the RT focus is always on shitty reflections. it's super pixelated and pointless. why don't they implement RT lighting in the games instead? or am i not understanding things properly? tbh on series x, gotham knights lighting (especially indoors) is FANTASTIC. i don't think it's RT, but it definitely adds to immersion. there have been numerous times i've just walked around in rooms and looked around because it looks so good. and the explosions and lighting that comes off of them are awesome. definitely want more games like that!
RT reflections aren't pixelated by nature, but current consoles can't do it very well so they cut corners.

Full path tracing simulates all these effects at once. I think that's the dream but as of now we can only do that in relatively simple games like Minecraft or Quake II or the Marbles demo. We're getting there, and new hardware and new algorithms are going to bring that dream closer and closer. There's some new Intel denoising tech that looks like it will improve/speed up real time path tracing tremendously. We will get there.

For now, yeah some of the use of RT hardware is gimmicky or not worth it, and it's up to devs to decide how best to make these methods work for their games. But eventually RT will be ubiquitous I'm game graphics.
 
I think that ray tracing is an interesting feature that could help to make games look better while reducing the workload for visual artists, but I think it's being pushed way too hard by hardware producers as this next big thing, while the actual tech is simply not there yet to support it efficiently. I would much rather see this processing power used for things that can look actually impressive, like more advanced physics or high-resolution, high-framerate performance.

This is pretty much spot on.

RT pays BIG when we hit overwhelmingly obvious diminishing returns on the multiple, increasingly complex, pre-computed lighting hacks used to achieve the same effect as RT. At that point, fully raytraced lighting becomes a no brainer, since the performance cost of RT versus the multitude of precomputed lighting approximations approach equality.

This is not currently where we are now, however, since the baseline hardware target that directs the majority of games development, i.e. consoles, simply aren't powerful enough to support fully RT lighting, and the various precomputed lighting approximations intended to serve very specific VFX features, e.g. A.O., are currently good enough and comparatively significantly less performance-intensive than a RT'd alternative to make using RT for those applications worth it---for the most part.

There are still edge cases today, like with reflections, where RT with denoising at a reasonable sample rate will be only marginally more performance intensive and/or give significantly better results than traditional methods like using cube maps.

For many games, however, the trade-offs required to enable limited RT effects in games on current-gen consoles are just too steep to justify the cost.
 

benno

Member
So essentially "everything is ray tracing" then.
yeah. If you plot each light bounce from a light source as it bounces around objects to eventually reach your eyes or you plot from your eyes view to the light sources, it's still basically ray tracing. One is more accurate and takes up more resources and the other is less accurate and takes up less resources.
Whatever you want to call it, all the end results are trying to achieve the same thing of a realistic lit scene by simulating how light realistically reflects and casts shadows. I'm even not sure what you're having an issue with?
 
Last edited:
when the differences are the number of light bounces?
The difference being true global illumination is unachievable goal (hence the "holy graal").
As either the scene becomes too large or amount of bounces to calculate becomes unbearable.
And being highly non-linear both can lead to unpredictable performance.
 

Griffon

Member
Ray tracing will be very useful once it's sufficiently powerful and common in every computer/console.

But for now? Don't think about it, games have to be compatible with weaker hardware for the time being.
 

Moochi

Member
Played Cyberpunk with it on a 3090. It's alright. At minimum for eye strain I have to have 60 fps. The lighting gives some scenes a little more realism. The water on the road is the most noticeable. I eventualy turned it off. 144 fps is more pleasant to look at and control.
 

benno

Member
The difference being true global illumination is unachievable goal (hence the "holy graal").
As either the scene becomes too large or amount of bounces to calculate becomes unbearable.
And being highly non-linear both can lead to unpredictable performance.
What is your point? people not using your favourite terms when they mean the same thing?
 
Who is talking about RTX? Not me, I'm talking about ray tracing in general. That's not an Nvidia marketing term.

Path tracing is done via ray tracing, they are not separate concepts. It's all about simulating the paths of rays of light, but at different levels of complexity.
I think the whole RT debate is not really applicable to a lot of games anyway.

The average FPS really doesn't need to have something like ray tracing when most people are racing around a map focused on what's behind the next door or behind the wall nearby.

In slower, more story focused games, it definitely has its place - as a way to aid immersion etc...

Like we've all said, though, the performance hit is just too severe at the moment and it'll probably be at least another 5 - 7 years before we have a new console generation or we see powerful enough GPUs at cheap enough prices to make RT ubiquitous.

Just my personal opinion.
 

Zarkusim

Neo Member
How anyone can look at these videos and still believe that ray tracing is a gimmick is baffling
Probably because as soon as the first RTX cards were available, casuals were expecting to just toggle a raytracing switch and graphics will just look like that. Like jensen saying 'It just works". I can't really blame them though with all the marketing and hype and proclaimations about it being the holy grail in computer graphics.
 
What is your point? people not using your favourite terms when they mean the same thing?
People intentionally or unintentionally mixing what NV introduced as RTX and the real GI implementations.
Same video from NV posted how many times here?
I.e.: RTX == RT == GI in majority of posts here.
 

RoadHazard

Gold Member
I think the whole RT debate is not really applicable to a lot of games anyway.

The average FPS really doesn't need to have something like ray tracing when most people are racing around a map focused on what's behind the next door or behind the wall nearby.

In slower, more story focused games, it definitely has its place - as a way to aid immersion etc...

Like we've all said, though, the performance hit is just too severe at the moment and it'll probably be at least another 5 - 7 years before we have a new console generation or we see powerful enough GPUs at cheap enough prices to make RT ubiquitous.

Just my personal opinion.

It will be highly useful in any game that wants to achieve realistic lighting, doesn't really matter what kind of game it is. It also means less work hand-tuning and baking lighting, and faster iteration during development, because everything will be real-time. But that's once hardware powerful enough for full RT lighting is ubiquitous enough for fallbacks to no longer be required, and yeah, we're not quite there yet (although I believe the updated version of Metro works that way, in that it requires RT for GI and has no fallback for systems that can't handle it).
 

SlimySnake

Flashless at the Golden Globes
I am conflicted on it. On one hand, I dont like seeing my performance literally get halved when turning on ray traced lighting. Reflections are a 30-50% hit and thats ok if used correctly. But 100% performance hit? Fuck that.

On the other hand, it saves dev time, and leads to more accurate results.

I think devs should be using software GI with hardware ray traced reflections instead of going full RT shadows, GI and reflections. Software GI will save time and look almost as good. Just see how good Matrix looks even with hardware accelerated lighting disabled.

Unreal Engine 4 demos like Rebirth and Australia look far more realistic than anything on consoles and they dont use RT. Cyberpunk and Metro use RT lighting and they dont look as good, do they? So is it neccessary? Not in all instances. I think open worlds with large urban cities with a lot of reflections need RT, but open world games set in the wild? No need for RT reflections really. RT GI wont do anything software GI wont.

At the end of the day, do you need ray tracing when you can push these visuals at native 4k 30 fps realtime on the PS5? While rendering millions of hair/fur? They are using GI but i dont believe its RT GI. You can read the details below.

lion_cub_tussle_720.gif


 

Gambit2483

Member
I've never seen a single game where I felt it truly changed the game (so to speak) and was needed.

It's nice to have but not at the cost of performance.
 
RT reflections aren't pixelated by nature, but current consoles can't do it very well so they cut corners.

Full path tracing simulates all these effects at once. I think that's the dream but as of now we can only do that in relatively simple games like Minecraft or Quake II or the Marbles demo. We're getting there, and new hardware and new algorithms are going to bring that dream closer and closer. There's some new Intel denoising tech that looks like it will improve/speed up real time path tracing tremendously. We will get there.

For now, yeah some of the use of RT hardware is gimmicky or not worth it, and it's up to devs to decide how best to make these methods work for their games. But eventually RT will be ubiquitous I'm game graphics.
for sure. and i get that we have to toe in the water first. nothing is ever 100% polished out the door! and i should have specified that yes, on consoles, due to the power constraints, the reflections are pixelated. i know that's not the intended result. can't wait for the power to catch up though. having lighting that effects the character and world realistically is much greater than textures in my opinion. i mean shit, once we catch up to the compute power needed, you could go back and just implement RT lighting in any game ever made, and it would look amazing. games like Assassin's Creed come to mind. but hell, even Super Mario with RT lighting would be legit!
 

gatti-man

Member
I don't think it's a gimmick, just kinda impractical at this point. It's just far too taxing on the hardware when striving for high framerates.

I figure that sometime in the future some company will have a eureka moment and come up with a method/technique (whatever) that will simplify its implementation and allow it to be used without it being so demanding on the hardware. I mean, right now, it feels like a brute forced way of doing it when just turning it on slams your framerates so nastily.
Right. People confuse gimmick with an expensive feature that is the future. If you want the future right now you’re going to pay for cutting edge hardware but RT definitely makes a huge to subtle difference depending on the game. Well implemented RT makes games have a 3D effect that can’t be duplicated with anything else. It’s incredible.
 

jaysius

Banned
Developers can’t just make rtx lighting standard in games, they need to provide alternatives and always will. This shows that it’s a gimmick, it doesn’t fix or replace anything.

Nvidia had a a real problem, gpu power was plateauing, so they had to create a new problem for their cards to “solve” now they‘ve convinced people this is the carrot you need.

The GPU market is in extreme stagnation right now.
 
It will be highly useful in any game that wants to achieve realistic lighting, doesn't really matter what kind of game it is. It also means less work hand-tuning and baking lighting, and faster iteration during development, because everything will be real-time. But that's once hardware powerful enough for full RT lighting is ubiquitous enough for fallbacks to no longer be required, and yeah, we're not quite there yet (although I believe the updated version of Metro works that way, in that it requires RT for GI and has no fallback for systems that can't handle it).
Oh, I agree, any game that wants the realistic lighting will benefit, absolutely.

I was referring more to how much players will actually care, or even notice, in certain games.

If they really aren't bothered then they will simply leave it disabled - at least until it can be turned on with almost no performance hit (if not natively then via DLSS).
 
Last edited:

Bankai

Member
Of course it is good, what a silly question. The cost is often too high for GPU's and consoles of today, but that doesn't mean it can be "bad". Good RT is always better than the alternative.
 

Gambit2483

Member
Of course it is good, what a silly question. The cost is often too high for GPU's and consoles of today, but that doesn't mean it can be "bad". Good RT is always better than the alternative.
Agreed, the thread title and poll question are completely unrelated. Not sure what the OP was thinking
 

jaysius

Banned
Of course it is good, what a silly question. The cost is often too high for GPU's and consoles of today, but that doesn't mean it can be "bad". Good RT is always better than the alternative.
You’ve answered the question yourself, it’s costs are too high to be useful for the average case, but it’s not a feasable answer, this is blinders thinking, just eating whatever Nvidia feeds you. There’s possibly other solutions but now everyone is focused on trying to “fix” this one that Nvidia has presented instead of trying to innovate and come up with a more feasible alternative.

Nvidia LOVES RTX because it’s hard to do, it gives people a reason to keep needlessly consuming their overpriced shit.
 
Last edited:

Excoman

Banned
I think everything that advances graphics, and brings them closer to the way real world physics work is not a gimmick.

Just imagine the possibilities in horror, and stealth games if AI starts reacting to reflections as well.
 
Not a gimmick. When used right, objects in the environment actually have weight and presence. Looks like its actually there in the space.
 

gatti-man

Member
Developers can’t just make rtx lighting standard in games, they need to provide alternatives and always will. This shows that it’s a gimmick, it doesn’t fix or replace anything.

Nvidia had a a real problem, gpu power was plateauing, so they had to create a new problem for their cards to “solve” now they‘ve convinced people this is the carrot you need.

The GPU market is in extreme stagnation right now.
They can’t make it standard bc it’s so gpu intensive. That’s like saying ultra wide is a gimmick bc it’s not standard or AA is a gimmick or shadow detail or texture detail. Hell everything in PC gaming isn’t standard. Even in consoles now things aren’t standard. Is FOV a gimmick?
 

ACESHIGH

Banned
As with most advancements there, they are made with DEVS in mind. This is made to save dev time down the line when implementing lighting in games instead of baking it.

Devs don't care if you have to spend way more to run games at the same frsmerate you did previpusly thanks to ray tracing. They just want to spend less hours developing games. This is why they also want the most powerful consoles they can get so that they don't spend that much time optimizing (see Xbox series S whining)
 

Puscifer

Member
its just the next evolution is graphics that's taking a few GPU generations before it's a mainstay like most graphical tech. Things like SSR & AO had performance impacts back in the day when they were showing up as new tech.
For me I just don't understand why something that was made for lighting artist to have an easier time doing their job was sold to gamers like it has. I'm not saying it's a gimmick totally, but it's one of those things that's only appreciated during certain moments. When I'm playing devil may cry I just don't have the time to appreciate what it's doing.
 
You’ve answered the question yourself, it’s costs are too high to be useful for the average case, but it’s not a feasable answer, this is blinders thinking, just eating whatever Nvidia feeds you. There’s possibly other solutions but now everyone is focused on trying to “fix” this one that Nvidia has presented instead of trying to innovate and come up with a more feasible alternative.
It will be better in the future. Once developers will start digging into it more.
Right now essentially all the "realtime RT" implementations are vendor-sourced.
 
Top Bottom