• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Rendering Engineer at EpicGames: DirectX RayTracing and Vulkan Optix holds everything back in PC land

rnlval

Member
isnt it strange when pc RT is so new, with so few hardware compatibility concerns. why the need for a thicc DXR that prevents low level access?

Is it Amd ray accelerator is teh suxxor?
For PC GCN's low-level access


AMD could create RDNA 2's Intrinsic Functions low-level access for the PC.
 

llien

Member
AMD could create RDNA 2's Intrinsic Functions low-level access for the PC.

Having to do shaders for one GPU and then shaders for another GPU is how NVidia has killed openGL.
DirectX just had unified shaders.

RT, if it makes sense to have (which I am not convinced about) just needs a better API.

Remember that "Mantle" thing?

api-overhead-g3258-290x.jpg


 
Last edited:

rnlval

Member
Having to do shaders for one GPU and then shaders for another GPU is how NVidia has killed openGL.
DirectX just had unified shaders.

RT, if it makes sense to have (which I am not convinced about) just needs a better API.

Remember that "Mantle" thing?

api-overhead-g3258-290x.jpg


Mantle doesn't support GCN's : Shader Intrinsic Functions.

Mantle uses MS HLSL.




Doom 2016 was the 1st game that used GCN's : Shader Intrinsic Functions.

AMD should design a new RT API extension (RDNA 2) based on PS5's RT access.
 
Last edited:

llien

Member
Mantle doesn't support GCN's : Shader Intrinsic Functions.

Mantle uses MS HLSL.

Why is this relevant? Did you miss the point? Here is how Mantle ("a low level API") was introduced, 9 times (!!!) more draw calls than any other APIs:

MantleBenefits_575px.jpg


Here is how it ended:

api-overhead-g3258-290x.jpg



The point: there does not have to be a huge API overhead, if abstraction layer is right.
 
Last edited:

rnlval

Member
Why is this relevant? Did you miss the point? Here is how Mantle ("a low level API") was introduced, 9 times (!!!) more draw calls than any other APIs:
MantleBenefits_575px.jpg


Here is how it ended:

api-overhead-g3258-290x.jpg



The point: there does not have to be a huge API overhead, if abstraction layer is right.
Apparently, Mantle is not low level enough since it's missing GCN's : Shader Intrinsic Functions access.
 

rnlval

Member
I should stop arguing with people with limited mental capacity... :messenger_face_screaming:


TS8Iik5.png


Oxide knows both Mantle and Vulkan APIs. GCN Shader Intrinsic Functions hit-the-metal access is something missing in Mantle!

The real person who has limited mental capacity is you.

Doom 2016 was the 1st game on PC with GCN Shader Intrinsic Functions hit-the-metal access.

AMD needs to update "Intrinsic Functions" hit-the-metal access for the RDNA 2 era.

GCN Shader Intrinsic Functions are lower level when compared to Mantle, standard Vulkan, and DirectX12 APIs.
 
Last edited:

Trimesh

Banned
They designed a system that runs without hardware RT on a wide range of hardware configs.
The only people whining in this thread are DXR proponents, who seem to struggle with the thought that DXR ain't at all needed to have good looking games.

Well, personally I have severe doubt that RT is needed to have good looking games at all, no matter how it's implemented - I'm far from a "DXR proponent". My point is simply that if you have an abstraction that doesn't map well to an API it needs to work with and you are the people that designed the abstraction - then it's a bit of a stretch for you to try and blame it entirely on the API.
 
I don't think a couple of posters speculating over marketing deals equate to PC master race culture. I honestly find it much more childish, with their RGB on their RAM and keyboard, the "racing" chair and the constant need for validation that their rig goes vroom vroom.


PC gamers are possibly the biggest, most try hard retards on the planet. The kind of people that pay a catastrophic premium for features that often get abandoned by the vendor a few years later. Only to line up and do it all over again.

You can fucking sell any old shite to a pc gamer. As a consumer/whale they get farmed repeatedly by vendors. Each one taking a turn to get their dicks out that it's practically become clockwork at this point.
Whever it's RAM/NAND/HDD factories mysteriously flooding in the baking hot sun thus driving up prices 300 percent.
CPU/GPU manufacturers randomly increasing prices by hundreds of dollars or re-releasing the exact same product and calling it a refresh.
Or motherboard nonsense blocking ram frequences and 'unlocked' CPUs that determines whever you can overclock the shit you PAID for purely by some bullshit hidden software you can't touch.

I say this as someone who games 99 percent of the time on his computer, and has been doing so for the past 23 years. So I've seen every possible shenanigan.
 
Last edited:

Lethal01

Member
Well, personally I have severe doubt that RT is needed to have good looking games at all, no matter how it's implemented - I'm far from a "DXR proponent".

Ofcourse it's not, Chrono trigger is still very pretty, we've been making pretty games for decade, Raytracing is just necessary if you to make dynamic games that actually have lighting that looks real. Sadly just not possible without it, but not every game needs to look realistic. Sometimes that limitation is what makes games look better when they find ways to deal with it.

I'd rather devs find their style by choice rather than because they can't make what they want though.
 
Well, personally I have severe doubt that RT is needed to have good looking games at all, no matter how it's implemented - I'm far from a "DXR proponent".
No specific feature is absolutely necessary to have "good looking games". I can get "i don't enable RT because on my current hardware it's too demanding", like other high end graphics effects, it's very demanding, rt is just way more demanding... But give it a bit of time, the hardware will catch up, devs will learn how and when to use it effectively.
 
PC gamers are possibly the biggest, most try hard retards on the planet. The kind of people that pay a catastrophic premium for features that often get abandoned by the vendor a few years later. Only to line up and do it all over again.

You can fucking sell any old shite to a pc gamer. As a consumer/whale they get farmed repeatedly by vendors. Each one taking a turn to get their dicks out that it's practically become clockwork at this point.
Whever it's RAM/NAND/HDD factories mysteriously flooding in the baking hot sun thus driving up prices 300 percent.
CPU/GPU manufacturers randomly increasing prices by hundreds of dollars or re-releasing the exact same product and calling it a refresh.
Or motherboard nonsense blocking ram frequences and 'unlocked' CPUs that determines whever you can overclock the shit you PAID for purely by some bullshit hidden software you can't touch.

I say this as someone who games 99 percent of the time on his computer, and has been doing do for the past 23 years. So I've seen every possible shenanigan.
Would you rather a stable 144fps or dropping below 30fps? I don't get your statement. Yeah prices soar, there's no news to that. But you gotta pay to play. You either have money to play games at the fullest, somewhere in between, or you are stuck with consoles sacrifices.
 

assurdum

Banned
I have a video myself of WD:L that I"m going to upload in 4k on Youtube soon.

From the DF video that I just looked at, if you guys want to claim that because MM has infinite draw distance of objects in reflections no matter how bad those reflections look, then I'll concede that I can see you feeling MM is the best use of RT reflections. However, that's a purely subjective opinion, but one I will accept.

But to simply say the reflections are BETTER in quality to WD:L is just lying as they are clearly not of better quality.
Reflections are definitely better in Spiderman than WD, on console at least. Again can you post a screen of the console version of WD which provides better reflection of Spiderman? I beg you. Because I continue to look to the DF videos, checking the reflections and every details here and there and I have no a clue where such superiority is manifested; sometimes it's almost embarrassing how more effective is raytracing on Spiderman, in comparison.
 
Last edited:
you can get up to x2 twice better performance for optimizing for particular spec many devs said so.
so if game for PS5 is coded to the metal it could match performance of 20Tflops PC card, and on top of that there's a lot custom ASIC hw in PS5 which would give even more headroom, not necessary in GFX.

On PS5 only. You won't get 2x performance on xsex because it's using PC API.

We will see PS5's true capability in Sony's first party games as usual.
 

rnlval

Member
That will never happen on the PC platform so need to say it. PC is agnostic and has to have general API in order to support multiple configurations. That doesn't matter since a PC will always run any console game better than the console can.

You have forgotten



According to AMD, PC's Doom 2016 was the 1st game to use GCN's Shader Intrinsic Functions




hUh2Imd.jpg


ID Software is not Epic games. Microsoft owns ID Software via ZeniMax Media
 
Last edited:

rnlval

Member
On PS5 only. You won't get 2x performance on xsex because it's using PC API.

We will see PS5's true capability in Sony's first party games as usual.
FYI, Xbox Series X also has direct RDNA 2 Intrinsic Functions besides DXR Tier 1.1

For the PC's DirectX12U era, Microsoft needs to expand DirectML's Meta-Commands which gives access to hardware ML Intrinsic Functions atm.
 
Last edited:

VFXVeteran

Banned
Honestly, I don't care what you or Dictator, think to see in Spiderman raytracing because "you know how such things working", but you are not a sophisticated AI machine which detects the graphic tech with your scan vision, and just saying WD Legion uses a far superior raytracing reflection compared Spiderman proves it. I don't know where the hell is coming this absurd convinction because there isn't any evidence on the screen, far the contrary .

Yes, there is. Look at the detail of what's being presented. Then look at what they say the resolution is of the rendered shot. A higher resolution render of a scene is going to ALWAYS look better than a lower resolution render. They cull out more things like transparencies (i.e. leaves in trees). They don't have reflections within reflections (i.e. that puddle of water disappears in MM whereas it's still shiny in WD:L).

And the funny thing just needs a simple capture from the same DF videos to destroy such assessment. Heck, raytracing in WD is extremely limited in the LOD, it abuses of a mix of cubemap/SSR, in what absurd universe can be superior of something which appears more raytraced?

WD:L took the stance to cull out the number of objects that get reflected due to how more detailed their reflections are. That's a tradeoff. If you like the fact that you can "see" everything reflected. Hats off to you. I can not argue with that. But to say
WD Legion DOES NOT use a far superior raytracing reflection compared Spiderman
is a lie. It OBJECTIVELY does.
 
Last edited:
Textures, Framerates, Resolutions, Physics etc.
These are tradeoffs and are game specific choices to be made (except for physics... To a degree). On PC games ship with video configuration screens so you decide, on consoles games now often have a performance and quality mode.

What are you talking about?
 

Alphagear

Member
These are tradeoffs and are game specific choices to be made (except for physics... To a degree). On PC games ship with video configuration screens so you decide, on consoles games now often have a performance and quality mode.

What are you talking about?

Like I said earlier. Demon's souls looks better than most RT enabled games.

A game with NO ray tracing. Neither mode has RT.

GPU resources are better spent elsewhere.

RT is simply a waste of GPU resources especially for consoles.
 

VFXVeteran

Banned
Like I said earlier. Demon's souls looks better than most RT enabled games.

Just because you think so doesn't make it objectively true.

Valhalla on PC looks way better than Demon Souls by every metric although not RT enabled but a fair comparison.

Crysis Remake looks better than DeS. And Cyberpunk looks better than DeS too - although both are FPS games.

GPU resources are better spent elsewhere.

Yea, like getting reasonable texture resolution sizes that the PS4 was lacking it's entire 7yr generation. The PS5 is finally catching up to the PC in this regard.

RT is simply a waste of GPU resources especially for consoles.

Perhaps for consoles. Not for the PC.
 
Last edited:

Alphagear

Member
Just because you think so doesn't make it objectively true.

Valhalla on PC looks way better than Demon Souls by every metric although not RT enabled but a fair comparison.

Crysis Remake looks better than DeS. And Cyberpunk looks better than DeS too - although both are FPS games.



Yea, like getting reasonable texture resolution sizes that the PS4 was lacking it's entire 7yr generation. The PS5 is finally catching up to the PC in this regard.



Perhaps for consoles. Not for the PC.

Never said my word was gospel. I just said DeS looks better than MOST games with RT enabled.

You can only mention 3 and even that's debatable.

That was my point. RT isn't all it's made out to be considering the GPU resources it uses.

Definitely a waste on consoles this gen. Even on PC is won't be a standard just yet.
 

VFXVeteran

Banned
Never said my word was gospel. I just said DeS looks better than MOST games with RT enabled.

You can only mention 3 and even that's debatable.

That was my point. RT isn't all it's made out to be considering the GPU resources it uses.

Definitely a waste on consoles this gen. Even on PC is won't be a standard just yet.

But there are only a few AAA RT games from the get go. It hasn't even had a generation to be iterated on.

Control
Metro: Exodus
Crysis Remake
Cyberpunk
WD:Legion
COD
Fortnite
Tomb Raider
BFV

We haven't even begun to see the new games coming out in 2021 that will use RTX.
 
Last edited:

S0ULZB0URNE

Member
Just because you think so doesn't make it objectively true.

Valhalla on PC looks way better than Demon Souls by every metric although not RT enabled but a fair comparison.

Crysis Remake looks better than DeS. And Cyberpunk looks better than DeS too - although both are FPS games.



Yea, like getting reasonable texture resolution sizes that the PS4 was lacking it's entire 7yr generation. The PS5 is finally catching up to the PC in this regard.



Perhaps for consoles. Not for the PC.
Demon's Souls looks better than anything on PC.
PC will be behind this gen as they won't program for the PC with SSD as a minimum.
 

vkbest

Member

Guilty_AI

Member
Just because you think so doesn't make it objectively true.

Valhalla on PC looks way better than Demon Souls by every metric although not RT enabled but a fair comparison.

Crysis Remake looks better than DeS. And Cyberpunk looks better than DeS too - although both are FPS games.
Demon's Souls looks better than anything on PC.
PC will be behind this gen as they won't program for the PC with SSD as a minimum.

You're both wrong. Little witch Nobeta is the clear graphical benchmark of this gen.

Little-Witch-Nobeta-850x478.jpg


Who needs RT reflections when her smile reflects the light of a thousand suns?
 
Demon's Souls looks better than anything on PC.
PC will be behind this gen as they won't program for the PC with SSD as a minimum.
Lmfaaaaao please pass that shit over here, whatever your smoking! PC will be behind consoles?! Where in history has that ever happened? How will the devices that MAKE the games for console, be behind consoles? That's some deranged/delusional mentality right there.
 

Dampf

Member
You didn't understand the point of my post. You can't optimise a black box, you have no access to what it does, you just throw it inputs and receive an output and that's it, that's a black box. That's the point, how can something be "poorly optimised" if it literally can't be optimised? His/her statement was nonsensical, as is yours.
Uhm... Instead of rendering reflections at full screen resolution, it could be rendered at half res combined with SSR for the distance, like that? It's pretty simple.
 
Last edited:

Dampf

Member
It's called brute forcing. Or in this case: brute-losing...
Brute forcing is not going to cut it even on PC, especially considering RDNA2 is inferior to Nvidia in RT performance and they have no DLSS equivalent. And too with the rise of midrange RT cards like a RTX 3050.

It won't be optimization that you'll be waiting for. RT, even in the offline world, is just as expensive as it was years ago. You are going to be waiting for the hardware to get more powerful. Period.

Raytracing as the process cannot be optimized, that is correct. How Raytracing is used in games however can absolutely get better optimized.

If you want proof, just run RTXGI in UE4 and compare it to the built in RT GI from UE4. RTXGI is lightyears ahead both in performance and visual quality because it uses Raytracing in a much more efficient way.

Nvidia also released a new denoiser which produces great results at just half the rays per pixel, which allows for much higher performance.

That is the type of optimization that will greatly speed up Raytracing in the coming years for consoles and lower end RT capable hardware.
 
Last edited:

Alphagear

Member
Except that's what all what the dedicated RT cores can do, so you either actually use them for RT or they're sitting there doing nothing, which is a real waste.

Have the RT cores ON and the Resolutions, Frame rates and Textures etc all remain the same?

No sacrifices are made?

Does not impact GPU performance?
 
Last edited:

psorcerer

Banned
Brute forcing is not going to cut it even on PC, especially considering RDNA2 is inferior to Nvidia in RT performance and they have no DLSS equivalent.

Looks like a psalm to me. Are you praying?
Because it kinda not related to what I've said...
 

assurdum

Banned
Yes, there is. Look at the detail of what's being presented. Then look at what they say the resolution is of the rendered shot. A higher resolution render of a scene is going to ALWAYS look better than a lower resolution render. They cull out more things like transparencies (i.e. leaves in trees). They don't have reflections within reflections (i.e. that puddle of water disappears in MM whereas it's still shiny in WD:L).



WD:L took the stance to cull out the number of objects that get reflected due to how more detailed their reflections are. That's a tradeoff. If you like the fact that you can "see" everything reflected. Hats off to you. I can not argue with that. But to say is a lie. It OBJECTIVELY does.
Again can you show me a scene with more detailed reflections in WD on console compared MM on ps5? You continue to talk and talk about is superiority but I haven't seen anything and I don't find it in the DF videos.
Anyway, most of the raytracing even on pc run at a quarter of the resolution and I have great doubt WD on console uses higher resolution, very high doubt. But let's take what you say as true, so an higher resolution scene raytraced with a quarter of the details, filled of cubemap and SSR, is superior of a full scene raytraced at lower resolution and that's an indisputable fact? Sure

P.S. And the puddle water reflections doesn't appears in WD too on ps5. Don't use a bug to take your point ☺
 
Last edited:

Jagz

Member
Ray Tracing will be the consoles' Achiles heel this generation. Spiderman at 4K30, with non reflective Ray Tracing, is an early sign the PS5 will struggle with Ray Tracing, especially in open world games.

The consoles lack a hardware based DLSS solution, like the RTX 30 series, so ironically it's the consoles which are "brute forcing" 4K, while the 30 series on PC can fall back on DLSS and AI upscale to 4K, while doing full Ray Tracing and post effects.
 

S0ULZB0URNE

Member
Lmfaaaaao please pass that shit over here, whatever your smoking! PC will be behind consoles?! Where in history has that ever happened? How will the devices that MAKE the games for console, be behind consoles? That's some deranged/delusional mentality right there.
Again Demon's Souls looks better than anything on PC.
It's happened before(learn gaming history)
The minimum requirements on PC still involve HDD's.
Consoles win.
Gaming builds don't make games LOL try again.
 
Last edited:

VFXVeteran

Banned
Brute forcing is not going to cut it even on PC, especially considering RDNA2 is inferior to Nvidia in RT performance and they have no DLSS equivalent. And too with the rise of midrange RT cards like a RTX 3050.



Raytracing as the process cannot be optimized, that is correct. How Raytracing is used in games however can absolutely get better optimized.

If you want proof, just run RTXGI in UE4 and compare it to the built in RT GI from UE4. RTXGI is lightyears ahead both in performance and visual quality because it uses Raytracing in a much more efficient way.

Nvidia also released a new denoiser which produces great results at just half the rays per pixel, which allows for much higher performance.

That is the type of optimization that will greatly speed up Raytracing in the coming years for consoles and lower end RT capable hardware.

I will. In fact, if you steer me towards where in the code I can look at it, I can find the optimizations myself. Is it up on the repo?
 

VFXVeteran

Banned
Again can you show me a scene with more detailed reflections in WD on console compared MM on ps5? You continue to talk and talk about is superiority but I haven't seen anything and I don't find it in the DF videos.
Anyway, most of the raytracing even on pc run at a quarter of the resolution and I have great doubt WD on console uses higher resolution, very high doubt. But let's take what you say as true, so an higher resolution scene raytraced with a quarter of the details, filled of cubemap and SSR, is superior of a full scene raytraced at lower resolution and that's an indisputable fact? Sure

P.S. And the puddle water reflections doesn't appears in WD too on ps5. Don't use a bug to take your point ☺

We are comparing the PC version of WD:L to the PS5 version of MM.

I'll say this yet again, if you think seeing all the objects in the world to infinity at a lower quality is your definition of "better" reflections, then I won't argue that. You won't hear me fight that at all.

If, however, you think that the quality of the reflection is better, then you are factually WRONG. Period. This isn't up for debate.

Watch Dogs: Legion perhaps lacks the full RT spectacle of Insomniac's efforts but it does have its own plus points. Unlike Spider-Man, there are reflections within reflections, so the reflection of a puddle on the ground, for example, will show reflective properties. Also, the geometry in reflections looks to be the same level of detail and precision as those in the primary view - Miles Morales has a lower precision 'RT city' from which to draw its reflections. However Ubisoft's implementation also has plenty of similarities with the PS5 exclusive. Xbox Series S and X are using stochastic reflections much like Insomniac's tech, so they will technically produce more realistic surface reflections than other simpler types of ray tracing. Also, the ray traced reflections in Watch Dogs: Legion add to transparencies - so glass materials look very realistic. Put simply, it's a big upgrade from a visual perspective, especially for a cityscape rich in reflective surfaces.


For ray tracing, a checkerboard rendering approach is used for all systems - even PC - so what are effectively half resolution RT reflections on PC become quarter resolution on Series X, calculated at 1080p, reducing to 720p on Series S.

 
Last edited:
Top Bottom