• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox architect on ray tracing: 'developers still want to use traditional rendering techniques without a performance penalty'

Mark Grossman, principal architect at Microsoft, has detailed the graphics silicon inside the Series X and just what role ray tracing has in it.

The short answer seems to be: not much.

while Microsoft has indeed worked with AMD to ensure there is some level of ray tracing support inside the Xbox Series X GPU, Grossman doesn't seem to be that enthused about how readily it will be utilised.

"We do support DirectX Raytracing acceleration, for the ultimate in realism™, but in this generation developers still want to use traditional rendering techniques, developed over decades, without a performance penalty," says Grossman sadly. "They can apply ray tracing selectively, where materials and environments demand, so we wanted a good balance of die resources dedicated to the two techniques."



I mean it was kind of given raytracing will be limited in these machine.
As long as we get some RT refelction and global illumination we should be good .
 

onQ123

Member


It seems that Microsoft is less than enthused about the prospect of ray tracing in the Xbox Series X, despite it being deemed the 'ultimate in realism.' In a Hot Chips deep dive on the AMD-powered GPU at the heart of the next-gen console, Mark Grossman, principal architect at Microsoft, has detailed the graphics silicon inside the Series X and just what role ray tracing has in it.

The short answer seems to be: not much.

The dual compute units (DCU) of the Big Navi-like GPU inside the Xbox Series X do have specific hardware dedicated to accelerating the real-time ray tracing process. But that seems to be the only change to the RDNA 2.0 dual compute unit compared with the first-gen ones found in the AMD RX 5700-series cards.

There are 26 of them (so 52 actual individual compute units), but from high-level a look at the GPU structure it seems like a pair of DCUs have been disabled in each chip. As a whole then, the full AMD GPU would have 56 compute units inside it, but maybe with such a large slice of 7nm silicon it makes sense to build some redundancy in there.


Because the next-gen Xbox GPU is built for DirectX 12 Ultimate—and therefore uses the DirectX Raytracing API—that means the dedicated hardware blocks in the AMD GPU will be aimed at accelerating the same bounding volume hierarchy (BVH) algorithms Nvidia's RTX-based graphics cards are pointed at.

"We've added hardware embedded in the compute units," says Grossman, "to perform intersections of rays with acceleration structures that represent the scene geometry hierarchy. That's a sizeable fraction of the specialised ray tracing workload, the rest can be performed with good quality and good real-time performance with the baseline shader and memory design.


"The overall ray tracing speed up varies a lot, but for this task it can be up to 10x the performance of a pure shader-based implementation."

But while Microsoft has indeed worked with AMD to ensure there is some level of ray tracing support inside the Xbox Series X GPU, Grossman doesn't seem to be that enthused about how readily it will be utilised.

"We do support DirectX Raytracing acceleration, for the ultimate in realism™, but in this generation developers still want to use traditional rendering techniques, developed over decades, without a performance penalty," says Grossman sadly. "They can apply ray tracing selectively, where materials and environments demand, so we wanted a good balance of die resources dedicated to the two techniques."

Maybe it's the fact that he's presenting to a screen, with an unknown number of virtual attendees sitting on the other side, or maybe he's really not expecting real-time ray tracing to be of much interest to developers in this coming generation either. Either way, we just want to give him a hug.




 

VFXVeteran

Banned
We knew this already. I'm predicting 1 feature would be used out of all the features available and it will depend on the game of course. RTX brings the 2080Ti to it's knees. So it is pipe dream thinking we'll see 4k/30FPS with all RT features enabled on consoles. Devs will have to pick and choose which feature they want to show.
 
Last edited:
I’ve been amazed what SSR does to a game on PC recently. Considering it’s a feature omitted from a lot of console games for performance reasons I think they should start using that instead of RT personally. RT will eat up so much of the new systems resources if used for anything major which is why I’m sure developers want to stick with more traditional techniques.
 
The article tells us absolutely nothing new. There's literally no reason for this article to exist other than to garner the website clicks. If I have to explain to you how this thread is going to turn sour, I don't know what else to say really. I suppose if you hadn't posted it, someone else would have obliged.
 

Bernkastel

Ask me about my fanboy energy!
The article tells us absolutely nothing new. There's literally no reason for this article to exist other than to garner the website clicks. If I have to explain to you how this thread is going to turn sour, I don't know what else to say really. I suppose if you hadn't posted it, someone else would have obliged.
Someone else already did
 
Someone else already did
Lol, like I said.....
 
I have no idea why people are saying this is a troll thread by posting the Cerny talk. They do know PS5 will be even more limited when it comes to RT don’t they considering Series X has more compute units to do more RT calculations...

No one is saying there won’t be RT in console games just that it won’t be in every game and when it is used it will be one technique out of reflections, shadows or GI, not all three and it will be probably quarter native resolution when it comes to reflections.

SSR is good enough imo when it comes to reflections especially considering how many console games were missing SSR this generation. SSR adds so much life to a scene for a fraction of the rendering costs of real time RT reflections.
 

Dr Bass

Member
Developers "want" to use traditional techniques? I'm pretty sure developers would LOVE to render fully ray traced scenes in real time at 30-60fps. We aren't even close to being able to do that in hardware. For being an architect he sure is making strange language choices regarding how it works.
 

JLB

Banned
This is the same reason Mark Cerny briefly touched on raytracing for the PS5

Developers will do a hybrid approach this generation

Truth is devs are used to the old stuff, they have the tools and the know-how, and the cost to adopt it so early is comparatively high. I guess later on next generation it will grow.
 

onQ123

Member
The article tells us absolutely nothing new. There's literally no reason for this article to exist other than to garner the website clicks. If I have to explain to you how this thread is going to turn sour, I don't know what else to say really. I suppose if you hadn't posted it, someone else would have obliged.

SMH it's an interview with one of the Xbox architects
 

Bitmap Frogs

Mr. Community
Mark Grossman, principal architect at Microsoft, has detailed the graphics silicon inside the Series X and just what role ray tracing has in it.

The short answer seems to be: not much.

For all the noise that was made about the supposed raytracing capabilities of the next gen hardware and all the associated fud (remember the whole hardware accelerated vs software accelerated meme), at the end turns out it's going to be a nothingburguer.
 
Last edited:

trikster40

Member
Curious what kind of hit raytracing will have on a game like Halo Infinite. I’m assuming 4K will not change, but will they drop the FPS? If so, will it affect SP and MP? Hopefully not for multiplayer? Can they keep the 120 FPS?
 

pawel86ck

Banned
We know that already, but even limited RT is always something. UE5 is using limited RT for GI and results still looks stunning.
 

Elog

Member
I have no idea why people are saying this is a troll thread by posting the Cerny talk. They do know PS5 will be even more limited when it comes to RT don’t they considering Series X has more compute units to do more RT calculations...

Duplicating my answer in another thread (please note that I do not believe that the PS5 has full RT or something crazy like that):

"Please note that you assume that the RT solution is the same on the two machines. We do not know that. In the past Sony has done several customisations even on the CU level. We have to wait to see what their RT solution is until we have a hot-chip like breakdown of the PS5.

I am not sure that most people even know that half the CUs in the PS4 pro was different than all other CUs that AMD has made. Sony has never revealed what change they implemented but you can see on the silicon that each CU in one block has a larger mm2 silicon foot print.

On the PS4 basic model Sony added Shader capacity to the CU block compared to the basic AMD layout that the Xbox had.

And so on."
 

Entroyp

Member
I thought RT performance was like 4 2080ti ducted taped together? Why not make developers use RT for everything?

Kidding aside, sounds like both MS and Sony are very lukewarm towards RT for this gen.
 
Curious what kind of hit raytracing will have on a game like Halo Infinite. I’m assuming 4K will not change, but will they drop the FPS? If so, will it affect SP and MP? Hopefully not for multiplayer? Can they keep the 120 FPS?
Halo is already not native 4k on.xsx. its dynamic 4k. So resolution can easily go up and down
 

Dr Bass

Member
I agree, you can get good ray tracing effects using software techniques without compromising performance. For rest, you can use hardware acceleration. Developers have to tinker and get a balance of both.

This is completely backwards. Operations with hardware support run exponentially faster than "software" techniques. Software based solutions are the definition of hitting a compromise with performance.
 

Mod of War Mod of War we have two of these threads already. Can we get them merged please?
 

JLB

Banned
This RT thing drops the in-game frame rates even on high end PC GPUs when used liberally

Well the "RT thing" is the biggest "thing", the holy grail of graphics.
But yeah, RT without other techniques is still expensive. Its a shame that next gen consoles dont have a proper DSLL 2.0 like implementation in place.
 
Last edited:

Hendrick's

If only my penis was as big as my GamerScore!
Why not post the other part of what he said?

"We've added hardware embedded in the compute units," says Grossman, "to perform intersections of rays with acceleration structures that represent the scene geometry hierarchy. That's a sizeable fraction of the specialised ray tracing workload, the rest can be performed with good quality and good real-time performance with the baseline shader and memory design. The overall ray tracing speed up varies a lot, but for this task it can be up to 10x the performance of a pure shader-based implementation."
 

Hobbygaming

has been asked to post in 'Grounded' mode.
Truth is devs are used to the old stuff, they have the tools and the know-how, and the cost to adopt it so early is comparatively high. I guess later on next generation it will grow.
It will grow and I think most developers will use a mix of rasterization and RT to save on performance
 
  • Thoughtful
Reactions: JLB

Mister Wolf

Member
Doesn't help that many of these developers already have their own game engines and are comfortable doing things the way they are used to. I can see Remedy and 4A games being all in on raytracing though. Kinda disappointed I haven't heard Remedy mention anything about raytracing or DLSS for Crossfire X campaign.
 
Last edited:
Between the repeated downplaying of ray tracing and the low specs of the XSS, I should be able to save a bundle on the new Laptop I'm planning on getting next year and still be able to play any Xbox games without issue.
 

Spukc

always chasing the next thrill
How about developers stop jerking off to native 4k fuckery, save up half of the gpu resources by settling for 1440p or 4ckb, and they should be able to go all out with ray tracing.
all out with raytracing in 2020 with modern games.. that's cute :pie_roffles:
 
Duplicating my answer in another thread (please note that I do not believe that the PS5 has full RT or something crazy like that):

"Please note that you assume that the RT solution is the same on the two machines. We do not know that. In the past Sony has done several customisations even on the CU level. We have to wait to see what their RT solution is until we have a hot-chip like breakdown of the PS5.

I am not sure that most people even know that half the CUs in the PS4 pro was different than all other CUs that AMD has made. Sony has never revealed what change they implemented but you can see on the silicon that each CU in one block has a larger mm2 silicon foot print.

On the PS4 basic model Sony added Shader capacity to the CU block compared to the basic AMD layout that the Xbox had.

And so on."

Of course they’re different hardware wise but it will still be AMD’s RT tech they’re both using which uses compute units to calculate rays for RT. Developers have to chose and balance between RT and traditional rasterisation and if they’re using RT this is why we’re already hearing they’re choosing more traditional methods like SSR because to use RT even in limited fashion at even 1440p native is immensely computationally expensive.

The console with the most CU’s will have the best RT performance in like for like scenarios but I don’t see it being used a lot anyway outside of exclusives so it doesn’t really matter tbh.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Curious what kind of hit raytracing will have on a game like Halo Infinite. I’m assuming 4K will not change, but will they drop the FPS? If so, will it affect SP and MP? Hopefully not for multiplayer? Can they keep the 120 FPS?

MP wont use RT I can all but guarantee that.

The campaign likely in "visuals mode" will have RT and a dynamic 4K that rarely actually sits at 4K.
My guess is 343i is going to use RT for GI since I dont think an open world game needs RT Reflections and Reflections cost so much yet dont add anywhere near as much to the experience as GI would add.
 
Top Bottom