• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Digital Foundry: Xbox Series X ray tracing performance <= 2060 Super

phil_t98

#SonyToo
I think saying a first gen cross gen game is where ray tracing will be in 5 or 6 years on both consoles is a bit premature. both consoles will get better as the gen progresses , API's will get better and coding of the games will far exceed what we have now.

the best looking game so far IMO is Spiderman MM and that will look pretty poor by the time we get to the end of this gen. when you look back at the first games on PS4/XB1 to what we have now there is a huge difference.

same will be applied to ray tracing techniques as they find their feet better with both consoles
 
Easy? Every benchmark i saw 2070 run from 45 to 55 at 4K ultra depending on the weather..
Sorry, I meant 2070s, which I have.
And notice that there was driver upgrade which boosted Nvidia performance.
gamescom-2019-geforce-game-ready-driver-faster-performance.png
 

Krappadizzle

Gold Member
A £450 console doesn't have 2080 performance, a card that cost almost twice as much? Well, I'm surprised.

2060S.

Big difference between the 2060s and 2080 not only in price tag but also perfomance.

XsX should ideally be working at 2080/2080ti levels according to a bunch of tech outlets. It's simply underperforming pretty heavily in this game.

----------------

It's fun to poke fun at, but I think it probably has more to do with Ubisoft than the hardware available in the XsX.
 
OP, really does not know how console launches work. Acting like XSX is not more powerful than the 2060 Super lol.

This is a lazy, low-effort port, it’s not really optimized for XSX, it’s a launch game. Every console generation launch we have the same lazy ports, this is nothing new.

Is this peoples first console launch? Or what is going on here? I thought people on this board actually know stuff, but looking at OP it seems to be his first console launch ever. Cute.
 

Krappadizzle

Gold Member
OP, really does not know how console launches work. Acting like XSX is not more powerful than the 2060 Super lol.

This is a lazy, low-effort port, it’s not really optimized for XSX, it’s a launch game. Every console generation launch we have the same lazy ports, this is nothing new.

Is this peoples first console launch? Or what is going on here? I thought people on this board actually know stuff, but looking at OP it seems to be his first console launch ever. Cute.
People are just super high on console launches at the moment, Sony fans are loving the underperformance from Ubi on Xbox and trying to say it's a measurement and proof of things to come, while they argue over milliseconds of torn frames...It's bizarro land for sure, but some people will come back down to reality as the new car smell wears off. As it stands, I just try and let them have their fun, it's not gonna last forever.
 
Last edited by a moderator:

cryptoadam

Banned
I think both machines are going to struggle with Ray tracing. RT will always have to make cuts like lower the res, LOD, shadows etc within the reflection... Even Spiderman is pulling all kinds of tricks on its RT. The machines could not be built cost effective while going whole hog on RT. And of course they need something to sell the Pro versions. Unlike last gen where "4K" was the selling point, this time around its going to be 4K + "60" + RT for you to shell out 600$ in 18-24 months.

These machines will never give a full RT experience with the power they have. Expect cuts either in the main game (lower res/FPS) or within the RT itself. Its going to test the metal of developers to see what cuts and tricks they can do to optimize. One thing we can say is UBI is shit at both of those things so expecting good results from any UBI game is like expecting to eat shit and it will taste like a gourmet steak
 

Self

Member
Console have basic RT performance, but even miles morales has far better raytracing than I'd ever imagine the PS5 to have.

I agree. I never thought it would look *that* good - on a freaking launch game.

It's fun to poke fun at, but I think it probably has more to do with Ubisoft than the hardware available in the XsX.

I honestly can't remember a Ubisoft game with stable framrate or without mayor failures and glitches. I'm actually surprised that AC runs at 60fps, apart from looking like a last gen game.
 

Ascend

Member
So this should mean both Xbox and PS5 are equivalent to a 2060 Super when it comes to ray tracing.
This is questionable already. The PS5 has 36 RT cores, the XSX has 52.
Take clock speeds into account and the XSX should be around 15% faster than the PS5 at RT. What that difference should be? It's about the difference between the 2070S and the 2060S (40 RT cores vs 34 RT cores).

So either the XSX is like a 2060S and the PS5 is like a 2060 non-S but can put out stuff like Miles Morales RT, or, you are jumping to conclusions.
Remember that nVidia and AMD cards both have their strengths and weaknesses. Whatever is done with RT is not done through rasterization. This has to be taken into account. Say nVidia is better at ambient occlusion with rasterization than AMD, but AMD is better with reflections. If you have a game that uses RT for its reflections, AMD's relatively performance hit will be higher, even if the RT performance is the same for both. Likewise, if you have a game that uses RT for ambient occlusion, nVidia's relative performance hit will be higher, because they replaced what is done faster on their GPU with a taxing RT implementation.
Comparing one game is just that; a sample of one. It is not enough to be representative, and there are too many variables.

Not a good first effort by AMD in my opinion, if their 12 tflops gpu is offering worse performance in ray traced games than a 7 tflops nvidia gpu then we are going to have a pretty lame gen.
TFLOPS is useless here. And considering the architectures are different, counting RT cores is only so useful. The RTX2060S has 34 RT cores, while the XSX has 52 RT cores. But they are implemented differently and can behave differently depending on the engine and the game as mentioned above. And even within the same architecture, it doesn't scale linearly. Is the 2080Ti twice as fast at RT compared to the RTX 2060S? It's 68 vs 34 RT cores... No it's not (Source).

I am not saying AMDs RT implementation will be awesome. At this point, RT accelerations in GPUs are still mediocre for gaming performance-wise. The consoles can only do so much. They already have to try and run 4K, which is actually beyond the optimal resolution for their class of GPUs, and on top of that they are expected to do RT... It's kind of too much to ask of these machines, and well, they look good with what they can offer.

AMD's implementation will be a failure if the 6800XT with 72 RT cores performs the same as a 2060S with 34 RT cores. I doubt that's going to happen. I also doubt that it will match nVidia's Ampere GPUs (it's likely going to be slower), but in practice, I doubt it really matters, considering how poor RT performs. Unless you want to pay $700 to game at 1080p with RT, it's not going to matter much.

In before "BUT DLSS"
 
Last edited:

VFXVeteran

Banned
This is questionable already. The PS5 has 36 RT cores, the XSX has 52.
Take clock speeds into account and the XSX should be around 15% faster than the PS5 at RT. What that difference should be? It's about the difference between the 2070S and the 2060S (40 RT cores vs 34 RT cores).

So either the XSX is like a 2060S and the PS5 is like a 2060 non-S but can put out stuff like Miles Morales RT, or, you are jumping to conclusions.
Remember that nVidia and AMD cards both have their strengths and weaknesses. Whatever is done with RT is not done through rasterization. This has to be taken into account. Say nVidia is better at ambient occlusion with rasterization than AMD, but AMD is better with reflections. If you have a game that uses RT for its reflections, AMD's relatively performance hit will be higher, even if the RT performance is the same for both. Likewise, if you have a game that uses RT for ambient occlusion, nVidia's relative performance hit will be higher, because they replaced what is done faster on their GPU with a taxing RT implementation.
Comparing one game is just that; a sample of one. It is not enough to be representative, and there are too many variables.


TFLOPS is useless here. And considering the architectures are different, counting RT cores is only so useful. The RTX2060S has 34 RT cores, while the XSX has 52 RT cores. But they are implemented differently and can behave differently depending on the engine and the game as mentioned above. And even within the same architecture, it doesn't scale linearly. Is the 2080Ti twice as fast at RT compared to the RTX 2060S? It's 68 vs 34 RT cores... No it's not (Source).

I am not saying AMDs RT implementation will be awesome. At this point, RT accelerations in GPUs are still mediocre for gaming performance-wise. The consoles can only do so much. They already have to try and run 4K, which is actually beyond the optimal resolution for their class of GPUs, and on top of that they are expected to do RT... It's kind of too much to ask of these machines, and well, they look good with what they can offer.

AMD's implementation will be a failure if the 6800XT with 72 RT cores performs the same as a 2060S with 34 RT cores. I doubt that's going to happen. I also doubt that it will match nVidia's Ampere GPUs (it's likely going to be slower), but in practice, I doubt it really matters, considering how poor RT performs. Unless you want to pay $700 to game at 1080p with RT, it's not going to matter much.

In before "BUT DLSS"

Everything you said is NOT how things work when the game engine is rendering.

There is no special performance chips that know what a particular algorithm is for a given hardware set. You won't have "this card" does RT reflections better than the other card. The algorithm doesn't care what the hardware does with the technique. There is no special sauce for different algorithms in the chipsets.
 

VFXVeteran

Banned
I just hope this low-end ray tracing doesn't hold PC gaming back over the next few years, because the tech is going to grow by leaps and bounds and the GPUs are going to get much more capable.

RT is mainly controlled by fixed settings. You set them and hit render and it does everything it should.

You have nothing to fear as far as that's concerned. No developer is going to intentionally turn down dials just because the consoles can't hang.
 

VFXVeteran

Banned
Judging by the difference between Spiderman and watchdogs, I would say how a developer implements ray tracing is equally important, watchdog seems like it's just a bit shit.

How when it looks better and factually does everything closer to CG like reflections are supposed to. It runs the original shaders when it reflects, it reflects other reflections from water puddles, it reflects smoke, leaves, etc.. It's clearly a better quality implementation than MM. Not sure how some of you can't see that. MM is taking out so much from the reflections it's not even funny.
 

VFXVeteran

Banned
I mean, it’s kinda hard to make any legit claims when your only sample is a Ubisoft crossgen game that’s not fully optimized for next gen consoles specifically

Where are your receipts for this claim? I'm curious. WD:L is using the GPU pretty heavily and their RT implementation is pretty thorough especially with what their reflections are doing. You might want to watch the DF video on it to get a better understanding of how the reflections implementation is of super high quality and will cost GPU cycles.
 

VFXVeteran

Banned
Why do PC gamers *always* forget the benefits of developing for a closed system.

Show me a PC Game on a 1.8tf GPU that looks as good as TLOU2...

Same will happen here, PC gamers will scoff then the exclusives will roll out (especially on PS5) then they will go quiet again, this includes RT which is in its infancy on these machines

No PC game will ever trump the popularity game of you PS warriors. There is nothing magically stunning about any of the exclusives but subjectivity and popularity. If we were to examine at the actual pixel level of these games and their many shortcuts, you'll find that many many games with PC @ Ultra settings have way better rendering quality than the consoles..as it should... it has the GPU power to add in higher fidelity rendering.
 

VFXVeteran

Banned
Would it not make sense if raytraced reflections activated only when a character is moving at walking or slow running pace.
Then when in a car or swinging or traversing quickly, the raytracing could be limioted apart from on the car....

Not sure if that would work, it's just an idea.

No. You avoid any "if" statements in shaders. That's not how to properly implement an optimization. At some point, you've cut corners too much and the look is destroyed.
 

mitchman

Gold Member
This is questionable already. The PS5 has 36 RT cores, the XSX has 52.
Take clock speeds into account and the XSX should be around 15% faster than the PS5 at RT. What that difference should be? It's about the difference between the 2070S and the 2060S (40 RT cores vs 34 RT cores).
You forgot to take into account that BVH traversal will be significantly faster on the PS5 over the XSX due to higher clock speeds, but XSX should be able to launch more rays. Maybe that was implied, though. What the end result will be is hard to say.
 
I just hope this low-end ray tracing doesn't hold PC gaming back over the next few years, because the tech is going to grow by leaps and bounds and the GPUs are going to get much more capable.
Oh baby give me that 3090ti and I’ll never ask for anything else thanks
 

VFXVeteran

Banned
Yer that 2060 does look a hell of a lot better than the xbox version.

Then again spiderman looks incredible so surely the xbox can improve it.

another reason is maybe Insomic games are just a tier above Ubisoft in that department.

Not wanting to through shade on Ubisoft but i don't rate them anywhere near the naughty dogs or sucker punches of this world, this is probably just that. A good studio, not a masterful one, or maybe i am being to harsh,

And you would be wrong and your comment comes off as arrogant and ignorant at the same time. Ubisoft has multiplatforms to think about when making their graphics engine A 1st party doesnt' have to worry about that. Ubi's graphics engine is more robust and definitely contains more tech in it.

Spiderman MM reflections are complete garbage compared to WD:L reflections. They don't look better by any strech of the imagination. Just by avoiding the original shader calls in MM should make you see the difference right there.

Ubi is huge dude. Just like all the other large studios that have way more money and resources because they have to support multiple platforms. It does NOT mean their graphics engine is automatically unoptimized just because its a 3rd party company.
 
Last edited:

Ascend

Member
Everything you said is NOT how things work when the game engine is rendering.

There is no special performance chips that know what a particular algorithm is for a given hardware set. You won't have "this card" does RT reflections better than the other card. The algorithm doesn't care what the hardware does with the technique. There is no special sauce for different algorithms in the chipsets.
I don't think you actually understood what I wrote.

You forgot to take into account that BVH traversal will be significantly faster on the PS5 over the XSX due to higher clock speeds, but XSX should be able to launch more rays. Maybe that was implied, though. What the end result will be is hard to say.
No I did not forget to take that into account. If I would, that would be closer to 45% faster for the XSX, rather than the mentioned 15%.
 
Last edited:

Armorian

Banned
Where are your receipts for this claim? I'm curious. WD:L is using the GPU pretty heavily and their RT implementation is pretty thorough especially with what their reflections are doing. You might want to watch the DF video on it to get a better understanding of how the reflections implementation is of super high quality and will cost GPU cycles.

Yes, RT in WD is quite great:

Wood and metal reflects (probably not on consoles)

d94WatchDogsLegionScree.png
156WatchDogsLegionScree.png


Lamp reflected on picture glass

5aaWatchDogsLegionScree.png


796WatchDogsLegionScree.png
 
Yes, RT in WD is quite great:

Wood and metal reflects (probably not on consoles)

d94WatchDogsLegionScree.png
156WatchDogsLegionScree.png


Lamp reflected on picture glass

5aaWatchDogsLegionScree.png


796WatchDogsLegionScree.png
I’m not saying that the raytracing in watch dogs isn’t great, I’m just saying (admittedly this is an assumption) that they probably were developing with NVidia rtx as the target rather than AMD - so I don’t know if we can fully state what AMD’s rt is capable of based off this game alone.
 

VFXVeteran

Banned
Yes, RT in WD is quite great:

Wood and metal reflects (probably not on consoles)

d94WatchDogsLegionScree.png
156WatchDogsLegionScree.png


Lamp reflected on picture glass

5aaWatchDogsLegionScree.png


796WatchDogsLegionScree.png

Even the NPCs are of better quality. This game overall looks way better than MM. The PC version maxed out just has many 3d features that are expensive. Look at the SSS of the skin on your character. You can't find that quality SSS on any console during gameplay (not cinematics).
 

Tajaz2426

Psychology PhD from Wikipedia University
Both consoles will get better with time. I think they both have some good RT. It will get better with time.
 
Last edited:

Tajaz2426

Psychology PhD from Wikipedia University
Even the NPCs are of better quality. This game overall looks way better than MM. The PC version maxed out just has many 3d features that are expensive. Look at the SSS of the skin on your character. You can't find that quality SSS on any console during gameplay (not cinematics).
The NPCs look like crap on both games. Stop your little hate boner that you have. It’s embarrassing having to come into a thread and see you acting like a child yet again.
 
These consoles have been out for only a few days, and we're judging them with a game made by a company that doesn't make the most optimized games? We're better than that, and so should Digital Foundry.
 

VFXVeteran

Banned
The NPCs look like crap on both games. Stop your little hate boner that you have. It’s embarrassing having to come into a thread and see you acting like a child yet again.

They do not look like crap. Show me an NPC that looks the best of all games while in gameplay and I'll match it with an NPC in WD:L. You go first. I already have my screenshots ready to roll.
 

Tajaz2426

Psychology PhD from Wikipedia University
They do not look like crap. Show me an NPC that looks the best of all games while in gameplay and I'll match it with an NPC in WD:L. You go first. I already have my screenshots ready to roll.
They don’t look good period. I don’t have any game in mind because NPCs are basic. All games look that way. Throw up your pictures. You have crazies in this thread saying the XBox is like a 3080. Quite your crap and act like an adult.

You fanboys all drive me up the wall. That’s on all sides.
 
Last edited:

sircaw

Banned
And you would be wrong and your comment comes off as arrogant and ignorant at the same time. Ubisoft has multiplatforms to think about when making their graphics engine A 1st party doesnt' have to worry about that. Ubi's graphics engine is more robust and definitely contains more tech in it.

Spiderman MM reflections are complete garbage compared to WD:L reflections. They don't look better by any strech of the imagination. Just by avoiding the original shader calls in MM should make you see the difference right there.

Ubi is huge dude. Just like all the other large studios that have way more money and resources because they have to support multiple platforms. It does NOT mean their graphics engine is automatically unoptimized just because its a 3rd party company.


I guess you work for Ubisoft or something, hence your shitty tone in what you deem arrogant and ignorant.

Spiderman reflections look fucking amazing, proper game developers have been saying so, not sure why you think your a better authority over them, well your not.

We get it, you think your the bee's knees around here, but your not, your just average joe at the end of a pc.

We all have eyes, whatever play station seems to do, your always fucking butt hurt about it.

Game Developer my arse.

Should of done what people said along time ago, welcome to my ignore.
 
Last edited:

VFXVeteran

Banned
I guess you work for Ubisoft or something, hence your shitty tone in what you deem arrogant and ignorant.

No, but I do work at a company that does realtime graphics and I would NEVER assume Ubisoft have unoptimized code. YOU on the other hand, have 0 experience in the industry. You know no one of importance. You have not talked personally to any developer that makes this stuff, and yet you want to push out there that they have an unoptimized graphics engine for the console? Yes, that's fucking arrogant AND ignorant dude.

Spiderman reflections look fucking amazing, proper game developers have been saying so, not sure why you think your a better authority over them, well your not.

They look amazing to YOU! They do not compare well to Watch Dogs Legion. DF showed this explicitly for your eyes to see..

We get it, you think your the bee's knees around here, but your not, your just average joe at the end of a pc.

Yea, average joe that gets paid to do this stuff on a daily basis. Sorry, but I know exactly how to write an algorithm for spiderman's reflection shader code. It's very easy to write. I also know that WD:L has way more computations and fires more rays to get better reflections. I have written those shaders for years.
 
Last edited:

nowhat

Member
Yea, average joe that gets paid to do this stuff on a daily basis. Sorry, but I know exactly how to write an algorithm for spiderman's reflection shader code. It's very easy to write. I also know that WD:L has way more computations and fires more rays to get better reflections. I have written those shaders for years.
So, when do we get your Youtube channel explaining all this, will you still refer to yourself in third person then?
 

synce

Member
Goes to show you what a waste of resources RT is. Looking at DF's DMC5 video the game runs at 90+ fps in 4K without RT, which is just above 3070 performance, or over twice that of 2060 Super.
 
Wait, does it mean that they are both completely equal to the 2060 performance for raytracing, or does it mean that the developers of Watch Dogs have a profile for the 2060 that they just decided to use for the consoles to make their lives easier while making their mediocre game?

They really need to update the visuals in the PS5 version. Just horrible at times.
 

CrustyBritches

Gold Member
VFXVeteran VFXVeteran

Turds like Moore's Law is Dead had fanboys thinking they were getting 2080ti performance out of these consoles and in the end, with RT enabled reality hits and denial is everywhere.

This level of RT performance makes sense in the regards that the RX 6800 XT, a card with twice the CUs and raw power of the PS5, has RT performance right around a 2080ti/3070. Something like a XSX that's said have raster perf around a 2080, or a PS5, would have RT perf a tier or so below. 2070S/2080 dropped a tier down from weak RT perf lands right at 2060S/2070 stock level.

This isn't even with overclock or DLSS. I love the XSX, but it still gives me a good chuckle as a 2060S owner.
 

Ev1L AuRoN

Member
Easy? Every benchmark i saw 2070 run from 45 to 55 at 4K ultra depending on the weather..
You are delusional if you thing that consoles are running those games at native 4k60 ultra. None of the titles tested so far are native, it's all reconstructed and dynamic resolution or 4k30. And don't get me wrong, I'm ok with it, I think the games look great, but is innocence believe they are attempting the same IQ as the PC. If you lower some settings like the consoles you would be surprised on how well the RTX 2060 run games at 4k. And when the card need to use a lower resolution, most of the time it still higher then consoles. I'm still think they are like 2080 in rasterization and like the 2060S in RT, but we still need to see it in games. So far, it's 2060S performance.
 

longdi

Banned
MM is about the expertises and big resources Sony first party can throw at the problems.
afaik MM RT is but a mix of cube maps and half baked RT.

Technically it is not even close to WDL.
Technique-lly, it is a smart way to solve the consoles weaker RT hardware.
 

Ev1L AuRoN

Member
MM is about the expertises and big resources Sony first party can throw at the problems.
afaik MM RT is but a mix of cube maps and half baked RT.

Technically it is not even close to WDL.
Technique-lly, it is a smart way to solve the consoles weaker RT hardware.
I think that what they do on consoles are quite impressive. It's obvious to pc gamers that this games don't run at same level of fidelity, but because of it, they can pull very impressive gains in performance compare to pc space when most games don't even feature dynamic resolution. And of course most entusiasts want the game running on ultra. That's why a lot of people dismiss the RTX 2060 S as a weak card, there so much better options on pc space, and nobody optimize settings for the RTX when comparing performance. It's obvious it will tank.
 
That video made me not even want to get a series x. I can't stand aliasing. I thought for sure that crap would be gone this gen.
 

longdi

Banned
I think that what they do on consoles are quite impressive. It's obvious to pc gamers that this games don't run at same level of fidelity, but because of it, they can pull very impressive gains in performance compare to pc space when most games don't even feature dynamic resolution. And of course most entusiasts want the game running on ultra. That's why a lot of people dismiss the RTX 2060 S as a weak card, there so much better options on pc space, and nobody optimize settings for the RTX when comparing performance. It's obvious it will tank.

Yes and you add in the cinegraphers, artists and animators in Sony camp. And IG have more time to optimize their game on only 3 hardware.
That's how SCE games do so well from a visuals stand point, even though their hardware is not the strongest even in the console space.
 

jigglet

Banned
I think the ray tracing capabilties of these consoles are probably going to be sub par but come on OP this is a shit way to prove your point. A multi-platform game from a developer that's known to fart stuff out. Yeah...
 

Ev1L AuRoN

Member
I think the ray tracing capabilties of these consoles are probably going to be sub par but come on OP this is a shit way to prove your point. A multi-platform game from a developer that's known to fart stuff out. Yeah...
The engine that ubi uses on AC is amazing and heavy, I don't think is unoptimized at all, try any modern assassin's creed game on pc with ultra settings. It is pretty much nextgen.
 

regawdless

Banned
I think saying a first gen cross gen game is where ray tracing will be in 5 or 6 years on both consoles is a bit premature. both consoles will get better as the gen progresses , API's will get better and coding of the games will far exceed what we have now.

the best looking game so far IMO is Spiderman MM and that will look pretty poor by the time we get to the end of this gen. when you look back at the first games on PS4/XB1 to what we have now there is a huge difference.

same will be applied to ray tracing techniques as they find their feet better with both consoles

Mhm this may be true for old generations. But since they moved to an easy to program architecture last gen, the gains aren't that huge. Infamous Second Son is still one of the better looking PS4 games if you ask me.
 

Ev1L AuRoN

Member
Mhm this may be true for old generations. But since they moved to an easy to program architecture last gen, the gains aren't that huge. Infamous Second Son is still one of the better looking PS4 games if you ask me.
In this gen the thing developers are going to get better and it's new in both console and pc space is the direct access to fast SSD's for nextgen streaming capabilities. But I agree with your post.
 

regawdless

Banned
The NPCs look like crap on both games. Stop your little hate boner that you have. It’s embarrassing having to come into a thread and see you acting like a child yet again.

NPCs in Watch Dogs Legion can actually look pretty good (maxed on PC, whatever that's worth). But they move like shit, the animations are pretty bad. The models themselves are very good for an open world game.
 
Top Bottom