• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Next-Gen PS5 & XSX |OT| Console tEch threaD

Status
Not open for further replies.

LordOfChaos

Member
Not true. RT reflections are much more expensive than RT AO or shadows.

In control enabling a few RT features doesn't hurt performance all that much but once I enable Reflections, or if I only enable reflections, the performance takes a punch to the face.

Not all RT techniques have the same cost. AO and shadows are cheaper just like the Digital Foundry guys said. Which is why they will be the most used on PS5 and XBX2. I highly doubt RT reflections show up on more than a couple PS5 games during its lifetime. If a 2080ti cannot handle them then the weaker PS5/XBX gpu wont either. Unless of coarse AMD knocks Nvidia out of the ring with their RT implementation, but when was the last time AMD beat Nvidia at anything "GPUwise" other than being the lowest bidder?


The expense of these effects is not in dispute. A hardware implementation that only does one thing is. It's like saying you have a shader that only does water - once you solve for fast shading, a developer can apply it however they see to budget out, right? It's the same thing with ray casting. Some effects are more expensive than others, but Mr xmedia is saying it's hardware limited to shadows, which makes no sense on the face of it.
 

SlimySnake

Flashless at the Golden Globes
Here's a hoot. First people were insisting PS5 only had software RT when it was unclear in the first Wired article, then when Cerny addressed that directly and said it was hardware in the second one, it's this. How would you even make RT only applicable to shadows, makes no sense.

And that's before I noticed the username :messenger_tears_of_joy:


lol this loser is 6 years older than he was back during the ps4 reveal days and he still hasnt grown up.

i swear at this rate, cerny will be forced to schedule another interview with wired showcasing reflections and ray traced GI.

and that wouldnt work either.
 

Croatoan

They/Them A-10 Warthog
What about Global Illumination? That's what I look forward to the most as it has the largest impact on lighting as it makes things look like they belong in the scene.
Ray traced global illumination is my holy grail as well. I'm not exactly sure if we will see that for another 3-5 years and probably only on pc.

I could be wrong but I believe control has some form of ray traced lighting (indirect lighting? ) that looks okay.

Ray traced reflections are the real generational jump though. They are so damn cool.
 
Last edited:

Croatoan

They/Them A-10 Warthog
The expense of these effects is not in dispute. A hardware implementation that only does one thing is. It's like saying you have a shader that only does water - once you solve for fast shading, a developer can apply it however they see to budget out, right? It's the same thing with ray casting. Some effects are more expensive than others, but Mr xmedia is saying it's hardware limited to shadows, which makes no sense on the face of it.
You are 100% correct, sorry in misunderstood you.

The hardware won't limit what ray tracing stuff you can do. Only the performance requirements. This is what the DF guys where talking about.

Will the ps5 be able to do ray traced reflections? Yes, but it won't be able to do it with realistic graphics, good effects, and 4k at over 10-15 fps (likely).

You will see it in some less resource intensive games though.

For example, if I remember correctly I run Control at 4k Maxed Settings on a 2080ti with no RTX at around 70-80fps.
Without DLSS, and with all of the RTX features on, my performance drops to about ~30-40 FPS.
Thats a 40-50fps drop!!

With DLSS on (which limits to 1440p) I get about 50-55 fps.

If I drop to 1080p I am back to ~70fps.

All of that is with a $1000 card.

Now lets look at Shadow of the tomb raider.

At 4k maxed I get over ~70 fps.
With its Ray traced shadowing I get about 60-65 fps.

So that particular effect isn't so demanding.


You have to remember that RTX blows chucks, and AMD might have a better setup for their ray tracing. So the drop for RT Reflections might not be so severe, but even if it is only a 10 fps drop is it going to be worth it if you are struggling to fit the rest of your game into 30fps? Next gen games are going to be more demanding than current games with or without ray tracing.
 
Last edited:
Ray traced global illumination is my holy grail as well. I'm not exactly sure if we will see that for another 3-5 years and probably only on pc.

I could be wrong but I believe control has some form of ray traced lighting (indirect lighting? ) that looks okay.

Ray traced reflections are the real generational jump though. They are so damn cool.
Reflections are great, but I can live with Screen Space Reflections so long as GI is there. It makes a world of difference.
 

Croatoan

They/Them A-10 Warthog
Reflections are great, but I can live with Screen Space Reflections so long as GI is there. It makes a world of difference.
Has a game even implemented Ray traced Global Illumination?

UE4 cut it before the engine launched and I think just recently added it back in with version 4.22 (I haven't played with it in the engine yet to check performance).

I could be wrong, but I think we are still a console generation away from Real Time RT Global Illumination.

Agreed.

GI and AO is my favored usage.

The reflections are incredible though. The way glass and surfaces react in Control is freaking beautiful and makes everything look that much more real. They do just as much as GI to improve the over all image IMO. Also FPS/TPS multiplayer gameplay gets interesting with RT Reflections. Imagine using glass, or a puddle, to see around a corner!

GI is much better for development though as you no longer need to bake lighting or shadows.

Ideally we would have both and on PC within 5 years I think we will.

Agree with a lot of it, but turing GPU's use very few RT cores (even 2080ti). If AMD found a way to accelerate RT operations on shader units, then RT performance on next xbox and playstation should easily surpass even 2080ti RT performance.

BTW. guys, SonGoku is still with us?

If that happens I would be super happy. I think RTX sucks and I hate when Nvidia tries to monopolies tech that should be available to all.
 
Last edited:

DeepEnigma

Gold Member
Has a game even implemented Ray traced Global Illumination?

UE4 cut it before the engine launched and I think just recently added it back in with version 4.22 (I haven't played with it in the engine yet to check performance).

I could be wrong, but I think we are still a console generation away from Real Time RT Global Illumination.

Good point. I think only tech demos with engines at the moment. Hell, even Tomb Raider looked fantastic with just Shadows and AO. It somehow enhanced the look of the lighting without GI.

On a side note, you can single ray light source without full blown GI, correct?

The reflections are incredible though. The way glass and surfaces react in Control is freaking beautiful and makes everything look that much more real. They do just as much as GI to improve the over all image IMO. Also FPS/TPS multiplayer gameplay gets interesting with RT Reflections. Imagine using glass, or a puddle, to see around a corner!

GI is much better for development though as you no longer need to bake lighting or shadows.

Ideally we would have both and on PC within 5 years I think we will.

Don't get me wrong, it looks great, but if it is more taxing than just Shadows and AO, and going with SSR/Rasterisation (which has come a long and convincing way), I would take it for now.

Save it for less taxing racing games on the cars, maybe?
 
Last edited:

Croatoan

They/Them A-10 Warthog
Good point. I think only tech demos with engines at the moment. Hell, even Tomb Raider looked fantastic with just Shadows and AO. It somehow enhanced the look of the lighting without GI.

On a side note, you can single ray light source without full blown GI, correct?



Don't get me wrong, it looks great, but if it is more taxing than just Shadows and AO, and going with SSR/Rasterisation (which has come a long and convincing way), I would take it for now.

Save it for less taxing racing games on the cars, maybe?

Oh, I was assuming yall wanted to leave out reflections and add in GI. I am cool with just shadows and AO as well but would love if these machines could do control like reflections for all games. That stuff is amazing to me.

Even cartoonish games would benefit from RT Reflections.

Also, AFAIK GI is WAY more resource intensive that Reflections so it wouldn't be a 1 to 1 performance swap either.
 
Last edited:
Has a game even implemented Ray traced Global Illumination?

UE4 cut it before the engine launched and I think just recently added it back in with version 4.22 (I haven't played with it in the engine yet to check performance).

I could be wrong, but I think we are still a console generation away from Real Time RT Global Illumination.

Minecraft is all I know of at the moment.
 
F

Foamy

Unconfirmed Member
Oh noes! The japanease won't be able to use it as a coffee table!
That's what doomed the OG Xbox there!
 
Last edited by a moderator:

Imtjnotu

Member
Nah, you just hear the xbox doesn't have exclusives instead, because it's the only defense. Prior to the X and Pro, it was Xbox is less powerful.
But the difference in power doesn't seem to be a topic anymore. Guess we will have to wait and see but it will always come down to exclusives that sell consoles. Not hardware
 

DeepEnigma

Gold Member
Uhh, where is the ethernet port?

It's just a render of the dev unit taken from the drawings. One of the squares in the back is one or more.

I swear to god if they go wireless only I will do something very very naughty.
Jemaine-Clement-Admires-Himself-in-The-Mirror-In-Dinner-For-Schmucks.gif
 
Here's a hoot. First people were insisting PS5 only had software RT when it was unclear in the first Wired article, then when Cerny addressed that directly and said it was hardware in the second one, it's this. How would you even make RT only applicable to shadows, makes no sense.

And that's before I noticed the username :messenger_tears_of_joy:


Digital Foundry think RT is going to be applicable only on shadows and AO because some EA exec gave 2 examples of RT when talking about PS5: with AO and shadows and not on anything else (like if she was going to cite the whole list of application). In reality they are totally clueless about those matters.

But their audience seem to like their unique interpretation of the Cerny interviews, like Misterxmedia above. LOL.
 
Last edited:

CrustyBritches

Gold Member
Hope everything is going ok with you my brotha, SonGoku SonGoku . We miss having you in the thread, man!
---
On topic. I like that DF doesn't try to overhype next-gen hardware. I hope RT doesn't become some overblown marketing gimmick. 60fps is preferable to me. There are some situations that RT makes sense. I think real-time reflections is a case where it has immediate visual impact.

I don't have a RTX card yet, so my only experience is on a 1060 6GB with DXR fallback in new UE4 update, so maybe guys with the real cards could chime in. In regards to DF's EA example, I remember reading a TechSpot article and benchmark of DICE's update to their BFV RT implementation for better performance:

"If you look closely at the water surface, at times an object will move across the reflected area like an AI character or a falling leaf, and for a brief moment you’ll spot the classic screen space reflection-like streak caused by that object obstructing the reflection’s path. My guess here is that DICE have chosen to more aggressively cull rays from objects not in view, which has improved performance significantly, but it’s at the cost of the occasional artifact where something that should still be in view is getting culled erroneously. It makes the reflections slightly more ugly but it’s still a significant upgrade on basic screen space reflections where this issue is much more widespread."
2018-12-06-image-4.jpg


"First up we’re going to look at RTX 2080 Ti performance in the intensive Tirailleur map at 1080p. We’re looking at a 57% improvement to average frame rates for the Ultra DXR mode, and a 21% improvement for the Low mode. Despite a small reduction to frame rate for DXR off, this mode is still 75% faster than Ultra DXR, and 53% faster than Low DXR. Previously there was more than a 2x difference, but this performance penalty is still pretty brutal."
2018-12-06-image.png

2018-12-06-image-9.png
The original penalty for 'Low' reflections on Frostbite was the halving of the framerate. Post-update that became the penalty for 'High/Ultra'. At least in my view, DF's assessment of EA's comment was justified given the real-world penalty for reflections on Frostbite.
 

Long time lurker here. Just want to point out that as far as I am aware, Arcturus is not an architecture per se, it is the codename of a chip based on our good friend Vega with all graphics silicon removed. I.e there are no ROPs, TMUs or a display engine.
It's been linked to a yet to be released Radeon Instinct MI100. Supposedly a 128CU (8192sp) compute monster.

Why Vega as a basis for Arcturus? Vega is a compute beast. Instinct MI60, based on the 4096sp Vega20 (a cut down version of which is used for MI50 and Radeon 7) is apparently goes toe to toe in compute workloads eith the 815mm2, 5120 cuda core Nvidia Tesla V100.

So, I'm fairly certain Arcturus won't be used in Scarlett. It's a single chip, not an architecture, designed purely for Machine Intelligence workloads.

GameSpot or whoever was talking through their backsides putting "Arcturus" as Scarlett's architecture, and didn't do any due diligence.
 
Long time lurker here. Just want to point out that as far as I am aware, Arcturus is not an architecture per se, it is the codename of a chip based on our good friend Vega with all graphics silicon removed. I.e there are no ROPs, TMUs or a display engine.
It's been linked to a yet to be released Radeon Instinct MI100. Supposedly a 128CU (8192sp) compute monster.

Why Vega as a basis for Arcturus? Vega is a compute beast. Instinct MI60, based on the 4096sp Vega20 (a cut down version of which is used for MI50 and Radeon 7) is apparently goes toe to toe in compute workloads eith the 815mm2, 5120 cuda core Nvidia Tesla V100.

So, I'm fairly certain Arcturus won't be used in Scarlett. It's a single chip, not an architecture, designed purely for Machine Intelligence workloads.

GameSpot or whoever was talking through their backsides putting "Arcturus" as Scarlett's architecture, and didn't do any due diligence.
Figured as much.
 

CrustyBritches

Gold Member
GameSpot is a joke. iirc the whole Arcturus thing was based on the rumor that it was the next-gen, post-GCN arch, and Navi was still GCN/hybrid, so Sony was on Navi, while Xbox was Arcturus/Next-gen. An AMD engineer, John Bridgman had to clarify this many months ago that Arcturus was a GPU, not an arch...




This was how it came to be associated with Xbox...
wHcrp5L.png


"scarletbig_dev"
From the old benchies and reference to Xbox One, Anubis seems nothing-burger as well. I think Arden and Argalus were the Anaconda and Lockhart, and now only Arden remains. I'd guess the reason we see the benchmarking for Gonzalo/Flute/Oberon, but not Arden/Argalus, is that MS uses PIX profile tool(at least they did for Xbox Scorpio), and Sony/AMD is benching on 3DMark and UserBenchmark. :pie_thinking:
 
Last edited:
Status
Not open for further replies.
Top Bottom