• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Chernobylite Ray Tracing Analysis: Gorgeous on PC, but what about PS5?

winjer

Gold Member
Shader compilation deserve a painful death.

Hopefully new engines are not gonna suffer from this shit.

This week I started to play Detroit Become Human on PC. I started the game and the first thing it does is compile all shaders.
It took a couple of minutes, but after that the game was running smooth as butter.
There are already solutions to this problem, devs just have to implement them.
And as far as I know, UE has these tools. For example Borderlands 3 in DX3, also compiled all shaders at the first startup.
 

GymWolf

Member
This week I started to play Detroit Become Human on PC. I started the game and the first thing it does is compile all shaders.
It took a couple of minutes, but after that the game was running smooth as butter.
There are already solutions to this problem, devs just have to implement them.
And as far as I know, UE has these tools. For example Borderlands 3 in DX3, also compiled all shaders at the first startup.
Yeah, i vastly prefer doing a shader compilation before the game start.

Tiny tina wonderlands does that aswell.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Shader compilation deserve a painful death.

Hopefully new engines are not gonna suffer from this shit.
Unless a new API shows up Shader compilation will still be a thing.
Its just about games/developers being clever about how and when they actually do the shader compilation.

There are too many hardware configurations out there for current engine/apis to have precompiled shaders.

Weve had examples of games that do a compile run before the game even actually starts.
Im curious how/when Gears 5 does its compilation cuz that game has near no stuttering and its a even a UE4 game which basically all games since have suffered from stuttering.
 

yamaci17

Member
That's because the 750ti was not better. Edit : well, it was better than Xbox one, not as good as PS4. 2gb vram hurt it as well.

3060 will always be ahead. Even when driver support winds down, it will still have rt cores and dlss.
indeed 750ti was never better. it was a 1.3 tflops maxwell GPU, a far cry from 1.8 tflops.

it performeed better than ps4 at some titles because of the CPU difference. lets take witcher 3 for example: on PS4, you have to set the game to a 30 fps limit because you cannot realistically get a locked 60 with jaguar cores. I'm "pretty" sure that PS4's GPU is more than capable of 30 frames in Witcher 3 at 1080p. 750ti had the privelege of running those early gen games at unlocked framerates with much better CPUs, naturally it "seemed" it performed better.

this is like saying a gtx 1070 is better than a ps5. is it? absolutely no. can you run rdr 2 at 1080p/60 FPS with a gtx 1060 on PC? yes. why can't you do that on ps5? because Rockstar is to blame for not providing a simple nextgen patch for 2 years now.

in that era, all 2 gb GPUs suffered heavily. it is so misleading. their raw performance can still match or tail just behind of PS4. their VRAMbuffer is not up for the task. it was 2 gb buffer versus a 3.5-4 GB buffer of PS4 (a total of 8 GB it had, but only 4 gb was usable for GPU operations and 2 GB for CPU operations)

take a look at this picture. it is very informative. you have 2 gb 960, which as a limited 2 gb buffer and a 4 gb varian with exact same GPU core.

one renders 14 frames, and other renders 32 frames. that's a whoppin 2 times difference. this is what LACK of buffer looks like.

nOlHRs3.png


now you might ask, what does this have to do with our discussion? people like these keep yapping about "those GPUs cannot run games these days!!" well that's the problem. almost all of the PS4 equivalent GPUs from that time have 2 GB. all of them suffer high performance losses due to limited VRAM buffer. if some of them had 4 GB buffer, you would see they perform very close to the actual PS4 hardware.

HD 7850? 2 GB.
GTX 760? 2 GB.
HD 7870? 2 GB.

PS4 is somewhere between a HD 7850 (16 cu) and a HD 7870 (20 cu). At its cores, PS4 has a 7870 chipset (20 CU) but two CU units are removed. It has 18 CUs clocked at 800 mhz.. so it is not a complete replica of hd 7850. its not a 7870 either. its just sits in between. and what do wo have in between? lots of 2 gb cards.

r7 265? 2 GB
R7 270? 2 GB
GTX 6660? 2 GB

take in account that ps4 has higher bandwidth than all of these gpus. most of these gpus have bandwidth between 40-112 gb/s whereas ps4 has 140-150 gb/s available for GPU operations. so i don't where exactly it sits in this comparison. it may even outpace a hd 7870 in actual scenario due to being fed better thanks to higher memory bandwidth.

SJWiGgQ.png



lets see how rx 460 performs then,

in rdr2, in one of the heaviest locations in the game at 1080p, ultra textures with optimized settings, it gets a nice 35 fps average. can this gpu really match? ps4 absolutely. can it perform a bit better? yeah.



can consoles get better performance? absolutely. with metal APIs, they are bound to get better performance. we can see that 460 barely gets a 900p/ps4 matched settings/30 fps in horizon zero dawn



does it depend on dev? yeah. is it still playable? also yeah. is it "smoking"? no. since they're matched, it seems ps4 performs %20-25 above a 460 in this title, which is pretty respectable and admirable for devs. then again, rx 460 only has 112 gb/s bandwidth compared to the 224 gb/s total ps4 has (140-150 gb/s usable for gpu operations). so there's still some room for more performance on 460, but at this point i've proven my point.

in the case of 3060 and 3080, this will never happen. consoles have 10 GB buffer for GPU operations and 3060 has more than plenty. 3080 will get by. 8 GB GPUs will have to do cutbacks. 4-6 GB GPUs will pretty get whacked though, just like the 2 GB GPUs of the yore.

for the bandwidth case, 2070 2080 and co, all of them have full 448 gb/s. they're always nice fed.
 
Last edited:

winjer

Gold Member
Unless a new API shows up Shader compilation will still be a thing.
Its just about games/developers being clever about how and when they actually do the shader compilation.

There are too many hardware configurations out there for current engine/apis to have precompiled shaders.

Weve had examples of games that do a compile run before the game even actually starts.
Im curious how/when Gears 5 does its compilation cuz that game has near no stuttering and its a even a UE4 game which basically all games since have suffered from stuttering.

On Linux, using DXVK's state caches already have a great solution for shader compilation.
They created a library of compiled shaders, created by users. So when someone starts a game it checks if someone else has already compiled the shaders, and downloads it.
The results are very impressive. Maybe AMD and nVidia can do something similar with their drivers.

 
I mean, you can make anything sound bad if you're exagerating everything like that. You're not spending neither 2000 euros nor playing under 60 with a 3080. Dying Light 2 is a game with one of the most extreme differences between RT on and off. There's no reality where you dont notice:




You're again trying to diminish this aspect by saying you dont notice when you're killing and slashing enemies. Of course you notice. You are looking at the games visuals the entire time you are playing. The entire time you spend in the game, no mather what you do, is looking at it. The idea that one just doesnt have "time" to notice RT in games is just coping when you cant run this well. A 2080 is not adequate enough for RT so its fair that you prefer higher framerates instead. No reason to self convince yourself that its not worth it though, because it totally is.

Once you play enough games with ray tracing and you learn how different effects look you will ALWAYS notice it in every other game. Once your eyes adjust to RT shadows, lightning, reflections, etc. regular games will pop out more and more with how deficient they are and you will start seeing how wrong certain scenes look and how incorrectly they're lit. I kid you not, you go into a game like last of us 2 with its baked lightning and you will scratch your head how people think this looks good. You notice poor and incorrect lightning in almost every scene. You dont do it on purpose, its just your brain thats slowly adjusting to RT and how scenes SHOULD look that it will start identifying all the issues in normal games


Here's a simple fact: Horizon FW, hell even Zero Dawn looks miles better than Dying Light 2 ever will and those game have no RT. The magic of actual artistical talent. Dying Light 2 is a shit example because how poor the devs designed the game. The first game looks miles better. Imagine saying having a smooth experience vs better shadows is coping. Lmfao, where do you posters come from? I have nothing agaisnt RT, but as of now, it's not viable at all, even with DLSS. Maybe in the next 4 years. Enjoy your better shadows.
 

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
On Linux, using DXVK's state caches already have a great solution for shader compilation.
They created a library of compiled shaders, created by users. So when someone starts a game it checks if someone else has already compiled the shaders, and downloads it.
The results are very impressive. Maybe AMD and nVidia can do something similar with their drivers.


Yeah its a "common" solution for emulators too, download precompiled shaders.
But that works "well" because its basically a virtual machine thats using those shaders.
If Nvidia was to release shader packs or something, their drivers would probably be 100s of gigs each.....and if they relied on say download packs through Geforce Experience, it would likely actually be faster to just do a compile run when the game boots up....i dont have the best internet, I can all but guarantee my PC could compile the shaders faster than I could download them.
Which realistically is what devs should implement. Compile on first boot....when the game is showing us those splash screens, start compiling.

DXVK doesnt support DX12 titles yet does it?
 

winjer

Gold Member
Yeah its a "common" solution for emulators too, download precompiled shaders.
But that works "well" because its basically a virtual machine thats using those shaders.
If Nvidia was to release shader packs or something, their drivers would probably be 100s of gigs each.....and if they relied on say download packs through Geforce Experience, it would likely actually be faster to just do a compile run when the game boots up....i dont have the best internet, I can all but guarantee my PC could compile the shaders faster than I could download them.
Which realistically is what devs should implement. Compile on first boot....when the game is showing us those splash screens, start compiling.

DXVK doesnt support DX12 titles yet does it?

nVidia and AMD drivers don't need to bring all the handers, for all the games.
It just needs an online library with compile shaders. And when we start a game, it would download the corresponding shaders to that game, GPU and driver.
And mind you, most of these shaders are MB in size. So it's not that heavy on our internet connection.

I think DVXK still only supports DX9, 10 and 11.
But there is VKD3D-Proton on Linux, that supports DX12.
 

Md Ray

Member
indeed 750ti was never better. it was a 1.3 tflops maxwell GPU, a far cry from 1.8 tflops.

it performeed better than ps4 at some titles because of the CPU difference. lets take witcher 3 for example: on PS4, you have to set the game to a 30 fps limit because you cannot realistically get a locked 60 with jaguar cores. I'm "pretty" sure that PS4's GPU is more than capable of 30 frames in Witcher 3 at 1080p. 750ti had the privelege of running those early gen games at unlocked framerates with much better CPUs, naturally it "seemed" it performed better.

this is like saying a gtx 1070 is better than a ps5. is it? absolutely no. can you run rdr 2 at 1080p/60 FPS with a gtx 1060 on PC? yes. why can't you do that on ps5? because Rockstar is to blame for not providing a simple nextgen patch for 2 years now.

in that era, all 2 gb GPUs suffered heavily. it is so misleading. their raw performance can still match or tail just behind of PS4. their VRAMbuffer is not up for the task. it was 2 gb buffer versus a 3.5-4 GB buffer of PS4 (a total of 8 GB it had, but only 4 gb was usable for GPU operations and 2 GB for CPU operations)

take a look at this picture. it is very informative. you have 2 gb 960, which as a limited 2 gb buffer and a 4 gb varian with exact same GPU core.

one renders 14 frames, and other renders 32 frames. that's a whoppin 2 times difference. this is what LACK of buffer looks like.

nOlHRs3.png


now you might ask, what does this have to do with our discussion? people like these keep yapping about "those GPUs cannot run games these days!!" well that's the problem. almost all of the PS4 equivalent GPUs from that time have 2 GB. all of them suffer high performance losses due to limited VRAM buffer. if some of them had 4 GB buffer, you would see they perform very close to the actual PS4 hardware.

HD 7850? 2 GB.
GTX 760? 2 GB.
HD 7870? 2 GB.

PS4 is somewhere between a HD 7850 (16 cu) and a HD 7870 (20 cu). At its cores, PS4 has a 7870 chipset (20 CU) but two CU units are removed. It has 18 CUs clocked at 800 mhz.. so it is not a complete replica of hd 7850. its not a 7870 either. its just sits in between. and what do wo have in between? lots of 2 gb cards.

r7 265? 2 GB
R7 270? 2 GB
GTX 6660? 2 GB

take in account that ps4 has higher bandwidth than all of these gpus. most of these gpus have bandwidth between 40-112 gb/s whereas ps4 has 140-150 gb/s available for GPU operations. so i don't where exactly it sits in this comparison. it may even outpace a hd 7870 in actual scenario due to being fed better thanks to higher memory bandwidth.

SJWiGgQ.png



lets see how rx 460 performs then,

in rdr2, in one of the heaviest locations in the game at 1080p, ultra textures with optimized settings, it gets a nice 35 fps average. can this gpu really match? ps4 absolutely. can it perform a bit better? yeah.



can consoles get better performance? absolutely. with metal APIs, they are bound to get better performance. we can see that 460 barely gets a 900p/ps4 matched settings/30 fps in horizon zero dawn



does it depend on dev? yeah. is it still playable? also yeah. is it "smoking"? no. since they're matched, it seems ps4 performs %20-25 above a 460 in this title, which is pretty respectable and admirable for devs. then again, rx 460 only has 112 gb/s bandwidth compared to the 224 gb/s total ps4 has (140-150 gb/s usable for gpu operations). so there's still some room for more performance on 460, but at this point i've proven my point.

in the case of 3060 and 3080, this will never happen. consoles have 10 GB buffer for GPU operations and 3060 has more than plenty. 3080 will get by. 8 GB GPUs will have to do cutbacks. 4-6 GB GPUs will pretty get whacked though, just like the 2 GB GPUs of the yore.

for the bandwidth case, 2070 2080 and co, all of them have full 448 gb/s. they're always nice fed.

This is incorrect. Not even the PS4 Professional has that much total memory bandwidth. PS4 Amateur's total mem BW is 176 GB/s. Also, it's not just a 7870 chipset with 2 CUs disabled. There are further customizations specific to PS4 GPU like the async compute engines - it's a total of 8 ACEs on PS4 compared to 2 ACEs in the 7000 series lineup which might contribute to better perf in titles like Horizon Zero Dawn when compared to similarly specced PC GPU.
 
Last edited:

yamaci17

Member
This is incorrect. Not even the PS4 Professional has that much total memory bandwidth. PS4 Amateur's total mem BW is 176 GB/s. Also, it's not just a 7870 chipset with 2 CUs disabled. There are further customizations specific to PS4 GPU like the async compute engines - it's a total of 8 ACEs on PS4 compared to 2 ACEs in the 7000 series lineup.

thanks for further supporting my point bro XD i didnt know that. good on PS4.

i forgot the total mem bw. it can happen, sry dude.


in the end, it has nothing to do with 750ti. it can even supercede hd 7870 perf with the things you mentioned. its just that we don't have a 4 gb 7870 or 7850 that we can realistically compare. 2 gb buffer is simply limiting too much performance. i had a r7 265, when it had enough buffer, it could mostly match ps4. whenever that limit is breached , it was impossible to claw back performance
 
Last edited:

winjer

Gold Member
This is incorrect. Not even the PS4 Professional has that much total memory bandwidth. PS4 Amateur's total mem BW is 176 GB/s. Also, it's not just a 7870 chipset with 2 CUs disabled. There are further customizations specific to PS4 GPU like the async compute engines - it's a total of 8 ACEs on PS4 compared to 2 ACEs in the 7000 series lineup which might contribute to better perf in titles like Horizon Zero Dawn when compared to similarly specced PC GPU.

Also, the GPUs on the PS4 and Xbox One, were GCN 1.1, while the 7850/70 were GCN 1.0
GCN 1.0 is only compatible with DX12 FL11. But GCN 1.1 is DX12 FL12 compliant.
This is why earlier GCN cards, have problems running some modern DX12 games.

A closer comparison to the GPU on the PS4 and Xbox One, would be the 7770/90, as they are GCN 1.1
 
Last edited:
indeed 750ti was never better. it was a 1.3 tflops maxwell GPU, a far cry from 1.8 tflops.

it performeed better than ps4 at some titles because of the CPU difference. lets take witcher 3 for example: on PS4, you have to set the game to a 30 fps limit because you cannot realistically get a locked 60 with jaguar cores. I'm "pretty" sure that PS4's GPU is more than capable of 30 frames in Witcher 3 at 1080p. 750ti had the privelege of running those early gen games at unlocked framerates with much better CPUs, naturally it "seemed" it performed better.

this is like saying a gtx 1070 is better than a ps5. is it? absolutely no. can you run rdr 2 at 1080p/60 FPS with a gtx 1060 on PC? yes. why can't you do that on ps5? because Rockstar is to blame for not providing a simple nextgen patch for 2 years now.

in that era, all 2 gb GPUs suffered heavily. it is so misleading. their raw performance can still match or tail just behind of PS4. their VRAMbuffer is not up for the task. it was 2 gb buffer versus a 3.5-4 GB buffer of PS4 (a total of 8 GB it had, but only 4 gb was usable for GPU operations and 2 GB for CPU operations)

take a look at this picture. it is very informative. you have 2 gb 960, which as a limited 2 gb buffer and a 4 gb varian with exact same GPU core.

one renders 14 frames, and other renders 32 frames. that's a whoppin 2 times difference. this is what LACK of buffer looks like.

nOlHRs3.png


now you might ask, what does this have to do with our discussion? people like these keep yapping about "those GPUs cannot run games these days!!" well that's the problem. almost all of the PS4 equivalent GPUs from that time have 2 GB. all of them suffer high performance losses due to limited VRAM buffer. if some of them had 4 GB buffer, you would see they perform very close to the actual PS4 hardware.

HD 7850? 2 GB.
GTX 760? 2 GB.
HD 7870? 2 GB.

PS4 is somewhere between a HD 7850 (16 cu) and a HD 7870 (20 cu). At its cores, PS4 has a 7870 chipset (20 CU) but two CU units are removed. It has 18 CUs clocked at 800 mhz.. so it is not a complete replica of hd 7850. its not a 7870 either. its just sits in between. and what do wo have in between? lots of 2 gb cards.

r7 265? 2 GB
R7 270? 2 GB
GTX 6660? 2 GB

take in account that ps4 has higher bandwidth than all of these gpus. most of these gpus have bandwidth between 40-112 gb/s whereas ps4 has 140-150 gb/s available for GPU operations. so i don't where exactly it sits in this comparison. it may even outpace a hd 7870 in actual scenario due to being fed better thanks to higher memory bandwidth.

SJWiGgQ.png



lets see how rx 460 performs then,

in rdr2, in one of the heaviest locations in the game at 1080p, ultra textures with optimized settings, it gets a nice 35 fps average. can this gpu really match? ps4 absolutely. can it perform a bit better? yeah.



can consoles get better performance? absolutely. with metal APIs, they are bound to get better performance. we can see that 460 barely gets a 900p/ps4 matched settings/30 fps in horizon zero dawn



does it depend on dev? yeah. is it still playable? also yeah. is it "smoking"? no. since they're matched, it seems ps4 performs %20-25 above a 460 in this title, which is pretty respectable and admirable for devs. then again, rx 460 only has 112 gb/s bandwidth compared to the 224 gb/s total ps4 has (140-150 gb/s usable for gpu operations). so there's still some room for more performance on 460, but at this point i've proven my point.

in the case of 3060 and 3080, this will never happen. consoles have 10 GB buffer for GPU operations and 3060 has more than plenty. 3080 will get by. 8 GB GPUs will have to do cutbacks. 4-6 GB GPUs will pretty get whacked though, just like the 2 GB GPUs of the yore.

for the bandwidth case, 2070 2080 and co, all of them have full 448 gb/s. they're always nice fed.

Yes and this is why I was shitting on the 3060ti/70 only having 8gb.

3080... Doesn't have enough vram for its own capabilities but is probably ok for matching PS5 settings the whole generation, and should always be faster.
 
This week I started to play Detroit Become Human on PC. I started the game and the first thing it does is compile all shaders.
It took a couple of minutes, but after that the game was running smooth as butter.
There are already solutions to this problem, devs just have to implement them.
And as far as I know, UE has these tools. For example Borderlands 3 in DX3, also compiled all shaders at the first startup.

Shader compilation on PC is, unfortunately, the one thing that makes the consoles superior as the games just run without requiring this.

I remember playing Horizon: Zero Dawn on PC and getting really annoyed at the lengthy shader compilation delays (at one point it was 15 minutes from memory) during loading, especially as these are usually retriggered when you upgrade your graphics card (which with NVIDIA GPUs is typically every 3 to 4 weeks). And if the games don't pre-compile their shaders during loading then you end up with bad stuttering as happens in a lot of Unreal Engine 4 games these days. In fact, the shader compilation stutter issue is one of the reasons I no longer use my PC as my main gaming platform and instead prefer to play games on my PS5 and Xbox Series X.
 

DaGwaphics

Member
That's because the 750ti was not better. Edit : well, it was better than Xbox one, not as good as PS4. 2gb vram hurt it as well.

3060 will always be ahead. Even when driver support winds down, it will still have rt cores and dlss.

You have to figure that every facet of the current-gen consoles will be maximized eventually. The 360GB/s bandwidth might be a new bottleneck for the 3060 down the line.
 

rodrigolfp

Haptic Gamepads 4 Life
Shader compilation on PC is, unfortunately, the one thing that makes the consoles superior as the games just run without requiring this.

I remember playing Horizon: Zero Dawn on PC and getting really annoyed at the lengthy shader compilation delays (at one point it was 15 minutes from memory) during loading, especially as these are usually retriggered when you upgrade your graphics card (which with NVIDIA GPUs is typically every 3 to 4 weeks). And if the games don't pre-compile their shaders during loading then you end up with bad stuttering as happens in a lot of Unreal Engine 4 games these days. In fact, the shader compilation stutter issue is one of the reasons I no longer use my PC as my main gaming platform and instead prefer to play games on my PS5 and Xbox Series X.
They patched the compilation time. If the devs don't fuck up is not really a big problem compiling shaders before, like R6 Siege does right.
 
You have to figure that every facet of the current-gen consoles will be maximized eventually. The 360GB/s bandwidth might be a new bottleneck for the 3060 down the line.
It'll be fine. Again, PS5 is much more limited in terms of rt, and dlss is there to pick up slack wherever it hits driver issues or anything else.

I can run metro Exodus enhanced at 120fps with better rt than PS5... Says it all really.

PS5 does not have that full 448gb/s for its gpu, so the raw numbers vs. 3060 are close and also color compression on ampere is a factor.

The only real issue and something to look out for are lack of driver optimizations down the line aka look at Nvidia Kepler doom eternal performance. But Kepler was also really really limited in compute performance so even that does not apply to 3060 which has MORE compute than PS5 and SX.
 
Last edited:
No, they didn't. Vulkan also has shader compilation. Devs just need to do the compilation before the game starts.
Why aren't they doing it and how hasn't it being figured out already? I have a much better experience with Vulkan games.

Man this is the .. what .. 8th thread you've gone in and blamed MS for something unrelated :messenger_tears_of_joy:

You've got a serious grudge against them ..
You see that ignore user button? Be my guest.

Do you even play on PC? DX12 so far has brought nothing but headaches.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
You see that ignore user button? Be my guest.

Do you even play on PC? DX12 so far has brought nothing but headaches.


Yes, I play on PC but my PC isn't anywhere near DF quality so to me just getting 1080p/60 on modern games is victory enough :D

But that's besides the point, this isn't MS's fault somehow .. no matter how much you want to try and make it.
 
Yes, I play on PC but my PC isn't anywhere near DF quality so to me just getting 1080p/60 on modern games is victory enough :D

But that's besides the point, this isn't MS's fault somehow .. no matter how much you want to try and make it.
How is it not their fault if DX12 doesn't solve these issues? This has been happening for years now.

Some devs already had. Some don't. Ask them why they didn't figure.
Clearly devs can't be trusted whit these things, too many fuck ups already.
 
Last edited:

winjer

Gold Member
Yes, I play on PC but my PC isn't anywhere near DF quality so to me just getting 1080p/60 on modern games is victory enough :D

But that's besides the point, this isn't MS's fault somehow .. no matter how much you want to try and make it.

The Kronos group is taking measures to improve shader compilation under Vulkan.
Microsoft is doing little to nothing.

Example:

 

adamsapple

Or is it just one of Phil's balls in my throat?
The Kronos group is taking measures to improve shader compilation under Vulkan.
Microsoft is doing little to nothing.

Example:



Isn't it just a matter of precompiling shaders ? Yes it'll lead to a longer load time at start but it should fix all the stutters.

If I'm not mistaken.
 
Last edited:

winjer

Gold Member
Isn't it just a matter of precompiling shaders ? Yes it'll lead to a longer load time at start but it should fix all the stutters.

If I'm not mistaken.

Instead of depending on game engines, and dv optimization, it does it on the API level.
And all that the devs have to do is to call on these extensions.
This is a great step for Vulkan games. If MS doesn't stop being so lazy, soon enough, Vulkan games will run much better and smoother than games in DX12.
 

01011001

Banned
Do you even play on PC? DX12 so far has brought nothing but headaches.

that's because most developers suck... it's as easy as that.

the reliance on generic engines like UE4 and Unity made many developers simply not care for optimisation. they basically expect the engine to do everything for them and just use shaders all willy nilly, not even thinking for a moment about what that means for the performance.

there are DX11 games that have the same issues, it's just not as prevalent with DX11 because DX11 was more restrictive.

the new DX12 possibilities also make it so that devs need to actually know what they are doing.
 
Last edited:
that's because most developers suck... it's as easy as that.

the reliance on generic engines like UE4 and Unity made many developers simply not care for optimisation. they basically expect the engine to do everything for them and just use shaders all willy nilly, not even thinking for a moment about what that means for the performance.

there are DX11 games that have the same issues, it's just not as prevalent with DX11 because DX11 was more restrictive.

the new DX12 possibilities also make it so that devs need to actually know what they are doing.
I get it but MS and whoever is in charge of Vulkan has to work with the devs they have and develop solutions so that we don't keep having these problems. How is it possible that people came up with better solutions on Linux first for some of these issues?
 
Last edited:

01011001

Banned
I get it, but MS or whoever is in charge of Vulkan has to work with the devs they have and develop solutions so that we don't keep having these problems.

there are no solutions other than developers actually being conscious about using shader permutations and compiling them during times where it doesn't affect gameplay.

and it's not looking good when even Epic games has worse shader stutters in DX11 mode in Fortnite than Gears 5 has in DX12.

that just shows you that even Epic Games, gigantic company and creators of the Unreal Engine, also don't give a shit
 
there are no solutions other than developers actually being conscious about using shader permutations and compiling them during times where it doesn't affect gameplay.

and it's not looking good when even Epic games has worse shader stutters in DX11 mode in Fortnite than Gears 5 has in DX12.

that just shows you that even Epic Games, gigantic company and creators of the Unreal Engine, also don't give a shit
How about sharing compiled shaders? That was the fix for when Cemu had issues compiling shaders in real time.
 

VFXVeteran

Banned
- Whole suite of RT effects (on PC)
- PC version has an exclusive "virtual tourism" to just explore.
- High texture quality and all vegetation is captured in RT reflections.
- RT on Translucency , caustics and GI are praised

- PC version with RT on low causes 34% performance loss.
- High and Ultra refine effects with similar big performance losses with each tier.
- Alex mentions his recommended settings during the course of the video on PC.

- PS5 has Performance mode (1080p/60 and no RT) and Quality mode 1512p / 30 FPS with RT.
- PS5 makes cuts in RT GI, caustics, translucency.
- Rt reflections are included but quality is below the Low preset on PC and more aggressive culling and draw distance range.

- Video ends with Alex talking about the dreaded shader compilation stutter on PC that happens every time something happens for the first time.
As expected..
 

VFXVeteran

Banned
.

Yep, it's fucking ridiculous about some of these comparisons videos, Let's start with a graphics card that costs 4 times more than the console, then go into detail about how many sacrifices the console has made to reach its raytracing target.

Like no shit Sherlock,


If they are going to make comparisons, stick to console vs console and have PC as its own platform.

God damn digital foundry, you have one job, do it correctly.
PC gets compared with consoles (mainly PS) all the time when it concerns exclusives so why not games that are on all platforms? I don't understand why you guys would complain about comparing the PC vs. the PS. What Alex is doing is showing how the game can be seen in it's best light compared to the consoles. Nothing wrong with that.
 

TintoConCasera

I bought a sex doll, but I keep it inflated 100% of the time and use it like a regular wife
As expected..
Almost makes you think the whole "RTX on consoles" was just a marketing stunt.

Not that I like the thing in PC too much either. Personally the gains in visual quality are not worth the performance cost, but that's just my opinion.

Things like DLSS are much cooler and interesting to me. Now if that could be on consoles, that would be the shit.
 

DeepEnigma

Gold Member
Almost makes you think the whole "RTX on consoles" was just a marketing stunt.
It's not a marketing stunt. It works, and some developers are better than others at it (see: Insomniac, which even improved it more in recent patches).

It's just the first we are seeing it, and it should improve over the generations. Still not bad at all for $400/$500 boxes in comparison.
 

VFXVeteran

Banned
I mean, you can make anything sound bad if you're exagerating everything like that. You're not spending neither 2000 euros nor playing under 60 with a 3080. Dying Light 2 is a game with one of the most extreme differences between RT on and off. There's no reality where you dont notice:




You're again trying to diminish this aspect by saying you dont notice when you're killing and slashing enemies. Of course you notice. You are looking at the games visuals the entire time you are playing. The entire time you spend in the game, no mather what you do, is looking at it. The idea that one just doesnt have "time" to notice RT in games is just coping when you cant run this well. A 2080 is not adequate enough for RT so its fair that you prefer higher framerates instead. No reason to self convince yourself that its not worth it though, because it totally is.

Once you play enough games with ray tracing and you learn how different effects look you will ALWAYS notice it in every other game. Once your eyes adjust to RT shadows, lightning, reflections, etc. regular games will pop out more and more with how deficient they are and you will start seeing how wrong certain scenes look and how incorrectly they're lit. I kid you not, you go into a game like last of us 2 with its baked lightning and you will scratch your head how people think this looks good. You notice poor and incorrect lightning in almost every scene. You dont do it on purpose, its just your brain thats slowly adjusting to RT and how scenes SHOULD look that it will start identifying all the issues in normal games

Finally someone gets it! I'm relieved!

Once people understand how rendering is SUPPOSED to look they will change their color-coded glasses to match reality instead of just claiming "it just looks better".
 

TintoConCasera

I bought a sex doll, but I keep it inflated 100% of the time and use it like a regular wife
It's not a marketing stunt. It works, and some developers are better than others at it (see: Insomniac, which even improved it more in recent patches).

It's just the first we are seeing it, and it should improve over the generations. Still not bad at all for $400/$500 boxes in comparison.
Yeah hopefully! I guess devs will learn how to properly implement and optimize it down the line.
 

VFXVeteran

Banned
PC version with RT on low causes 34% performance loss.
It's supposed to. RT will never be a "free" graphics feature. It is, by far, the most expensive rendering technique. We all knew this back in the 80s.
High and Ultra refine effects with similar big performance losses with each tier.
Yes, that's supposed to be the case too. You are taking more samples to approximate the rendering equation better. Again, that's NOT going to be free.
Dreaded shader compilation stutter on PC that happens every time something happens for the first time.
Has always been the case since like forever with Unreal Engine.
 
Last edited:

01011001

Banned
How about sharing compiled shaders? That was the fix for when Cemu had issues compiling shaders in real time.

not sure if that works, you'd need to download the shaders after each new driver update you install, after each game update and pretty sure it would also need to he tailored to your GPU to some degree.

it's doable, but would need to be actively supported by developers and GPU manufacturers
 

hlm666

Member
Finally someone gets it! I'm relieved!

Once people understand how rendering is SUPPOSED to look they will change their color-coded glasses to match reality instead of just claiming "it just looks better".
It's pointless I tell ya, RT is a waste /s



Unfortunately it's going to be another 60fps scenario, it wont matter or isn't noticeable until consoles are bringing the RT works mid refresh or next gen.
 
Top Bottom