• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

[DF]Cyberpunk 2077 Ray Tracing: Overdrive Technology Preview on RTX 4090

Abriael_GN

RSI Employee of the Year
Imagine losing >80% of your performance to achieve this:

UONzITd.jpg

Imagine being so disingenuous that you cherry-pick the one scene that shows a limited difference, among tons that show big differences.
 

Fredrik

Member
This looks incredible. Maybe it’s time for another playthrough.

Now that CDPR is showing what they can do in their own engine it’ll be interesting to see how it’ll compare to UE5 which is what they’ll use next.
 

SlimySnake

Flashless at the Golden Globes
Absolutely stunning stuff. At least in some areas. The performance hit isnt as bad as I thought it would be. 48 fps Raster vs 18 fps? Thats what 2.5x? I am currently getting 60 fps using psycho RT settings on my 3080 at 4k dlss performance. If i go down to 1440p dlss performance, i might actually be able to run this.

I dont see how this is the future though. the visual gains just arent enough to justify a 2.5x hit to performance. That kind of the GPU overhead is better spent elsewhere like adding more detail and better character models. I think Matrix still looks better because of this. Will next gen or even PS7 consoles use path tracing? I just dont think so unless they can figure out a way to reduce the cost by half.
 

Mahavastu

Member
Wait, Alex is doing Cyberpunk video?

Wasn't him with the mob hate over this game? Is he change his mind after this game get popularity or something?

So, there is hope for HL RT.
I think you mix something up.
In that year DF had a top X list of 2020 games with the best graphics, best technic and so on, and here the PC version of Cyberpunk won.
Alex and John spend a while talking about that their decision is probably controversial, but that they still think this game deserves the title.

timestamped:
 
Last edited:

Mahavastu

Member
But I think at this point ARM is a must. Having all the memory (RAM/VRAM), CPU, GPU, caches, RT cores in one large chip would 100% make less latency between those parts and better results overall.
while you are right, that those things would be good for performance, you do not need ARM for that. It would also work on X86
You do no need a huge chip, which would be insanly expensive because of bad yield, it would be "good enough" to put it in the same package, like Apple did it with the M1
 
Last edited:

Turk1993

GAFs #1 source for car graphic comparisons
I was playing CP2077 yesterday maxed out and was like holy shit this looks soo good. But now seeing this blows my mind how far we have come. It looks soo good that i can literally take screenshots of every corner or place i go in the game. And why are people crying about the performance lol, its the most advance game ever made. Some people have beeing criticizing devs for not pushing the tech, and when they do they now criticizing the performance lol. This is the new Crysis 1 and i can't wait to test this when the RTX5000 series launch. I want to play this with 4k DLSS Quality mode. While 4K Performance mode looks really good, the quality mode is on a different level when it comes to sharpness.

Absolutely stunning stuff. At least in some areas. The performance hit isnt as bad as I thought it would be. 48 fps Raster vs 18 fps? Thats what 2.5x? I am currently getting 60 fps using psycho RT settings on my 3080 at 4k dlss performance. If i go down to 1440p dlss performance, i might actually be able to run this.

I dont see how this is the future though. the visual gains just arent enough to justify a 2.5x hit to performance. That kind of the GPU overhead is better spent elsewhere like adding more detail and better character models. I think Matrix still looks better because of this. Will next gen or even PS7 consoles use path tracing? I just dont think so unless they can figure out a way to reduce the cost by half.
The charachter models are already incredible looking outside some random npcs, better than almost any game. Just look at the level of detail on them and look at the detail of the player hands
Cyberpunk-2077-C-2020-by-CD-Projekt-RED-10-04-2023-6-36-27.jpg

Cyberpunk-2077-C-2020-by-CD-Projekt-RED-10-04-2023-7-08-54.jpg

Cyberpunk-2077-C-2020-by-CD-Projekt-RED-10-04-2023-8-02-51.jpg

Cyberpunk-2077-C-2020-by-CD-Projekt-RED-10-04-2023-6-48-11.jpg
 
I understand how impressive the tech is.
But I don't want every game to look like that.
Sometimes I want the "video game" look.

Agreed. I kinda wonder what techniques people are going to have to use in the future to get around that perfect path-traced look, haha. Like we have all these rasterization techniques to get game lighting look more realistic, now it's gonna be the other way around.
 
Last edited:

Buggy Loop

Member
Absolutely stunning stuff. At least in some areas. The performance hit isnt as bad as I thought it would be. 48 fps Raster vs 18 fps? Thats what 2.5x? I am currently getting 60 fps using psycho RT settings on my 3080 at 4k dlss performance. If i go down to 1440p dlss performance, i might actually be able to run this.

I dont see how this is the future though. the visual gains just arent enough to justify a 2.5x hit to performance. That kind of the GPU overhead is better spent elsewhere like adding more detail and better character models. I think Matrix still looks better because of this. Will next gen or even PS7 consoles use path tracing? I just dont think so unless they can figure out a way to reduce the cost by half.

It'll be the future

It basically solved all the limits of the hybrid rasterization + RT in one fell swoop for not that drastic of a penalty. 5 years ago you would have told me that path tracing a game like Cyberpunk would be 18 fps 4k native? I would have told you you would be lucky to have even 1 frame every 30 seconds.

I'm not even sure why we bother to look at native performances when anyway you need a form of temporal AA, and DLSS at the very least should almost always be enabled.

5000 series for sure is tailored to accelerate that pipeline even further.

Nvidia also has an improvement already in research beyond ReSTIR called Neural radiance cache (NRC), which if i would take a bet, is that 5000 series will likely have tech for it but should still be agnostic as it uses Tensor cores.

We're talking about what, 5000 series high end to make this playable native, then 6000 series low end to have this playable?

That's a snap of the finger fast in the tech world. It took way longer than that to have tessellation standard in the rendering pipeline compared to its first appearance in hardware in 2001.

Lumen has RT as the backbone of its tech, with a simplified geometry and that will have a tsunami of games supporting it soon. Next gen consoles at the very least will have current PC Lumen hardware RT performance hit vs Lumen software (i.e. no penalty).

Nvidia bypassed the path tracing wall with ReSTIR. Wouldn't surprise me that all vendors, APIs and game engines fast track to implement a similar solution of it. Before we know it, artists will stop manually putting these light sources or pre-baking lights.

We went from Quake 2 RTX in 2019 with tens lights part of the ray tracing in simple corridor geometry → Cyberpunk 2077 path tracing in 2023 with thousands/unlimited lights with probably the most complex open world geometry of modern days.

In a mere 4 years

Quake-2-RTX-new-5.jpg


verticalimage1675968921790-2951976.png


Blow Your Mind Wow GIF by Product Hunt
 

Umbasaborne

Banned
I see the "rAy TrAcInG iS a MeMe" crowd are still determined to die on that hill. It won't be long now before we look at them like we look at this:


Downplaying the technology makes it easier for people who cannot currently access it because they are on consoles, to cope. I cant afford to buy a house yet, even though i really want one. It make it easier for me to cope when I think of all the cons of a house
 
Last edited:

Bo_Hazem

Banned
while you are right, that those things would be good for performance, you do not need ARM for that. It would also work on X86
You do no need a huge chip, which would be insanly expensive because of bad yield, it would be "good enough" to put it in the same package, like Apple did it with the M1

Actually Apple chip is pretty huge:

3ssPvpE8h4I5XLZxCp1Zhl5XfiqsvAgiiyqtiyG8o2A.png


2022-03-19_14-00-05.png


You can always do what Apple is doing, selling the perfect yield higher as usual.
 

iorek21

Member
Future tech is for sure looking great. Very future to be accessible though, but it’s something worth the wait.
 

LiquidMetal14

hide your water-based mammals
I told you what, as a fan of games I've always enjoyed this. That's a fan of technology and advancements and graphics in this industry or anything that has to do with how games are made, I've always been excited about the best and newest technology.

Part of all of us, I would imagine, would want to see bleeding in its performance and tech being pushed to the limit. A sort of measurement on what kind of effects can be done right now in real time regardless of frame generation or anything, it can be done.

I can see the jokes about the price and how the entry level is with these things but has anybody ever been a part of a higher end of technology without paying a premium to experience certain things before anybody else? OLED TV's or even high-end electronic vehicles which are probably beyond most of our budgets, that's the cool stuff that you like to see that even if it's stuff that most of us can't necessarily afford brand new at this moment.

So this lets us see that right now, a 4090 particularly can run this game very well with DLSS and that's fine. These technologies are there to be taken advantage of and I don't know about you but I have always wanted to get more out of the hardware that I have. I don't care whether you call it cheating or free performance or whatever but me as the consumer just cares about getting the best experience and the most out of the hardware that I have.

Now the other side of the jokes and making fun of the effect of these things is pretty much missing the point. The subtleties in reality are small and you don't appreciate them until you see games starting to render the my new details that you take for granted but in real time in video game. I don't take any of these advancements lightly or laugh at them because regardless of how subtle or simple they may be, they are getting closer to something that looks more realistic regardless of our direction. And at the very least you don't want to look at rasterize graphics and make you think that it looks like plastic or fake. And I'm not saying they need to look Ultra real but look at rasterize graphics with simple lighting and you damn well can observe first that it looks unnatural if light is bouncing a certain way or shadows aren't cast under certain things. It does break the illusion to an extent and I know I'm not the only one that doesn't notice this because in real life I'll tell you this, that bottle on the ground or the toilet paper that's sitting on the counter cast shadows and that's what I want to see in a video game.

I welcome all these advancements and heaven forbid you do have to have high-end hardware, I've been on that other side. I have been on the mid-range just outside the high-end and I've been there forever. But if I happen to have something good it's not because I didn't earn it but I do deserve it and so does anybody with great hardware.
 

Buggy Loop

Member
I am getting a feel that 20 series are out and you'll need at least a 3070 to play this in 1080p at 30fps.... hope to be proved wrong.

Got a feeling that the SER (Shader execution reordering) will take a huge dump on non 4000 series. That was made specifically for path tracing's chaotic calculations. We'll see tomorrow but ampere will likely choke.

Even with a 3080 Ti, i doubt tomorrow it will go well for me, even at 3440x1440.

I'd wager that a ton of whiny bitches will review bomb the free patch and a technological preview, expecting their 2060 should be offering playable framerates. I might get to see a glimpse of it on my card, maybe unplayable, who cares, it's interesting tech and we HAVE to move forward. It's also future proofing this game for years to come. Will happily revisit when i get a 5000 series card.
 

Turk1993

GAFs #1 source for car graphic comparisons
DLSS ultra performance 1080p, 30 fps (maybe)

Shocked Oh My God GIF by Unbreakable Kimmy Schmidt
If a rtx 4090 can get 40fps 4k dlss performance and 80 with dlss3, than we could expect 1440p30 dlss performance atleast. The RTX 4090 roughly doubles the fps in most games without dlss3 compared to my RTX 3080 ti. Im sure we could tweak some setting to get it locked 1440p30 with dlss, Alex will do a optimised setting video.
 

Mahavastu

Member
Actually Apple chip is pretty huge:

3ssPvpE8h4I5XLZxCp1Zhl5XfiqsvAgiiyqtiyG8o2A.png


2022-03-19_14-00-05.png


You can always do what Apple is doing, selling the perfect yield higher as usual.

https://www.apple.com/de/newsroom/2...xt-generation-chips-for-next-level-workflows/
While it is only the german version of the text, you can find a gallery with a picture of the internals of that package.

In this picture you can see that you have the M2 die itself, but the memory chips are mounted in the same package

This is what I meant: you do not need to do everything in the same die, you can add stuff in the same package. While this is slightly slower, it is still very fast, because it is so close to the die and you have so many memory chips for wide bandwith.
AMD uses a similar technology for example to add more cache to the die without ruining the yield.

And you do not need ARM for this. The ISA you use does not matter here, so a X86 would be fine.
 
Last edited:

Lethal01

Member
But I don't want every game to look like that.
Sometimes I want the "video game" look.

This should be the basis for any game that's slightly realistic, from there they can tweak it to achieve the look they actually want rather than what they are forced to accept due to limitations.
If they want totally innacurate lighting at time the can make it like that, but it's rare for any developer to think "I really need these shadows to disappear when they are too far away" in a realistic game.

I totally, all I want from a game like breath of the wild is higher resolution/drawdistance and for the screenspace effects they use to be replaced by ratraced ones so they are more stable.
 
Last edited:
This should be the basis for any game that's slightly realistic, from there they can tweak it to achieve the look they actually want rather than what they are forced to accept due to limitations.
If they want totally innacurate lighting at time the can make it like that, but it's rare for any developer to think "I really need these shadows to disappear when they are too far away" in a realistic game.

I totally, all I want from a game like breath of the wild is higher resolution/drawdistance and for the screenspace effects they use to be replaced by ratraced ones so they are more stable.
Inaccurate probably isn't the right word, but devs would absolutely want unnatural lighting. I mean there isn't a movie or TV show where the lighting isn't meticulously designed and staged in a way that isn't using "real" light sources. Hell even nonfiction two people in a room doing an interview requires a ton of extra lighting.

I think there are going to be some artistic growing pains where it becomes trendy to have literally only diagetic lighting, but eventually devs will probably work out the best ways to have nondiagetic lighting for artistic effect. Still ray traced or whatever, but coming from sources that don't actually exist.
 

Mattdaddy

Gold Member
Goddamn if this becomes the standard down the road more games are going to have to start implementing flashlights!

Shadows are so realistic I actually cant see shit.
 

Tchu-Espresso

likes mayo on everthing and can't dance
With the 4090 being able to pump out path tracing at high resolution now, is it safe to say that the next generation of consoles (say in another 5 years) will be able to achieve this relatively easily?
 

Spyxos

Member
With the 4090 being able to pump out path tracing at high resolution now, is it safe to say that the next generation of consoles (say in another 5 years) will be able to achieve this relatively easily?
5 years is a lot of time, but Amd is clearly behind Nvidia in terms of ray tracing. However, I can hardly imagine such a strong console.
 

Buggy Loop

Member
This should be the basis for any game that's slightly realistic, from there they can tweak it to achieve the look they actually want rather than what they are forced to accept due to limitations.
If they want totally innacurate lighting at time the can make it like that, but it's rare for any developer to think "I really need these shadows to disappear when they are too far away" in a realistic game.

I don't understand why anyone would deliberately want to keep rasterization lighting. All these years the devs have been trying HARD to mimic ray tracing by pre-baking it with RT. It falls apart as soon as anything is dynamic, and is not bound by any art style. You could do cell shaded ray traced lighting game. Fornite / Mario 64 RT, Minecraft RTX, Lego, Teardown comes to mind.

teardown1.gif


teardown4.gif


5ac5a3f2d6054d73985ef93fae0e4ed0


minecraft-ray-tracing.gif


giphy.gif


 
Top Bottom