• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Cyberpunk 2077: Phantom Liberty Benchmark Test & Performance Analysis Review (Raster / Ray Tracing / Path Tracing)

Dice

Pokémon Parentage Conspiracy Theorist
Mmmmmm overdrive is definitely the kind of stuff I am looking for with my next PC build. The next phase of development for me is like 80% lighting and 20% animation. I am going to wait until something can take even better advantage of this new RT tech. I really like how AMD handles normal rasterization and their driver software, but if they can't get their act together in lighting then they are just straight up missing the generation.
 

Thaedolus

Gold Member
Upgraded PC with a 4080 aaaand
elmo awww GIF
 
Last edited:

Leonidas

Member
Is Frame Generation forced on Path Tracing? Is that why there's no tests specifying the setting?
No, frame generation is not on for these, Nvidia really is that much faster than AMD at Path Tracing.

Section of the conclusion listed settings used as well as some performance numbers (DLSS / FG) that were not graphed.

We ran several rounds of benchmarks for this article, on a wide selection of graphics cards at various settings. Our first test run is rasterization only, at ultra settings, without any ray tracing. Here we're seeing very decent framerates across the board. To reach 60 FPS at 1080p you need a Radeon RX 7600 or RTX 4060. 1440p with 60 FPS is possible for many cards, too, you'll need RX 7700 XT or RTX 4070 or better. 4K60 is pretty challenging though—only the RTX 4090 can handle it (without any upscaling tech).

Next up is rasterization at ultra, plus ray traced reflections, sun shadows, local shadows and ray traced lighting at "Medium." Here the hardware requirements go up quite a bit. At Full HD, the RTX 3090 reached 60.7 FPS, the RTX 4060 55.7 FPS. AMD's fastest the RX 7900 XTX is far behind, with just 49.7 FPS. At higher resolutions, AMD falls behind more and more. While RTX 4090 can reach 36 FPS at 4K, the RX 7900 XTX only gets 17 FPS. No wonder, NVIDIA is promoting Cyberpunk to show their dominance in ray tracing.

Last but not least, we activated path tracing, which brings even the best GPUs down. The mighty RTX 4090 got 61 FPS at 1080p, 4K was almost unplayable at 20 FPS. Things look even worse for AMD, with RX 7900 XTX reaching only 14.5 FPS at 1080p, 8.8 FPS at 1440p and 4.3 FPS at 4K. The good thing is that Phantom Liberty supports all three rivaling upscaling technologies from NVIDIA, AMD and Intel. With DLSS enabled, in "Quality" mode, the RTX 4090 gets 47 FPS at 4K—much more playable. If you enable DLSS 3 Frame Generation on top of that, the FPS reaches a solid 73 FPS. Without DLSS upscaling and just Frame Generation the FPS rate is 38 FPS at 4K, but the latency is too high to make it a good experience, you always need upscaling. Since the upscalers have various quality modes, you can easily trade FPS vs image resolution, which makes the higher ray tracing quality modes an option, even with weaker hardware, but at some point the upscaling pixelation will get more distracting than the benefit from improved rendering technologies.
 

Thaedolus

Gold Member
This is looking dire for AMD. They are vastly behind in what seems to be the next paradigm shift in rendering technology. Hopefully, RDNA 4 makes a major leap in RTRT. Otherwise, they'll be left behind completely.
Seems like they’re still pretty competitive CPU-wise but if you want any kind of ray tracing or decent upscaling/frame generation, goodnight
 

Kenpachii

Member
what settings do you use to avoid screen tearing and reduce input lag?

High framerates and reflex, the higher your framerate the less tearing becomes a issue. The best is VRR solutions like gsync/freesync to smooth it out, if not u could try vsync or whatever sync nvidia came up with in there control panels, didn't follow it anymore.

personally on my laptop which doesn't have vrr, i sit at around the ~100 fps in cyberpunk with overdrive at 1080p max settings, so tearing becomes a whole lot less of a problem then. But even without the tearing, framegen is needed to get smooth framerates, without FG i probably sit around yanky 50's.
 
Last edited:

GymWolf

Member
Every time you'll be in indirect light situation like this

D1XwaaW.jpg


psx_20221126_181153ocerm.jpg


Or have characters around

CP5-2.jpg

CP9-2.jpg
CP4-2.jpg


You'll remember that you're playing in "old gen" mode. You'll end up enabling path tracing all the time.
Dude, i literally tried the game maxed out for like 30 min, i already know that FOR ME some lights and shadows are not worth the hit on resolution and framerate.
Unless dlss 3.5 gets me at 60 stable fps without going below dlss quality, overdrive is already off the table and i'm never gonna look back as you think (i just don't care about rtx in general)

Like someone else said, the game has a strange input lag, at least with a controller, i tried locking at 30 and it is unplayable, 60 is the bare minimum to feel decent.
 
Last edited:

Kenpachii

Member
It’s a bad RT situation for AMD but really, this is one single game. Hardly any devs will make the effort to implement lighting this advanced until next gen at the earliest when the consoles will have the ability to handle it somewhat well.

Nvidia will make them, big chance there big selling point is dedicated hardware on the 5000 series that speeds up pathtracing massively and sponsorships for games will make them implement it.

Not going to lie, pathtracing is mighty fine and fixes the oh so ugly NPC's in cyberpunk, but its dark like really dark. It doesn't feel natural. It feels like shadow setting is overtuned to hell. But disabling it makes the entire game feel like last gen.

Anyway AMD needs to start investing into AI solutions far more, they are behind the curve for a while and that gets bigger if they don't start to follow nvidia or atleast try to compete.
 
Last edited:

Bojji

Member
had no issue's with it on a 60hz tv with games.

That's interesting. Nvidia introduced the way to limit framerate produced by dlss3 by introducing (working) driver vsync, but it will limit framerate below refresh rate. So for example in Starfield I'm maxing out at 115fps, without it game would be taring.

I didn't try fixed refresh with it but maybe it works ok.

It’s a bad RT situation for AMD but really, this is one single game. Hardly any devs will make the effort to implement lighting this advanced until next gen at the earliest when the consoles will have the ability to handle it somewhat well.

Alan wake 2 is next. Every game with dlss 3.5 implemented will (at least) look much better on Nvidia. Amd is in worse and worse situation with every new feature Nvidia introduces.

Because dlss I can't co back to AMD GPUs, difference between this and fsr is too great and now 3.5 will also fix RT denoising problems.
 
This looks insane. Definitely don’t want to upgrade my 3080 at the moment because those new cards will be rolling around in a year or so.

May just play at 1080p 30fps with path tracing on.
 

Buggy Loop

Member
Nvidia will make them, big chance there big selling point is dedicated hardware on the 5000 series that speeds up pathtracing massively and sponsorships for games will make them implement it.

Not going to lie, pathtracing is mighty fine and fixes the oh so ugly NPC's in cyberpunk, but its dark like really dark. It doesn't feel natural. It feels like shadow setting is overtuned to hell. But disabling it makes the entire game feel like last gen.

Anyway AMD needs to start investing into AI solutions far more, they are behind the curve for a while and that gets bigger if they don't start to follow nvidia or atleast try to compete.

AMD has to throw their hybrid pipeline in the garbage

Was made for saving silicon area and complexity in favor of rasterization, but even that they barely got an advantage somehow. That hybrid RT is made so that queuing systems juggle between rasterization and RT and ML (ML part has never been demonstrated to work with the other 2 so far..) in, AMD's prefered method, inline ray tracing. You control and know what you're getting, the scheduler can do the rest.

Path tracing is not like that, it's chaotic. The hybrid pipeline is choking big time.

Throwing an AI denoiser into the mix, because i don't think you can possibly outdo or match ray reconstructions without ML, would choke that pipeline even further, juggling between graphic workload, RT & compute... It would not be pretty. It's probably why you do not see their FSR solutions use ML so far, it's too hard to juggle that with the current architecture.

They have to start from scratch that pipeline

Buckle up for 5000 series to leverage path tracing future full speed ahead. Those next gen RT cores + ML might be a drastic departures, if their neural cache radiance papers are to be believed, path tracing is going ML big time and leaving ReSTIR behind for an even less noisy image, very close to reference path tracing before applying the denoiser. They're not sitting on their laurels. Intel might be the only contender to somehow catch up but always a tad late. At least not 2 whole fucking gens late.
 
Last edited:

SF Kosmo

Al Jazeera Special Reporter
is it worth using Frame generation on 60Hz display? Would it lead to lot of input lag?
It can be if you're getting like 45fps and trying to get to 60.

It does add lag, especially at lower framerates, but generally not as much as you would think. It's definitely viable in sub-60 situations even if it's more designed for high framerates and VRR
 

Thaedolus

Gold Member
Do we know what time today the 2.0 update is going live? I just got a tiny update in Steam this morning but apparently it wasn't the real update yet.
 

Sethbacca

Member
Seeing these benchmarks make me feel like my 1070 has finally reached a point where I should replace it. Thankfully my laptop is a 3060 so no rush.
 
Last edited:

winjer

Gold Member
AMD has to throw their hybrid pipeline in the garbage

Absolutely.
And add dedicated hardware to traverse and manage the BVH structure.

Was made for saving silicon area and complexity in favor of rasterization, but even that they barely got an advantage somehow. That hybrid RT is made so that queuing systems juggle between rasterization and RT and ML (ML part has never been demonstrated to work with the other 2 so far..) in, AMD's prefered method, inline ray tracing. You control and know what you're getting, the scheduler can do the rest.

Inline raytracing benefits AMD and NVidia. Although it's more important for AMD, as to try to reduce contention using the TMUs to process textures or ray testing.
But the thing with RT, is that it's performance bottleneck is not the amount of rays a GPU can test. But rather the occupancy of the render pipeline.
Be it with AMD or NVidia hardware, turning on RT will reduce shader occupancy to about 1/3. The front end of current GPUs just can't cope with the randomness of RT dependencies.
So the GPU ends up issuing much fewer work waves or warps, than it normally does in rasterization.
And my guess is that this game performs much better on nvidia hardware, because it was optimized to be able to issue has many warps as possible.
But on AMD hardware, these probably won't work and the result is abysmal numbers in work waves issue.
NVidia does have one new advantage in this matter, at least with Ada and that is SER. This can probably increase occupancy by 5-10%.

Path tracing is not like that, it's chaotic. The hybrid pipeline is choking big time.

Not really. Path tracing is very similar to ray-tracing.
The main deference is that it uses the Monte Carlo simulation, to decide a pattern to cast it's rays.
The result is a more accurate lighting system, without having to cast as many rays as it would, without Monte Carlo algorithm.
But it still casts many more than RT.
 

SF Kosmo

Al Jazeera Special Reporter
Not going to lie, pathtracing is mighty fine and fixes the oh so ugly NPC's in cyberpunk, but its dark like really dark. It doesn't feel natural. It feels like shadow setting is overtuned to hell. But disabling it makes the entire game feel like last gen.
This is a funny way to put it. Path tracing isn't unnaturally dark, and in many cases it can make dark areas brighter due to bounce lighting. But it is a fundamentally different way of lighting a scene and if the level designers and artists weren't using/considering path tracing when they created a scene, then the scene isn't going to be lit according to the artist's intention.

This isn't a flaw of path tracing, it's a problem of adding path tracing to games after the fact. Metro Exodus handled this well by releasing their RT version as an entirely separate download and doing a new art/lighting pass specifically with RT in mind. This means removing non-source lights designed to simulate bounce lighting etc, but also making sure there are light sources in dark areas, etc.

Unlike Metro, Cyberpunk doesn't really use non-source lights so the devs didn't worry about that end of it, but they probably could have done a new pass to add light sources to dark areas as needed.

These are growing pains. As we see games developed with path tracing in mind from the jump, we won't have these issues.
 

bbeach123

Member
I wonder if we could have some kind of tech that only use ray tracing on the characters . I dont really mind non-RT version of cyberpunk actually , look really good already . But bright npc in dark enviroment sometime look kinda weird .
And the lack of local shadows on characters .
 
Last edited:
This is looking dire for AMD. They are vastly behind in what seems to be the next paradigm shift in rendering technology. Hopefully, RDNA 4 makes a major leap in RTRT. Otherwise, they'll be left behind completely.
AMD has consoles. But take a moment to think what this means for the PS6 and ray tracing if they can't come up with something good.
 

The Cockatrice

Gold Member
I have the game installed and saved in a demanding place, curious to see the FPS difference in the same location with 2.0. Seems like the average is over 80 on a 4070TI with full pathtracing/fg/rr/dlss which is perfectly fine for me.

laughts at starfields performance.
 
Last edited:

Thaedolus

Gold Member
AMD has consoles. But take a moment to think what this means for the PS6 and ray tracing if they can't come up with something good.
Its gonna be reeeeeeaaal interesting to see where next gen Switch lands in comparison to last gen and current gen consoles
 

Buggy Loop

Member
AMD has consoles. But take a moment to think what this means for the PS6 and ray tracing if they can't come up with something good.

Intel could maybe slide in consoles with their tiled based APUs

Microsoft could look into gaining an advantage over Sony and a different supplier is pretty much the only way to go since AMD seems in too much architectural turmoil for coming years
 

Bojji

Member
This is the same game that everyone hated and now the expansion is smoking sales.

Some people hated it (still do) and some people (including me) loved it. Many people wanted this game to be like GTA and majority of criticism was about drivers and police Ai, I didn't care but CDPR fixed these things (plus changed a lot more) in 2.0 update.

Cp2077 (for me) was already 9/10 game few months after launch (when I completed it).
 

Thaedolus

Gold Member
Intel could maybe slide in consoles with their tiled based APUs

Microsoft could look into gaining an advantage over Sony and a different supplier is pretty much the only way to go since AMD seems in too much architectural turmoil for coming years
With the current expectations with BC and console generational lines getting blurred with pro consoles and cross gen titles, I’m inclined to believe it’s an uphill battle to get someone to switch architectures. I bet Nintendo sticks with nvidia and Sony/MS stick with AMD
 

Kenpachii

Member
Update dropped

4080 laptop, 64gb ddr5 4800, 12900HX

Psycho lightning, ultra settings, 1080p, dlss quality, framegen enabled, i had a stream running on the background and some other stuff. Lowest i saw it go was 85 at the bar.

ced9d2599319e4797e3f37e0589bed7e.jpg
 
Last edited:

Buggy Loop

Member
With the current expectations with BC and console generational lines getting blurred with pro consoles and cross gen titles, I’m inclined to believe it’s an uphill battle to get someone to switch architectures. I bet Nintendo sticks with nvidia and Sony/MS stick with AMD

You're trading a PC for a PC in Sony & Microsoft's case.
 

shamoomoo

Member
With the current expectations with BC and console generational lines getting blurred with pro consoles and cross gen titles, I’m inclined to believe it’s an uphill battle to get someone to switch architectures. I bet Nintendo sticks with nvidia and Sony/MS stick with AMD
Intel makes x86 CPUs as well, I'm not sure if switching to a different x86 CPU brand would mess up BC on the PS5.
 
Last edited:

Kataploom

Gold Member
I don't think this is a matter of Nvidia being better than AMD on RT alone, basically any other game with RT shows the real difference between both brands... This seems like CDPR heavily optimizing the RT for Nvidia cards.
 
  • Empathy
Reactions: amc

Thaedolus

Gold Member
Seems like with everything maxxed at 4K and DLSS set to balanced I’m holding 75-85FPS. And holy shit at the corpo run ride through NC at the beginning. Looks incredible and runs great.
 
Top Bottom