• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF: Cyberpunk 2077 Next-Gen Patch: The Digital Foundry Verdict

Cyberpunkd

Gold Member
I agree with this. It looks good on PCs with full bore raytracing, the lighting looks amazing, but the models are pretty iffy at times, everyone looks plasticky (yeah, okay, cyberpunk, maybe it's their plastic surgery/mods, but...ehh), the animation and lip synching is often very poor, and lots of things are just, like, very square/boxy. It really feels to me while playing it on high end PC or the next gen patch like something like Perfect Dark Zero, where they're on a creaking engine and design ideas and they just slather on a load of new effects. The clubs, etc, it looks like Vampire the Masquerade Bloodlines - poorly animated dancers, etc, kind of bad models. I love that game, btw, so Cyberpunk scratches a certain itch.
That is my impression as well, the city, the neons, the buildings looks incredible, especially at night. The character models and animations are hot rubbish.
 
Last edited:
Nah the HDR mode is still fucked.
In fact a lot of stuff from the lastgen versions are still present.
I think CPR literally ported the last gen versions to the new consoles and added some stuff.
Hence why we got this upgrade now instead of later.
I have to disagree here. Whereas HDR was fucked before it is very much viable now. (PS5) I switched between modes in areas where it previously looked real bad with HDR on and there isn't much difference anymore besides the expected HDR enhancements. I played with HDR off before but have it on this time.
 

VFXVeteran

Banned
This is such a weird question. I don't mean this in a derogatory way, just categorically speaking. I'm not sure if you're saying the console can't achieve native 1440p in any game or if you're specifically referring to CP2077.

If the former, then you are objectively incorrect.

If the latter, then you are highlighting a key reason why developers should redesign key aspects of their engines to take advantage of the bandwidth efficiencies of the new consoles.
I'm saying it as a whole with RT on and several other non-RT games (i.e. HFW) @ 60fps. No one and I mean NO company rewrites their graphics engines for new hardware. It's never going to happen. A lighting equation is always going to be the mathematical equation to solve the rendering equation. We can surely add to or modify the graphics engine but there will be code reuse all over. And that's my point. Graphics engines don't need to be rewritten to take advantage of consoles when their bandwidth-limited in the first place. Developers don't have low-level assembly language routines anymore. That is long gone. Optimizations now are lowering the number of rendering pixels and/or FPS. Consoles would not have been able to run CP2077 with full RT at any good FPS if the PC (which the game was designed around) can't do it at a reasonable FPS without DLSS. I don't understand why the Sony crowd thinks their hardware is greater than what it actually is. It's a 2080 at best and we already know how that GPU performs under most scenarios. They are simply bottlenecked GPUs for things that the developers would like to implement this generation. Period.
 
Last edited:

Venom Snake

Gold Member
I love those Honda Vs KIA type comparaison while I drive my Ferrari 😂
Both game run well so good for the players

When i think of a Ferrari, i am reminded of a holiday in San Marino, glamurous women wearing designer glasses for 30,000 euros apiece, diamond signet rings and beautiful yachts nearby. How does your gaming pc compare? Did it come with a free breakfast? :messenger_tears_of_joy:
 

assurdum

Banned
Yes they are very close, but the XSX has the edge. More Compute and bandwidth on the same architecture will prove to be more useful in the long run.
You won't see anything drastically different of what you seen until now. PS5 has a big chunk of GE circuit enough bigger compared the XSX, a lot of people really understimate how helpful is the GE in an GPU in the long run
 
Last edited:

assurdum

Banned
I don't understand why the Sony crowd thinks their hardware is greater than what it actually is. It's a 2080 at best and we already know how that GPU performs under most scenarios. They are simply bottlenecked GPUs for things that the developers would like to implement this generation. Period.
Sony crowd? Lol. Have you missed completely what Xbox crowd think of their hardware, genuine question.
 
Last edited:

Tchu-Espresso

likes mayo on everthing and can't dance
I have to disagree here. Whereas HDR was fucked before it is very much viable now. (PS5) I switched between modes in areas where it previously looked real bad with HDR on and there isn't much difference anymore besides the expected HDR enhancements. I played with HDR off before but have it on this time.
Yeah HDR is left on now since they fixed it.
 

assurdum

Banned
No. Can you give me links? I don't see several threads based solely on 1 game like I do the Sony gamers, therefore I don't see this happening in practice.
Jesus what a sided person are you. Go to read some post history for the fuck sake, I can't believe you just blame Sony fans for stuff like this. The forum was full of people who laughed at not native 4k Ubisoft games reveal on ps5 and when the others (as me) tried to argue that would have been the same on XSX, they laughed again because the 12 TF beast.
 
Last edited:

VFXVeteran

Banned
Jesus what a sided person are you. Go to read some post history for the fuck sake, I can't believe you just blame Sony fans for stuff like this. The forum was full of people who laughed at not native 4k Ubisoft games reveal on ps5 and when the others (as me) tried to argue that would have been the same on XSX, they laughed again because the 12 TF beast.
That's not what I'm talking about. The console warring between power this and power that is immature and I don't pay attention to that. I'm talking about what the hardware *should* do with relation to graphics features. I always hear the Sony crowd talking about how a game underperforms because it's not written specifically for a next-gen console assuming that a developer would get enormous amounts of gains had that been the case. I don't see Xbox gamers declaring "unoptimized, this game would run like CG if the graphics engine was rewritten for the next-gen console." Xbox gamers wouldn't say that because all of their exclusives are developed on PC first. If the PC can't do it, then surely the XSX won't.
 
Last edited:

assurdum

Banned
That's not what I'm talking about. The console warring between power this and power that is immature and I don't pay attention to that. I'm talking about what the hardware *should* do with relation to graphics features. I always hear the Sony crowd talking about how a game underperforms because it's not written specifically for a next-gen console assuming that a developer would get enormous amounts of gains had that been the case. I don't see Xbox gamers declaring "unoptimized, this game would run like CG if the graphics engine was rewritten for the next-gen console." Xbox gamers wouldn't say that because all of their exclusives are developed on PC first. If the PC can't do it, then surely the console won't.
Go back to read The Medium thread then. Or the more recent thread about the game which run 8k on ps5 and 6k in XSX. What changes if xsx fan has absurd expectations Vs the ps5 hardware and it's not the pc. Don't make them better or less fanboy
 
Last edited:

VFXVeteran

Banned
Go back to read The Medium thread then. Or the more recent thread about the game which run 8k on ps5 and 6k in XSX. What changes if xsx fan has absurd expectations Vs the ps5 hardware and it's not the pc. Don't make them better or less fanboy
Everything you are talking about is in the vein of a console war. I've never seen an Xbox owner think their hardware is better than anything out including a PC based off of how well a game looks.
 

assurdum

Banned
Everything you are talking about is in the vein of a console war. I've never seen an Xbox owner think their hardware is better than anything out including a PC based off of how well a game looks.
Implicitly they did it dude. They expected XSX games runs native 4k when was not on ps5 many times. What kind of specs they think it has?
 
That's not what I'm talking about. The console warring between power this and power that is immature and I don't pay attention to that. I'm talking about what the hardware *should* do with relation to graphics features. I always hear the Sony crowd talking about how a game underperforms because it's not written specifically for a next-gen console assuming that a developer would get enormous amounts of gains had that been the case. I don't see Xbox gamers declaring "unoptimized, this game would run like CG if the graphics engine was rewritten for the next-gen console." Xbox gamers wouldn't say that because all of their exclusives are developed on PC first. If the PC can't do it, then surely the XSX won't
It's all about dem tools, right? 😉
 

PaintTinJr

Member
Do you really think that game companies completely rewrite their graphics engines just because a console comes out that barely has enough bandwidth to push 1440p pixels? Why wouldn't a full RT lighting pipeline NOT be considered next-gen if every console out can't run it? Or do you think if the graphics engine was made around the next-gen consoles that you'd get 4k/60FPS with full RT lighting???
Most companies use a middleware provided engine and customise it in a way that re-integrating those customizations to the next version of the engine allows them to get the best of both worlds, without rewriting an entire engine. This is very obvious from how few companies are vocal about the benefits of writing their own engine compared to licensing. And even then, the small number of AA studios like Rebellion and Hello games that do write their own, still tend to make their own engines compatible with middleware libraries. So your rewriting question would be moot.

As for the RT pipeline in Cyberpunk 2077 on PC, is it still lighting highly faceted geometry like some of the screenshots I've seen. Lighting and modelling are inseparable in the "next-gen" engine argument IMHO. Otherwise a next-gen engine rendering a single triangle with RT could claim next-gen visuals. Modelling stresses lighting models in performance, so without everything in Cyberpunk being "beyond polygons" like UE5 nanite+lumen, IMHO it is still last-gen.

You did a great job of covering the rendering in Demon's Souls in your old thread - I'm only just properly playing the game now, at three demons vanquished - and it too is still a last-gen engine IMHO. It looks amazing because of how good the original modelling and normal mapping and texturing and lighting of the original game on PS3 hold up at enhanced settings - probably Sony's PhyreEngine - and look pretty flawless at close and mid-range, but comparing to UE5 Land of the Ancient in drone mode or the PS5 UE5 demo is like day and night difference IMO. Seeing faceted items in Cyberpunk screenshots with RT isn't a next-gen lighting stress test for me.
 

ChiefDada

Gold Member
I'm saying it as a whole with RT on and several other non-RT games (i.e. HFW) @ 60fps. No one and I mean NO company rewrites their graphics engines for new hardware. It's never going to happen. A lighting equation is always going to be the mathematical equation to solve the rendering equation. We can surely add to or modify the graphics engine but there will be code reuse all over. And that's my point. Graphics engines don't need to be rewritten to take advantage of consoles when their bandwidth-limited in the first place. Developers don't have low-level assembly language routines anymore. That is long gone. Optimizations now are lowering the number of rendering pixels and/or FPS. Consoles would not have been able to run CP2077 with full RT at any good FPS if the PC (which the game was designed around) can't do it at a reasonable FPS without DLSS. I don't understand why the Sony crowd thinks their hardware is greater than what it actually is. It's a 2080 at best and we already know how that GPU performs under most scenarios. They are simply bottlenecked GPUs for things that the developers would like to implement this generation. Period.

Description of PS job vacancy:

Senior Software Engineer – PlayStation®5 Rendering API​

at PlayStation Global
United Kingdom, London

About the Advanced Technology Group
The Advanced Technology Group (ATG) is one of the central technology teams in PlayStation Studios, the game development division of SIE, which is responsible for developing some of the most recognisable and ambitious console games and franchises.
In addition to its original role supporting the PlayStation Studios game teams, the group collaborates with engineering teams worldwide to deliver key platform components. Among these projects, ATG has responsibility for a large part of the GPU software stack used by all PlayStation games.
What you’ll be doing

  • You will be working on the rendering API that is used for all GPU programming on PlayStation 5.
  • Your tasks will include designing and developing new API features that allow us to get the most out of the PlayStation 5 hardware, communicating and advocating these to developers, as well as providing expertise to other system software engineers and support teams that have GPU needs.
  • You will be learning a great deal about the low-level operation of the PlayStation 5 hardware and software and then using this knowledge to provide high-quality and high-performance solutions.
  • As a key member of the graphics team, you will be regularly interacting with game developers who work on PlayStation 5, allowing you to understand how to provide them with the greatest value.
This is a unique and senior position at the core of PlayStation GPU technology. Your colleagues in ATG will include the engineers developing the rendering and ray tracing libraries, GPU tools, shader compiler, and contributors to the architecture of multiple generations of PlayStation consoles. The role also involves close collaboration with other key hardware and software stakeholders so we can provide game developers with the means to push the boundaries of our platforms.
What we are looking for

  • Experience writing rendering code on top of multiple existing rendering APIs.
  • Experience writing rendering code on console.
  • An interest in GPU technology and low-level programming.

Edited to include full context
 
Last edited:
its been 2 years this year since launch..still no 120fps first party games and no VRR
why does the bolded part of statement bother me more than anything else in this thread? Lol, I bet this guy thinks a lightyear is a measure of time... I kid, I kid of course. Back to arguing about dumb stuff. I look to grab the new Bravia X95K when it hits $1,000; wonder if I'll already have a PS VR2 by then.
 

VFXVeteran

Banned
As for the RT pipeline in Cyberpunk 2077 on PC, is it still lighting highly faceted geometry like some of the screenshots I've seen. Lighting and modelling are inseparable in the "next-gen" engine argument IMHO. Otherwise a next-gen engine rendering a single triangle with RT could claim next-gen visuals. Modelling stresses lighting models in performance, so without everything in Cyberpunk being "beyond polygons" like UE5 nanite+lumen, IMHO it is still last-gen.
I agree to disagree here. Increasing modeling throughput will only make everything else slower. With a limited bandwidth for the consoles, it would really be unplayable.

While modeling is a great enhancement to getting close to CG visuals, it's the lighting of those triangles that become saturated with work. We could easily increase polygon count like Nanite but we'd have to reduce complexity on the backend. I'm sure these consoles can run any game with 2x the polygons if the lighting was a constant color.. however, this isn't going to further the cause.

You did a great job of covering the rendering in Demon's Souls in your old thread - I'm only just properly playing the game now, at three demons vanquished - and it too is still a last-gen engine IMHO. It looks amazing because of how good the original modelling and normal mapping and texturing and lighting of the original game on PS3 hold up at enhanced settings - probably Sony's PhyreEngine - and look pretty flawless at close and mid-range, but comparing to UE5 Land of the Ancient in drone mode or the PS5 UE5 demo is like day and night difference IMO. Seeing faceted items in Cyberpunk screenshots with RT isn't a next-gen lighting stress test for me.
We just disagree with what constitutes 'next-gen'. By your definition, if no company has a nanite-like system then it's considered last gen. I'm on the other end of the spectrum. If a game doesn't implement lighting properly, then it's considered last-gen. While I agree modeling plays a good role in geometry silhouette edges, I do think that a game can be considered next-gen with lower poly counts but a robust RT lighting pipeline.
 

VFXVeteran

Banned
Description of PS job vacancy:



Edited to include full context
This says nothing to align with your argument. Again, you guys are thinking that developers right now don't know how to code to an API that's been available for years (PS4). To me that's an insult really. If you look at the PS exclusives that are currently out now they still have to reduce pixel count and RT features. The proof is what you are playing today (i.e. DS, R&C, etc..)

Low-level in this context is not writing assembly language function calls. Vulkan's API would be considered 'low-level' but it's not moving bits - it's just allowing more control over the memory management of the GPU driver. And btw, that job description describes part of what I've been doing for years.
 
Last edited:

TGO

Hype Train conductor. Works harder than it steams.
I have to disagree here. Whereas HDR was fucked before it is very much viable now. (PS5) I switched between modes in areas where it previously looked real bad with HDR on and there isn't much difference anymore besides the expected HDR enhancements. I played with HDR off before but have it on this time.
Interesting 🤔
Looks washed out for me and incapable of deep blacks.
Pretty much like it was before.
 

vpance

Member
Looks like they improved/fixed HDR? Found this vid talking about it



Now to figure out the best settings.
 

Lysandros

Member
why does the bolded part of statement bother me more than anything else in this thread? Lol, I bet this guy thinks a lightyear is a measure of time... I kid, I kid of course. Back to arguing about dumb stuff. I look to grab the new Bravia X95K when it hits $1,000; wonder if I'll already have a PS VR2 by then.
50% of the time thinking in light years works all the time. Can't blame the guy.
 

ChiefDada

Gold Member
This says nothing to align with your argument. Again, you guys are thinking that developers right now don't know how to code to an API that's been available for years (PS4). To me that's an insult really. If you look at the PS exclusives that are currently out now they still have to reduce pixel count and RT features. The proof is what you are playing today (i.e. DS, R&C, etc..)

Low-level in this context is not writing assembly language function calls. Vulkan's API would be considered 'low-level' but it's not moving bits - it's just allowing more control over the memory management of the GPU driver. And btw, that job description describes part of what I've been doing for years.

You said no company is redesigning engines specifically for new consoles and I showed you proof of PlayStation hiring exclusively for this task. Now you want to engage in non sequitur by falsely claiming I believe game developers can't code for PS4 API. It's evident you're being purposefully difficult, I'm just confused as to why you choose to behave this way. What do you gain from this silliness?
 

DenchDeckard

Moderated wildly


does this apply to the series x version? Is he basically saying there’s VRS and you can not notice it. That’s crazy if true. What a great feature.
 

adamsapple

Or is it just one of Phil's balls in my throat?
Massive performance gains?

BQUfHyI.png



SX version before this patch was erratic as fuck, now its mostly 60 FPS with only a few minor dips in stress areas.

Before it would wildly swing between 40 and 60 just driving down a road.
 
BQUfHyI.png



SX version before this patch was erratic as fuck, now its mostly 60 FPS with only a few minor dips in stress areas.

Before it would wildly swing between 40 and 60 just driving down a road.

Ahhh ok. So it brought it in line with the PS5 version where frame rate is concerned.
 

VFXVeteran

Banned
You said no company is redesigning engines specifically for new consoles and I showed you proof of PlayStation hiring exclusively for this task. Now you want to engage in non sequitur by falsely claiming I believe game developers can't code for PS4 API. It's evident you're being purposefully difficult, I'm just confused as to why you choose to behave this way. What do you gain from this silliness?
I said companies are not rewriting their entire graphics engine for a console. Let's get it straight here. Even the job description doesn't say redesign. I know what goes into designing functions for graphics engines and adding features. That's not the same as rewriting it from scratch. Code is reused and new features are added. The description even says multiple generations which proves cross generation code will be used.
 
Last edited:

Lysandros

Member
Damn, historical revisonism is a matter of minutes these days, you turn your back and XSX is suddenly running as well as PS5 in performance mode just like that.
 
Last edited:

adamsapple

Or is it just one of Phil's balls in my throat?
A little example of the extreme variables in framerate analysis in this game.
TojSlHJ.jpg

69k2F4W.jpg
practically identical run thru and huge difference in performance measured.



It can also swing the other way in the same locations during combat etc moment to moment. It's a big open world game with a lot of processes running in the background, small variances like this are to be expected.



XLpSUo1.jpg
 
Last edited:

ChiefDada

Gold Member
I said companies are not rewriting their entire graphics engine for a console. Let's get it straight here. Even the job description doesn't say redesign. I know what goes into designing functions for graphics engines and adding features. That's not the same as rewriting it from scratch. Code is reused and new features are added. The description even says multiple generations which proves cross generation code will be used.

Whoever claimed the entire graphics engine would be rewritten from scratch? Not me.

If the latter, then you are highlighting a key reason why developers should redesign key aspects of their engines to take advantage of the bandwidth efficiencies of the new consoles.

You are again attempting to split hairs with specific terminology, assuming that you can get one over with people in this forum since they do not have tech background. But that doesn't work when make statements so ridiculous that even laymen can see through the BS. Case in point:

Optimizations now are lowering the number of rendering pixels and/or FPS.

Sony wants to hire a Senior Software Engineer for Playstation 5 Rendering API so that he/she can reduce pixel rendering and framerates. Got it.
 
It can also swing the other way in the same locations during combat etc moment to moment. It's a big open world game with a lot of processes running in the background, small variances like this are to be expected.



XLpSUo1.jpg
Go back to my pictures and look at the graph for Xbox to Xbox and ps5 to ps5. Same platform, same scene, totally different results from DF to vgtech.
 
Last edited:

PaintTinJr

Member
I agree to disagree here. Increasing modeling throughput will only make everything else slower. With a limited bandwidth for the consoles, it would really be unplayable.

While modeling is a great enhancement to getting close to CG visuals, it's the lighting of those triangles that become saturated with work. We could easily increase polygon count like Nanite but we'd have to reduce complexity on the backend. I'm sure these consoles can run any game with 2x the polygons if the lighting was a constant color.. however, this isn't going to further the cause.


We just disagree with what constitutes 'next-gen'. By your definition, if no company has a nanite-like system then it's considered last gen. I'm on the other end of the spectrum. If a game doesn't implement lighting properly, then it's considered last-gen. While I agree modeling plays a good role in geometry silhouette edges, I do think that a game can be considered next-gen with lower poly counts but a robust RT lighting pipeline.
I only used nanite+lumen to highlight that the lighting being a better approximate simulation of real lighting as you would have it, doesn't provide a better simulation of real-life when the modelling - I've seen in Cyberpunk screenshots - is using as little as 200-1000 polygons for curved surfaces that needed at least 2.5K minimum to reduce the facets in the curves to noise - by being no more than a few pixels in length. I wasn't trying to say those faceted models in Cyberpunk needed to be at nanite level for the whole game to be considered next-gen, but if you measure a game by its weakest graphics part, then UE5 is far more next-gen than Cyberpunk with RT on PC - at the moment by the modelling issues I've seen.

And it is a catch 22 situation, the more polygons, the more the lighting taxes the engine performance, so the low polygon meshes might be a compromise for performance.
 
Last edited:

legacy24

Member
as someone with both consoles I'm leaning towards ps5 because of the dualsense implementation, still thinking of waiting to buy it though
 

adamsapple

Or is it just one of Phil's balls in my throat?
Go back to my pictures and look at the graph for Xbox to Xbox and ps5 to ps5. Same platform, same scene, totally different results from DF to vgtech.

Exactly my point. DF, NXGamer and VGTech also all found completely different dynamic resolution thresholds.

A game like this, which already came in a flaming hot state and needed a years worth of patches to run in an acceptable state in the first place, a game like this isn't really indicative of either consoles peak performance and flip flops in frame rate and DRS should be expected.
 
Top Bottom