• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

DF - Cyberpunk 2077 Patch 1.23 PS5 vs Xbox Series X/S Updated! How Does Next-Gen Stack Up Right Now?

SCB3

Member
I've just thought of something, Xbox did the marketing for CP right? Gamescom is in a few weeks and Xbox is there, what if CP 2077 has a next gen update and coming to Game Pass announcement at the show?
 
Last edited:

Kuranghi

Member
So a last gen game that doesn't fully tax your system? That's still a trade off, you just aren't being given the high end options.

I'm talking about the best way to maximize what your system can do in terms of overall rendering quality. If you want to push you system to the limit, you're going to want DLSS on, period.


Consider the kind of performance uplift we're talking about, like 30-50% uplift on the GPU side of things (if not necessarily overall framerate. There's no situation in which the best use of that extra overhead is gonna be slight improvements to barely perceptible temporal artifacts rather than like, ray traced lighting. There will always be better uses for that performance budget (whether or not the game actually has the features to take advantage of that).

Okay. Can we just agree that we have different priorities? As long as I'm locked to 60fps, at that point I prefer image quality over all and I want it to look the best, even disregarding the temporal issues, having DLSS on will not look as good as native, its physically impossible as far as I am aware.

I care about image quality above all else as long as it doesn't drop the fps below 60fps, I want the graphics to be at the highest settings (within reason, if there no difference ofc don't use it if you drop below 60) and native 4K and 60fps, I wouldn't put on DLSS to get more frames after that. Thats why I said I'd wait to play it, because you can't do that right now with any reasonable HW that exists.

I understand that you want to make a tradeoff in IQ for more performance I think its a totally reasonable position and having 120fps instead of 60fps will definitely make the game feel much much better to play, but I personally don't care about that so I won't make that sacrifice in IQ. No matter how tiny it might be.

The bolded part is a bit out of place for Cyberpunk, as far as I'm aware you need to edit files to fully remove the temporal reconstruction whatever they are using and if you don't it creates awful smearing/temporal artifacts when DLSS is used with it. Unless they changed that and you can turn it off in-game now?

Sorry if I'm misreading you but it seems like you just want to say my opinion of what looks best is wrong because theres "basically no difference", well that means there IS a difference by definition. I'm not telling you your priorities are wrong when you lose that "tiny" bit of IQ to get more performance 🤷‍♂️

My order of importance is resolution/"no" aliasing/no detail loss>RT+settings>fps above 60fps and if I don't have the HW to do that I'm not going to reduce resolution or use DLSS to allow me to max out the RT, I just wouldn't play it until I can do that. I don't do this with many games but I don't particularly care for the gameplay of CB2077 so I want it to look perfect before I play it.

I'm getting irate because I heard this all the time with TVs, the "LG is nearly as good as the Sony in X area" (or vice versa) so its the best deal and I'm saying "well I've studied these sets in detail side-by-side and I want the best IQ so I pick the Sony even if I lose out on other things or it costs way more, because thats my preference".

TL;DR - Don't read this I'm just grumpy.
 
BOOM! Series S has a higher NPC count than PS5! That 0.1GHz faster CPU putting in that work!!

SuKBpAO.jpg


Jokes aside Tom said the XSS was one of best places to play the game. 1440p resolution and a near locked 30 fps framerate. Clearly a superior experience compared to last generation consoles the XSS is regularly compared to. It held 30fps better than the XSX in its 30 fps mode. Since the XSS is held to the highest standards might as well call it out when it performs well. Not too shabby for the system that gets compared to the Switch and PS4.
 

WitchHunter

Member
Jokes aside Tom said the XSS was one of best places to play the game. 1440p resolution and a near locked 30 fps framerate. Clearly a superior experience compared to last generation consoles the XSS is regularly compared to. It held 30fps better than the XSX in its 30 fps mode. Since the XSS is held to the highest standards might as well call it out when it performs well. Not too shabby for the system that gets compared to the Switch and PS4.
People still up in arms about console cpu speed? In 2021? Cpu? Cpu is used for nothing in a game.
 

MrFunSocks

Banned
Even when you compare it to native 4K with temporal super sampling and sharpening? I know the performance is (much) better with DLSS but this is just a silly statement. You literally have less than half of the input pixels if you are DLSSing from 1440p so it just doesn't make sense.

On a smaller screen (less than 50") I can certainly see how there wouldn't be a big difference, but I'm on a giant screen and I've got massive pedant eyes so there really is a difference for me.
 
People still up in arms about console cpu speed? In 2021? Cpu? Cpu is used for nothing in a game.
Uh the CPU is the main reason games run at 60fps. It was a well known fact the CPU last generation was horrible. The CPU is the reason the XSS is a current generation device despite the attacks it receives from people who don't own the system or like the Xbox brand. You should do some research.
 

Kuranghi

Member

I was hoping for your own opinion on it not DF's tbh, but cheers anyway.

This is a quote from the article

"All of which brings us to the inclusion of Nvidia DLSS, which once again has the ability to match and even exceed native resolution rendering in many respects"

I also understand that they are talking about normal TV sizes and not giant telly like I'm using, so its more noticable to me.
 

WitchHunter

Member
Uh the CPU is the main reason games run at 60fps. It was a well known fact the CPU last generation was horrible. The CPU is the reason the XSS is a current generation device despite the attacks it receives from people who don't own the system or like the Xbox brand. You should do some research.
No reseach needed, the CPUs are so stong in this generation they are literally wanking in their free time.
 
Still haven’t played CP 2077, waiting to see if the next gen version can earn my business. Regarding this latest patch I’d be hoping that VRR would smooth the experience on Series X. But, as I said, not jumping in now.

Nice frame rate on PS5 but those empty streets are just bizarre.
 

Mr Moose

Member
Uh the CPU is the main reason games run at 60fps. It was a well known fact the CPU last generation was horrible. The CPU is the reason the XSS is a current generation device despite the attacks it receives from people who don't own the system or like the Xbox brand. You should do some research.
The CPUs (and SSDs) in the new gen machines shit all over last gen. And they are still Cool'n'Quiet. Though the difference between all 3 current gen CPUs is minimal.
 

ss_lemonade

Member
I'm sure it looks ace but I want 4K60, max RT settings and no reconstruction. I prefer native res because I have a giant telly right in my face so I notice a difference.
I haven't tested it much, but this game seems to look better in DLSS quality vs native 4k. It is also really demanding at 4k max, unless you use lower quality DLSS options. A 3090 for example while close, still can't get a locked 4k 60 even with DLSS Quality. Without DLSS, framerate is awful (at 4k max)
 
Last edited:

Patrick S.

Banned
On PC this game is quite great NOW, I played majority of main story and probably most of side missions and encountered ZERO major bugs, only some gfx glithes here and there.



It's great, I play with everything maxed out on 3070 in 2560x1080 with DLSS, framerate 60FPS+

785Cyberpunk2077Screens.png
430Cyberpunk2077Screens.png
Cyberpunk-2077-Screenshot-2021.07.20---21.54.09.29.png
Cyberpunk-2077-Screenshot-2021.07.21---19.12.26.36.png
When you say "everything", does that include RT? I have a 3080, and while at 1080p I do get 80-90 fps in places with RT on, in many many places, I'm dropping into the 30s and 40s... I have a much shittier CPU though; i7 6700k @4.6 GHz...
 

GloveSlap

Member
Still the old versions.....They really should have taken the time to finish the game and had the next gen versions ready day one.
 

Aenima

Member
Your expectations are too high.
Not really. The engine is pure crap in any machine, u cant fix that. I just expect a slight better experience, im not expecting a diferent game.

Right now thers only the PS4 version of the game available to play on PS5, been there. Finished the main quest. Will only give it another go once a native PS5 version comes out.
 
Last edited:

Armorian

Banned
When you say "everything", does that include RT? I have a 3080, and while at 1080p I do get 80-90 fps in places with RT on, in many many places, I'm dropping into the 30s and 40s... I have a much shittier CPU though; i7 6700k @4.6 GHz...

Yes, RT on ultra (with shadow rt off), there are places when it goes to low 50 and game is entirely CPU bound there. I would suggest dropping crowd density to medium while keeping RT.
 

SF Kosmo

Al Jazeera Special Reporter
Okay. Can we just agree that we have different priorities? As long as I'm locked to 60fps, at that point I prefer image quality over all and I want it to look the best, even disregarding the temporal issues, having DLSS on will not look as good as native, its physically impossible as far as I am aware.
Unlike other image upscaling methods DLSS actually CAN add detail and enhance edge clarity, and in some games there's a real argument about whether or not it's as good as (or even better than) native.

It's pretty subjective and depends on the game, to be honest. In Control, for example, DLSS is pretty obvious because they crank up the sharpening, giving it a slightly filtered look. In other games it can be better than the available AA solutions, especially TAA which adds a bit of blur even at native res.

I care about image quality above all else

Well I think the kinds of robutst ray traced lighting implementations that aren't viable at native 4K ARE an improvement to image quality, more so than the difference between native res and DLSS, which is often very hard to spot or only apparent in edge cases.

Here are two cropped 1080p shots, one with DLSS and one without. Can you tell which is which? This is not even 4K so it should be much, much more obvious. And yet I think the DLSS shot looks better.

xxaeGpP.jpg



as long as it doesn't drop the fps below 60fps, I want the graphics to be at the highest settings (within reason, if there no difference ofc don't use it if you drop below 60) and native 4K and 60fps, I wouldn't put on DLSS to get more frames after that.

We're talking in circles. Regardless of your target, more overhead on the GPU means you can push the GPU harder in other ways. It doesn't have to be framerate, it can be by turning up detail or LOD or raytracing or whatever.. The amount saved by implementing DLSS can be better spent elsewhere. The difference between turning on more effects is much more noticeable than the difference turning off DLSS.

I understand that you want to make a tradeoff in IQ for more performance

No, it's making a trade off in one aspect of the image's quality and using it to improve other aspects that are far, far more noticeable.
 
Last edited:

Mr Moose

Member
Here are two 1080p shots, one with DLSS and one without. Can you tell which is which? This is not even 4K so it should be much, much more obvious. And yet I think the DLSS shot looks better.

xxaeGpP.jpg
The one on the left has more artefacts on the hair and is less "smooth" overall like a sharpening filter but less obvious, right is softer.
 

Kuranghi

Member
Well I think the kinds of robust ray traced lighting implementations that aren't viable at native 4K ARE an improvement to image quality, more so than the difference between native res and DLSS, which is often very hard to spot or only apparent in edge cases.

Here are two 1080p shots, one with DLSS and one without. Can you tell which is which? This is not even 4K so it should be much, much more obvious. And yet I think the DLSS shot looks better.

xxaeGpP.jpg


We're talking in circles. Regardless of your target, more overhead on the GPU means you can push the GPU harder in other ways. The amount saved by implementing DLSS can be better spent elsewhere.

No, it's making a trade off in one aspect of the image's quality and using it to improve other aspects that are far, far more noticeable.

The DLSS shot is the one that looks oversharpened, has false detail and has oversaturated colour + incorrect gamma compared to the the native image. I have an extremely high quality display calibrated to a degree of accuracy I couldn't achieve so is effectively "the best it can be", I want to see the creator's intent, not something else, just like how I wouldn't turn on an image enhancement in my TV no matter how miniscule the downsides were.

I pointed this stuff out on every DLSS comparison shots I ever saw since version 2 (Maybe 1.9?) came out (As you know 1.0/the way it was in Control at launch wasn't really that great) and in a recent video talking about the next version they talked about how it would hopefully fix the colour and gamma shifts that nobody really noticed until they pointed it out.

Most people aren't on my side, I get that. I understand why you would do it and you'll definitely have a smoother experience than me, for some people thats more important, but not for me. Sorry if its not clear but by IQ I mean the clarity of the image, not graphical effects, even though I know thats not cut and dried since some effects increase clarity.

I'm not debating whether increased resolution/detail is as imporant to the overall IQ as RT or other settings being higher, I was talking about when the settings are the same, I choose native over DLSS regardless of how much more performance it gives me, because I don't like how DLSS alters the image in the ways I noted above.

No offense meant but I really don't care enough anymore to continue this discussion, enjoy your games in the way you would like. I wish I didn't notice these things, but I do, so here we are. Every time I get into these debates I feel like a raving lunatic:

5itjlc.jpg
 
Last edited:

JackMcGunns

Member



Actually it just means that it can run at 3.6 when not in SMT mode. As I thought, the Series S CPU runs at 3.4Ghz in SMT mode, but can also run at 3.6Ghz when not in SMT, so it theoretically can run 100Mhz faster than PS5 when in this mode which PS5 cannot support.... interesting.
 
Last edited:

Mr Moose

Member
Actually it just means that it can run at 3.6 when not in SMT mode. As I thought, the Series S CPU runs at 3.4Ghz in SMT mode, but can also run at 3.6Ghz when not in SMT, so it theoretically can run 100Mhz faster than PS5 when in this mode which PS5 cannot support.... interesting.
Hard to say if BC titles are using the CPU to their fullest, I know some on the PS5 don't, could be the same on the Series consoles. They might have a limit like with PS5's BC.
They said it does run at full but... Who really knows?
 
The DLSS shot is the one that looks oversharpened, has false detail and has oversaturated colour + incorrect gamma compared to the the native image. I have an extremely high quality display calibrated to a degree of accuracy I couldn't achieve so is effectively "the best it can be", I want to see the creator's intent, not something else, just like how I wouldn't turn on an image enhancement in my TV no matter how miniscule the downsides were.

I pointed this stuff out on every DLSS comparison shots I ever saw since version 2 (Maybe 1.9?) came out (As you know 1.0/the way it was in Control at launch wasn't really that great) and in a recent video talking about the next version they talked about how it would hopefully fix the colour and gamma shifts that nobody really noticed until they pointed it out.

Most people aren't on my side, I get that. I understand why you would do it and you'll definitely have a smoother experience than me, for some people thats more important, but not for me. Sorry if its not clear but by IQ I mean the clarity of the image, not graphical effects, even though I know thats not cut and dried since some effects increase clarity.

I'm not debating whether increased resolution/detail is as imporant to the overall IQ as RT or other settings being higher, I was talking about when the settings are the same, I choose native over DLSS regardless of how much more performance it gives me, because I don't like how DLSS alters the image in the ways I noted above.

No offense meant but I really don't care enough anymore to continue this discussion, enjoy your games in the way you would like. I wish I didn't notice these things, but I do, so here we are. Every time I get into these debates I feel like a raving lunatic:

5itjlc.jpg

Oh great another PC elitist who can't play most games on the new PS5, XSX consoles, because they use reconstruction techniques that are worse than DLSS.
 

Mr Moose

Member
Oh great another PC elitist who can't play most games on the new PS5, XSX consoles, because they use reconstruction techniques that are worse than DLSS.
I think it's because they are using a large screen rather than a monitor (my monitor is 27" @ 60Hz, I need a new one). The larger the screen, the more obvious it might be.
My TV is only 43" 4k @ 60fps though, and I sit right on top of it lol.
 
I think it's because they are using a large screen rather than a monitor (my monitor is 27" @ 60Hz, I need a new one). The larger the screen, the more obvious it might be.
My TV is only 43" 4k @ 60fps though, and I sit right on top of it lol.
I'm all for devs using all sorts of rec. tech. to save ms. Heck I even thought horizon looked good on ps4 pro. Native 4K is a huge waste of resources.
 
What reduction? It is the PS4 Pro version with minimal changes and brute forcing framerate. Do you read what I post at all or are you just trying to having a bit of a row :LOL:?
This is a thread about Cyberpunk running on various platforms is it not? You complained about NPC density and I mentioned that it should be worth it to get the best framerate. That is true for PS4 and PS5. The NPC density is higher on other platforms and the framerate is lower on average. I will admit the XSS looks to have a pretty great balance which is nice to see.
 
Top Bottom