• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Quality and performance modes are annoying

Do you research graphics modes before playing?

  • Yes - DF, NXg etc

    Votes: 111 30.6%
  • No - I always put in Fidelity

    Votes: 56 15.4%
  • No - I always put in Performance mode

    Votes: 196 54.0%

  • Total voters
    363

ethomaz

Banned
What? What games? And how do they substantiate your claim?
Just look at any game annalists since Fidelity and Performance become a thing in consoles.
Very few exception succeed with acceptable performance in both modes.
 
Last edited:

Hunnybun

Member
Just look at any game annalists since Fidelity and Performance become a thing in consoles.
Very few exception succeed with acceptable performance in both modes.

As far as I'm aware all next gen games have seemed to run perfectly at 30fps in fidelity mode and 60fps in performance mode.

The situation seems to be more or less the exact opposite of what you're describing.
 

Clear

CliffyB's Cock Holster
Just look at any game annalists since Fidelity and Performance become a thing in consoles.
Very few exception succeed with acceptable performance in both modes.

Define acceptable? Maximum possible resolution at maximum possible refresh-rate with zero deviation is not a realistic target unless your design is incredibly underspecced versus the device capabilities. Which in itself is a massive compromise!

Also the degree and magnitude of load fluctuation is inherent to design and creative direction. User agency can cause massive spikes in processor, memory, and/or gpu utilization creating corner cases that are often insurmountable by optimization alone. Sometimes you have to take the occasional hit so as not to have to downgrade the overall experience.

Not having a perfect frame-rate is not a flaw, its actually optimal in many cases.
 
Last edited:

ethomaz

Banned
As far as I'm aware all next gen games have seemed to run perfectly at 30fps in fidelity mode and 60fps in performance mode.

The situation seems to be more or less the exact opposite of what you're describing.
Vast majority of the games didn't hit the target in either mode.
I'm not even sure where you are playing these "perfectly" 30fps and 60fps modes lol
 
Last edited:

ethomaz

Banned
Define acceptable? Maximum possible resolution at maximum possible refresh-rate with zero deviation is not a realistic target unless your design is incredibly underspecced versus the device capabilities. Which in itself is a massive compromise!

Also the degree and magnitude of load fluctuation is inherent to design and creative direction. User agency can cause massive spikes in processor, memory, and/or gpu utilization creating corner cases that are often insurmountable by optimization alone. Sometimes you have to take the occasional hit so as not to have to downgrade the overall experience.

Not having a perfect frame-rate is not a flaw, its actually optimal in many cases.
Solid framerate.
And yes not having a perfect framerate is the biggest flaw in any game.

Why Destiny was considered the best gunplay in last generation even being 30fps? Because framerate was flawless.
 
Last edited:

lh032

I cry about Xbox and hate PlayStation.
I reeally hope there is a performance mode for next gen games (not cross gens).

I dont mind playing games without RT as long its 60fps
 

RJMacready73

Simps for Amouranth
I'm not ignorant. I played on PC for 2 years with a 144Hz monitor. Some games I ran at 140fps (to stay inside the freesync range) and some games I willingly dropped down to 30 (like Tomb Raider). It's a preference thing.

Maybe it's all in my head but to me, 30fps looks "higher quality" to me than 60fps. Probably stems from arcade games. If a game is going for that "movie look", then I want it to play like a movie as well. I know y'all are probably throwing up in your mouths right now, but I really don't care.

Anyways, my point is, I know what high frame rates look and feel like, so I'm not blind or ignorant to it. I just prefer 30fps for it's look and feel. Feels more premium to me... like the system is being pushed as hard as it can go.


What about people like me that use a 4K monitor and can easily tell the difference between 4K and 1440p? Ratchet and Clank looked like ass in 1440p. It was too soft.
sit further back from your monitor? ;-) seriously though what we need then is 4 tiers

1. Maximum Quality 4k/30/40 locked
2. Maximum Quality 1440p/60 locked - This would be my preferred choice, 60fps +/- a few frames and maximum graphics
3. Performance Mode Varible Rez/60 locked
4. Performance Mode+ 1080/120fps locked

As mentioned, i want the best graphics, i want the incredible reflections in pools or glass and a draw distance out to fucking jupiter but i also like playing at 60fps, what i dont like is having to compromise one for the other and imo 4K is just too much for this generation hardware to fully resolve with all the bells and whistles AND a locked 60fps, so Devs need to stop chasing that 4K which makes for pretty screenshots and give us the option of 1440p+ which looks grand on a 4K telly at normal seating distances.
 
Last edited:

Hunnybun

Member
Vast majority of the games didn't hit the target in either mode.
I'm not even sure where you are playing these "perfectly" 30fps and 60fps modes lol

I think literally all the games I've been interested in enough to watch analyses of have had excellent performance, even better than last generation.

I don't really know which ones you're referring to.

In any case, it's not as if games had perfect performance last gen anyway, so your point would hardly hold even then.
 
So why are you arguing for 1440p30 then?

Why not 1440p15, or 4k15?

After all, the content is the content, even at 15fps.
I see this comment a lot.

30fps is the minimum. You don't watch TV shows and movies at 15fps.

Since 30 is closer to 24, then 30 becomes the cinematic look.

I'm not saying I won't play a game at 60fps, I just usually prefer maximum quality.
 

Thaedolus

Member
It’s motion blur. Naughty dog have excellent motion blur which helps to blends frames.

I am disappointed in the results of the poll. So many next gen owners don’t even check out fidelity mode or the analysis?
Like…. You people blindly accept any drawbacks the game has in order to reach 60 fps?
It might be 1080p or lower effects or no ray tracing… like you are not even interested to check out what you are missing? On your next gen console? You should go with pc if only thing that matters is performance.
Maybe it turns out fidelity mode is also 60 fps or it’s really worth it?

It’s probably the same group who often outright disables motion blur and film grain without any reason. Because it was bad on gta3 20 years ago. Motion blur can look great and make the game look smoother. Film grain will help to remove branding and posterisation.

Well each to his own. I don’t think choice is always good thing. Maybe the console optimisation will return in next gen games

Honestly you can scale down numerous options and resolution and other nonsense with diminishing returns and I likely won’t even notice. You start dropping frames and it instantly looks worse to me. I can get used to it- I’ve played through BotW and TLOU2 on stock hardware, but it’s not even close to an optimal experience.

60FPS should be the minimum. If you can’t hit that while tracing all your rays and such, just turn something else down
 

trikster40

Member
I always try the quality mode out, see if it’s worth it.

Spider-Man, I played in quality mode. Looked amazing and the frame rate was stable.

Guardians of the Galaxy - quality was shit, switched to performance for the FPS.

Control - same shit.

If you can’t give me at least a rock solid 30FPS in quality mode, why bother?
 

Hunnybun

Member
I see this comment a lot.

30fps is the minimum. You don't watch TV shows and movies at 15fps.

Since 30 is closer to 24, then 30 becomes the cinematic look.

I'm not saying I won't play a game at 60fps, I just usually prefer maximum quality.

Ok, but 20 is closer to 24 than 30, and is also divisible by 60 so is a viable target.

So you obviously prefer 20fps to 30fps then? Given the huge increase in fidelity it implies, after all.
 
Ok, but 20 is closer to 24 than 30, and is also divisible by 60 so is a viable target.

So you obviously prefer 20fps to 30fps then? Given the huge increase in fidelity it implies, after all.
Okay yeah you're right. 20 is closer. I guess 30 is a good mix between visuals and playability.

30 is probably the lowest you really want to go in interactive media such as a video game. Besides, we've been playing 30fps games for decades now and to say it's all of a sudden unplayable is kinda hilarious.

Developers are going to do what they always do and try to make the most visually impressive games they can possibly make and that usually means targeting 30fps unless it's a racing/first person shooter/competitive multiplayer game.

I would be very surprised if this generation stays in this cross-gen phase where games look mostly like last gen games at 4K/60.
 

ethomaz

Banned
Ok, but 20 is closer to 24 than 30, and is also divisible by 60 so is a viable target.

So you obviously prefer 20fps to 30fps then? Given the huge increase in fidelity it implies, after all.
60 is not divisible by 24.

The standard 30fps was choose because 60 (the most common refresh rate in NTSC TVs) where divisible by 30 (the TV have to duplicate all frames to reach the refresh rate).
That in NTSC format of course.

In PAL formats (50Hz) it was 25fps (that is where come the 24.9fps from movies)... in fact for NTSC we should have movies in 29.9fps and not 24.9fps except cinema uses another format so they standardized 24.9fps.

Technically speaking for not have any issue the games should have different framerates for:
NTSC (and others 60 base formats) TVs: 30, 60, 120, etc.
PAL TVs: 25, 40, 100, etc.

That is why PAL users suffered a lot in the past because they have to accept games in 30/60fps in 50Hz TVs when obvious these games should have run in 25/50fps in these TVs.

PS. Today TVs doesn't have different refresh rate standards so that NTSC/PAL issue is a thing of the past.
 
Last edited:

Hunnybun

Member
Okay yeah you're right. 20 is closer. I guess 30 is a good mix between visuals and playability.

30 is probably the lowest you really want to go in interactive media such as a video game. Besides, we've been playing 30fps games for decades now and to say it's all of a sudden unplayable is kinda hilarious.

Developers are going to do what they always do and try to make the most visually impressive games they can possibly make and that usually means targeting 30fps unless it's a racing/first person shooter/competitive multiplayer game.

I would be very surprised if this generation stays in this cross-gen phase where games look mostly like last gen games at 4K/60.

I think there's almost no data to draw on. Covid has meant that virtually no next gen games have even been revealed, let alone released.

The very few next gen only games to release seem to target 4k at 30fps. I really don't know why that should be, these things are always arbitrary, to some extent. Last gen games could have targeted 720p and gone for richer environments etc, but it seems like the preference was native resolution of the dominant TV standard at the time, namely 1080p. Actually that seemed to be the case for 720p in the previous gen and now 4k in this one.

So that does seem to be the dominant factor in developers' minds - targeting screens' native resolutions.

Now, personally I don't think this makes much sense nowadays, now that we have such high res displays. The best fidelity would almost certainly compromise down to about 0.5 of native 4k.

But as long as developers continue to prioritise 4k30, then that leaves something like 1440p60 as an easy alternative and my personal preference.
 
But as long as developers continue to prioritise 4k30, then that leaves something like 1440p60 as an easy alternative and my personal preference.
It's not always as simple as just dropping down to 1440p to get 60fps. Maybe it works in some cases, but not all.

Also, ray tracing takes a big chunk out of the rendering budget. Often times just disabling ray tracing yields a lot more frames, so maybe that's how they will deal with graphics/performance modes.

However, what happens when developers base their entire game around ray tracing and don't want to be bothered with baking in GI or using cube maps for reflections requiring extra work just to be able to turn off ray tracing. It might become something that's just built into the engine and there's no turning it off without a ton of work to make the game still look good with it off. I'm only speculating though.
 

Hunnybun

Member
It's not always as simple as just dropping down to 1440p to get 60fps. Maybe it works in some cases, but not all.

Also, ray tracing takes a big chunk out of the rendering budget. Often times just disabling ray tracing yields a lot more frames, so maybe that's how they will deal with graphics/performance modes.

However, what happens when developers base their entire game around ray tracing and don't want to be bothered with baking in GI or using cube maps for reflections requiring extra work just to be able to turn off ray tracing. It might become something that's just built into the engine and there's no turning it off without a ton of work to make the game still look good with it off. I'm only speculating though.

I'm not saying that it's simple in an absolute sense, or that the work required is always the same regardless of the game, just that in general it's a relatively achievable trade off. I think that's been shown pretty clearly so far this gen tbh.

It may be that in time some games can't be scaled down but I think generally resolution for frames will continue to be a viable solution. That's always seemed to be the case on PC, after all.
 
I'm not saying that it's simple in an absolute sense, or that the work required is always the same regardless of the game, just that in general it's a relatively achievable trade off. I think that's been shown pretty clearly so far this gen tbh.

It may be that in time some games can't be scaled down but I think generally resolution for frames will continue to be a viable solution. That's always seemed to be the case on PC, after all.
You know what else seems to be the case on PC, but seemingly difficult on console? Anisotropic filtering. Having 16x AF caused no issues for me on PC ever. Maybe cost 1 frame, but on consoles, developers often use 2x or 4x AF... it makes the game look like ASS in my opinion. Gosh dangit developers! Use 16x AF already! All these fancy effects and the ground looks like PS2.
 
60 is not divisible by 24.

The standard 30fps was choose because 60 (the most common refresh rate in NTSC TVs) where divisible by 30 (the TV have to duplicate all frames to reach the refresh rate).
That in NTSC format of course.

In PAL formats (50Hz) it was 25fps (that is where come the 24.9fps from movies)... in fact for NTSC we should have movies in 29.9fps and not 24.9fps except cinema uses another format so they standardized 24.9fps.

Technically speaking for not have any issue the games should have different framerates for:
NTSC (and others 60 base formats) TVs: 30, 60, 120, etc.
PAL TVs: 25, 40, 100, etc.

That is why PAL users suffered a lot in the past because they have to accept games in 30/60fps in 50Hz TVs when obvious these games should have run in 25/50fps in these TVs.

PS. Today TVs doesn't have different refresh rate standards so that NTSC/PAL issue is a thing of the past.
20 goes into 60 evenly. Each frame would be displayed 3 times.
 
Anyways.... 30fps is a good mix of graphics and playability. It's the absolute MINIMUM frame rate. Whether you like it or not is up to you, but that's the minimum.

I won't refuse to play a game if it's 60fps, but if there's a higher fidelity mode, I'm all for it even if it means dropping to 30fps.
 

Alexios

Cores, shaders and BIOS oh my!
More options is better. There could be a line of text explaining it to noobs to solve all your issues though.

Halo Infinite should have more modes to choose from going by the DF video, like an actual locked 60fps mode with as good graphics/res as as can be had, rather than choosing between dynamic res fidelity "aiming for" 60 or dynamic res performance mode "aiming for" 120.
 

tygertrip

Member
I'd prefer if devs just pick a res/framerate target and don't give any options. At least we will get properly optimized games.

This current trend of giving options makes sense for cross-gen games built for last gen-consoles that by definition have the performance headroom to spare. Next-gen games shouldn't be looking to provide these options because if they are, by definition they are not maxing out the performance and properly optimizing for any of the optional profiles offered in-game.

I insist that 60fps isn't necessary for every game and devs are the ones best placed to decide what fits their vision for the game. So devs should decide on a render target, build the game, optimize, properly test and QA their game and call it a day. I don't want options if those options mean I get a largely inferior, less optimized experience.
They are consoles, of course they are inferior!
 

RJMacready73

Simps for Amouranth
i played through Horizon & TLOU2 at 30fps and it played smooth af no complaints, same with Ratchet & Clank at 40fps and even with the insane graphics and the sheer amount of effects and debris littering the screen the game never felt anything other than smooth, maybe its just me but between R+C @40fps and the Uncharted remasters & GOW 6 60fps i honestly couldnt tell the difference
 

tygertrip

Member
I agree 100% with this.

I haven't had any issues picking between different modes and it's been on a game by game basis. I think the options are fantastic. I preferred Ratchet in Fidelity mode. I preferred Spider-Man in performance mode. They were both rock solid in their respective modes. And that goes for other games that I tried that had those setups.

Seems like a non-issue to me.
It's because some people just can't pick a mode and chill... Because THEY are uptight, they'd rather not have the option and deprive others of them
 

tygertrip

Member
Meh, 60 is overrated. It literally doesn't make a game any more enjoyable. The content is the content no matter how fast it's running.

Similarly, a good song listened to on cheap headphones is still a good song. Using expensive headphones isn't going to make the song suddenly completely different. It's still the same song. A crappy song isn't going to suddenly turn into a good song with expensive equipment either.
This is a ridiculous analogy, for certain genres. 60 or 120 fps LITERALLY adds function to certain games. Not all, no, but many.
 

tygertrip

Member
I think the options suck ass, i prefer my console games to have only one mode be it targeting high frame rate or fidelity depending of the game and that decision shouldn't be mine but the dev's who i would expect took performance into consideration when designing the gameplay.

Never mind that i mostly play on PC laltely so i can just crank most stuff to max and get 60+ framerates since all games are kinda cross gen for now, but i'm not prefering the PC platform because of performance concerns precisely but because in part not having to think much about performance it's a perk i value that got lost on consoles.

That is demostrably and proven to be wrong, there is a whole field dedicated to it and is used against cosumers, search "the paradox of choice" if you care enough.
You can't prove a subjective opinion.
 

ethomaz

Banned
It's because some people just can't pick a mode and chill... Because THEY are uptight, they'd rather not have the option and deprive others of them
You are no actually getting two options.

The time and effort the dev gives to one single mode is being split in two modes that made them not ideal.

So people complaining prefer devs to focus on one mode instead to give options.

It is probably the best results to cater consumers and developers… otherwise you have consumers not getting what they want or devs not having time to give consumers what they wanted.
 
Last edited:

rofif

Can’t Git Gud
I voted that I always choose performance mode, because I do. But the poll didn't allow me to say that I also always research the games online. But not to see which mode to use, just to determine whether to buy the game. E.g. the coverage of Guardians of the Galaxy just dissuaded me from buying the game, cos I'm not playing at 30fps regardless of how much better it looks than the 60fps mode. I'd just rather not play it at all.
hey, I hoped the first option will be this inclusive.
You research/make your own testing or choice :)
 

tygertrip

Member
i think the thread it's pretty evident with the impressions. You have more options yet there is more people dissatisfied with how the games run.
They changed the expectations with the options, people expect fidelity perfection with one option and performance perfection with the other and get dissapointed by both, obviously.
Some people are too uptight, is all.
 

tygertrip

Member
You are no actually getting two options.

The time and effort the dev gives to one single mode is being split in two modes that made them not ideal.

So people complaining prefer devs to focus on one mode instead to give options.

It is probably the best results to cater consumers and developers… otherwise you have consumers not getting what they want or devs not having time to give consumers what they wanted.
Naaa, options are better. It's just a few OCD types complaining. Most people are fine with default and won't touch the options, but it's good to have them. The types that get their bussy blasted because they can't decide which mode to run in... they'll never be happy. Even if the game only has one mode, they'll just get their panties in a twist with the TV/monitor options. "NEOGaf, who here thinks TVs should remove dimming controls?? I just can't decide!!!". LMAO
 
Last edited:

tygertrip

Member
Fear of missing out

If I choose 4K/30, then I'm missing out on fluidness

If I choose 1440p/60, then I'm missing out on effects, resolution, extra grass, ray tracing etc.
Good god, how unaware must one be of themselves to be bothered by such trivial psychological phenomena. "Duhhh muh FOMO!!!"
 

rofif

Can’t Git Gud
I reeally hope there is a performance mode for next gen games (not cross gens).

I dont mind playing games without RT as long its 60fps
Do you remember how the game looked or how it ran after years ? You don't think about n64 games running 20fps. At least I don't.
I don't remember Turok running 25 fps on pc, Doom running 25fps or many 360 games. I remember playing these games and how good graphics were at the time.
I have easier time getting used to fps... which means, when I had 240hz monitor. After a while, it started to feel like a "standard" framerate. the "wow 240fps smoothness" was quickly gone and it became the new norm.
I can perfectly fine tell 30 vs 60 vs 144 vs 240. Even 144 felt like crap after gaming on 240hz... It's all about getting used to it.
But I noticed, that I personally appreciate nice graphics, with good anti aliasing and modern effects more.
I was more satisfied with 4k 27" 60hz monitor for Dark Souls 3 in amazing 4k than 240hz monitor for other games.

Now 4k120 oled is fantastic. The biggest upgrade over 27" 4k60hz monitor is how smooth the desktop is. But for gaming, I don't mind good 30fps as long as I am stunned by graphics and the framerate is stable and responsive. 30 fps can be terrible too
 

tygertrip

Member
If you can't pick between two options, I can't imagine how you manage to play games where you have to level up and select skills or choose between different guns. This is an amazing complaint.
Tell me about it! Reading this thread, I almost fell over backwards from my eyes rolling up so fast!! Good god almighty, such whiny, childish nonsense. Of course, I'm no better, having not been able to keep myself from responding to this OCD whining. At least I'm aware I'm being ridiculous though, lol.
 

rofif

Can’t Git Gud
Good god, how unaware must one be of themselves to be bothered by such trivial psychological phenomena. "Duhhh muh FOMO!!!"
christ... cmon man take it easy.
You really must be a fun person...
You don't choose psychological effects.
 

tygertrip

Member
Do you remember how the game looked or how it ran after years ? You don't think about n64 games running 20fps. At least I don't.
I don't remember Turok running 25 fps on pc, Doom running 25fps or many 360 games. I remember playing these games and how good graphics were at the time.
I have easier time getting used to fps... which means, when I had 240hz monitor. After a while, it started to feel like a "standard" framerate. the "wow 240fps smoothness" was quickly gone and it became the new norm.
I can perfectly fine tell 30 vs 60 vs 144 vs 240. Even 144 felt like crap after gaming on 240hz... It's all about getting used to it.
But I noticed, that I personally appreciate nice graphics, with good anti aliasing and modern effects more.
I was more satisfied with 4k 27" 60hz monitor for Dark Souls 3 in amazing 4k than 240hz monitor for other games.

Now 4k120 oled is fantastic. The biggest upgrade over 27" 4k60hz monitor is how smooth the desktop is. But for gaming, I don't mind good 30fps as long as I am stunned by graphics and the framerate is stable and responsive. 30 fps can be terrible too
Buddy, I started gaming with a home pong system in the 70s, and while I agree that stuff quickly becomes "the new normal", it doesn't mean I have to like 30 fps. God, imagine playing Virtual Fighter 2 at 30 fps, lol. Jeez, some prefer resolution, some prefer framerate. There is no objective "perfect optimization".
 

tygertrip

Member
christ... cmon man take it easy.
You really must be a fun person...
You don't choose psychological effects.
Says the one that gets bothered by the little details. You're the one that can't have fun with a game because you can't decide which mode to run. Hilarious!😂😂
 

rofif

Can’t Git Gud
Buddy, I started gaming with a home pong system in the 70s, and while I agree that stuff quickly becomes "the new normal", it doesn't mean I have to like 30 fps. God, imagine playing Virtual Fighter 2 at 30 fps, lol. Jeez, some prefer resolution, some prefer framerate. There is no objective "perfect optimization".
"buddy", you have some problems...
I am gaming since 90s. This doesn't mean anything.
You clearly get way too worked up over this and have a huge issue accepting someone else's opinion...
Just leave with this attitude. This brings nothing to the discussion if you are just laughing at peoples opinion. I am trying hard to not get worked up by you... so I you can be happy for that.

Says the one that gets bothered by the little details. You're the one that can't have fun with a game because you can't decide which mode to run. Hilarious!😂😂
This is not a topic of our discussion here. It's not about that at all...
 

Melubas

Member
Considering the alternative to having graphics modes would most likely be pretty much all games running in 30 fps, I'm all for it. Having been a PC gamer for years I only recently switched to playing some of my games on the PS5, mainly due to 60 fps finally being an option. 30 fps is unplayable to me. I tried it in Miles Morales, Dark Souls, Ratchet & Clank etc, and it's sooooo choppy in quality mode.

I don't find it annoying that there are options however. I'm used to spending hours going through DF and optimizing on my PC, and find it fun (in reality I usually just lower shadows to medium and drop anti-aliasing to get a stable 60, haha).
 

Neo_Geo

Banned
They should really allow toggles, with an indicator to the right of said toggles telling you what each toggle will do to either enhance performance, or enhance visual fidelity at the cost of performance. Only two options sucks because people like different aspects of the overall IQ and the limited option set may turn too many features off or vice versa.

IE, on PC any game I play things like AF/AO/Textures take a backseat to something like particle density/quality since the former make a larger impact on overall IQ to my eyes and preferences.
 

8BiTw0LF

Banned
Performance can go as laughably low as 1080p (imo a joke in 2021) and sometimes not hold 60fps or go for 120.



11qeRKWaLql2GQ.gif


Funny how everything you and the fidelity-guys were wrong about resolution. Effects are the new scale - just see Returnal - masterpiece in effects. 1080p FTW!
 

rofif

Can’t Git Gud



11qeRKWaLql2GQ.gif


Funny how everything you and the fidelity-guys were wrong about resolution. Effects are the new scale - just see Returnal - masterpiece in effects. 1080p FTW!

Matrix is 1440 and tsr. Looks nothing like 1080 on ps5.
returnal is also looking way sharper than 1080. Is not just 1080.
And these games have graphics to show. Not a blurry mess like bare 1080 with Taa in some performance modes.
 
Last edited:

8BiTw0LF

Banned
Matrix is 1440 and tsr. Looks nothing like 1080 on ps5.
returnal is also looking way sharper than 1080. Is not just 1080.
And these games have graphics to show
Do you think NXG is lying? It's 1080p upsampled to 1440p :messenger_tears_of_joy:

Returnal is 1080p upsampled to 1440p and upscaled to 4K.
 
Last edited:

rofif

Can’t Git Gud
Do you think NXG is lying? It's 1080p upsampled to 1440p :messenger_tears_of_joy:
You listen to him? He says it looks almost 4k on ps5
S is 1080.
anyway run this in ps5 and toggle pixel setting. You can see epic temporar solution doing wonders
 
Last edited:

8BiTw0LF

Banned
You listen to him? He says it looks almost 4k on ps5
S is 1080.
anyway run this in ps5 and toggle pixel setting. You can see epic temporar solution doing wonders
S is ~720p! Yes I believe NXG because he can actually measure it from his tools. What your PS5 or TV tells you is just the output - and that means jack shit.
 
Top Bottom