• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Why Higher Refresh Rates Matter

Topher

Gold Member
My first sentence was "Sure, higher refresh rates are better. This should be a consensus."

But the reality is that game devs have to develop for consoles. So it's all about them.

If you have a PC, surely you'll go for higher framerates when possible, that's a given

And this video shows why many go for higher frame rates whether it is PC or console, given the choice. But fact of the matter is, for this topic, consoles leave the conversation entirely past 120hz so no, it isn't all about consoles in this thread.
 

Bojji

Member
With LG OLED UFO test shows quite great difference and 120hz looks amazing. But for games most difference between 60 and 120 comes from input lag and with tech like reflex 60fps is very good.
 

Gaiff

SBI’s Resident Gaslighter
Oh, but it does

What was the last relevant PC exclusive?
Nice non-sequitur. The statement and question it posits have nothing to do with another. PC having no "relevant" exclusives means the world revolves around consoles? By relevant I suppose you mean AAA derivative trash? MMOS, RTS, simulators, and every other genre probably don't matter in your world where gaming revolves around consoles.
 

Hugare

Member
Consoles are the common minimum denominator. For some games they are the base, but not for others.
Curious to think you picked 2 games that are first and foremost PC tech games.
Cyberpunk, from launch was always better on PC, by a huge margin. Not only it had fewer bugs and performance issues, but it also looked better.
And it has the most advanced PC tech, such as DLSS 3.5, RTX, XeSS and soon FSR3.
Alan Wake is using PC specific tech that the PS5 doesn't have, Mesh Shaders. And then there is the DLSS and RTX 3.5 tech.
These are 2 games that were made with the high specs of PC in mind, and then cutback to fit into consoles.
You are talking about 2 games that have the biggest sponsorship from Nvidia.
Cyberpunk was 100÷ pc first. U should read the Release reviews and User reviews from that time ...

I picked Cyberpunk 'cause despite running better on PC (obviously) and having unique features, it was designed to be possible to run on last gen machines still

You really think that the game wouldnt be wildly different if it was PC only?

I'm talking about something like Crysis was back in 2007. You could downscale the game all you want, it wouldnt be possible to run on the consoles of that time.

Same for Alan Wake II. Xbox supports and uses Mesh Shaders in AW II. PS5 doesnt have it, but uses Primitive Shaders instead.

DLSS and RTX are nice, but they are hardly important for the game overall design. Just bells and whistles.

Again: you dont see games like Crysis that you could cutback all you want and it would still only be possible on PC

This "the game was designed for PC first and foremost" talk is only present 'cause devs want to sell PC copies and are in bed with GPU manufacturers

Nice non-sequitur. The statement and question it posits have nothing to do with another. PC having no "relevant" exclusives means the world revolves around consoles? By relevant I suppose you mean AAA derivative trash? MMOS, RTS, simulators, and every other genre probably don't matter in your world where gaming revolves around consoles.

By relevant I mean games that sells

Funny that you mentioned genres but not games to build your argument
 
Last edited:

winjer

Gold Member
I picked Cyberpunk 'cause despite running better on PC (obviously) and having unique features, it was designed to be possible to run on last gen machines still

You really think that the game wouldnt be wildly different if it was PC only?

I'm talking about something like Crysis was back in 2007. You could downscale the game all you want, it wouldnt be possible to run on the consoles of that time.

Same for Alan Wake II. Xbox supports and uses Mesh Shaders in AW II. PS5 doesnt have it, but uses Primitive Shaders instead.

DLSS and RTX are nice, but they are hardly important for the game overall design. Just bells and whistles.

Again: you dont see games like Crysis that you could cutback all you want and it would still only be possible on PC

This "the game was designed for PC first and foremost" talk is only present 'cause devs want to sell PC copies and are in bed with GPU manufacturers

have you noticed how many games run poorly on console. some dropping frame rate and other going to very low resolution.
It's like these games and engines were designed for much stronger hardware than consoles.
 

Hugare

Member
have you noticed how many games run poorly on console. some dropping frame rate and other going to very low resolution.
It's like these games and engines were designed for much stronger hardware than consoles.
And yet again I'll mention Crysis, the last meaningful PC game that wouldnt run on consoles of that time even with cutbacks

And have you noticed that despite running poorly on consoles, they still use tech that fits the scope of what consoles can do? Even running poorly?

It's almost like devs are making games while having in mind that they are going to make the game work on consoles or something

And just so you know, I would love seeing devs make something like Crysis again for PC.

But they wouldnt be crazy to drop a huge user base with consoles. That's leaving too much money on the table.
 
Last edited:

winjer

Gold Member
And yet again I'll mention Crysis, the last meaningful PC game that wouldnt run on consoles of that time even with cutbacks

And have you noticed that despite running poorly on consoles, they still use tech that fits the scope of what consoles can do? Even running poorly?

It's almost like devs are making games while having in mind that they are going to make the game work on consoles or something

That's because you only consider Crysis as the last great PC title. While ignoring that games like CP2077 and Alan Wake are amazing showcases for PC tech.
 
Honest question: is there a way for 60+ fps content to be displayed so that the viewer doesn’t perceive the infamous soap-opera effect?

I gave away my beloved plasma TV 3 years ago for a series of reasons, and since then my vision never fully adapted to the poor movement resolution of LED-based TVs. 30fps CG scenes and in-engine cutscenes in games are invariably a pain to watch due to sample-and-hold, and TV and movies absolutely require some motion interpolation if I want to watch more than a few minutes of footage.
On the other hand though, on modern screens the soap-opera effect is very obvious and, to use a very abused expression, “not cinematic”. Things feel too smooth, and it seems everything is moving too fast. That is perfect for actual gaming, but in cutscenes it’s a bit jarring. Would there be some way to make it “feel right” while still maintaining the smoothness?
You have such a strange take on this because you're complaining about both framerates.

The Ghost of Tsushima PS5 re-release has 30fps cutscenes which look like shit coming right from 60fps gameplay. The majority of them are 60fps in the PS4 version running on a PS5 though. They capped them at 30fps in the re-release so that they could add in the real time weather as well as the Japanese lip sync.

But seeing a game in 60fps doesn't give me the soap opera effect that watching a live action movie does in 60fps such as "Billy Lynns Long Halftime Walk" or "Gemini Man".

Personally I am all for 4K, 60fps films as they actually present a far superior clarity in picture quality than a 24fps film.

Gemini Man is probably the clearest movie I've ever seen, image wise next to Avatar 2 which literally looks like a fish tank on an OLED screen.
 
Last edited:

64bitmodels

Reverse groomer.
Cyberpunk and Alan Wake II are curently the best looking PC games on the market, and they were made with consoles in mind first and foremost.
....Cyberpunk.... could barely properly run on PS4 and Xbox One. The baseline, back in 2020. That game was not made with consoles first and foremost.
 
Last edited:

Hugare

Member
That's because you only consider Crysis as the last great PC title. While ignoring that games like CP2077 and Alan Wake are amazing showcases for PC tech.
Oh man ...

I've already mentioned those 2 cases.

Cyberpunk was built to fit consoles. Last gen consoles can "run" it on 2013 laptop CPUs. It runs poorly, but it runs.

Same for Alan Wake. It uses Mesh Shaders, but Xbox can run Mesh Shaders. PS5 uses Primitive Shaders, which have the same purpose.

They look and run better on PC, but they are consoles games beefed up on PC

....Cyberpunk.... literally couldn't run on PS4 and Xbox One. The baseline, back in 2020. That game was not made with consoles first and foremost.

It literally could. It played like ass, but I finished the game on my PS4. I wasnt dreaming.

Make Alan Wake II run on the Switch. Now this is an example of what cant literally be achieved.
 

winjer

Gold Member
Oh man ...

I've already mentioned those 2 cases.

Cyberpunk was built to fit consoles. Last gen consoles can "run" it on 2013 laptop CPUs. It runs poorly, but it runs.

Same for Alan Wake. It uses Mesh Shaders, but Xbox can run Mesh Shaders. PS5 uses Primitive Shaders, which have the same purpose.

They look and run better on PC, but they are consoles games beefed up on PC



It literally could. It played like ass, but I finished the game on my PS4. I wasnt dreaming.

Make Alan Wake II run on the Switch. Now this is an example of what cant literally be achieved.

What you played on the PS4 was a very cutdown version of a much better PC game.
 

64bitmodels

Reverse groomer.
Honestly, the real question is whether any of you guys think 60hz only screens will survive in modern display tech, especially at the rapid pace of improvement we've got with OLEDs and MiniLEDs these days. Many new TVs, LCD screens, phones.... they're all coming out with 90hz or better screens. Steam Deck OLED was 90hz too.

I feel like by 2030, 60hz only screens will be relegated to ultra budget. It's already the case for monitors, you can pick up a 1080p 120hz for super cheap.
 
Last edited:

hinch7

Member
One of the reasons why PC's have been my go-to for gaming alongside the choice of controls and peripherials. Just so many options to pick from and play around with.

Can't wait until we get the next generation of 240Hz Samsung panel QD-OLED's in 21:9 monitors.
 
Last edited:

HL3.exe

Member
Agree, although input latency can be highly mitigated at any Frame rate, if your PC is properly configured (synced with your display)

Things like: lowering frame pre-render buffer, Nvidia Reflex, using RTSS properly, etc.
 
Last edited:
Its the LG 45" ultrawide and I overpaid to get it at launch at $2500 and here at Black Friday they were down to around $900 at some sites
I find it wise never to look at the sales when your previous purchase is down by a considerable amount.

Sad Matthew Mcconaughey GIF by Legendary Entertainment
 

Deerock71

Member
"How noticeable these jumps are depends on your sensitivity to smoothness."

A direct quote from the video. Some of us clearly need more visene, mucous reducers, and kleenex more than others around here. I get it, your eyes need the safe spaces of 540hz. To each their own.
 

Gaiff

SBI’s Resident Gaslighter
By relevant I mean games that sells
Oh, boy. So since Xbox no longer has exclusives, I suppose it's no longer relevant? And since the way you framed your original argument clearly excluded Nintendo and that most current AAA games skip the platform, I guess they're not relevant either. So by consoles, you really meant Playstation, which had a whopping 2 exclusives this year, including a single 1st party. Is that what you meant?
Funny that you mentioned genres but not games to build your argument
Why would I name games when I had no idea what you meant by "relevant"? Relevant in this industry is the games that make the most money and hint; they're not exclusives.

And yet again I'll mention Crysis, the last meaningful PC game that wouldnt run on consoles of that time even with cutbacks

And have you noticed that despite running poorly on consoles, they still use tech that fits the scope of what consoles can do? Even running poorly?

It's almost like devs are making games while having in mind that they are going to make the game work on consoles or something

And just so you know, I would love seeing devs make something like Crysis again for PC.

But they wouldnt be crazy to drop a huge user base with consoles. That's leaving too much money on the table.
This stupid shit again. Consoles now have hardware comparable to an RTX 2070S/Ryzen 3600 which is about where the average gaming PC sits. If you want something that can't run on consoles, not only would you erase 100% of the console market but you'd also erase like 85% of the PC market. This isn't 2005. Who the fuck wants a game that runs a 1080p/30fps on a 4090? Consoles are no longer machines with incredibly unique and exotic configurations like the PS3. They're much closer to PCs. Damn near everything that can run on a 4090 can also run on a PS5. They both have modern hardware. None of that pixel shader 3.0 stuff that can't run on your GPU only supporting pixel shader 2.0.

It's not only that developers take consoles into consideration but they also take PCs into consideration and the vast majority of them don't sport a 4090 so those statements are nonsensical no matter how you look at it. Hell, most of the time, the minimum specs of modern AAA games consider the GTX 1060, an old mid-ranger from 6 years ago. Does that mean the developers don't care about consoles with twice the GPU horsepower? Or are you smart enough to figure out that they almost always take the route that'll lead to the most profitable outcomes? Alan Wake II as far as I'm aware is the first or second game that literally is designed 100% with current-gen consoles and modern PCs only and those consoles have been out for almost 3 years.
 
Last edited:

Hugare

Member
Honestly, the real question is whether any of you guys think 60hz only screens will survive in modern display tech, especially at the rapid pace of improvement we've got with OLEDs and MiniLEDs these days. Many new TVs, LCD screens, phones.... they're all coming out with 90hz or better screens. Steam Deck OLED was 90hz too.

I feel like by 2030, 60hz only screens will be relegated to ultra budget.
90hz is cool, but 120hz lets you get 40 hz

With 90 you can get 45hz, and its fine, but 5 FPS can be a lot to ask from some hardwares

120hz would be the perfect spot. 60, 30 and 40 FPS options.
 
Last edited:

Gaiff

SBI’s Resident Gaslighter
It literally could. It played like ass
Yeah, so just like Crysis on the PS360? Cyberpunk dropped to the single digits upon release and sometimes completely locked up on consoles. It was so bad that CDPR went out of their way to hide it from the press and you're sitting us with a straight face that it was "designed" to run on PS4 when all CDPR did was desperately cram something into machines that clearly couldn't handle it.

The Witcher 3 also runs on the Switch.
 

Topher

Gold Member
"How noticeable these jumps are depends on your sensitivity to smoothness."

A direct quote from the video. Some of us clearly need more visene, mucous reducers, and kleenex more than others around here. I get it, your eyes need the safe spaces of 540hz. To each their own.

I caught that quote as well. I've been saying for a while that I think this varies from person to person, but frankly that is something I've been pointing out to these people who say there should not be any other mode than 30fps, which I think is ridiculous.
 

64bitmodels

Reverse groomer.
90hz is cool, but 120hz lets you get 40 hz

With 90 you can get 45hz, and its fine, but 5 FPS can be a lot to ask from some hardwares

120hz would be the perfect spot. 60, 30 and 40 FPS options.
Looking at those motion clarity screenshots I think we should aim a lot higher than just 120hz if we want truly amazing visual clarity. For the time being it's fine but 360hz is the endgame here
 

mrcroket

Member
Oh, boy. So since Xbox no longer has exclusives, I suppose it's no longer relevant? And since the way you framed your original argument clearly excluded Nintendo and that most current AAA games skip the platform, I guess they're not relevant either. So by consoles, you really meant Playstation, which had a whopping 2 exclusives this year, including a single 1st party. Is that what you meant?

Why would I name games when I had no idea what you meant by "relevant"? Relevant in this industry is the games that make the most money and hint; they're not exclusives.


This stupid shit again. Consoles now have hardware comparable to an RTX 2070S/Ryzen 3600 which is about where the average gaming PC sits. If you want something that can't run on consoles, not only would you erase 100% of the console market but you'd also erase like 85% of the PC market. This isn't 2005. Who the fuck wants a game that runs a 1080p/30fps on a 3090? Consoles are no longer machines with incredibly unique and exotic configurations like the PS3. They're much closer to PCs. Damn near everything that can run on a 4090 can also run on a PS5. They both have modern hardware. None of that pixel shader 3.0 stuff that can't run on your GPU only supporting pixel shader 2.0.

It's not only that developers take consoles into consideration but they also take PCs into consideration and the vast majority of them don't sport a 4090 so those statements are nonsensical no matter how you look at it. Hell, most of the time, the minimum specs of modern AAA games consider the GTX 1060, an old mid-ranger from 6 years ago. Does that mean the developers don't care about consoles with twice the GPU horsepower? Or are you smart enough to figure out that they almost always take the route that'll lead to the most profitable outcomes? Alan Wake II as far as I'm aware is the first or second game that literally is designed 100% with current-gen consoles and modern PCs only and those consoles have been out for almost 3 years.
Thanks, I was about to answer something similar, the elitism of "pc gamers" over consoles and the complaining about past times where pcs had real games only possible on pc is tiresome and very outdated.

In the PS2 and PS3 era (where most of those games like crysis, half life 2, doom 3, farcry... were launched) technology was advancing much faster than now. A pc after 2 or 3 years was lucky if it was even able to run the latest games, the hardware was much cheaper and the consoles were much further behind technically after a few years of release.

Nowadays there is no such difference between pc and consoles even compared to the high end (not even close) and also pcs have much longer life cycles. In fact just look at how many people complained about Alan Wake 2 not performing well on 7 year old graphics cards.
 

64bitmodels

Reverse groomer.
Oh, boy. So since Xbox no longer has exclusives, I suppose it's no longer relevant?
Not that i agree with the other guy but Xbox is quite literally fading into irrelevance thanks to its insane lack of exclusives. That is the worst example you could've picked to try and disprove him, lol
 

FoxMcChief

Gold Member
Yeah, but when playing Callisto Protocol on the PS5, quality mode’s lighting difference is a much bigger difference than the extra frames. So as long as devs keeping doing the wrong thing and giving console gamers options, I will go with the mode that offers more visual effects.

I wish more devs went back and made games how they wanted, instead of catering to the vocal minority that “need” 60fps.

Leave options to the PC Elite.
 

Hugare

Member
Yeah, so just like Crysis on the PS360? Cyberpunk dropped to the single digits upon release and sometimes completely locked up on consoles. It was so bad that CDPR went out of their way to hide it from the press and you're sitting us with a straight face that it was "designed" to run on PS4 when all CDPR did was desperately cram something into machines that clearly couldn't handle it.

The Witcher 3 also runs on the Switch.
There was a level cut from Crysis on 360/PS3 due to hardware limitations. It also had downgraded physics.

Cyberpunk on last gen consoles is feature complete. You could play it from start to finish on those consoles. Period. No matter how bad it ran, it ran. It ran like shit and looked like shit, but I played the same missions on the same city as PC players on my PS4.

The Witcher 3 also runs on the switch, yes. Its also completely different from Cyberpunk and Alan Wake II, so I dont know why you bother mentioning it.
 

Ulysses 31

Member
Looking at those motion clarity screenshots I think we should aim a lot higher than just 120hz if we want truly amazing visual clarity. For the time being it's fine but 360hz is the endgame here
I think 240Hz with BFI, 120Hz with very high motion clarity, will look amazing too. :messenger_winking_tongue:

Once windows/consoles get BFI options like the RetroTink 4K to mitigate the brightness loss that is.
 

HeisenbergFX4

Gold Member
I caught that quote as well. I've been saying for a while that I think this varies from person to person, but frankly that is something I've been pointing out to these people who say there should not be any other mode than 30fps, which I think is ridiculous.
My son is a big time gamer albeit just on Series X but he swears he can't tell a difference between 60 and 120 and he plays shooters

Even showing him 60 to 240 and he can't tell a difference and has maintained that stance for as long as he has seen faster displays

I even bought him a 32" 144hz monitor for his desk to play games at 120 on his Series X and he is like, yeah ok but was fine at 60

I don't get it the difference is night and day to me.
 

FoxMcChief

Gold Member
My son is a big time gamer albeit just on Series X but he swears he can't tell a difference between 60 and 120 and he plays shooters

Even showing him 60 to 240 and he can't tell a difference and has maintained that stance for as long as he has seen faster displays

I even bought him a 32" 144hz monitor for his desk to play games at 120 on his Series X and he is like, yeah ok but was fine at 60

I don't get it the difference is night and day to me.
Your kid ain’t L33T
 

64bitmodels

Reverse groomer.
My son is a big time gamer albeit just on Series X but he swears he can't tell a difference between 60 and 120 and he plays shooters

Even showing him 60 to 240 and he can't tell a difference and has maintained that stance for as long as he has seen faster displays

I even bought him a 32" 144hz monitor for his desk to play games at 120 on his Series X and he is like, yeah ok but was fine at 60

I don't get it the difference is night and day to me.
since he doesn't care it would be really nice if i got that 32" 144hz monitor....
 

Ulysses 31

Member
My son is a big time gamer albeit just on Series X but he swears he can't tell a difference between 60 and 120 and he plays shooters

Even showing him 60 to 240 and he can't tell a difference and has maintained that stance for as long as he has seen faster displays

I even bought him a 32" 144hz monitor for his desk to play games at 120 on his Series X and he is like, yeah ok but was fine at 60

I don't get it the difference is night and day to me.
Point out the motion clarity, something with text will more clearly show what's running at 60 and 120 fps.
 

64bitmodels

Reverse groomer.
I think 240Hz with BFI, 120Hz with very high motion clarity, will look amazing too. :messenger_winking_tongue:
I think BFI is a nice workaround, but it's most effective in 60hz games since most refresh rates capable of displaying it can't really use it with HFR.

I can see it being very useful for older retro games that ran at 60hz on a CRT. since you get the motion clarity back. Plus with the 1ms response time of many of these monitors, you'd basically have as close to a modern CRT as you get.
 

MikeM

Member
Not really. We are having so many games with low internal resolution that sometimes goes back to PS3/360 era

The catch is that they put "4K resolution" on the box, but its being upscaled from sometimes sub 1080p res

Alan Wake II in Performance Mode runs at 872p. It's ridiculous. And it still runs at sub 60 FPS. Same for the Quality mode that runs at 1272p and sub 30.

So what do we do? We sacrifice game design in order to have a stable game at a higher frame and resolution? Maybe many features in the game wouldnt be possible if they had to make it work on that hardware at 60 FPS decently (higher than 1080p)

How many years we would have to wait until a Nintendo console could run TOTK at 60 FPS/4K?

It's a tough topic
Whocares what the internal resolution is. The question is, does the gaming community notice? If the image quality is good, then does it matter if it runs internally at 720p or 1440p?

What hurts more is that DLSS looks far better at lower internal resolutions than DLSS. FSR will improve over time tho.
 

MikeM

Member
Point out the motion clarity, something with text will more clearly show what's running at 60 and 120 fps.
Moving text is a huge thing on my OLED. Diablo 2 at 60fps looks fine in movement vertically but terrible horizontally. This is all cleaned up at 120fps on my PC version on the same TV.
 

nkarafo

Member
120hz would be the perfect spot. 60, 30 and 40 FPS options.
240hz gives you all that but also 80 fps as well.

Though these don't matter as much when most (if not all) high refresh monitors are also VRR compatible. So you don't need those hard limits, they can sync in odd frame rates as well.
 
Ok..... and?

Who is this supposed to be a response to?

Nobody ever argues that lower framerates are better. Many, however, argue that on fixed hardware platforms like consoles where performance is finite, 30fps can be "good enough" for many many different types of games.

If that's not you... go play on PC.
 

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
Yes, we traded size and convenience, for image quality.
I don't think that we knew at the time, exactly what we were losing.

Let's not pretend that CRTs had perfect image quality. When I replaced my CRT PC monitor for an LCD screen, I was ecstatic that the image was razor sharp compared to the old CRT. The difference was most striking in the corners of the screen That's of lesser importance for gaming, but it was a big deal for productivity.

Another big plus was that you could get much higher resolutions screens that weren't absurdly large and heavy like CRT screens. And we were rid of overscan.
 

64bitmodels

Reverse groomer.
Let's not pretend that CRTs had perfect image quality. When I replaced my CRT PC monitor for an LCD screen, I was ecstatic that the image was razor sharp compared to the old CRT. The difference was most striking in the corners of the screen That's of lesser importance for gaming, but it was a big deal for productivity.

Another big plus was that you could get much higher resolutions screens that weren't absurdly large and heavy like CRT screens. And we were rid of overscan.
CRTs being naturally fuzzy image wise does help give the image a sort of automatic anti aliasing whereas LCD and OLED need that built in through software due to them just displaying everything as it is.

Though, I'm less inclined to believe that CRTs had the image they did because of the TV as much as it was probably the component/composite signal blending all the colors and pixels together
 

Gaiff

SBI’s Resident Gaslighter
Not that i agree with the other guy but Xbox is quite literally fading into irrelevance thanks to its insane lack of exclusives. That is the worst example you could've picked to try and disprove him, lol
This quite literally doesn't matter. It still gets almost every game and devs are even bitching about having to develop for the Series S. They can't even ship their games to even Playstation or PC if the game cannot run on the Series S. Only Larian delayed BG3 on Xbox as a result. The argument that the world revolves around consoles because the PS5 gets 1-3 exclusives a year is moronic.
There was a level cut from Crysis on 360/PS3 due to hardware limitations. It also had downgraded physics.
A whole ass DLC is missing from the PS4 and X1 version of Cyberpunk because of hardware limitations. Crysis 2 and Crysis 3 are also on consoles and came out day 1 on both platforms. While some might argue that Crysis 2 is a step back from Crysis, Crysis 3 most definitely isn't.
Cyberpunk on last gen consoles is feature complete. You could play it from start to finish on those consoles. Period. No matter how bad it ran, it ran. It ran like shit and looked like shit, but I played the same missions on the same city as PC players on my PS4.

The Witcher 3 also runs on the switch, yes. Its also completely different from Cyberpunk and Alan Wake II, so I dont know why you bother mentioning it.
I bother mentioning it because your stance is utter nonsense. This isn't the early 2000s where literal hardware limitations prevented new games from running on older hardware. Almost every game can be downscaled enough to be run on old hardware. Rift Apart, Returnal, and TLOU Part I, PS5 only games can all run on a GTX 960 that dates back to 2015. Same for Immortals of Aveum, Remnant II, or Dead Space Remake. Those are current-gen only games on consoles but run on comparatively last-gen PC specs.

The Witcher 3 is a perfect example because we know for a fact it was entirely designed around hardware far more powerful than the Switch yet it runs on the Switch all quests and DLCs. What kind of PC exclusives do you think would only run on PC when we managed to cram a PS4 game into a Switch? You'd need a game that runs at 1080p/30fps on a 4090 to completely exclude consoles from considerations and even then, nothing would stop you from running it at 720p with FSR and get 20-25fps and still be able to play the game from start to finish but again, even 4090 owners would tell you to fuck off with a game like that.
 
Last edited:

NeoIkaruGAF

Gold Member
Let's not pretend that CRTs had perfect image quality. When I replaced my CRT PC monitor for an LCD screen, I was ecstatic that the image was razor sharp compared to the old CRT.
Just for kicks, you should try connecting a modern console to a CRT display.
Even on a standard-res CRT and with a cheap HDMI->SCART converter, the image out of a Switch is sharper than what you’d get out of a PS2 or GameCube 20 years ago. Sure, the small text of many modern games is impossible to read and that’s something everyone immediately noticed at the start of Gen 7 (that alone must have done wonders to boost HDTV sales). But the graphics look good.

I’ve done it, for the record. Mario Kart looks great and indie games like Celeste look much better with some CRT smoothing to the raw pixels.
 

Clear

CliffyB's Cock Holster
Back in the day, all games looked like shit and ran like shit, and yet... here we are!

If its enjoyable or not is the only metric that actually matters.
 
Honest question: is there a way for 60+ fps content to be displayed so that the viewer doesn’t perceive the infamous soap-opera effect?
"soap opera effect" only refers to interpolation, and should never be used for gaming (since it adds a buttload of lag).
interpolation is the TV adding in fake frames based on reading/analyzing previous frames.
interpolation is a setting you can turn off 100%, but it goes by many names depending on the tv (e.g. trumotion, motionflow).

the only way to enjoy >60fps content is on a display that supports >60hz natively.
for example, the gdmfw900 could do 80hz at its max res (if i remember correctly).
most (all?) OLEDs can do 120hz, some go higher.
plenty of LCDs go above 60hz.
playing games with interpolation turned off will have no soap opera effect.

"sample and hold" is what reduces motion clarity. LCDs and OLEDs use this, CRTs and plasmas do not.
if you want better motion clarity, either use a very high Hz LCD/OLED (with a framerate to match)... or just use a CRT/plasma.
 

rodrigolfp

Haptic Gamepads 4 Life
BOTW/TOTK, The Last of Us, Ocarina of Time, Mario 64, all of the GTA games, RDR, RDR 2, Bioshock, Mass Effect and etc.
Some GTA games, Bioshock and Mass Effect were 60+fps day 1. And the others could be 60fps (TOTK was lol) on PC day 1 if not because potato consoles.
 
Last edited:

Hugare

Member
Some GTA games, Bioshock and Mass Effect were 60+fps day 1. And the others could be 60fps (TOTK was lol) on PC day 1 if not because potato consoles.
If you consider PC, than sure, every game is 60 FPS on day one.

But they were all designed to run at 30 FPS on consoles, and thats my point.

You can play TLOU Part I on PC at 120 fps if you want to. Doesnt change the fact that the game is linear af because they had to make the game run at 30-ish fps on the PS3 originally.
 
Top Bottom