• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox's Impressive New FPS Boost - Kinda Funny Xcast Interview with Jason Ronald

AnotherOne

Member
qUCs2BV.jpg
 

rofif

Can’t Git Gud
You're wrong. Games Studios shouldn't be forcing 30fps on consoles. On PC I can change the settings like Frame Rates. It should be the same on consoles.
I like that developer can decide he wants to make uncharted 4 and pushes and optimizes around 30 and max graphics.

30 can be good. But in new demon's souls it's terrible. It's terrible compared to bloodborne 30fps... So where is my choice, I am forced to play at 1440p 60fps. The choice is fake in this case
 

DaGwaphics

Member
They need to make and FPS de-boost for the people who kept claiming 30fps is more cinematic (until they were certain their box of choice could actually run something at 60 fps this gen).

In many ways 24fps is more cinematic on an emotional level to the viewer. This is why 60/120fps films often don't hit with a lot of people.

If you took a very powerful system and used that power to create very accurate frames (in comparison to motion video), addressing the different motion blur, lighting, pacing issues with that, most would probably view that in a more favorable light. I'm speaking of cutscenes and not gameplay. It's just how the human mind is conditioned to work due to the high-level exposure to low fps video.

Here's a quick (non-scientific) article I found kind of referencing this phenomenon in regards to film: https://www.vulture.com/2019/10/how-high-frame-rate-technology-killed-ang-lees-gemini-man.html
 
Last edited:
Haha so funny. Can I laugh now?
Nothing wrong with pushing graphical power for cost of 30fps. I prefer better graphics

Nothing wrong for you and the games you gravitate towards perhaps.

I think 30fps shouldn't even exist in a medium where you are directly controlling what is happening on the screen. I realize it will likely be a staple for a long time as there will always be low-end console hardware and aging platforms, I have a Switch so obviously I play a lot of games at 30fps but I don't have to pretend to like it. Some games are "okay" to play at 30 fps but if I revisit those same experiences later with better hardware at 60fps the experience is always much better from a gameplay perspective which to me is much more important than just graphical fidelity which is often not really noticeable in motion during gameplay.

I guess it depends on what games you're playing though. To me, limiting a game 30fps just so that you can have better cutscenes and better photomode is a failure.

In many ways 24fps is more cinematic on an emotional level to the viewer. This is why 60/120fps films often don't hit with a lot of people.

If you took a very powerful system and used that power to create very accurate frames (in comparison to motion video), addressing the different motion blur, lighting, pacing issues with that, most would probably view that in a more favorable light. I'm speaking of cutscenes and not gameplay. It's just how the human mind is conditioned to work due to the high-level exposure to low fps video.

Here's a quick (non-scientific) article I found kind of referencing this phenomenon in regards to film: https://www.vulture.com/2019/10/how-high-frame-rate-technology-killed-ang-lees-gemini-man.html

That's just film and cutscenes though. I don't control the character's actions in a film and I tend to avoid games with a lot of cutscenes interrupting the game. Personal preferences I suppose.

In movies however, directors work with a known output resolution and make the movie with that in mind. When the shitty Hobbit movies came out I watched the first one at 48fps. The sweeping outdoor camera shots were incredible at 48fps but the rest of the movie felt off, almost like someone was filming it with a phone or something. I watched the sequels in regular theatres. The wide sweeping outdoor shots looked awful at 24fps but the rest of the films felt better to watch even though technically they weren't. It's all in my head of course, absolutely everything about the 48fps version looked better, less motion blur, smoother transitions, less detail lost in the steps between frame that my eyes have to track. I'm used to 24 fps in film and unwittingly my brain expects 24fps when I watch a movie. 48fps or 60fps in film is something I will have to get used to before my brain stops giving me the signal something is wierd..

Games are different though. I played most games at 60Hz in the 8-bit and 16-bit days. When polygons hit the scene I had to get used to 30fps (or lower) with the odd 60fps game like F-Zero or Einhander looking incredibly fluid, and often in the Gamecube days the framerate being 60fps wasn't at the expense of the visuals. Games like the Star Wars titles and Metroid Prime were amazing and still ran at locked 60fps. Racing games, which are my mainstay, absolutely much run at 60fps+. Through the next few gens I gravitated to games with higher framerates and since getting into PC gaming I've revisited games that I'd originally played at 30fps and found them much more enjoyable with better refresh rates.
 

DaGwaphics

Member
Nothing wrong for you and the games you gravitate towards perhaps.

I think 30fps shouldn't even exist in a medium where you are directly controlling what is happening on the screen. I realize it will likely be a staple for a long time as there will always be low-end console hardware and aging platforms, I have a Switch so obviously I play a lot of games at 30fps but I don't have to pretend to like it. Some games are "okay" to play at 30 fps but if I revisit those same experiences later with better hardware at 60fps the experience is always much better from a gameplay perspective which to me is much more important than just graphical fidelity which is often not really noticeable in motion during gameplay.

I guess it depends on what games you're playing though. To me, limiting a game 30fps just so that you can have better cutscenes and better photomode is a failure.



That's just film and cutscenes though. I don't control the character's actions in a film and I tend to avoid games with a lot of cutscenes interrupting the game. Personal preferences I suppose.

In movies however, directors work with a known output resolution and make the movie with that in mind. When the shitty Hobbit movies came out I watched the first one at 48fps. The sweeping outdoor camera shots were incredible at 48fps but the rest of the movie felt off, almost like someone was filming it with a phone or something. I watched the sequels in regular theatres. The wide sweeping outdoor shots looked awful at 24fps but the rest of the films felt better to watch even though technically they weren't. It's all in my head of course, absolutely everything about the 48fps version looked better, less motion blur, smoother transitions, less detail lost in the steps between frame that my eyes have to track. I'm used to 24 fps in film and unwittingly my brain expects 24fps when I watch a movie. 48fps or 60fps in film is something I will have to get used to before my brain stops giving me the signal something is wierd..

Games are different though. I played most games at 60Hz in the 8-bit and 16-bit days. When polygons hit the scene I had to get used to 30fps (or lower) with the odd 60fps game like F-Zero or Einhander looking incredibly fluid, and often in the Gamecube days the framerate being 60fps wasn't at the expense of the visuals. Games like the Star Wars titles and Metroid Prime were amazing and still ran at locked 60fps. Racing games, which are my mainstay, absolutely much run at 60fps+. Through the next few gens I gravitated to games with higher framerates and since getting into PC gaming I've revisited games that I'd originally played at 30fps and found them much more enjoyable with better refresh rates.

It's an interesting thing. I wonder if anyone will ever experiment with having a normal gaming fps for the game itself but using 24fps for cutscenes. Could not be stop gap animation obviously, the final frames would need to mimic the properties of traditional film. Would players feel more connected to the characters? Is it even possible in real time? etc.
 

TLZ

Banned
I really like it. I'm playing Sniper elite now and feels great. Does it cost the gpu or cpu anything or is it free? I wonder how it works. I also wish they do something similar to the resolution automatically getting a boost.

Very cool tech stuff.
 
I agree with the KF criticism, but at least the X-Cast isn't the Greg Miller crew. SnowbikeMike and Gary are pretty cool, and Parris seems like a nice guy too.
 

jhjfss

Member
I wonder if it has any performance impact? IF not, they could push visuals in next-gen games until its only capable of running at 30 fps, then use this "cheat" to make it 60 fps?
 

DaGwaphics

Member
I wonder if it has any performance impact? IF not, they could push visuals in next-gen games until its only capable of running at 30 fps, then use this "cheat" to make it 60 fps?

I doubt it's free. The series consoles just have more grunt available than their predecessors and they are using that. There is no cheat for displaying more frames, short of just sending the same frames twice, but there's not much point in that.
 
Top Bottom