• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Starfield is 4K/30 on Series X, 1440/30 on Series S

oji-san

Banned
Disappointing no doubt, was expected but still had hope for some kind of performance mode, oh well i guess i should get used to 30fps again.. barely played any 30fps games the last 5 years (2 on PC then the 3 years on PS5/XSX), it was amazing. Used to have no issues with 30fps.. but not sure it's LG way of handling motion as 30fps looks awful to me... like every time i pan the camera everything has ghosting or something like that, damn shame.
 

DeepEnigma

Gold Member
Yes, they are. Some of the best games ever made ran at 30fps. That's not a limiting factor. Your eyes and fingers adjust and adapt to it. 2 hours in and you don't even notice if the gameplay feels great.

The best shooter of all time ran at 30 fps for many, many years. Destiny.
They are not wrong.

And "the best shooter of all time," got 100% better when they made 60fps on the new consoles and PC.
The New Power Generation Tea GIF by Prince
 
Last edited:

Otre

Banned
Yes, they are. Some of the best games ever made ran at 30fps. That's not a limiting factor. Your eyes and fingers adjust and adapt to it. 2 hours in and you don't even notice if the gameplay feels great.

The best shooter of all time ran at 30 fps for many, many years. Destiny.
And suddenly when it was 60+ in D2 it became greater. Every game gets an upgrade when played at a higher framerate. Its a fact. Response time and fluidity matter in a videogame. You are playing a gimped version at 30, preferring a slightly sharper object on screen.
 

mrqs

Member
And suddenly when it was 60+ in D2 it became greater. Every game gets an upgrade when played at a higher framerate. Its a fact. Response time and fluidity matter in a videogame. You are playing a gimped version at 30, preferring a slightly sharper object on screen.
Sure, it "gets better". But that doesn't mean that 30fps is bad. It isn't.
 
Not playing a game because it's 30FPS is mind boggling to me. If a game looks fun I play it, it doesn't have to meet some resolution or framerate benchmarks for me to play a game
This is quite the shallow take. For over a decade, PC users have been touting the benefits of 60fps since the PS3 era. The quick response and more fluid gameplay experience dramatically change how you experience the game.

Last gen introduced performance modes because of high demand. Then, for the first time since those RAM cartridge expansions of the Sega/PSX era, there's the Pro editions of those consoles, to get even closer to the 60fps "standard".

To imply 30fps is just a preference shows how out of touch you are with the changing demand. It's fine if you like playing that way but to be entirely dismissive of that is objectively better in game experience is what's really mind-boggling.
 
Sure, it "gets better". But that doesn't mean that 30fps is bad. It isn't.

Where have you been for the past 5 years. Anytime any game was 30fps it got shat on. Let's not try to walk back all of those years of sentiment because Redfall and Starfield is 30fps.
 

sankt-Antonio

:^)--?-<
What do people mean when they say Starfield looks great for its scale? Isnt "scale" just data on the HDD that gets loaded when in need as the game is filling RAM the same as any other game. Bottleneck beeing whats rendered on the spot?
So in theory Horizon could look as good as it does, on ten times the map size on multiple different planets, all that would be needed is more stored data (textures, geometry etc.).

Imo Starfield looks rather meh.


Edit:
Regarding 30fps, I think in the last couple of years a lot of gamers got OLED TVs, 30fps on them (without massive amount of in game motion blur) looks and feels terrible. 30fps on an CRT is a whole other ballgame and much better than 30fps on OLEDs.
 
Last edited:

Ywap

Member
Ps5 owners are experiencing their God of Wars and Horizons at 60fps while the Series X/S crowd are enjoying the Redfalls and Starfields at 30. I wonder if the sense of quality is comparable?
 

Stuart360

Member
I get why some of you are hammering this 30fps point as hard as you can, but honestly i think this is just going to happen more and more as the gen goes on, and i still feel that once UE5 games strat dropping in droves, a large amount of them will be 30fps.
 

Fredrik

Member
No surprise. The hints were there in an IGN interview like a year ago, Todd didn’t specifically talk about Starfield there but said he thought 30fps was okay for their games and preferred the fidelity and being able to interact with all objects you can pick up.

Console gamers, as with Redfall it’s great that 30fps is finally criticized, but for those getting depressed here, remember that it won’t be any more unplayable than launch version of any other Bethesda RPG, or Halo 1-4, or Gears of War 1-4, or Uncharted 1-4, or Mass Effect or all 3D Zelda, or Spidey, GOW, TLOUp2, FF7R, etc etc.

And I’ve been enjoying TOTK since launch at unstable 30fps. If I wanted I could handle Starfield just fine at 30fps.

But this is 100% a PC game for me, nice visuals and performance is cool and all but the real bonus is going to be proper mods support, I just can’t see myself skipping that. So Steam version it is! 👍
 

SkylineRKR

Member
S being 1440p at least actually sounds good (I wonder if its true or dynamic?). I prefer 60 but I did play the FFXVI demo at 30fps more because of the fidelity. The combat felt good still,
 

LooseLips

Member
Lots of replies wondering why they can’t just lower to 1080p or drop graphical settings to achieve 60fps. It is probably not that simple due to the scope of the game.

There appear to be a lot of complex systems and processes running in this game at once. This isn’t just a graphics thing at this point, all of these separate systems cost resources to implement and something has to give to allow for that.

Phil saying that it’s a creative decision is probably not far from the truth. Creatively, they want to make something big and ambitious and so they had to decide between narrowing the scope or making the game they want but lower the framerate to get them there.

I’d like to know what the system requirements are for PC at this point. I think even my modest build (12700K, 3080) is going to be pushed quite hard for 1440p60.
A rare sighting of sensiblle discourse...
 

LiquidMetal14

hide your water-based mammals
It's probably a little bit more concerning considering the game looks good but concerning on the PC side. Knowing bethesda's eroded reputation and history of bugs doesn't bode well for me but I really hope they can at least release a Polish (ed) game even if the game disappoints.

One of those cases where maybe they bit off more than they can chew but I'm also not going to criticize if the game is optimized and what is happening is worth the sacrifices. It's still really disappointing at the end and it doesn't change anything as far as my purchase decision but let's not pretend that you don't want a 60 FPS mode for the most powerful console out there.
 
Last edited:

MarkMe2525

Member
This is the whole tweet though


Exactly, and maybe I wasn't clear enough. He is clearly referring to the console output signal. He even refers to it directly when he mentions "standard output" and supporting up to 120hz. No one talks about device outputs when referring to a games internal fps.

When I said it was not wise to be talking about the output signal, I meant it was not wise in relation to the question asked. Frankly it doesn't answer the question and he is just deflecting.

This is similar to when a question was asked to a MS exec (I believe) about the resolution of games on XB1, the exec. responded that all games will eventually be outputting at 1080p to the television. In this case, just like Greenberg's response, he didn't actually answer the question but referred to the output signal of the Xbox.

Edit: Greenberg being an idiot, does not equate to him making some sort of promise that series X games would all be at least 60fps. He is speaking to something completely different.
 
Last edited:

Nydius

Member
All the arguing about which is better feels like a distraction. Some people are fine with 30, others aren't -- it's all a matter of personal preference.

But where I think most of us are (rightfully) pissed -- and I know this is at least the case for me -- is in how we were sold both of these consoles promising major generational leaps. Higher frame rates, VRR, native 4K, up to 120FPS, possible 8K output. Now, yes, some of that we knew was marketing bullshit (especially the 8K stuff), but it wasn't unrealistic to expect that we'd get more options than we had in the past. Options that allowed us to prioritize frame rate over graphics if we chose. This was also part of the marketing campaign for both the XSX and PS5 -- pointing to the options we got in last generation's mid-gen refresh consoles saying "this is the future".

Many of us bought new equipment to take advantage of these features. But now we're sliding back to locked 30fps games where most of these features are meaningless. VRR, for example, is functionally useless at a locked 30fps. If we're just sliding back to the same standards as last gen with zero user preferences available to us, then yeah... I think being pissed off is justified. Especially when we all know they're probably going to try to make new mid-gen consoles with the same "improved performance" promises they gave us for these consoles.
 
This is quite the shallow take. For over a decade, PC users have been touting the benefits of 60fps since the PS3 era. The quick response and more fluid gameplay experience dramatically change how you experience the game.

Last gen introduced performance modes because of high demand. Then, for the first time since those RAM cartridge expansions of the Sega/PSX era, there's the Pro editions of those consoles, to get even closer to the 60fps "standard".

To imply 30fps is just a preference shows how out of touch you are with the changing demand. It's fine if you like playing that way but to be entirely dismissive of that is objectively better in game experience is what's really mind-boggling.


I'd rather play a good game at 30FPS than a mediocre one at 60FPS, nobody's disputing that 60FPS is better than 30FPS, but to avoid playing a game because it's 30FPS is completely idiotic. I play video games, not framerate or resolution. Being mad that a game won't achieve your expectations for framerate is fine, making it your primary purchasing decision is bizarre. Also if I'm out of touch with the "changing demand", why are so many devs staying at 30 and pushing graphics, world size, and things like AI/physics instead of focusing on framerate? Whenever I have the choice I pick performance mode over quality mode so this isn't some spiel about framerate not mattering, just that it doesn't matter enough to avoid playing a game that looks fun.
 
Last edited:

mrqs

Member
Where have you been for the past 5 years. Anytime any game was 30fps it got shat on. Let's not try to walk back all of those years of sentiment because Redfall and Starfield is 30fps.
I just think people are a bit insane about expecting next-gen only games to have a 60fps mode. It was easy when the PS4/Xbox One were the main hardware and the games were running on next-gen consoles.

It won't work like that anymore. Most next-gen only games will be @ 30fps.
 

Ribi

Member
Yes, they are. Some of the best games ever made ran at 30fps. That's not a limiting factor. Your eyes and fingers adjust and adapt to it. 2 hours in and you don't even notice if the gameplay feels great.

The best shooter of all time ran at 30 fps for many, many years. Destiny.
But what if those GREATEST GAMES EVER ran at 60fps?
 
Can you imagine if they would have given us 120fps? Even though I know it's not I'd still be touting it as a perfect Sci fi game

A man can dream
 
Last edited:

Kataploom

Gold Member
.... is anyone surprised? I mean, look at that game.

Honestly, they better add a 1440p/900p performance mode for Series X/S respectively.
Must be pushing the CPU, in that case there's no way to reduce resolution to gain FPS.

That's what people actually wanted when they said "next gen game", didn't they? Games that couldn't be done on previous gen by just downgrading graphics features or resolution.
 

SmokSmog

Member
But, but starfield runs at 30FPS on the Xbox!

FF16 runs at 30-40FPS on PS5
Avatar game will be 30FPS
Star wars outlaws will be 30FPS
Every "next gen" game will be 30FPS on both consoles.

But, but I need to make fun of bethesda!

58e5a15dc3fa73877645a137fd8d19b7b6633f100d24e5dca6d181494cd51363.png
 
Last edited:
Top Bottom