Elysion
Banned
For example, let‘s say Sony and MS required that all new games on PS5/SeriesX have to include an optional (internal) 8k mode that‘s not accessible to players, and can only be unlocked by Sony/MS at the system level. Obviously, if you tried to run most games at 8k on current consoles they‘d probably run at 6fps or something. But if, a couple years in the future, you put the same game in a next-gen console like a PS6, the system would recognize the game‘s internal 8k mode and would automatically display the game at that resolution. That’s just an example of course, since we don’t know if 8k will be viable for most consumers by the time next-gen starts. But even if it isn’t, it surely will be for the generation after the next.
And the same could theoretically be done for framerates – 60fps or 120fps (or even 240fps) modes @4k won‘t be possible for all games on current gen of course, but they might be on next gen. All developers would have to do is include a mode where the framerate is capped at 60 or 120 instead of 30 or 60 (even if the game never actually reaches those higher numbers on current consoles). That way, when the next generation of consoles comes around, we wouldn’t have to wait for developers to release next-gen patches, but could immediately play our older games at higher resolutions and framerates. It wouldn’t be perfect of course, since only resolution and/or framerate would be raised, while things like lighting or textures remain unchanged, but it would certainly be better than not having any improvements at all. And developers who want improvements beyond just resolution and fps can still release next-gen patches if they want.
Would something like this be possible for consoles? I know that, unlike PCs, console games have always been developed ‚closer to the metal‘ so to speak, so there‘s probably never going to be the same kind of flexibility and forward-compatibility as on PC. But, starting with PS4/XBox One, consoles have not only adopted very PC-like architectures, but have also carried this architecture over into the current gen, and will likely continue to do the same with future generations. In that case, why not force developers to future-proof their games by including 8k and 60/120fps modes from day one, even if they‘re not possible on current hardware?
And the same could theoretically be done for framerates – 60fps or 120fps (or even 240fps) modes @4k won‘t be possible for all games on current gen of course, but they might be on next gen. All developers would have to do is include a mode where the framerate is capped at 60 or 120 instead of 30 or 60 (even if the game never actually reaches those higher numbers on current consoles). That way, when the next generation of consoles comes around, we wouldn’t have to wait for developers to release next-gen patches, but could immediately play our older games at higher resolutions and framerates. It wouldn’t be perfect of course, since only resolution and/or framerate would be raised, while things like lighting or textures remain unchanged, but it would certainly be better than not having any improvements at all. And developers who want improvements beyond just resolution and fps can still release next-gen patches if they want.
Would something like this be possible for consoles? I know that, unlike PCs, console games have always been developed ‚closer to the metal‘ so to speak, so there‘s probably never going to be the same kind of flexibility and forward-compatibility as on PC. But, starting with PS4/XBox One, consoles have not only adopted very PC-like architectures, but have also carried this architecture over into the current gen, and will likely continue to do the same with future generations. In that case, why not force developers to future-proof their games by including 8k and 60/120fps modes from day one, even if they‘re not possible on current hardware?