• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

What if last gen hardware does not hold back games?

Demon's Souls, Ratchet and Returnal all run at 60 and are next gen games. I don't see why it should be an issue offering a toggle between 30 and 60 for upcoming games exclusive for PS5 and Series X.

All of them launch games.

Developers have not yet tapped the hardware potential of these next gen consoles. Once that starts to happen the sacrifices start creeping in to maintain acceptable framerates.
 
Demon's Souls, Ratchet and Returnal all run at 60 and are next gen games. I don't see why it should be an issue offering a toggle between 30 and 60 for upcoming games exclusive for PS5 and Series X.
They might be next gen only, but they're still early titles, therefore based on cross-gen tech. Returnal uses Unreal Engine 4, a "last gen" engine albeit, taking nanite aside, UE5 is basically rebranded UE4 (which couldn't be claimed with UE2 to UE3, nor from UE3 to UE4), Ratchet seems to be an exclusive above all because it needs fast streaming assets here and there, but it's built on the same engine as Spiderman Miles Morales. Demon Souls is technically a enhanced port, the core game is lean, they seem to have a big overhead to focus on graphics, but it's not pushing the envelope on anything that PS4 couldn't do on a CPU front, obviously.

60 fps usually becomes worse to do the older the hardware is, games get more complex not only because of dropping older hardware but also because they have more time to go deep into the features the hardware offers.

Some generation we'll be able to keep 60 fps as 30 fps will be diminishing returns, but I'm not sure this is the one (40 fps might be doable as an effort for vrss). There is hope though. Good CPU power this time around will help and frame reconstruction techniques might improve throughout the generation to allow it. We'd be running games internally at resolutions as low as 1080p though, much to the chagrin of some people. (most likely 1440p though, because of Series S being in the picture and having to go lower than the big boys)
even if old hardware didnt hold a game back conceptually, the dev still has to spend time optimizing/producing the game for the old hardware.
I think for the most part that's what they're doing.

Scaling back usually means taking assets like textures and batch converting them to some other resolution, lowering level of detail and such.

You're right that it takes resources but so does supporting more platforms, we've never supported as many platforms as we're supporting now but at the same time we wouldn't be doing that if it wasn't more doable than in the past when we didn't. Take Saturn and PSone, or Xbox vs PS2. Those were completely different systems and you had to do adapt big parts of the code for things to move along. Now every console system is x86 (ignoring Switch for a moment) and it's running a AMD GPU. They're all similar enough. Even a PC with a Nvidia GPU is still similar enough.
 
Last edited:

Crayon

Member
The only reason you're getting 60 and 120fps is because of cross gen games.

You're not going to get anywhere near that on next gen exclusive games unless developers make massive sacrifices.

We might still be getting 60 if ...(who exactly?)... continues to demand wasteful 4k fidelity modes.

Actually wait... we'll continue to get 60fps for the low low price of $600 for pro consoles.
 

BlackTron

Member
Potato is switch in my book. I don't think PS4 is a potato, yet.

Without getting into too sincere a conversation about the definition of potato, the difference between PS4/XBOne and PS5/Series CPUs are utterly massive. To the extent that in comparison, it may as well be a potato, but I'm not really worried about what we label it. The important point is that we have a new chip that you can't really fully embrace because anything you do must also be possible on the old chip.

Most of the time things simply work with caveats. Just like PC games on entry level GPU's these days. Below spec, still runs.

For as long as there's a sizeable userbase there and stuff still more or less run, It's bound to happen. Perhaps they're making those decisions because it's still doable, rather than making the game for the PS4 as a lead platform. I don't think that's happening, notice how they're announcing PS4 versions towards the end of development. I think it's because they're developing the games for next gen and then look at what they have closer to launch and see whether it's possible or not to scale.

Otherwise, why wait? They could always cancel the PS4 SKU too.

Apart from UE5 Nanite, which doesn't have a single commercial game out on, it's normal for them to scale things up from what they currently have. The paradigm didn't change after all, the architectures and what the hardware can do is still similar enough.

I think of this generation GPU as a mix table with a lot more channels, but last generation was already a respectable mix table. These days we have infinite channels when producing music but the biggest jump was when we hit 24 tracks.

Having more is nice, but there isn't a sound that you can't do because you have 48 tracks and not 24 tracks.

While it's easy to blame that on the Jaguar cores. I'm not sure that's due to the lack of power alone.

Jaguar cores were more powerful in general purpose than what came before, all the processing areas we're expecting to see a jump now could have had a jump last gen as well but that didn't happen. Because developers didn't focus on that for a second. It was frustrating for me, personally.

Jaguar was nice because it made everything x86, but in raw power it was not that much better than Cell, in fact Cell could even edge it out in some benchmark tests. Of course actually coding for Cell for real-world performance is different than a benchmark test but the takeaway is that Jaguar was never that hot even in 2013! Now tell me it doesn't matter when we are still making games for a CPU that even PS3's processor can beat in some benchmarks...sad.

It's not just about having the processing power, I feel we've seen developers trying advanced semi-complex AI, destructible environments, number of NPC's and so on... a lot more on the PS3/X360 generation, when it was borderline insane to attempt so much, than they did on the PS4/Xbox One.

And I think that's because it was a novel possibility back then and truth is that despite having the power to do it, they also realized these things are resource intensive and can't be automated... So they need resources. You need to increase the amount of guys you have doing AI, so they can look at the task and start fresh instead of recycling whatever they were using for the previous game; if they don't have time and resources they won't. Destructible environments all the same, we're seeing less and less of it because the broken assets have to be created.

If anything though, it's sony and microsoft that should be pushing that envelope, no doubt. But they need to put resources into that specifically otherwise we'll only get better graphics and no loadings for the most part.

You're right! Back then, studios were more likely to push the envelope. Today, they've settled on what's good enough for everyone, by and large, because we've accepted it. However...there's always been a certain baseline in gaming of what's normal, and the few that push the envelope to elevate things. We're giving up on that before the conversation starts by including Jaguar in the picture.

You could take a N64 game and add better textures, crank the res, maybe even boost the geometry and run it on GC. It would work! And it really would look a LOT better. Or you could ditch the N64 game totally and make an entirely new one on GCN from scratch that runs circles around the other one, but could just never be ported to 64. We don't even need to drag in destructible environments and AI here, it just impacts development in such a fundamental way that its effect on what you get can't be understated.
 

BlackTron

Member
Some games are not held back running on a PS4, but the moment you stream a lot of assets and design it with a SSD in mind they will probably be a bit limited in the amount of stuff they get from disk every second.

Still isn't an issue if they treat PS5 and Xbox Series S/X as lead platforms and then backport it to PS4/Xbox One. Similar to how they did Doom Eternal and worked alongside with a separate team to get it on switch.

With some games you don't even need that, you'll simply have loadings where you either had fast loadings or no loadings at all before.

Doom Switch version was made for a different architecture, on a weak system, and the decision to put it on Switch came quite a bit after release of the game. So subcontracting a team there made sense.

There is no way anyone with the INTENTION to put a game on multiple platforms, these days, that share the same architecture, would have done it that way. Why make a PS5 game ignoring PS4 if you know you will put it on PS4?

Any sane assessment would be to make one game and tweak the graphic settings for each. Not tell your people to ignore PS4 and then hope and pray they don't design something for PS5 that is too hard or impossible to back port. Even if they can, the job will much harder AND more costly than if you had just done it right to begin with. With the logic you're applying they can simply back port Rift Apart to PS4. Such a feat would require major work on the game resulting in a different version, like when PSX/N64/DC versions of the same game were much different than just framerate and loading times.

It doesn't make sense from a resource management standpoint at all to target a next gen system with the intention to back port it later, that's giving yourself additional headaches costs and challenges that never needed to exist. Make a game that runs on all your target hardware in the first place and then don't redo all your work later...
 

Sosokrates

Report me if I continue to console war
The only reason you're getting 60 and 120fps is because of cross gen games.

You're not going to get anywhere near that on next gen exclusive games unless developers make massive sacrifices.

I think we will get more 60fps games this gen then last gen. Returnal + rift apart are fully current gen and offer excellent 60fps performance.
 

BlackTron

Member
I think we will get more 60fps games this gen then last gen. Returnal + rift apart are fully current gen and offer excellent 60fps performance.

Agree we will have a lot more 60fps than before, which was sorely badly needed. 30 will still be a thing especially in graphical showcase games, but even then they started giving us options (even if half the time, the option presets don't make any sense lol).
 

Sosokrates

Report me if I continue to console war
Agree we will have a lot more 60fps than before, which was sorely badly needed. 30 will still be a thing especially in graphical showcase games, but even then they started giving us options (even if half the time, the option presets don't make any sense lol).
The question will then be is the extra graphics of 30fps only games be worth it. I have my doubts, afterall gameplay is king.
 

Sosokrates

Report me if I continue to console war
Worth it for their promo screenshots, no doubt.

And again, with luck, we will have an option to turn off raytraced rain water droplets for 60 fps lol.

Its going to be interesting to see what 60fps UE5 games look like with nanite and lumen turned down.
 

Panajev2001a

GAF's Pleasant Genius
You can't just scale everything down like you think. You'd double your workload, essentially making two games. And you'd think crunch time wasn't an issue now, wait till they apply your "just scale it down"-tactic.
If you are lucky you double the work, if you have a shared codebase and single deliverable then it might be more… or you may just say “screw it” and design it for the base HW and just scale resolution, framerate, texture quality, and if you have some time add some raytraced reflections and/or shadows.
 

M1chl

Currently Gif and Meme Champion
CPU and I/O operation can rarely be scaled back as easily as GPU operations.

Granted not that I believe, that we will see many games, pushing CPUs and I/O to it's limits. Outside of some big AAA game setpieces, but if we take into example CONTROL, which ran like absolute shit on last-gen HW, it's games like those, which rely heavily on CPU, where if you cut too much, you basically get different game.
 

Rickyiez

Member
Tbh it’s easy for Rockstar. Just hire modders and let them work on GTA 6 story and assets with GTA5 engine.

Everything new like gameplays will be purely on next gen :messenger_winking:
 
Remember Crysis? You needed a high end gaming PC to play it. Despite the memes it sold poorly and with Crysis 2 they switched to consoles.
 
Some elements can be scaled or not used at all. RTX, PhysX or TressFX are/were entirely optional on PC. But some processes are essential. Cyberpunk looks a bit like wasteland on last gen, while Series S looks alright, just lower res. I am not sure how the PS IO could even be used reasonable since moving through tons of data requires tons of data which would need several TB SSD and also be crafted, so in the end it's the similar portal thing Portal or Crack in Time already did on PS3, just refined, and of course quicker loading times which is imho actually more important than whatever in theory would be possible. But having a more capable CPU hopefully improves AI, pathfinding, animations, physics, to a level no sloppy approximation on last gen could substitute.
 

Redneckerz

Those long posts don't cover that red neck boy
Now if all these things can be implemented scaled down on jaguar CPUs + HDD, keeping the game cross gen may not hold it back.
This is a what if scenario.

What if GTA 6 is actually a fully path traced game? How would that run on consoles?
What if GTA 6 is actually GTA IV but fixed to actually run properly?

The if in this argument is meaningless when you later say ''If game x does this, then cross gen may not hold the consoles back.

Yes, yes it does.

We are almost 2 years into the new hardware. Stock issues keep on plaguing. Whilst it is naturally expected that devs continue to target PS4/XBO by virtue of customer base, that isn't to say the hardware is holding ambitions back.

It is remarkable what games run on PS4/XBO considering the already-in-2013-anemic Jaguar cores. The GCN architecture has shown great longevity (Something even seen on PC, a 7870/7970 still runs quite alright) but you are going to hit strides.
  • With Xbox One its the memory setup. Slow DDR3 (compared to everything else in 2022) with a separate ultra fast but small RAM versus then huge/fast-for-the time GDDR5 memory and 768 GCN cores versus 1152 means you hit the limits earlier, and it shows.
  • With PS4, 8 GB GDDR5 was a fantastic deal, but you can tell by now that PS4 hardware is hitting its strides too, particularly with lower CPU clocks. 1152 GCN cores and 8 GB GDDR5 have a bit more in the tooth, but Jaguar perf is just too much out there.
  • PS4 Pro is in a somewhat better ballpark due to 2304 GCN cores, but 8 GB GDDR5 might be limiting for future games.
  • Xbox One X actually is in a weird position. Its hardware ends up hitting higher resolutions than Series S, but at 30 FPS. This is due - again - to Jaguar cores. With 12 GB GDDR5 and 2560 GCN cores its still decent for the day, and both it and PS4 Pro have shown they can do Raytracing through software as seen in Crysis. But they are very much a current system now. And those Jaguar cores are just outdated.
Not everything is infinitely scalable, although it is easier to achieve due to new and old consoles sharing the same undercurrent (x64). What does help is that PS4/XBO promoted many-core rendering, something pioneered by PS3 and its 8 processors. These tools are invalueable for current-gen games.
Yes, scaled down to jaguar, there gpu, ram and HDD's...
Whats the issue?
Title of the topic: What if Last-gen hardware DOES NOT HOLD BACK games.

The issue is that what you are describing is actually holding back games. Scaling down = Holding back, because PS5/XSX are not fully utilized.
Please give specific examples....

What cant be scaled down?
Certain rendering and especially physics/streaming. This is all due to lack of CPU grunt or memory limitations.

While it's easy to blame that on the Jaguar cores. I'm not sure that's due to the lack of power alone.

Jaguar cores were more powerful in general purpose than what came before, all the processing areas we're expecting to see a jump now could have had a jump last gen as well but that didn't happen. Because developers didn't focus on that for a second. It was frustrating for me, personally.
Is the bolded really the case? I mean PS3's SPU's yes, but the Xenon was pretty comparable if you were to ask me. Remember (Well you imply to be a programmer so i reckon you know haha) that Jaguar was a netbook core. Power/IPC deficiencies aside, PPC could hold its own.

I agree with you on the latter parts though. The PS360 generation pioneered a lot of exciting tech and even achieved PBR. The only thing remotely comparable to this semi-defined pushing the limit envelope is the software-based raytracing solution on PS4 Pro/Xbox One X employed by Crytek for Crysis Remastered.

I also consider Alien Isolation (The only FP game with PBR and PBS, yes Black Ops did Physically Based Shading and games like Remember Me/Beyond Two Souls also attempted PBR, but Alien was the only first person title) a graphical tour the force on PS360 for having these features and was forever disappointed that Digital Foundry didn't pick up on it back then and haven't done so in their revisits on the game. It is lower resolution, lower everything, but it showed that a physically based rendering model could work on these machines. The Cathode Engine was/is amazing.
 

Sosokrates

Report me if I continue to console war
The issue is that what you are describing is actually holding back games. Scaling down = Holding back, because PS5/XSX are not fully utilized.

There is a difference between lowering resolution, FPS, textures and effects and holding a game back in terms of design. Look at the seriesS for example.
 

Redneckerz

Those long posts don't cover that red neck boy
There is a difference between lowering resolution, FPS, textures and effects and holding a game back in terms of design. Look at the seriesS for example.
Look at Frontiers of Pandora or Metro Exodus Enhanced Edition for my examples. Both include technologies simply impossible on PS4 Pro/Xbox One X.

Yes, Crysis Remastered with software raytracing is impressive, but due to the lack of dedicated ray accelerators, its also limited. Metro Exodus, already a fine game technically highlights in its Enhanced Edition exactly the difference between last-gen and current-gen. The kind of bounced lighting done in Enhanced Edition is not possible on older hardware. Luckily 4A implemented a great substitute in the original version, but the visual difference is night and day.

Same would go for Ratchet & Clank Rift Apart. The visual density on display with RT on top is simply too much for the hardware.

This isn't a tale of infinite scale (That rhymes). Scaling down in this context really means holding back for most developers - unless a bespoke version is developed for PS4/XBO, similar to how Nixxes did a bespoke port of Rise of the Tomb Raider to the Xbox 360. Even there, that port played to the consoles strengths.

All the cross-gen ports happening today do not play to the PS4/XBO hardware strengths. They are downscaled from the main SKU, and because this scaling down is factored in the publishers (Not developers: Developers want the best hardware) demands to make their games cross-gen (due to install base).

Thus, developer ambitions get limited. Games developed are still made with PS4/XBO in mind, and that in mind part has to go.*

*Personally i find it endlessly fascinating that PS4/XBO still have such strong support even as we are crossing 2022. Imagine buying a launch PS4 or XBO (And that's with slower GPU clocks, even!) in 2013 and still get relatively high grade to even AAA (AC: Mirage is still targeted for PS4/XBO after all!) games 9 years later. Incredible value for your money, and a second hand PS4/XBO can be found for rather cheap.

Not even PS360 had such strong backings in their heydays.
 
It's impossible for most games to not be held back by last gen. Even if it is something as simple as object density (how many non static objects available on screen), last gen CPUs were turds at release and nothing is going to change that.
 
Is the bolded really the case? I mean PS3's SPU's yes, but the Xenon was pretty comparable if you were to ask me. Remember (Well you imply to be a programmer so i reckon you know haha) that Jaguar was a netbook core. Power/IPC deficiencies aside, PPC could hold its own.
I'm not a programmer, I read a lot about it in my time, but I have no first hand inside knowledge.

This said Xenon and the PPE weren't comparable to Jaguar in general purpose. Benchmarking is of course difficult because one can't simply pull antutu or geekbench there, but you can attempt to.

For that you have to separate the two CPU strengths or weaknesses: Dhrystones and Whetstones, Dhrystones is general purpose so CPU duties, Whetstones is floating point operations. So we need to find a metric that measures dhrystones.

Best method is dmips because it was widely used by developers and even console manufacturers themselves so there's data which isn't subjective. I remember with the Wii U we even managed here at GAF to have a developer run those benchmarks for us.

Xenon and PPE+SPE's were no slouches in floating point, but for that GPU is more effective. Now, general purpose was a mess. They didn't even manage to double the performance in general purpose of Gamecube/Xbox per core. despite being over 4 times the clock. This happened because they didn't have cache miss prediction and had a pipeline with a lot of stages. It's everything we don't do now.

PS3 Cell PPE: 1879.630 DMIPS @ 3.2 GHz (SPE's not taken into account because they can't run general purpose code)
X360 Xenon: 1879.630 DMIPS*3 = 5638.90 DMIPS @ 3.2 GHz (each 3.2 GHz core performing the same as the PS3)
PowerPC G4: 2202.600 DMIPS @ 1.25GHz
8 core Bobcat: 4260*8 = 34080 DMIPS @ 1.6 GHz (Bobcat is the CPU generation that preceded Jaguar, same foundation but worst IPC, Jaguar is best case scenario 22% better which are the official figures)

The 8-core Jaguar was shitty against anything else on the PC market when it launched in 2013 (1 performance core on an i7 these days is better than 8 jaguar cores combined...), but at "CPU things" it was still leaps above PS3 and Xbox 360. We're talking easily double the performance per core (possibly near 3 times on Jaguar) and more cores. 8 core vs 1 to 3.

Consider that Xbox One was able to emulate the Xbox 360 cpu despite it using a different architecture. You need quite a bit of performance overhead to do that.

The 7th generation was on the General Compute prowess of the CPU's, akin to a wet fart. It had volume, Mhz, generated heat... but was conventionally crap.

Floating point power of Jaguar was also comparable to X360's Xenon, albeit on a 3 cpu vs 8 cpu scenario this time, Jaguar on PS4: 102 GFlops, Xenon 3 core Floating point peak: 115 Gflops. Cell with SPE's was 205 Gflops. But that was never an issue with the GPU's we got on PS4 and Xbox One, everything you were doing on the CPU you could move elsewhere this time.
I agree with you on the latter parts though. The PS360 generation pioneered a lot of exciting tech and even achieved PBR. The only thing remotely comparable to this semi-defined pushing the limit envelope is the software-based raytracing solution on PS4 Pro/Xbox One X employed by Crytek for Crysis Remastered.
Yes, developers were motivated and they couldn't be demoted not to do things that generation, they were playing with advanced physics, destroyable objects, AI, deformable liquid, dynamic light sources running on the CPU floating point habilities, anti aliasing on the CPU, you name it... This while running on Potatos. They were excited with the possibilities I guess. With PS2/GC/Xbox we also saw some of that with developers being really crazy with things that seemed next gen. From crazy amounts of enemies on-screen, shaders on fixed function hardware, deffered rendering, you name it.

With PS4/Xbox One we got very little of devs trying to do what the hardware couldn't on any front bar special frame reconstruction/accumulation methods, it felt like an encore of the previous gen without the need to spend thousands of hours so you can do something you are not supposed to. PS5/XBox Series S/X is looking better already, but I still don't feel that eagerness to push the envelope in every direction. Developers seem confortable with the process of how to do a game, and doing it over and over in a linear fashion. No things on the bucket list to implement (or no time).
I also consider Alien Isolation (The only FP game with PBR and PBS, yes Black Ops did Physically Based Shading and games like Remember Me/Beyond Two Souls also attempted PBR, but Alien was the only first person title) a graphical tour the force on PS360 for having these features and was forever disappointed that Digital Foundry didn't pick up on it back then and haven't done so in their revisits on the game. It is lower resolution, lower everything, but it showed that a physically based rendering model could work on these machines. The Cathode Engine was/is amazing.
Interesting. I avoided the X360 version because of the performance.

NO PBS, but didn't Metal Gear Solid V pull a very good PBR on X360/PS3? I felt it was the most balanced/performing implementation I've seen back then.
It's impossible for most games to not be held back by last gen. Even if it is something as simple as object density (how many non static objects available on screen), last gen CPUs were turds at release and nothing is going to change that.
Those are easy to cut though, and nobody will/would complain.

We should be seeing PS5 games with lots of objects and PS4 versions with way fewer.

Cross platform PS3 vs PS4 games often had a huge difference in foliage for instance, for the same reasons.
 
Last edited:

Redneckerz

Those long posts don't cover that red neck boy
I'm not a programmer, I read a lot about it in my time, but I have no first hand inside knowledge.
Yeah i am clearly talking on the Dhrystone marks. Ofcourse a vis-a-vis comparison won't do, kind of impressed how crap IPC PS360 was. Yet, it still was able to power a lot of games even in late game.
Yes, developers were motivated and they couldn't be demoted not to do things that generation, they were playing with advanced physics, destroyable objects, AI, deformable liquid, dynamic light sources running on the CPU floating point habilities, anti aliasing on the CPU, you name it... This while running on Potatos. They were excited with the possibilities I guess. With PS2/GC/Xbox we also saw some of that with developers being really crazy with things that seemed next gen. From crazy amounts of enemies on-screen, shaders on fixed function hardware, deffered rendering, you name it.
Ah yes, there have been so many good examples:
  • Shrek on Xbox: Deferred rendering
  • Transformers: Prelude to Energon on PS2: Just the density of what was shown
  • Riddick/Stolen on Xbox: Shadows baby!
  • SWAT: Global Strike Team: Prebaked lighting similar to Mirror's Edge
  • Wreckless
  • OG Xbox being able to output native 720p with hexedits on several dozen games
  • Gamecube's TEV abuse shown in Rogue Squadron, Star Fox and on Wii with Overlord: Dark Legend
With PS4/Xbox One we got very little of devs trying to do what the hardware couldn't on any front bar special frame reconstruction/accumulation methods, it felt like an encore of the previous gen without the need to spend thousands of hours so you can do something you are not supposed to. PS5/XBox Series S/X is looking better already, but I still don't feel that eagerness to push the envelope in every direction. Developers seem confortable with the process of how to do a game, and doing it over and over in a linear fashion. No things on the bucket list to implement (or no time).
Perhaps they were aware of the massive difference in PC perf considering the CPU was known to exist in a quad core config as Athlon 5150/5350?
Interesting. I avoided the X360 version because of the performance.
It is a 25 fps affair indeed, but the core look of the game seemed pretty well matched - Disregarding the fact that it uses PBR on PS360 grade hardware. It is fascinating to see a game so clearly built for PS4/XBO level spec run and look the way it does on hardware that's clocking over a decade old in the case of 360. And given the playspeed of Alien Isolation, 25ish FPS actually is less of an issue than else where.
NO PBS, but didn't Metal Gear Solid V pull a very good PBR on X360/PS3? I felt it was the most balanced/performing implementation I've seen back then.
I thought that was PBS? It has some supreme lighting though, and i am amazed it runs as it does on PS360.
 
Yeah i am clearly talking on the Dhrystone marks. Ofcourse a vis-a-vis comparison won't do, kind of impressed how crap IPC PS360 was. Yet, it still was able to power a lot of games even in late game.

Ah yes, there have been so many good examples:
  • Shrek on Xbox: Deferred rendering
  • Transformers: Prelude to Energon on PS2: Just the density of what was shown
  • Riddick/Stolen on Xbox: Shadows baby!
  • SWAT: Global Strike Team: Prebaked lighting similar to Mirror's Edge
  • Wreckless
  • OG Xbox being able to output native 720p with hexedits on several dozen games
  • Gamecube's TEV abuse shown in Rogue Squadron, Star Fox and on Wii with Overlord: Dark Legend
Yeah, I remember most of those.

Such a crazy generation. Very limited power but very different results on the same machines.
Perhaps they were aware of the massive difference in PC perf considering the CPU was known to exist in a quad core config as Athlon 5150/5350?
Not sure. The CPU was of course weak, but I feel most of all they wanted crossplatform to be easy. They weren't attempting to get close to the metal in any way shape or form with the code they were pulling. Not even trying to pull as much GPGPU code as one coming out of X360/PS3 era would assume they would instantly break into.

The low CPU of the consoles meant that anything quad core could run any game if it had the right GPU, in a multiplatform world that was probably a very good thing for publishers to hear.
It is a 25 fps affair indeed, but the core look of the game seemed pretty well matched - Disregarding the fact that it uses PBR on PS360 grade hardware. It is fascinating to see a game so clearly built for PS4/XBO level spec run and look the way it does on hardware that's clocking over a decade old in the case of 360. And given the playspeed of Alien Isolation, 25ish FPS actually is less of an issue than else where.
Yes, I remember that Digital Foundry feature.

I thought that was PBS? It has some supreme lighting though, and i am amazed it runs as it does on PS360.
I think just PBR. But it was years ago.
 
I think we will get more 60fps games this gen then last gen. Returnal + rift apart are fully current gen and offer excellent 60fps performance.

But those are launch games. Launch games don't make full use of new gen hardware.

If I'm not mistaken Insomniac were saying they're not anywhere close to tapping PS5 hardware potential.

When we get to year 4 or 5 and games are next gen exclusives, you'll start to see the frames go down in these games. There will be performance mode options but they'll make visual sacrifices (as they always do) to retain the 60FPS.
Good thing is that these options will become standard for console gamers. For me personally, I can never go back to 30FPS after being spoilt by PC gaming north of 100FPS in most games. 30FPS gives me a headache.

What you're not going to see- though is all next gen games capped to 60FPS as an industry standard.
 
Last edited:
I think your thread title will enflame all console warriors for miles around . Other than prettier visuals what's the point of next gen if every title can be crossgen? We buy into them for the gen exclusivity. Your title says "screw that ".
And this is exactly what the publishers have all been saying to people that bought a next gen system!
 

Sosokrates

Report me if I continue to console war
But those are launch games. Launch games don't make full use of new gen hardware.

If I'm not mistaken Insomniac were saying they're not anywhere close to tapping PS5 hardware potential.

When we get to year 4 or 5 and games are next gen exclusives, you'll start to see the frames go down in these games. There will be performance mode options but they'll make visual sacrifices (as they always do) to retain the 60FPS.
Good thing is that these options will become standard for console gamers. For me personally, I can never go back to 30FPS after being spoilt by PC gaming north of 100FPS in most games. 30FPS gives me a headache.

What you're not going to see- though is all next gen games capped to 60FPS as an industry standard.
Your guess is as good as mine. But devs always say they have not fully taped out the hardware at launch. Which can mean a lot of things.
Recent Ps4 games dont look that much better then killzone Shadowfall or ryse son of rome.

I really dont know how the the game engines and tools will mature this gen. Housemarque used UE4, but they created a lot of custom things on it.
I do think there will still be more 60fps games this gen, simply because I think less visual features will need to be sacrificed.
From still have not seen what UE5 games look like at 60fps on console. I suspect hybrid approach for the engine, if there is a returnal 2 on UE5 I think they will use nanite for the environment, new improved UE5 fluid + particle system but they might not use lumen and instead used a baked solution in order to achieve 60fps.
 
Last edited:

Hobbygaming

has been asked to post in 'Grounded' mode.
No platforms announced and is really just a concept but this is what current gen could do

 

Topher

Gold Member
The topic is perfectly fine dude, i was just fucking with you.

What's up with your avatar? I'm still showing off this thing Ass of Can Whooping Ass of Can Whooping made me show. Can't honor a bet?

whoopi goldberg shrug GIF by The Late Show With Stephen Colbert
 
Last edited:

TheGecko

Banned
It's a myth developers and publishers have peddled and the dopes have sucked up.

Has minimal spec on PC ever held back one of its exclusives..... NO, but keep making excuses for these so called game devs.
 

Caio

Member
I'm not saying these things in the new console don't make a difference, they clearly do allow better frame rates and faster loading times.

However maybe they are not needed in some games to implement new features, improve systems and gameplay.

I've been thinking about GTA6.
I imagine that when designing that game the developers will have a list of things they want to improve and implement, things such as:

  • Improved wanted system and police behaviour
  • More interiors with higher detail
  • More realistic NPC behaviours and routines
  • Larger and more detailed world.
  • Etc...

Now if all these things can be implemented scaled down on jaguar CPUs + HDD, keeping the game cross gen may not hold it back.


I'm not saying this will happen with all games but this gen and last gen may be more scalable then in the past.

So cross gen might not mean games are necessarily "held back"

What do you think constant reader?
It is technically possible, even though it would be much more expensive to develop a Game which take advantage of several APU/specs; I would love it, to have scalable AI, physics, collision-detection system, animations, etc, but I'm not sure it would be feasible for most developers/publishers. We would need very advanced development kit to help, who knows if, one day, this will be applicable. By the way, I fully understand your point.
 

Redneckerz

Those long posts don't cover that red neck boy
Yeah, I remember most of those.
One of the reasons i still game on X360. Seeing Mirror's Edge (Yes i know i can play that on One X) still looks very next-gen to me (Despite the jaggies)
Not sure. The CPU was of course weak, but I feel most of all they wanted crossplatform to be easy. They weren't attempting to get close to the metal in any way shape or form with the code they were pulling. Not even trying to pull as much GPGPU code as one coming out of X360/PS3 era would assume they would instantly break into.
Except for Killzone Shadow Fall. Its close to 10 (!) years old but it still looks amazing.
Its still impressive what games run relatively fine on last-gen PS4/XBO. This gen really spoiled us on solid 30 fps, that a PS360 ish cinematic 30 fps is booed at, when that was common ground (and accepted!) the generation before.
The low CPU of the consoles meant that anything quad core could run any game if it had the right GPU, in a multiplatform world that was probably a very good thing for publishers to hear.
Core 2 Duo/Core 2 Quad and a 8800 GT with 2/4 GB DDR2 and you were golden for the entire generation

I think just PBR. But it was years ago.
Is it really? Because PBS, far as i can tell, is a shading workflow, whereas PBR refers to a more generalized workflow.
This is the kinda of leap I expect from this gen.
I too would like to be impressed by cinematic gameplay.
 
Top Bottom