• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Developer Speaks About Lockhart "Holding Back Next Gen." And PS5 VS XSX Dev Kits.

Bogroll

Likes moldy games
Key is geometry (number of polygons on screen), number of textures (e.g. in CGI movies there are more than 100 textures for a single person model on average) and texture resolution. Requires a good GPU with plenty of VRAM/good I/O to pull off. And that makes more visual difference than whole 4K focus (personally I see very little difference beyond 1600-1800p on a 65 inch TV). More advanced geometry/ more texture diversity/ higher resolution textures however - you see it straight away
I'm with you on resolution I'm happy with 1440p and up on a 49". While there's always room for improvements i think they can look pretty good now. Im wishing for a lot more physics and destruction in objects in the environments eg the shooting of guns destroying things in room like in Control but better. I don't want to be walking around amazing looking environments but then be drawn out of the games as it feels like I'm walking around a film set a Universal Studio's.
 

Three

Member
And yet around same ballpark, no games have to be "held back" or "made from scratch" for XSS, and even if resolution has to fall a notch to keep performance on XSS, it wouldn't remotely be PS5 or XSX players business, they're another market.
It will be their business even those on PC. Look at how SVOGI was removed from UE4. If a game has something that changes the way the game looks or the workflow dramatically enough but not supported on a platform they will can it to achieve a similar look/workflow between all the platforms.
 
Last edited:

Humdinger

Member
I'm just asking the arm-chair developers of Sony GAF how are they going to identify when a game has been hold back by a 4TF machine and when it hasn't.

I wouldn't even claim to be technically savvy, but I've been following the discussion, so I'll attempt an answer. Those who know more, please free to correct me or add to what I'm saying.

There will be no way to identify, post hoc (after release), when a game has been held back by the XSS. That is part of the problem.

What people are saying is that the "holding back" comes at the initial stages -- the stage when ambitions and development targets are set. They are saying that these ambitions and targets will be limited by the lowest spec console (XSS). Therefore, any limitations to the higher-end console versions will not be visible, because they have been baked into the game from the beginning. Certain ambitions, design elements, or features have been scaled back, but this has happened at the early stages of development. The developer has accomodated the lower-spec unit from the start.

Once the development process is up and running, then each version (XSS, XSX, and PS5) can be optimized quite well, within its own parameters. Any comparison of different versions, post-release, will not necessarily demonstrate a consistent pattern. That is, you won't be able to see consistent differences after the fact, except for the obvious ones such as lower resolution on XSS. Otherwise, it could be that XSS versions run better in some ways than XSX versions or PS5 versions. It entirely depends on how that developer optimizes that particular version.

But the problem is that the limitations on high-end versions have been installed early in the process.

It's sort of like if you had to cook a meal, the same meal, that would satisfy people with normal diets and vegetarians. You'd have to leave certain ingredients off the menu from the start (e.g., meat), because you knew the meal had to please the vegetarians. That's a crude analogy that doesn't map real well on to what we're talking about, but I'm trying to make the point that the initial limitations on the "meal" are set from the start, so trying to determine differences after the development process is complete (i.e., after a game is released) is not going to reveal them. The limitations were set much earlier.

That was longwinded, lol.
 

Elog

Member
I'm with you on resolution I'm happy with 1440p and up on a 49". While there's always room for improvements i think they can look pretty good now. Im wishing for a lot more physics and destruction in objects in the environments eg the shooting of guns destroying things in room like in Control but better. I don't want to be walking around amazing looking environments but then be drawn out of the games as it feels like I'm walking around a film set a Universal Studio's.

Agree. The more you dig, the more you realise that Cerny was on point in his speech earlier this year.

To comfortably push pixels at 1800p or so you end up with high single digit TFLOPs (8+ or so) as a requirement. After that frequency (i.e. time) will determine how much post processing you can get into a single frame, I/O and VRAM pool will determine the richness and resolution of textures and geometry handling will determine the amount of polygons you can handle. Assuming that Cerny was truthful with the PS5 it seems to have hit those sweet spots well (right amount of TFLOPs, high frequency, highly customised geometry engine and best-in-class I/O).

That leaves RT and advanced physics as the next two big items. And I agree that physics is a big one - it is intrinsically linked to geometry though (together with processing power). We will see what can be achieved. As to the PS5 I am very curious what the RT hardware solution actually is - we know basically nothing about it right now except that it is quite a capable solution.
 
Everyone seems concerned with what devs can do and not at all with what they will do. Everything people want for next gen seems like a lot more work then currently and games are already delayed a bunch. Anyone here knowledged in this area? Are we at a point where what is possible just isn't financially viable? Could Dev time/cost be the real low spec target?
These are actual questions and sincere.
 

ZehDon

Member
... But the problem is that the limitations on high-end versions have been installed early in the process...
I wouldn't be too concerned. Your post describes what's known as a feasibility study. Being that we've had Doom 3 running on an original Xbox, and The Witcher 3 running on a Switch, there's really not much to worry about - Devs are pretty crafty. The XSS has, effectively, the same CPU as the PS5, the same SSD as the XSX, and resolution-target scaled-RAM and GPU components. The types of limitations your inferring requires significantly more differences than this to manifest themselves as anything gameplay, or feature, impacting. The XSS is, basically, a low-graphics 1080p XSX - which was the goal. It's games will have closer draw distances, lower internal resolutions, fewer simultaneous effects, lower levels of texture filtering, fewer LOD transitions, and lower frame rates. In terms of gameplay impact, I'm not really seeing why there would be any. This type of graphics scaling is ever-present in the PC realm, to even more extremes than what we're seeing between the XSX and the XSS. Will Developers need to do more work to accommodate a second performance profile for the Xbox consoles? Sure. But, Sony asked them to do the same with the PS4 Pro, and that seemed to go fine. Did Red Dead Redemption 2 look kinda crappy on the OG Xbone? Yeah - it did. Did that stop it from looking beautiful on my PS4 Pro? No, not at all.
 
Last edited:
Everyone seems concerned with what devs can do and not at all with what they will do. Everything people want for next gen seems like a lot more work then currently and games are already delayed a bunch. Anyone here knowledged in this area? Are we at a point where what is possible just isn't financially viable? Could Dev time/cost be the real low spec target?
These are actual questions and sincere.
You don't need to be CoD to make money. It's just that the more competitive the genre, the harder it is to make money. So for example, were you to try and make another GaaS Fortnite Killer, or an MMO, then it is simply not financially viable to make a little money; you had to take over the genre as a smash hit to make your money back.

On the other hand, if you were smart enough to notice that single player story based games don't seem to directly compete with others of the same type, then you might realize you can make a game with a modest budget and make a profit.

Good luck to any studio trying to make AAAA GaaS games. They are going to need it.
 

Humdinger

Member
I wouldn't be too concerned. [...] The XSS is, basically, a low-graphics 1080p XSX - which was the goal. It's games will have closer draw distances, lower internal resolutions, fewer simultaneous effects, lower levels of texture filtering, fewer LOD transitions, and lower frame rates. In terms of gameplay impact, I'm not really seeing why there would be any. This type of graphics scaling is ever-present in the PC realm, to even more extremes than what we're seeing between the XSX and the XSS.

As I understand it, the concern is not about graphics scaling. I think everyone agrees that the sorts of things you mention can be readily adjusted. The concern is about other aspects of design that aren't just visual -- e.g., AI, physics, enemy crowd size, level complexity, environmental interaction, or design options like rapid transition between worlds (R&C).
 
Last edited:

Jon Neu

Banned
There will be no way to identify, post hoc (after release), when a game has been held back by the XSS. That is part of the problem.

What people are saying is that the "holding back" comes at the initial stages -- the stage when ambitions and development targets are set. They are saying that these ambitions and targets will be limited by the lowest spec console (XSS). Therefore, any limitations to the higher-end console versions will not be visible, because they have been baked into the game from the beginning. Certain ambitions, design elements, or features have been scaled back, but this has happened at the early stages of development. The developer has accomodated the lower-spec unit from the start.

Once the development process is up and running, then each version (XSS, XSX, and PS5) can be optimized quite well, within its own parameters. Any comparison of different versions, post-release, will not necessarily demonstrate a consistent pattern. That is, you won't be able to see consistent differences after the fact, except for the obvious ones such as lower resolution on XSS. Otherwise, it could be that XSS versions run better in some ways than XSX versions or PS5 versions. It entirely depends on how that developer optimizes that particular version.

But the problem is that the limitations on high-end versions have been installed early in the process.

It's sort of like if you had to cook a meal, the same meal, that would satisfy people with normal diets and vegetarians. You'd have to leave certain ingredients off the menu from the start (e.g., meat), because you knew the meal had to please the vegetarians. That's a crude analogy that doesn't map real well on to what we're talking about, but I'm trying to make the point that the initial limitations on the "meal" are set from the start, so trying to determine differences after the development process is complete (i.e., after a game is released) is not going to reveal them. The limitations were set much earlier.

That was longwinded, lol.

Thank you for actually following a well reasoned conversation without retorting to ad hominems.

Now, what you say it's obviously true, but there is a fundamental problem with the premise: it assumes that all games are going to have Series S limitations in mind. Or put in other words, it assumes that all games are going to be programed for the Series S and then ported for the Series X, and I don't think that's what's going to happen all the time. And also, I think people who are so concerned with the Series S holding back this gen, will not be capable of discerning the difference between a multi game made with the Series S limitations in mind and a game with the Series S out of the picture in the developing stages.

Also, let's be clear; most developers don't squeeze the consoles at all in the first place, no matter the generation. I'm pretty sure the next Rockstar game is going to look amazing across all the consoles. I couldn't care less if other third party devs had to do extra work or limit their "vision" when most of the time that vision was never going to put the hardware to it's spades anyway.
 

Three

Member
I wouldn't be too concerned. Your post describes what's known as a feasibility study. Being that we've had Doom 3 running on an original Xbox, and The Witcher 3 running on a Switch, there's really not much to worry about - Devs are pretty crafty. The XSS has, effectively, the same CPU as the PS5, the same SSD as the XSX, and resolution-target scaled-RAM and GPU components. The types of limitations your inferring requires significantly more differences than this to manifest themselves as anything gameplay, or feature, impacting. The XSS is, basically, a low-graphics 1080p XSX - which was the goal. It's games will have closer draw distances, lower internal resolutions, fewer simultaneous effects, lower levels of texture filtering, fewer LOD transitions, and lower frame rates. In terms of gameplay impact, I'm not really seeing why there would be any. This type of graphics scaling is ever-present in the PC realm, to even more extremes than what we're seeing between the XSX and the XSS. Will Developers need to do more work to accommodate a second performance profile for the Xbox consoles? Sure. But, Sony asked them to do the same with the PS4 Pro, and that seemed to go fine. Did Red Dead Redemption 2 look kinda crappy on the OG Xbone? Yeah - it did. Did that stop it from looking beautiful on my PS4 Pro? No, not at all.
Sometimes a limitation can be gameplay enhancing and you make a choice on that based on what platforms you choose to support. Let me give you an example. SVOGI was meant to be an Unreal Engine 4 feature. It was deemed inefficient because it required a lot of resources but the main reason it was removed from the engine was because the PS4 and xbox one would have to give up a lot for it to have it. This didn't just affect console games but PC games no longer had it in UE4. Having no SVOGI means you have to do baked lighting.

This changes your workflow in creating your game. It can even limit gameplay. For example GTSport wanted global illumination but it couldn't do SVOGI. It settled for an inhouse baked lightning method known as 'Iris' instead. This effectively meant no day-night transitions in the game.

If GTSport now comes to PC or if it were a multiplatform game and you had a console that is more than 2 times as powerful they wouldn't change their workflow or engine just for one capable platform. They would take them all into account and likely not offer day to night transitions across platforms. These are the type of choices you make when taking into account some limited hardware. You aim for your engine to be compatible on all platforms. Could the PS4 Pro do SVOGI since it is more than 2 times more powerful? Maybe ( it will probably require more ram too) but you wouldn't have two versions of GTSport, one with SVOGI and time of day transitions and one without.

These are the ways platforms that are more than 2 times as powerful are held back by popular weak hardware. Do you change your workflow/engine/game for that other platform or do you use that extra power to simply bump res/fps. Almost everyone will choose the second.
 
Last edited:

Humdinger

Member
Thank you for actually following a well reasoned conversation without retorting to ad hominems.

No problem. You're asking a legitimate question, one that I was wondering about myself.

Now, what you say it's obviously true, but there is a fundamental problem with the premise: it assumes that all games are going to have Series S limitations in mind. Or put in other words, it assumes that all games are going to be programed for the Series S and then ported for the Series X, and I don't think that's what's going to happen all the time.

I agree. I left out some qualifications/nuances in my post, because I wanted to stick to the main point. But I'll add them now.

I don't think the scenario I sketched out will happen in all cases. It will happen in some, but it's hard to say how many. Maybe it will happen in the majority, maybe in the minority. It may depend on the project size or budget, or on the ambitions of the developer.

I also don't think that the things left on the cutting room floor will necessarily be significant. Maybe they will be -- maybe they'll be important features or design decisions. But maybe they will be trivial things not worth worrying about. It's hard to say; impossible, really.

And also, I think people who are so concerned with the Series S holding back this gen, will not be capable of discerning the difference between a multi game made with the Series S limitations in mind and a game with the Series S out of the picture in the developing stages.

Well, since the differences won't be apparent post-release, for the reasons mentioned, the only one who would really know would be the developer.

And this would have to be a developer who's working on a AAA game, built from the ground up for next-gen, built for all three consoles, without any ties to the previous generation. Problem is, that's not going to happen for a year or two. We know most third-party AAA games are going to be cross-gen for a while, and we know MS is tethered to last-gen for the next two years. So we won't even get a chance to observe this for at least a year or two, post-launch.

Another problem (as if we don't have enough) is that developers will be reluctant to admit that they held back their projects because of the XSS: "Yeah, we scaled back the ambition of our project." That could get them in trouble. It brings negative attention to the project. So even when it happens, getting a developer to talk about it may be difficult. It certainly won't come from an MS developer.

For now, it's all speculation and conjecture. I think there is some reason for concern, because statements by half a dozen experienced developers suggest that the XSS could impose some limitations. But how common those limitations will be, and how significant they will be, is anyone's guess -- and it's going to be a couple years before we even have a chance to find out.
 
Last edited:
So, exactly as I’ve been saying since day dot. Same experience, same game, as feature set, lower resolution and possible effects turn down in rare cases.

I mean anybody who actually understands how game engines work in relation to system hardware could tell you that.

For most games it shouldn't be an issue given the strong cpu and vast amounts of ram. But you could imagine a few select games using the gpu extensively for physics or some ai features, might have trouble scaling down.

Im wishing for a lot more physics and destruction in objects in the environments eg the shooting of guns destroying things in room like in Control but better. I don't want to be walking around amazing looking environments but then be drawn out of the games as it feels like I'm walking around a film set a Universal Studio's.
Some physics solutions actually use the gpu, iirc. Would more advanced physics require more Tflops from the gpu? How would that be scaled down?
 

ZehDon

Member
Sometimes a limitation can be gameplay enhancing and you make a choice on that based on what platforms you choose to support. Let me give you an example. SVOGI was meant to be an Unreal Engine 4 feature. It was deemed inefficient because it required a lot of resources but the main reason it was removed from the engine was because the PS4 and xbox one would have to give up a lot for it to have it. This didn't just affect console games but PC games no longer had it in UE4. Having no SVOGI means you have to do baked lighting.

This changes your workflow in creating your game. It can even limit gameplay. For example GTSport wanted global illumination but it couldn't do SVOGI. It settled for an inhouse baked lightning method known as 'Iris' instead. This effectively meant no day-night transitions in the game.

If GTSport now comes to PC or if it were a multiplatform game and you had a console that is more than 2 times as powerful they wouldn't change their workflow or engine just for one capable platform. They would take them all into account and likely not offer day to night transitions across platforms. These are the type of choices you make when taking into account some limited hardware. You aim for your engine to be compatible on all platforms. Could the PS4 Pro do SVOGI since it is more than 2 times more powerful? Maybe ( it will probably require more ram too) but you wouldn't have two versions of GTSport, one with SVOGI and time of day transitions and one without.

These are the ways platforms that are more than 2 times as powerful are held back by popular weak hardware. Do you change your workflow/engine/game for that other platform or do you use that extra power to simply bump res/fps. Almost everyone will choose the second.
What your talking about has existed for as long as consoles have - lowest common denominator. In terms of impact, consoles as a whole have potentially created any number of such hypothetical limitations for the past four decades. The XSS as an exclusive limitation, above and beyond the limitations that the XSX itself might impose, on multi-platform games are basically non-existent - which is the topic of the thread. You're talking about "consoles". I'm talking about a specific console, as the title of the thread discusses. For example, the Xbox One X and the PS4 Pro now use SVOGI courtesy of CryEngine and the Crysis Remastered. The base consoles don't get access to that particular feature. Fun fact: it's also enabled on the Switch version of Crysis. How, and why? The key difference in the equation being that the power differential between the PS4 and Xbone's Jaguar CPUs versus Desktop PC's of that era was enormous - both machines were under-powered and obsolete the day they released, thanks in no small part to their laptop CPU hearts. The Zen2 CPUs in the next-gen consoles about to release are still behind the bleeding edge of Desktops, but they're far more competitive, while both consoles are also packing a wealth of custom hardware designed to lessen the load on the CPU. This will let developers really put the pull CPU to use. The XSS has, effectively, the same CPU as the PS5, which is just shy of the CPU in the XSX.
I've already highlighted the potential limitations of the XSS hardware - with the CPU and SSD being of equivalent spec to the XSX, the RAM and GPU components can be, and apparently have been, efficiently scaled down for lower asset quality and resolution targets, without sacrificing anything related to gameplay (though, dropping from 60 to 30 in some titles will happen). I don't feel you've really added, changed, or revealed anything from your post. Any concern, such as the one described, that you have for the next generation of consoles are concerns you had for the Xbox One and PS4, and the Xbox 360 and the PS3, and the Xbox and PS2. The XSS offers basically no additional cause for concern that I can see.
 
Last edited:

Thirty7ven

Banned
.

omawmVM.png

Big games that are graphical showpieces are going to look worse and worse on XSS as the gen goes by.

To be fair the only problem here is MS deceitful message of XSS as a 1440p machine. Not surprising coming from MS but still disappointing.
 

Greeno

Member
Everyone seems concerned with what devs can do and not at all with what they will do. Everything people want for next gen seems like a lot more work then currently and games are already delayed a bunch. Anyone here knowledged in this area? Are we at a point where what is possible just isn't financially viable? Could Dev time/cost be the real low spec target?
These are actual questions and sincere.
I think this is exactly it, when the remedy developer was asked later on why he "smells trouble", he confirmed that it was because such a device requires more optimization. Which is logical.

I honestly don't think game design will be affected by this. If it is affected, we will start hearing about it in the first year through Digital Foundry and other sources. I just don't see where Series S will hold back game design or level design (it is similar in CPU and storage). The GPU will finally be put on a task that the GPU does best, the visuals task.
 
Last edited:
There seems to be a push to dispel this narrative of weaker consoles holding back next gen experiences. And despite what we may think individually, it's weird to think that "Microsoft" is still struggling to get it's messaging right?

In terms of resources they're second only to apple but still struggle to stop these types of narratives before they become engrained.
Ms will lose all the time in that department because some media and some fanboys/girls will never give MS the credit they deserve.
 
Last edited:
Ms will lose all the time in that department because some media and some fanboys/girls will never give MS the credit they deserve.
MS does not deserve ANY credit. PC gaming had been held back by Xbox One for 7 years, and now Xbox had the nerve to claim that PC has weaker specs and thus is to blame?
No, you don't get to pull that crap on us. MS never cared about PC gaming and they still don't. They just want PC gaming as scape goats and excuses. Them TALKING about PC gaming, didn't change the fact that they only did so to their advantage and didn't even tell the truth.

And now we have Series S, which is basically a slight side-grade of 1X, that wants to keep PC back ANOTHER 7 years. Thankfully I doubt their plan would work this time around, because for them to do it Series S would actually have to be popular.

Good luck with that.
 

MrFunSocks

Banned
As I understand it, the concern is not about graphics scaling. I think everyone agrees that the sorts of things you mention can be readily adjusted. The concern is about other aspects of design that aren't just visual -- e.g., AI, physics, enemy crowd size, level complexity, environmental interaction, or design options like rapid transition between worlds (R&C).
You mean CPU stuff? Not an issue at all since it's the same CPU.
 
Last edited:
MS does not deserve ANY credit. PC gaming had been held back by Xbox One for 7 years, and now Xbox had the nerve to claim that PC has weaker specs and thus is to blame?
No, you don't get to pull that crap on us. MS never cared about PC gaming and they still don't. They just want PC gaming as scape goats and excuses. Them TALKING about PC gaming, didn't change the fact that they only did so to their advantage and didn't even tell the truth.

And now we have Series S, which is basically a slight side-grade of 1X, that wants to keep PC back ANOTHER 7 years. Thankfully I doubt their plan would work this time around, because for them to do it Series S would actually have to be popular.

Good luck with that.
I will not argue back and forth that is your opinion.
 
I wouldn't even claim to be technically savvy, but I've been following the discussion, so I'll attempt an answer. Those who know more, please free to correct me or add to what I'm saying.

There will be no way to identify, post hoc (after release), when a game has been held back by the XSS. That is part of the problem.

What people are saying is that the "holding back" comes at the initial stages -- the stage when ambitions and development targets are set. They are saying that these ambitions and targets will be limited by the lowest spec console (XSS). Therefore, any limitations to the higher-end console versions will not be visible, because they have been baked into the game from the beginning. Certain ambitions, design elements, or features have been scaled back, but this has happened at the early stages of development. The developer has accomodated the lower-spec unit from the start.

Once the development process is up and running, then each version (XSS, XSX, and PS5) can be optimized quite well, within its own parameters. Any comparison of different versions, post-release, will not necessarily demonstrate a consistent pattern. That is, you won't be able to see consistent differences after the fact, except for the obvious ones such as lower resolution on XSS. Otherwise, it could be that XSS versions run better in some ways than XSX versions or PS5 versions. It entirely depends on how that developer optimizes that particular version.

But the problem is that the limitations on high-end versions have been installed early in the process.

It's sort of like if you had to cook a meal, the same meal, that would satisfy people with normal diets and vegetarians. You'd have to leave certain ingredients off the menu from the start (e.g., meat), because you knew the meal had to please the vegetarians. That's a crude analogy that doesn't map real well on to what we're talking about, but I'm trying to make the point that the initial limitations on the "meal" are set from the start, so trying to determine differences after the development process is complete (i.e., after a game is released) is not going to reveal them. The limitations were set much earlier.

That was longwinded, lol.
Here is the fault in the XSS holding the games back and it not being noticable... pS5 will be the dominant platform worldwide. it would make tons more sense for 3rd party developer's to use them as the lead platform. If they are the lead platform then deficiencies between the other consoles should be easily identified

The other issue I have is that people are giving credence to the developer from IDTech... Understandable, but he admits he doesn't have a Lockhart dev kit or the development tools provided. The Ops vetted source claims that the Lockhart dev kit and tools made a world of difference. I would be incline to believe those who have access to the tools over those who don't. Unfortunately these threads get bombarded by same Sony die hards who cry doom on every xbox thread.. we all get it . The xbox is doomed and it's going to look like a 16 bit console.... Cerny is infallible, and the PS5 will cure the world's illnesses.
 
Last edited:

freefornow

Member
Big games that are graphical showpieces are going to look worse and worse on XSS as the gen goes by.

To be fair the only problem here is MS deceitful message of XSS as a 1440p machine. Not surprising coming from MS but still disappointing.
If MS are being "deceitful" as you say, then they will pay a hefty price for that.
And as far as The Terror that Flaps in the Night comments re Xbox, well lets just say he has runs on the board.
 
Last edited:
The Geforce 3000 series are coming this week, but according to the OP it doesn't mean anything, because with enough time and resources we can make any game work on a Riva TNT, so, "I understand what he's trying to say".
 

quest

Not Banned from OT
The Geforce 3000 series are coming this week, but according to the OP it doesn't mean anything, because with enough time and resources we can make any game work on a Riva TNT, so, "I understand what he's trying to say".
You guys still at it with horrible examples. The TNT is not the same generation as the 3000. It be making game work on the 3070 and 3050.
 

Three

Member
What your talking about has existed for as long as consoles have - lowest common denominator. In terms of impact, consoles as a whole have potentially created any number of such hypothetical limitations for the past four decades.
So if a console has this effect on higher spec PCs why do you believe this effect would not exist on a 2x+ more powerful console?

The XSS as an exclusive limitation, above and beyond the limitations that the XSX itself might impose, on multi-platform games are basically non-existent - which is the topic of the thread. You're talking about "consoles". I'm talking about a specific console, as the title of the thread discusses. For example, the Xbox One X and the PS4 Pro now use SVOGI courtesy of CryEngine and the Crysis Remastered. The base consoles don't get access to that particular feature. Fun fact: it's also enabled on the Switch version of Crysis. How, and why? The key difference in the equation being that the power differential between the PS4 and Xbone's Jaguar CPUs versus Desktop PC's of that era was enormous - both machines were under-powered and obsolete the day they released, thanks in no small part to their laptop CPU hearts. The Zen2 CPUs in the next-gen consoles about to release are still behind the bleeding edge of Desktops, but they're far more competitive, while both consoles are also packing a wealth of custom hardware designed to lessen the load on the CPU. This will let developers really put the pull CPU to use. The XSS has, effectively, the same CPU as the PS5, which is just shy of the CPU in the XSX.
I've already highlighted the potential limitations of the XSS hardware - with the CPU and SSD being of equivalent spec to the XSX, the RAM and GPU components can be, and apparently have been, efficiently scaled down for lower asset quality and resolution targets, without sacrificing anything related to gameplay (though, dropping from 60 to 30 in some titles will happen). I don't feel you've really added, changed, or revealed anything from your post. Any concern, such as the one described, that you have for the next generation of consoles are concerns you had for the Xbox One and PS4, and the Xbox 360 and the PS3, and the Xbox and PS2. The XSS offers basically no additional cause for concern that I can see.
I'm talking about a specific console too through an example. Crysis remastered is again a good example of exactly what I'm saying and shows that SVOGI was held back most of this gen. SVOGI was ready tech right at the beginning of the current gen in UE4 but it was dropped from the engine right before PS4 and XB1 release so little to no games used it even on PC . It didn't make sense to have it because the popular devices at the time didn't support it well.

The reason you have seen that effort in it for Pro and X now in Crysis remastered is because the game hasn't released yet and that investment made sense for the upcoming next gen consoles too which I'm sure it will release on.
I believe you would NOT have seen that effort otherwise even on PC had this game released at the beginning or mid gen this gen. All you would get is increased res or fps. They would have baked it across all of them or used an alternative.

This will be happening in the upcoming gen too. An id software engine developer has already come out and said that half the ram in the Series S is low and things like Raytracing BVH takes up a lot of RAM. What do you think will happen if say the Series S makes up 90% of sales in the upcoming gen? engine developers strive for easy development and parity (in development) across all supported platforms. If one very popular platform doesn't support RT BVH for example they will come up with an alternative that gets similar results (with limitations and compromises) and make the workflow the same across all platforms just as Unreal Engine 4 did at the beginning of last gen with removing SVOGI. You don't want a completely different way of working just for one platform.
 
Last edited:
The Geforce 3000 series are coming this week, but according to the OP it doesn't mean anything, because with enough time and resources we can make any game work on a Riva TNT, so, "I understand what he's trying to say".
The RTX3070 will also bottleneck the next generation because it only has 8 gigs of VRAM
 

ZehDon

Member
So if a console has this effect on higher spec PCs why do you believe this effect would not exist on a 2x+ more powerful console?
Actually, I mentioned hypothetical limitations - I actually can't point to a single one. My actual point was that if you believe consoles have been holding back PCs to such an extreme degree, then you must also believe they've been doing it for forty years. What's changed this generation? Why does the XSS deserve its own thread, or special concern? At the end of the day, developers will target the hardware that sells - which is why Witcher 3 was ported to the Switch, even though it was deeply compromised. You cover this concept below, so we'll touch on it more there.

I'm talking about a specific console too through an example. Crysis remastered is again a good example of exactly what I'm saying and shows that SVOGI was held back most of this gen. SVOGI was ready tech right at the beginning of the current gen in UE4 but it was dropped from the engine right before PS4 and XB1 release so little to no games used it even on PC . It didn't make sense to have it because the popular devices at the time didn't support it well.
If the feature was complete and ready in a multi-platform third-party game engine, why was it pulled after it was finished? It costs nothing to not implement the feature. Something doesn't add up, does it? Tim Sweeny addressed the myth that SVOGI was removed to placate consoles - it was removed because it was too computationally expensive within the engine itself. They were able to match the effect with pre-baked lighting, so, they prioritised the latter, and returned to real-time non-hardware accelerated GI with their PS5 demo using a different accumulation method. You seem to think that if consoles don't support a feature, it's throw into the trash, even after its developed and implemented. Why does Battlefield V on PC supports ray tracing while its console port doesn't?

The reason you have seen that effort in it for Pro and X now in Crysis remastered is because the game hasn't released yet and that investment made sense for the upcoming next gen consoles too which I'm sure it will release on. I believe you would NOT have seen that effort otherwise even on PC had this game released at the beginning or mid gen this gen. All you would get is increased res or fps. They would have baked it across all of them or used an alternative.
Not at all. Crytec solved the computationally expensive SVOGI method and implemented it into their engine way back in 2015. It's been featured in several PC games - and the feature was disabled in the console ports of those titles. It was not implemented onto the base consoles because they cannot run the feature in realtime, where as the more advanced consoles can, so, they enabled it - though with a resolution hit (One X at 1080p, for example). Scaling isn't a myth, it just takes effort.

This will be happening in the upcoming gen too. An id software engine developer has already come out and said that half the ram in the Series S is low and things like Raytracing BVH takes up a lot of RAM. What do you think will happen if say the Series S makes up 90% of sales in the upcoming gen? engine developers strive for easy development and parity (in development) across all supported platforms. If one very popular platform doesn't support RT BVH for example they will come up with an alternative that gets similar results and make the workflow the same across all platforms just as Unreal Engine 4 did at the beginning of last gen with SVOGI. You don't want a completely different way of working just for one platform.
You make so many assumptions, attempt so many snuck premises, and derive from your above faulty premises, that it's actually difficult to address this section. The id developer commenting about the RAM limitations is a good example of an educated person commenting on the hardware. It's a good place to start. But, you're off to races without understanding what he's saying. Notice how he highlighted an impact on ray tracing, not AI? He didn't mention gameplay. He mentioned an extremely specific visual element.
XSS having 90% of the market share? Terrific - XSX is a bigger version that supports an identical feature set, so it'll be easy for developers to scale down to hit the XSS platform from the XSX version.
RT BVH structure takes up too much memory? Terrific - create a lower fidelity sub-structure and set it for the XSS profile.
RT tanks the framerate on the XSS? Terrific - disable the feature entirely, and have it fall back to the traditional reflection and lighting models that will be used on the PC platform for graphics cards that don't support HART.
XSS can't handle the NPC texture variety? Terrific - implement an NPC selection sub-set and give the XSS version less options.
Texture quality is blurry on the XSS? Who cares, it's a budget machine for people who don't care what mip-map level they're textures max out to.
Load times are really long on the XSS? It's a budget machine, what did you expect?
The Witcher 3 looks like soup on the Switch? Who cares - it's the whole game, it functions, and its portable.

Sorry friend, but you're not demonstrating anything worthy of concern. What specific feature do you think developers will throw out entirely because the XSS can't support it? What specific gameplay elements will be abandoned because the XSS can't accommodate it?
 
Last edited:

Three

Member
Actually, I mentioned hypothetical limitations - I actually can't point to a single one. My actual point was that if you believe consoles have been holding back PCs to such an extreme degree, then you must also believe they've been doing it for forty years. What's changed this generation? Why does the XSS deserve its own thread, or special concern? At the end of the day, developers will target the hardware that sells - which is why Witcher 3 was ported to the Switch, even though it was deeply compromised. You cover this concept below, so we'll touch on it more there.

Yes I do believe that it has happened for 40 years. I gave an example of it last gen even. What's changed this gen though is that we have one of the major players coming in with a cheap console and will see one of the smallest jumps in power in a generation. It's a completely new strategy for that company and if everybody decides that power isn't important but price is we will get an even smaller jump.

If the feature was complete and ready in a multi-platform third-party game engine, why was it pulled after it was finished? It costs nothing to not implement the feature. Something doesn't add up, does it? Tim Sweeny addressed the myth that SVOGI was removed to placate consoles - it was removed because it was too computationally expensive within the engine itself. They were able to match the effect with pre-baked lighting, so, they prioritised the latter, and returned to real-time non-hardware accelerated GI with their PS5 demo using a different accumulation method. You seem to think that if consoles don't support a feature, it's throw into the trash, even after its developed and implemented. Why does Battlefield V on PC supports ray tracing while its console port doesn't?
Read between the lines. Isn't it essentially saying the same thing?Yes it was ready, they had a demo of it on PC.
From the article:
that technique was extremely expensive.

Epic Games decided create a series of graphical effects that achieve the same image fidelity as SVOGI with far better performance.
This means that UE4 supports realistic real-time reflections, realistic specular lighting, and a combination of direct lighting with pre-computed Global Illumination.


It is expensive but was doable on high end PCs otherwise the demo wouldn't exist. Yes? They decided to go with baked GI because it was too expensive for mass market devices. Why else would you pull a computationally expensive feature that you have a running demo of other than how feasible it is for your multi-platform engine? Do you think if it was computationally expensive but most devices could pull it off it would be removed? The tech was working, with a demo, pulled because it was computationally expensive for who? The multiplatform mass market that UE4 aims for.

I just gave you an example of SVOGI and how for example baking lighting can affect gameplay in GTSport. I'm not saying there are no available compromises to achieve the same gameplay or anything like that (for example GTSport could have dropped GI completely and done dynamic time of day but its lighting would have looked worse for it) so I'm not sure where you are going with the later part of your comment.

I'm saying often effort is what prevents higher spec machines from getting things that the popular low spec machines cannot do. In life not just in development things take the path of least resistance.
If I had some engine feature like SVOGI that changed my workflow but only supported on some less popular devices I wouldn't bother and just use that extra power to bump res higher or fps. Isn't that even what MS is saying you should do? Isn't that what it is being marketed as?
 
Last edited:

Jon Neu

Banned
Big games that are graphical showpieces are going to look worse and worse on XSS as the gen goes by.

Even if that ends becoming true, nobody will care.

Well, except Sony fans, apparently.


To be fair the only problem here is MS deceitful message of XSS as a 1440p machine. Not surprising coming from MS but still disappointing.

How is that deceitful?

I'm pretty sure there are going to be plenty of games at 1440p on the Series S.

If that's deceitful, what we should call Sony and his "4K" PS4 Pro console?
 

Panajev2001a

GAF's Pleasant Genius
Even if that ends becoming true, nobody will care.

Well, except Sony fans, apparently.

That would be interesting if true, kind of paints Xbox fans in an interesting light “oh yeah, so they sell us a console that works more and more like shit as time goes by... thank God for that!” :LOL:. Except it will not happen and unless XSS sells like shit games will target XSS and then get scaled up.
 

Jon Neu

Banned
That would be interesting if true, kind of paints Xbox fans in an interesting light “oh yeah, so they sell us a console that works more and more like shit as time goes by... thank God for that!” :LOL:.

I don't think it's going to "work like shit", for it's intented target, it's going to run good, just not as good as the Series X.

Also it's a console aimed at casuals and people who don't care that much about graphics, but let's that not stop you or any other Sony fanboy from throwing shade like you always do.
 

supernova8

Banned
Sorry friend, but you're not demonstrating anything worthy of concern. What specific feature do you think developers will throw out entirely because the XSS can't support it? What specific gameplay elements will be abandoned because the XSS can't accommodate it?

You're right there's no evidence either way (because we've seen nothing running on XSS and XSX side by side) and it's all pure speculation (95% of this website is speculation). Still it sounds awfully fishy that they can have a XSX target game running pretty much identical but at 1440p on a machine with much less graphics horsepower and just over half the VRAM.

When it sounds too good to be true, it often is. If they can pull it off, great!
 

Jon Neu

Banned
The PS5 will be the lead platform. Everything will be ported up/down from there.

Haven't you heard? There's no such thing as porting down anymore.

Everything is going to be ported up from the Series S. So the PS5 is going to receive Series S ports!
 

Jon Neu

Banned
You're right there's no evidence either way (because we've seen nothing running on XSS and XSX side by side) and it's all pure speculation (95% of this website is speculation). Still it sounds awfully fishy that they can have a XSX target game running pretty much identical but at 1440p on a machine with much less graphics horsepower and just over half the VRAM.

When it sounds too good to be true, it often is. If they can pull it off, great!

I don't know how that is fishy. Less resolution means you need less graphic horsepower and ram. The Series S it's going to run games at 3 or 4 times less resolution.

And if other compromises have to be made such as less or not RT at all, less quality effects and so on, then that also saves a lot of resources.

Some games are going to be exactly the same at less resolution and some other games are going to have more compromises. Depends of the nature of the game and the devs doing the game.
 

Three

Member
Not at all. Crytec solved the computationally expensive SVOGI method and implemented it into their engine way back in 2015. It's been featured in several PC games - and the feature was disabled in the console ports of those titles.
Would love to see this list btw, the ones I know about can be counted on one hand or are PC exclusive

Star Citizen - PC exclusive
Miscreated - PC exclusive
Kingdom Come deliverance - 2018 (PC and X1X got SVOGI)
Crysis remastered - 2020

Where are the several multiplatform games where they put in the effort for half a decade you're talking about?
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
Also it's a console aimed at casuals and people who don't care that much about graphics

This is a console that is everything. Several times faster than the MONSTER Scorpio and at the same time not able to run its games in the XOX profile and just a cheap console aimed at casuals that do not know any better ;). I can see how saying every possible thing leaves a scenario where you will always be right.

Seriously though, you can keep focusing on what MS means XSS to be, thanks for being sure to amplify MS marketing message, but what matters is also what it is and the effect on its ecosystem.

Not that people should be surprised, a PC focused company with Apple iterative HW and walled garden focused ecosystem that sells HW at stratospheric profit margins yearly and that failed at dominating the consoles market is trying to convince people that the rules of the market are bad and that it needs to become like a closed version of the PC one they control (no hacks, no mods, no open HW, etc... and yet cross generation/minimum common denominator approach forever... worst of both worlds for consumers)... 😯.
 
Last edited:

Jon Neu

Banned
This is a console that is everything. Several times faster than the MONSTER Scorpio and at the same time not able to run its games in the XOX profile and just a cheap console aimed at casuals that do not know any better ;). I can see how saying every possible thing leaves a scenario where you will always be right.

Yes, it's CPU it's several times faster than the one on the Xbox One X and it's overall much better console. And yes, it's a console that targets the casual market.

Both stances can be true if you take away your Team Sony trolling mode glasses off.

But for some people, that's maybe asking too much.

but what matters is also what it is and the effect on its ecosystem.

Yes please, repeat again how concerned you are.
 

supernova8

Banned
I don't know how that is fishy. Less resolution means you need less graphic horsepower and ram. The Series S it's going to run games at 3 or 4 times less resolution.

And if other compromises have to be made such as less or not RT at all, less quality effects and so on, then that also saves a lot of resources.

Some games are going to be exactly the same at less resolution and some other games are going to have more compromises. Depends of the nature of the game and the devs doing the game.

Yeah that's a fair point, but:
The OS is supposedly 40% more memory-efficient than the original Xbox One OS (which was 3GB usage). That means the new OS will need about 1.8GB of RAM reserved for the OS.

Doesn't sound like a lot but it equates to 11% of the XSX's available RAM, but 18% of the XSS's RAM.
Put another way, XSS should have 8.2 GB of RAM available for the game itself, XSX should have 14.2 GB.

I suppose the question is, given whatever the scope is for a given game, is there an 'absolute minimum' RAM it needs before some of the fundamental parts of the game stop functioning properly?

Hopefully not but nobody can say "Nah that definitely won't happen" until we have real-world evidence of it not being an issue.
 
Last edited:

Jon Neu

Banned
I suppose the question is, given whatever the scope is for a given game, is there an 'absolute minimum' RAM it needs before some of the fundamental parts of the game stop functioning properly?

The Series S has more or less 8GB of RAM for games and the Series X like 12/13 for games.

Again, accounting for difference of resolutions, effects, and so on, I don't think there's anything you can't make in one and not in the other that would make the game stop functioning properly.

I think you can easily compromise to such a level that it's actually the Series S the one to have more resources left. So again, it will always be a matter of what the devs want to do.
 

FireFly

Member
This is a console that is everything. Several times faster than the MONSTER Scorpio and at the same time not able to run its games in the XOX profile and just a cheap console aimed at casuals that do not know any better ;).
The GPU in the S may well be slower than Scorpio's.
 

pawel86ck

Banned
People say 7.5 GB for next gen games is not enough, but if SFS gains (2.5x) are real, then it's like 18GB worth of data if developers would want to achieve similar results without SFS. 18GB just for 1080p is more than enough. 3x times slower GPU will be also enough to run XSX 12TF games at 4x lower resolution. There's no way XSS will hold back XSX.
 

Marlenus

Member
When people say the Series S will hold back games what are you actually thinking?

World size, asset variety, graphical features or effects, other?
 

Humdinger

Member
You mean CPU stuff? Not an issue at all since it's the same CPU.

Could be memory-related.

When people say the Series S will hold back games what are you actually thinking?

World size, asset variety, graphical features or effects, other?

From my read, it's not the visuals or framerate. It's stuff like physics, environmental interaction, level complexity, enemy AI, crowd size, or other design options (e.g., rapid transition between worlds like in R&C).
 

Marlenus

Member
From my read, it's not the visuals or framerate. It's stuff like physics, environmental interaction, level complexity, enemy AI, crowd size, or other design options (e.g., rapid transition between worlds like in R&C).

That is all CPU and SSD stuff so the Series S won't hold anything like that back.
 

Humdinger

Member
That is all CPU and SSD stuff so the Series S won't hold anything like that back.

I'm not a developer, but don't all of those things (physics, environmental interaction, etc.) involve memory? And haven't developers specifically mentioned the S's relative lack of memory and slowness of memory as a primary concern?
 

Marlenus

Member
I'm not a developer, but don't all of those things (physics, environmental interaction, etc.) involve memory? And haven't developers specifically mentioned the S's relative lack of memory and slowness of memory as a primary concern?

10GB is enough for 1080p quality assets and the main game code + OS overhead. The extra ram and speed on the X will mainly be used for 4k quality textures.

8GB system ram + 4GB Vram is enough for 1080p on the PC and PCs have huge OS memory overheads.
 

Humdinger

Member
10GB is enough for 1080p quality assets and the main game code + OS overhead. The extra ram and speed on the X will mainly be used for 4k quality textures.

8GB system ram + 4GB Vram is enough for 1080p on the PC and PCs have huge OS memory overheads.

Are you a developer yourself?
 
Top Bottom