• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Developer Speaks About Lockhart "Holding Back Next Gen." And PS5 VS XSX Dev Kits.

kruis

Exposing the sinister cartel of retailers who allow companies to pay for advertising space.
Another problem (as if we don't have enough) is that developers will be reluctant to admit that they held back their projects because of the XSS: "Yeah, we scaled back the ambition of our project." That could get them in trouble. It brings negative attention to the project. So even when it happens, getting a developer to talk about it may be difficult. It certainly won't come from an MS developer.

We may someday find out in a "post mortem" discussion where devs speak frankly about the successes and failures of their game if the Series S held them back. But actually, I think PCs are going to be blamed just as often, maybe even more. Both Xbox Series S/X and PS5 launch with high-performance storage solutions, whereas DirectStorage isn't ready for primetime anytime soon.
 

Neo_game

Member
We will know in some years when game will be made for next gen console only. I think it is going to be like how things are for current gen. They will use SS as the base and scale the resolution. That is the easiest thing for devs to do. Only few devs will bother to have extra gfx settings for the X and PS5 sadly.
 

Marlenus

Member
Are you a developer yourself?

Not games, generally ETL but dabble with whatever is needed by the company.

To say 10GB IS enough was probably too strong.

10GB should be enough especially with direct storage.

I don't think 10GB will hold back any more at 1080p than 16GB holds back at 4k. If 8GB of ram is required for the OS + core game logic that leaves 2GB for assets on the series S and 8GB on the X. 4k has 4x the pixel count compared to 1080p and the X has 4x the ram to store those higher quality assets in. That seems workable to me.
 

Humdinger

Member
Not games, generally ETL but dabble with whatever is needed by the company.

To say 10GB IS enough was probably too strong.

10GB should be enough especially with direct storage.

I don't think 10GB will hold back any more at 1080p than 16GB holds back at 4k. If 8GB of ram is required for the OS + core game logic that leaves 2GB for assets on the series S and 8GB on the X. 4k has 4x the pixel count compared to 1080p and the X has 4x the ram to store those higher quality assets in. That seems workable to me.

Ok. I hear what you're saying, and I respect your opinion, but I'm skeptical, especially since you're not a developer yourself. I'm not claiming to be any different. I'm not a developer, either, so you (or anyone) is free to be skeptical about what I'm saying. However, I'm not really putting forth my opinion so much as trying to summarize concerns I've heard from others (mostly developers), in response to questions. You're asserting an opinion that the memory constraints won't be an issue, and since you're not an experienced developer, I'm going to be skeptical.

Another reason I'm skeptical is that I've spent the last 2 or 3 years listening to people assure us that something won't be an issue, it's easily scalable, etc., only to find that it is in fact a significant problem. I'm not saying we're dealing with a Craig-level issue here -- not at all, I acknowledge all the ways the architecture is similar -- but I'm skeptical of the reassurances that there will be no problems, no limitations, all is well.
 
Last edited:

Schmick

Member
hmmm..... Remedy's concern was optimisation. Does this mean the S cant handle next gen or does it simply mean they need to spend a bit more time than usual on optimisation?
 

Marlenus

Member
Ok. I hear what you're saying, and I respect your opinion, but I'm skeptical, especially since you're not a developer yourself. I'm not claiming to be any different. I'm not a developer, either, so you (or anyone) is free to be skeptical about what I'm saying. However, I'm not really putting forth my opinion so much as trying to summarize concerns I've heard from others (mostly developers), in response to questions. You're asserting an opinion that the memory constraints won't be an issue, and since you're not an experienced developer, I'm going to be skeptical.

Another reason I'm skeptical is that I've spent the last 2 or 3 years listening to people assure us that something won't be an issue, it's easily scalable, etc., only to find that it is in fact a significant problem. I'm not saying we're dealing with a Craig-level issue here -- not at all, I acknowledge all the ways the architecture is similar -- but I'm skeptical of the reassurances that there will be no problems, no limitations, all is well.

To be more concise I think that if 10GB is an issue at 1080p then 16GB is probably just as much of an issue at 4k. If devs end up dropping from 4k to 1800p or something on the X / PS5 then I see no reason why they can't drop to 900p on the S.

The main constraint on Devs with PS4/Pro and Xbox One/One X was the CPU performance. With these consoles they get desktop tier CPU performance and SSD IO with some clever software/hardware to make direct GPU -> SSD access possible.

What I don't see happening is the Series S impacting on a devs artistic vision for a game because simply going 1080p with assets to match on the S and 4k with assets to match on the X should be pretty much job done. Maybe in a few cases it won't be but I expect that to be the exception rather than the rule.
 

mckmas8808

Banned
10GB is enough for 1080p quality assets and the main game code + OS overhead. The extra ram and speed on the X will mainly be used for 4k quality textures.

8GB system ram + 4GB Vram is enough for 1080p on the PC and PCs have huge OS memory overheads.

I don't believe this for one second!
 

Three

Member
There are actually people in here that think that PC has been holding back console gaming?

hahahahahaha
Who is saying this? People are saying the opposite and with the Series S it's on a whole new level.
I mentioned that the consoles last gen killed dynamic global illumination (SVOGI) in unreal engine which was demonstrated on a GTX 680. People can say that I am making assumptions while making assumptions themselves about Crytek solving some inherent problem with SVOGI but it's simply not true. Epic had SVOGI working and in the engine but the maintenance burden was too much for something all the mainstream consoles would not be able to support so they specifically removed it. Epic at the time even said so clearly (even throwing in the mobile renderer as a reason for static GI). They had SVOGI working in the engine but decided on a different path because it made sense. It really was not something smart for Unreal to pursue at the time for the limited market that would benefit unless they maintained a branch specifically for SVOGI (too much effort and cost) . You can even read the epic dev comment about it here if you like:


Not that any xbox fan likes to actually accept any developer input.

Ok. I hear what you're saying, and I respect your opinion, but I'm skeptical, especially since you're not a developer yourself. I'm not claiming to be any different. I'm not a developer, either, so you (or anyone) is free to be skeptical about what I'm saying. However, I'm not really putting forth my opinion so much as trying to summarize concerns I've heard from others (mostly developers), in response to questions. You're asserting an opinion that the memory constraints won't be an issue, and since you're not an experienced developer, I'm going to be skeptical.

Another reason I'm skeptical is that I've spent the last 2 or 3 years listening to people assure us that something won't be an issue, it's easily scalable, etc., only to find that it is in fact a significant problem. I'm not saying we're dealing with a Craig-level issue here -- not at all, I acknowledge all the ways the architecture is similar -- but I'm skeptical of the reassurances that there will be no problems, no limitations, all is well.
Please remain sceptical and please do not support the low end devices if you can. Some of the regular xbox fans will defend the thing to the death but it will have an effect. The id software dev even mentioned people who say "you can just scale" . Minimum specs matter. The market matters. Developer effort matters.
 
Last edited:

Journey

Banned
Looking at them side by side the image is very similar.


Who'd have thunk
Color me unsurprised.
As usual it will come down to exclusives.


Similar =/= the same


Xbox One vs PS4 image was very similar, so it's going to depend on your standards.


DF, NX Gamers, Warriors will focus on what

THIS PART:

But Anaconda has the upper hand in terms of us being able to really push effects.


Will translate to the end product. Pretending they will look the same based on the word "Similar" is misleading.


The real takeaway here is:

1) It was originally thought that XSS would have a slower CPU or slower I/O therefore causing an issue when trying to get creative with breakthrough physics, AI or getting fancy with an SSD. That was corrected, so any old comments or concerns should be dismissed.

2) This proves XSS will NOT be holding back XSX because games are built for XSX first, then ported down.
 
Last edited:

Humdinger

Member
You guys still at it with horrible examples. The TNT is not the same generation as the 3000. It be making game work on the 3070 and 3050.

I don't know how I ended up getting quoted as the source of that statement you're responding to, but I didn't say it, someone else did.
 

ZehDon

Member
...What's changed this gen though is that we have one of the major players coming in with a cheap console and will see one of the smallest jumps in power in a generation. It's a completely new strategy for that company and if everybody decides that power isn't important but price is we will get an even smaller jump.
Not really. Following your logic, the weakest console holds back everything else. So, the accurate comparison in your argument is the OG Xbox One, the weakest console of its generation, to the Xbox Series S, the weakest console of its generation. When you factor in that the Xbox Series S is targeting 1440p (1080p in reality, lets be honest), the jump in computational power is enormous.
As for "power not mattering", the Nintendo DS outsold the Xbox 360 and the PS3 by well over 50%, and we still got a PlayStation 4 and an Xbox One. The Nintendo Switch is selling so quickly Nintendo aren't able to manufacture enough of them, and will still got a PlayStation 5 and an Xbox Series X. This is starting to feel like concern trolling, if I'm being honest.

...Read between the lines --
No thank you, this lets people make baseless assumptions and present them as fact. I'll go on the evidence and logic.
You're asking me to believe that Epic co-founder and CEO Tim Sweeny paid millions upon millions in R&D and funded his development teams' efforts to create, polish, and implement a production-ready SVOGI implementation into Unreal Engine 4, and then after it was feature complete he deleted the code from the engine - possibly while screaming "If my friend Phil Spencer can't use this, no one can!" - because the Xbone couldn't run it. That simply doesn't make sense, given the sheer number of optional features within the engine. Doubly so when when you consider Crytek implemented this feature into their engine two years later, Rockstar implemented a version of the technique in their multi-platform engine - and got it running on the OG Xbone - and SIE Japan Studio implemented it into their engine and have it running on PS4.

... The tech was working, with a demo, pulled because it was computationally expensive for who? The multiplatform mass market that UE4 aims for....
What? A multi-platform engine that doesn't support features that aren't shared by all platforms isn't a multi-platform engine. That's... not how Unreal Engine works. That's not how multi-platform development as a whole works. Do you understand what you're saying? Unreal Engine targets Mobile devices - why didn't they cut pixel shader implementation when phones couldn't make use of them? Unreal Engine supports VR, but Xbone doesn't - why didn't they cut that feature? Unreal Engine 5 supports advanced IO features written with a focus on the PS5. PC's can't use that feature, why wasn't that cut?
As with a lot of high-profile downgrades, it's much more likely Epic's SVOGI implementation never worked as well as they wanted it to in real scenarios. Reality bites, as they say. Epic said it was too computationally expensive, so they cut it. Remember, it took other developers several more years of development before we'd see this running in real games. Self-evidently, their implementations must be less computationally expensive than Epic's attempt. It's much more likely that Epic's attempt at SVOGI was dropped when they couldn't get it working as required. Crytek got it working two years after Epic dropped the feature, and implemented it into their multi-platform engine. There isn't a vast conspiracy here, friend.

I just gave you an example of SVOGI and how for example baking lighting can affect gameplay in GTSport...
No, you picked a bad example of a feature that scales wonderfully - its currently running on the Switch - and you highlighted a first party, single platform game that employed customised pre-baked offline GI lighting in order to work within the performance confines of the first party virtual reality headset that the company employs. I'm not sure what you think you've highlighted? That day-to-night transitions were considered such a nothing "gameplay feature" that it was willfully cut in favour of hitting a 60FPS frame target in VR? If that's the best example you have of "weak consoles limiting gameplay" then you're arguing against yourself, and I can stop posting.

I'm saying often effort is what prevents higher spec machines from getting things that the popular low spec machines cannot do...
You're arguing in circles while providing nothing. "Often"? Such as?

If I had some engine feature like SVOGI that changed my workflow but only supported on some less popular devices I wouldn't bother...
Then you're a terrible developer? I'm not sure what you want me to say - you clearly have a very poor understanding of large scale multi-platform game development. Nothing you've described lines up with the realities. I'm sorry, but unless you have something more meaningful to add, I'm going to stop replying. I provided a list of your voiced concerns and explained in a single sentence how each can be addressed, and you're just replying "SVOGI! SVOGI! SVOGI!" even after I've described numerous multi-platform games that use versions of the technique.

Would love to see this list btw, the ones I know about can be counted on one hand or are PC exclusive

Star Citizen - PC exclusive
Miscreated - PC exclusive
Kingdom Come deliverance - 2018 (PC and X1X got SVOGI)
Crysis remastered - 2020

Where are the several multiplatform games where they put in the effort for half a decade you're talking about?
You missed Hunt: Showdown.
Any game made using CryEngine V has access to SVOGI, and the developer could chose to implement the feature or not. For example, in Crytek's VR games, such as Robinson: The Journey, they themselves chose not to use it because it was too expensive for the required 60FPS frame target for PSVR. However, they implemented on the Switch version of Crysis because 30FPS was achievable at the low resolution of the Switch... which is what Microsoft are planning for the XSS. CryEngine isn't as wide spread as Epic's Unreal Engine, so it's not an exhaustive list - but a smaller engine by a smaller team implementing optional SVOGI for multi-platform development only bolsters my statements even more.
I was demonstrating your lack of understanding of multi-platform engine development. Every game that doesn't use SVOGI made on an engine that supports it is a nail in the coffin of your argument that game development only moves as fast as the weakest console. If this were true, we'd be stuck on the Gameboy after it sold more than everything else while only supporting four colours.

If the XSS had a weaker CPU, or lacked of the CPU-saving dedicated processing hardware of the XSX, I'd agree with you.
If the XSS lacked a comparable SSD to the XSX, I'd agree with you.
If the XSS used a modified Xbox One X GPU, I'd agree with you.
But, it doesn't have any of those problems. It uses, effectively, the same CPU as the PS5, the same SSD as the XSX, all the same hardware feature sets, only with a paired back GPU and RAM configuration, reduced in line with a drop in resolution target. In this scenario, with that hardware and that criteria, I'm happy to give Microsoft the benefit of the doubt. Microsoft and Sony made a case for this with the PlayStation 4 Pro and the Xbox One X. The approach worked then, and I see no reason I should pay special concern to the Xbox Series S now.
 

Three

Member
Not really. Following your logic, the weakest console holds back everything else. So, the accurate comparison in your argument is the OG Xbox One, the weakest console of its generation, to the Xbox Series S, the weakest console of its generation. When you factor in that the Xbox Series S is targeting 1440p (1080p in reality, lets be honest), the jump in computational power is enormous.
As for "power not mattering", the Nintendo DS outsold the Xbox 360 and the PS3 by well over 50%, and we still got a PlayStation 4 and an Xbox One. The Nintendo Switch is selling so quickly Nintendo aren't able to manufacture enough of them, and will still got a PlayStation 5 and an Xbox Series X. This is starting to feel like concern trolling, if I'm being honest.


No thank you, this lets people make baseless assumptions and present them as fact. I'll go on the evidence and logic.
You're asking me to believe that Epic co-founder and CEO Tim Sweeny paid millions upon millions in R&D and funded his development teams' efforts to create, polish, and implement a production-ready SVOGI implementation into Unreal Engine 4, and then after it was feature complete he deleted the code from the engine - possibly while screaming "If my friend Phil Spencer can't use this, no one can!" - because the Xbone couldn't run it. That simply doesn't make sense, given the sheer number of optional features within the engine. Doubly so when when you consider Crytek implemented this feature into their engine two years later, Rockstar implemented a version of the technique in their multi-platform engine - and got it running on the OG Xbone - and SIE Japan Studio implemented it into their engine and have it running on PS4.


What? A multi-platform engine that doesn't support features that aren't shared by all platforms isn't a multi-platform engine. That's... not how Unreal Engine works. That's not how multi-platform development as a whole works. Do you understand what you're saying? Unreal Engine targets Mobile devices - why didn't they cut pixel shader implementation when phones couldn't make use of them? Unreal Engine supports VR, but Xbone doesn't - why didn't they cut that feature? Unreal Engine 5 supports advanced IO features written with a focus on the PS5. PC's can't use that feature, why wasn't that cut?
As with a lot of high-profile downgrades, it's much more likely Epic's SVOGI implementation never worked as well as they wanted it to in real scenarios. Reality bites, as they say. Epic said it was too computationally expensive, so they cut it. Remember, it took other developers several more years of development before we'd see this running in real games. Self-evidently, their implementations must be less computationally expensive than Epic's attempt. It's much more likely that Epic's attempt at SVOGI was dropped when they couldn't get it working as required. Crytek got it working two years after Epic dropped the feature, and implemented it into their multi-platform engine. There isn't a vast conspiracy here, friend.


No, you picked a bad example of a feature that scales wonderfully - its currently running on the Switch - and you highlighted a first party, single platform game that employed customised pre-baked offline GI lighting in order to work within the performance confines of the first party virtual reality headset that the company employs. I'm not sure what you think you've highlighted? That day-to-night transitions were considered such a nothing "gameplay feature" that it was willfully cut in favour of hitting a 60FPS frame target in VR? If that's the best example you have of "weak consoles limiting gameplay" then you're arguing against yourself, and I can stop posting.


You're arguing in circles while providing nothing. "Often"? Such as?


Then you're a terrible developer? I'm not sure what you want me to say - you clearly have a very poor understanding of large scale multi-platform game development. Nothing you've described lines up with the realities. I'm sorry, but unless you have something more meaningful to add, I'm going to stop replying. I provided a list of your voiced concerns and explained in a single sentence how each can be addressed, and you're just replying "SVOGI! SVOGI! SVOGI!" even after I've described numerous multi-platform games that use versions of the technique.


You missed Hunt: Showdown.
Any game made using CryEngine V has access to SVOGI, and the developer could chose to implement the feature or not. For example, in Crytek's VR games, such as Robinson: The Journey, they themselves chose not to use it because it was too expensive for the required 60FPS frame target for PSVR. However, they implemented on the Switch version of Crysis because 30FPS was achievable at the low resolution of the Switch... which is what Microsoft are planning for the XSS. CryEngine isn't as wide spread as Epic's Unreal Engine, so it's not an exhaustive list - but a smaller engine by a smaller team implementing optional SVOGI for multi-platform development only bolsters my statements even more.
I was demonstrating your lack of understanding of multi-platform engine development. Every game that doesn't use SVOGI made on an engine that supports it is a nail in the coffin of your argument that game development only moves as fast as the weakest console. If this were true, we'd be stuck on the Gameboy after it sold more than everything else while only supporting four colours.

If the XSS had a weaker CPU, or lacked of the CPU-saving dedicated processing hardware of the XSX, I'd agree with you.
If the XSS lacked a comparable SSD to the XSX, I'd agree with you.
If the XSS used a modified Xbox One X GPU, I'd agree with you.
But, it doesn't have any of those problems. It uses, effectively, the same CPU as the PS5, the same SSD as the XSX, all the same hardware feature sets, only with a paired back GPU and RAM configuration, reduced in line with a drop in resolution target. In this scenario, with that hardware and that criteria, I'm happy to give Microsoft the benefit of the doubt. Microsoft and Sony made a case for this with the PlayStation 4 Pro and the Xbox One X. The approach worked then, and I see no reason I should pay special concern to the Xbox Series S now.
I'm sorry but how is a handheld like a 3DS the same market as a home console. Are you arguing that series s is a different market?

Yes I'm saying SVOGI was implemented in the engine in whatever degree, enough for a demo even, removed, then not worked on again because it was wise to concentrate on static lighting due to the market at the time and maintaining it or working on it meant that devs would be wasting considerable effort on something the then next gen consoles (and mobile) couldn't really support. They weren't going to work on and maintain something that considerably less than 10% of the market would benefit from. Not sure why this still doesn't make sense to you.

Quote from Epic
We chose to remove SVOGI in order to reduce our maintenance burden and allow us to iterate more quickly on other new and exciting features. It takes a considerable effort to maintain a complex system like SVOGI as the engine grows and evolves: every time we add a new feature or modify an existing feature we have to ensure that it works with all existing features.

For that same reason we do not plan to re-integrate SVOGI in to the main branch of UE4.

We are considering other options but SVOGI has tendrils throughout the rendering code so it is not a good candidate for a plugin right now. It would also be a considerable effort to get it working again with the latest code.
A lot of our efforts recently have focused on improving static lighting in UE4. That was motivated by bringing up the mobile renderer...
I guess Epic engine devs are also terrible devs? It even tells you there that mobile was their driving factor to concentrate on static lighting too. Do you still believe the market doesn't sway engine development choices and that I don't know how engine development works?
Want another direct quote from an epic engine dev?
"[SVOGI] was our prototype GI system that we used for Elemental last year. And our targets, given that we've had announced hardware from Sony, that's where we're going to be using Lightmass as our global illumination solution instead of SVOGI,"
They directly mention that the announced hardware resulted in them targeting/concentrating on Lightmass instead. SVOGI was obviously being considered. They didn't integrate it into the engine, say "fuck it this is too hard" and remove it. Then Crytek came in with some stupid eureka moment that still isn't feasible on a PS4, XBox one. Epic made a wise decision based on the market at the time knowing that SVOGI is not feasible for the PS4, xbox One, Mobile and Wii U. That maintaining this for less than 10% of the PC market is not worth the effort.

Now if weak consoles take 90% of the market if you honestly think you will get the same efforts for the remaining 10% then you are flat out wrong.

Not sure why you are bringing up the xbox one either yes I know exactly what effect that has too. look at Craig. No effort, just faster fps. The path of least resistance. The difference is that the xbox one market would have eventually died whereas the Series S is just starting and would be cannibalising the more powerful next gen consoles sales.
Look if you don't want to take it from me why don't you take it from the id software engine dev? The "educated person" your backhanded comment referred to. He too says minimum specs matter and addressed simpletons who say " but just scale like pc". The market matters here. The market dictates support because effort needs a market.

Th Hunt release date is 2020 again correct? People are making an effort for SVOGI and other demanding methods because we got the PS4 pro and the xbox one X and the mean minimum PC spec has moved far beyond base PS4 and xbox one. Even your assumed Crytek breakthrough doesn't allow it to run on base consoles still.
 
Last edited:
"PS5 dev kit is a bit easier to work with. Its well thought out and designed in ways that make it a bit easier to tweak and change things vs Anaconda. To say I prefer one over the other isn't' really fair because both are very good, but its just a bit easier to work with PS5. But Anaconda has the upper hand in terms of us being able to really push effects. The difference will come down to effects over resolution for us. We have both dev kits pushing 4K/60 on Borderlands 3 and we have almost zero loading times on both kits. Looking at them side by side the image is very similar.

About that...
 
Top Bottom