• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

"MANY developers have been sitting in meetings for the past year desperately trying to get Series S launch requirements dropped"

Klosshufvud

Member
Still at it with the specific values? Where did I mention death threats at all, are you mixing me up with someone else? I have no idea what mental issue you have or why you misconstrue things but it appears I'm not the one who's psychotic here.
You replied to someone saying worst case scenario is a 720p game. Then you went on a completely baseless rant about how MS would never approve a 720p game in certification. And then finally you backtracked saying actually you didn't mean a specific resolution. Which makes the entire initial point of debate moot since suddenly 720p actually isn't an issue. This is you why you're not even worth bothering with. Dishonesty from start to finish.
 

Bogroll

Likes moldy games
Ok so basically it is harder to code and takes more time to get things running for the 8gb less powerful machine. That seems logical, nothing surprising there.
Why then do I have to pay more for God of War Ragnarok on PS5 over PS4?

Imagine coming out with the bollocks at work that you don't want to work on something that takes a bit more skill and time and effort you just want the cream.
 

Three

Member
While ps4 and xbox one are not mandatory, there have been tons of games developed for them with no issue.
I'm sure Cyberpunk blamed last gen consoles for the state it was in at some point. Developing for PS4/XB1 is not without issues either.
I'm certain that most of the developers working those were told by the bosses to make it work, so they did, even if they didn't want to. Same with minimum pc spec. It's not really "optional" if you need it for the game to sell enough, so basically mandatory in a lot of cases.
Very true but this is referring to games developed "firmly for new consoles" which I imagine means those who have decided to drop last gen.
Cross gen hardware may be a "burden" but it's blown way put of proportion. Ps5 and series X aren't the quantum leap some would suggest, Gt7 and God of war wouldn't look twice as good without ps4 versions, were talking 10% here. Forza 5 wouldn't look twice as good with no pc and series s versions, it would oy be minimally better.

Bottom line, a mountain out of a molehill with the hardware we have.
Fair enough, though I would say it's a little more than just looking good. There's game features, and development cost too.
 
Cross gen hardware may be a "burden" but it's blown way put of proportion. Ps5 and series X aren't the quantum leap some would suggest, Gt7 and God of war wouldn't look twice as good without ps4 versions, were talking 10% here. Forza 5 wouldn't look twice as good with no pc and series s versions, it would oy be minimally better.

Bottom line, a mountain out of a molehill with the hardware we have.
The Greatest Showman Lol GIF by Sky


Dude look how good Rift Apart looks and it’s basically a year one PS5 game. Your crazy if you think games will only look 10% better.

Idk why people keep defending this when actual game devs have said the same thing multiple times. But at the end of the day the console is in the wild, it’s too late so I don’t see the point of going back and forth about it.

Only time will tell if Xbox made the right decision with the Series S. They only care about game pass so pushing the envelope with their consoles clearly isn’t the main priority, it just sucks that they made it seem like it was when they announced the Series X first with the tag line “most powerful console ever made”…
 
Last edited:

supernova8

Banned
XSS has the same CPU to run the logic of XSX games. Graphics settings can be lowered relatively easy.
CPU is basically identical apart from being clocked 200mhz lower. Plus the SSD the exact same speed/architecture except for capacity so yeah XSS itself shouldn't be the limiting factor. Developers should focus on not making stuff for the bloody the last gen consoles first before we worry about XSS even remotely being a bottleneck.
 
Last edited:

Three

Member
You replied to someone saying worst case scenario is a 720p game. Then you went on a completely baseless rant about how MS would never approve a 720p game in certification. And then finally you backtracked saying actually you didn't mean a specific resolution. Which makes the entire initial point of debate moot since suddenly 720p actually isn't an issue. This is you why you're not even worth bothering with. Dishonesty from start to finish.
So you're not tired of having the same conversation over and over again?

The 720p wasn't even important. Metro hits 512p on Series S. I just gave an example of a badly optimised game running at 20fps at 720p and clarified my position that I don't think 30fps is a min or 720p is relevant over and over again but you seem to have trouble letting it go.
 
CPU is basically identical apart from being clocked 200mhz lower. Plus the SSD the exact same speed/architecture except for capacity so yeah XSS itself shouldn't be the limiting factor. Developers should focus on not making stuff for the bloody the last gen consoles first before we worry about XSS even remotely being a bottleneck.
I think you missed the original tweeter's point.

The reason this is coming up is because they're starting to focus strictly on next-gen now that we're a full game cycle into this gen and they don't want to be held up by yet another console.

What people aren't factoring in is that ANY thing that bogs down developers is resources taken away from the ultimate product. Even when it comes to just making a single first part sony title, there is usually considerable crunch, now imagine adding more crunch to optimize perfectly on a low end console, which has been compared numerous times to low spec recommendations, but that isn't the same at all.

With low spec recommendations, there really isn't any guarantee of market perfect experience. When you buy any console, you're looking for a fully optimized game.

I'd love a deep dive into AAA game performance on minimum recommended specs and if these games run consistent framerates and resolutions.
 

THE DUCK

voted poster of the decade by bots
The Greatest Showman Lol GIF by Sky


Dude look how good Rift Apart looks and it’s basically a year one PS5 game. Your crazy if you think games will only look 10% better.

Idk why people keep defending this when actual game devs have said the same thing multiple times. But at the end of the day the console is in the wild, it’s too late so I don’t see the point of going back and forth about it.

Only time will tell if Xbox made the right decision with the Series S. They only care about game pass so pushing the envelope with their consoles clearly isn’t the main priority, it just sucks that they made it seem like it was when they announced the Series X first with the tag line “most console console ever made”…

Rift apart is great, and I love the game, but sorry it wasn't a quantum leap in graphics. The whole instant load rift wasn't much more than a gimmic that could have been dealt with in other ways that wouldn't have changed the game much. Is it a great looking game? Sure, but so is gt7, horizon, gow, forza and other cross gen games. Does rift apart look twice as good? Demon Souls? Absolutely not. Heck even some truly last gen stuff like tlou 2 and gow hold up well even now.

I think they made a decent choice, they have a very powerfull console, and they have a budget console. It doesn't ruin next gen.

Now it would be different if there were some sort of massive leap in ps5 and series x that suddenly made games look 5 times better, and the S failed to have that feature, but its simply not the case.
 

SomeGit

Member
So you're not tired of having the same conversation over and over again?

The 720p wasn't even important. Metro hits 512p on Series S. I just gave an example of a badly optimised game running at 20fps at 720p and clarified my position that I don't think 30fps is a min or 720p is relevant over and over again but you seem to have trouble letting it go.
“Clarified” after being told over several posts that 720p20 hasn’t been a reason to refused for certification in the past.

And after going on a complete baseless theory that because a game wasn’t Updates for Series S it shows it isn’t a priority to devs, even though the game was updated at the same time as Series X and PS5…

Like the other poster said you backtracked after actively arguing specific numbers, only to say you didn’t argue specific numbers even though there is a page you doing it.

But, to move on, even your original point is idiotic there is NO PC game where the minimum “unplayable” specs and “playable” specs only differ by a GPU 2.5 times more powerful, on the same architecture.

When you have minimum specs that are down right unplayable, you have to move several tiers on both CPU and GPU performance tiers.

If you have a 240p15 game on Series S you are going to have something just as bad on Series X and PS5, and both Tale Plague and Gotham Knights show early signs of it were it’s not just the Series S struggling to hold 30.

The biggest issue with the Series S will obviously be memory, but OOM errors aren’t performance issues they are stability issues.
 

FunkMiller

Member
Hilarious that for two years pre-launch, everybody argued over whether the Series X or the Ps5 would be more powerful, and now we’re two years into the generation and virtually no games push either console 😂
 

CeeJay

Member
I'd say that's even more reason to listen when somebody as big and talented as Epic needs a helping hand for memory/performance optimisation. Everyone profits from any known optimisation, sure. Not sure how that's related.
But who's at fault, the little console that could or the big engine developer that couldn't?

If the memory constraints could be worked around and overcome then that means that the Series S is capable and isn't going to be an albertross for the entire generation. Developers only need to learn these ways of optimising once and then they can use the techniques for the rest of the generation, we see this all the time as games get better and better as the generation progresses and they learn how to get the most out of the hardware. As has been said, Epic only needed to make the changes to the engine once and then all future iterations of the engine will reap the rewards. It's a positive that Coalition got involved not a negative. The sooner that devs optimise their workflow for the new consoles and take advantage of the newer tools and features in the DX API the better. How many times do we see ham-fisted devs trying to brute force a game using old techniques, surely it's better all round if they move their understanding forward and create games that are efficient using modern techniques to fully take advantage of the hardware.

A rising tide raises all boats
 

S0ULZB0URNE

Member
Rift apart is great, and I love the game, but sorry it wasn't a quantum leap in graphics. The whole instant load rift wasn't much more than a gimmic that could have been dealt with in other ways that wouldn't have changed the game much. Is it a great looking game? Sure, but so is gt7, horizon, gow, forza and other cross gen games. Does rift apart look twice as good? Demon Souls? Absolutely not. Heck even some truly last gen stuff like tlou 2 and gow hold up well even now.

I think they made a decent choice, they have a very powerfull console, and they have a budget console. It doesn't ruin next gen.

Now it would be different if there were some sort of massive leap in ps5 and series x that suddenly made games look 5 times better, and the S failed to have that feature, but its simply not the case.
UnfCRgG.jpg
 

DenchDeckard

Moderated wildly
The guy didn't get exposed lying just like the countless other devs didn't either yet deleted their tweets because they didn't want angry xbox owners hurling abuse as usual and him getting into trouble at work for angering possible customers.


I've used a Series S but I own a PC. Have you developed for it though? Do you know how much work or trouble Series S optimisation is? Do you know that memory dictates a lot of things this gen? The devs making comments do.


Gotham Knights. Sure i get that some of these devs aren't going to be The Coalition, Santa Monica, or ND but can you at least empathise and see that Series S is a development burden? That it requires far more work to get a lesser result than the XSX/PS5? For games that might not even sell on it. That its memory limits what can be achieved this gen for things like world size, enemies on screen etc? That doesn't mean awesome games can't exist on it.

There was new evidence that some devs are trying to drop the requirement and develop for Series X only. That was what's new even if it is just sentiment. Why not discuss it?

People have explained already that Series S requires optimisation, a lot more work, and the memory dictates what devs do the entire gen. Not in a podcast but comments by Remedy, id devs, riftbreaker devs, etc. The problem is that a bunch of angry fans tend to throw tomatoes in their rage at hearing things like this.

Can you please provide me evidence of developers requiring to do a lot more work for less results on series s please?

If you can't I think you just need to let this go.

Any console takes work, and every console has challenges...its not just easy for any console environment.

There will be challenges for series s and series x. Imagine trying to get raytracing to work or dealing with split ram in series x, slower ram in ps5 vs series x. Smaller gpu in ps5 etc. Smaller memory pool in series s.

It's all possible challenges. Are you a developer? Do you have experience with series S?

What makes the people speaking negatively on the series s hold any more weight than all the devs praising it. Are you saying those are liers?

I need to know where you are coming from with this? What is your motivation?

We're you against the ps3 in that generation and angry at Sony that every dev had to work out how to get games working on their two split pools of memory and that hindered the 360 or does that not matter now as it was in the past? Angry that they went with the cell architecture that was a nightmare for developers for yeeears and took years and millions of dollars of money and man hours to program.

I just want to know what is driving you on this. Are you a developer and you feel for them?
 
Last edited:

supernova8

Banned
What people aren't factoring in is that ANY thing that bogs down developers is resources taken away from the ultimate product. Even when it comes to just making a single first part sony title, there is usually considerable crunch, now imagine adding more crunch to optimize perfectly on a low end console, which has been compared numerous times to low spec recommendations, but that isn't the same at all.
Well then basically every single game ever made that isn't exclusive to a specific platform has the same problems but companies still make it work. Plus, look at all the companies that started putting games on Switch when it became a hit (after initially being not interested at all).

I understand your point and I don't necessarily dispute it... I suppose (circling back to my original comment) I just don't care to hear developers whine about the Series S when we're getting cross-gen shit after cross-gen shit anyway.

Only way the requirement for Series S is dropped is if:

1) Series S sales end up sucking a few years down the road
or
2) Loads of developers choose to go PS5 exclusive

On the second point, it might happen in one of two ways:
a) PS5 ends up selling significantly more than Xbox Series S/X to the point that developers choose to go blue team-only
b) average spend per customer (on individual games) on Xbox drops significantly due to Game Pass, so developers/publishers not interested in being on Game Pass choose to just go PS5-only.
 

Danjin44

The nicest person on this forum
What people aren't factoring in is that ANY thing that bogs down developers is resources taken away from the ultimate product. Even when it comes to just making a single first part sony title, there is usually considerable crunch, now imagine adding more crunch to optimize perfectly on a low end console, which has been compared numerous times to low spec recommendations, but that isn't the same at all.
I could be very wrong about this, but isn’t same case with PC? I’m no expert but isn’t usually PC developers have to make the games able to run on low end gaming PCs?
 
Last edited:

Ozriel

M$FT
An Epic demo that required help from a first party studio to run on the XSS? Great. Why did they even get the coalitions help. They should have released any old mess however it was.

The Coalition stepped in to get the demo optimized on both the Series X and Series S. Not just the Series S.

See how you’re pushing misinformation?

tGJpqLF.jpg
 

Ozriel

M$FT
What people aren't factoring in is that ANY thing that bogs down developers is resources taken away from the ultimate product. Even when it comes to just making a single first part sony title, there is usually considerable crunch, now imagine adding more crunch to optimize perfectly on a low end console, which has been compared numerous times to low spec recommendations, but that isn't the same at all.

Crunch happens when you’re struggling to meet milestones you’ve already set for yourself for the project.

Any developer making a game would have set aside milestones and time for optimizing on all shipping platforms and across multiple PC GPUs. There is no inherent crunch from optimizing for Series S.

This is as ridiculous a take a claiming that making a Switch version of a multiplatform game leads to crunch and is bad for developers.
 

Three

Member
“Clarified” after being told over several posts that 720p20 hasn’t been a reason to refused for certification in the past.
I clarified straight away in the initial post that the main point was that optimisation is required to pass certification

I asked for a game that ran at 20fps 720p and you gave matrix awakens demo. A demo that is known to have required MS input for memory and performance optimisation. It even runs at 30fps most of the time in gameplay. Not only was it not an unoptimised game but a first party optimised demo.

And after going on a complete baseless theory that because a game wasn’t Updates for Series S it shows it isn’t a priority to devs, even though the game was updated at the same time as Series X and PS5…
Like the other poster said you backtracked after actively arguing specific numbers, only to say you didn’t argue specific numbers even though there is a page you doing it.
You brought up pubG not me as an example of such a game. I said PubG just had the Xbox one S build, playable at stable 30fps for 2 years on Series S before they released a native Xbox Series S version this June. I even admitted that yeah XSX and PS5 got an update that time too. Why so long. They could have just hit compile and away they went.

But, to move on, even your original point is idiotic there is NO PC game where the minimum “unplayable” specs and “playable” specs only differ by a GPU 2.5 times more powerful, on the same architecture.

When you have minimum specs that are down right unplayable, you have to move several tiers on both CPU and GPU performance tiers.
No idea what you're saying here but min spec can be set by the dev at whatever they want. End of story.
If you have a 240p15 game on Series S you are going to have something just as bad on Series X and PS5, and both Tale Plague and Gotham Knights show early signs of it were it’s not just the Series S struggling to hold 30.
'Tale plague' may be showing signs of what? What is it struggling with? Performance optimisation isn't stellar correct? but it hits its target 30fps most of the time. It still had to be optimised and pass certification.

Is it CPU bound? Surely the devs can just lower res and graphical effects and everything would be dandy. No work at all.
The biggest issue with the Series S will obviously be memory, but OOM errors aren’t performance issues they are stability issues.

OOM is stability but still requires optimisation and the memory performance is also an issue. Something that runs poorly at 240p15fps on XSS may not necessarily run poorly on XSX/PS5 due to memory performance. It depends what you are doing.

Can you please provide me evidence of developers requiring to do a lot more work for less results on series s please?
In the OP, that's the evidence.

I can point you to the Riftbreaker dev who said Series X just requires you to hit compile and it works whereas the Series S requires optimisation.

The "a lot" more work would be the devs complaining enough to want to drop support for it entirely, in the OP. If it were just a 20 minute job there would be no reason to.

Any console takes work, and every console has challenges...its not just easy for any console environment.There will be challenges for series s and series x.
This would depend on your team, game scope, engine. Series X challenges are not the same as Series S. It's obvious the effort for Series S is much much higher.

Imagine trying to get raytracing to work or dealing with split ram in series x, slower ram in ps5 vs series x. Smaller gpu in ps5 etc. Smaller memory pool in series s.

It's all possible challenges. Are you a developer? Do you have experience with series S?
Yes, those challenges aren't comparable in scope though.

What makes the people speaking negatively on the series s hold any more weight than all the devs praising it. Are you saying those are liers?
Why would I say that? They praise it for different reasons, install base for a cheap console being one. Those reasons are not ease of development though, especially compared to XSX/PS5.

I need to know where you are coming from with this? What is your motivation?

We're you against the ps3 in that generation and angry at Sony that every dev had to work out how to get games working on their two split pools of memory and that hindered the 360 or does that not matter now as it was in the past? Angry that they went with the cell architecture that was a nightmare for developers for yeeears and took years and millions of dollars of money and man hours to program.
unified memory was far better, agreed. Do you agree that a split memory pool was an issue for some devs on the PS3? Now imagine a tiny split pool now. Cell I can see mixed results because while novel it provided great performance on some tasks. My motivation isn't what you think it is if this is what you're bringing up.
I just want to know what is driving you on this. Are you a developer and you feel for them?
Yes I feel for them and I'm just commenting on a forum like anybody else. I'm a tech lead.
 
Last edited:

Three

Member
The Coalition stepped in to get the demo optimized on both the Series X and Series S. Not just the Series S.

See how you’re pushing misinformation?

tGJpqLF.jpg
We've already discussed this. They specifically mention "memory optimisation especially on the Series S". Any optimisation however would apply to all systems even the PS5. The point is that they had to step in for optimisation and Epic couldn't just do it on their own. They didn't just hit compile and release a mess on Series S, they shifted the memory and performance optimisation onto The Coalition.
 
Last edited:

Three

Member
But who's at fault, the little console that could or the big engine developer that couldn't?
Nobody. It's not a blame game. It's the reality of a situation.
If the memory constraints could be worked around and overcome then that means that the Series S is capable and isn't going to be an albertross for the entire generation. Developers only need to learn these ways of optimising once and then they can use the techniques for the rest of the generation, we see this all the time as games get better and better as the generation progresses and they learn how to get the most out of the hardware.
As has been said, Epic only needed to make the changes to the engine once and then all future iterations of the engine will reap the rewards. It's a positive that Coalition got involved not a negative.
Absolutely it's a positive they did.
Not every project is going to get the coalitions help like a tech demo to showcase hardware though.

Plus, when you want to add some features that requires more memory that you've already squeezed? The optimisations may raise all boats but its difficulty will hinder how high. Especially if there is a requirement that it can.
 
Last edited:

Ozriel

M$FT
We've already discussed this. They specifically mention "memory optimisation especially on the Series S". Any optimisation however would apply to all systems even the PS5. The point is that they had to step in for optimisation and Epic couldn't just do it on their own. They didn't just hit compile and release a mess on Series S, they shifted the memory and performance optimisation onto The Coalition.

…And you’re doubling down.

Coalition weren’t brought in to make it work on Series S. They were brought in to optimize the demo on Xbox hardware. They were responsible for getting it running on both Series X and Series S.

You pitching it as ‘Epic couldn’t do it on their own’ makes no sense. The Xbox wire article also says Coalition worked with Epic on the March 2020 UE4 demo for Xbox Series X. Are you also gonna claim that Epic couldn’t get UE4 running on 12TF, 16GB RAM Series X? 😂
 

Three

Member
…And you’re doubling down.

Coalition weren’t brought in to make it work on Series S. They were brought in to optimize the demo on Xbox hardware. They were responsible for getting it running on both Series X and Series S.

You pitching it as ‘Epic couldn’t do it on their own’ makes no sense. The Xbox wire article also says Coalition worked with Epic on the March 2020 UE4 demo for Xbox Series X. Are you also gonna claim that Epic couldn’t get UE4 running on 12TF, 16GB RAM Series X? 😂
I'm pretty sure you're misreading "We worked with Epic on the initial Unreal Engine 4 support for xbox series" as them polishing some kind of demo. Maybe also true, but that's not what it says. They worked with them as usual for hardware support. That collaboration conitnued with UE5 hardware support.

They jumped in and provided polish and optimisation for this demo:

"With their track record of creating games that are technical showcases for Unreal and Xbox, along with their collaboration and early experience with UE5, The Coalition was positioned well to jump in and help optimize and polish The Matrix Awakens: An Unreal Engine 5 Experience with Epic."

Nobody is saying "Epic couldn’t get UE4 running on 12TF, 16GB RAM Series X"

Whatever the hell "getting UE4 running" means anyway. If they also polished and optimised some 'UE4 demo' for their hardware then great, what's your point? they had games like gears for that already which they polished and optimised.
 
Last edited:

CeeJay

Member
Not every project is going to get the coalitions help like a tech demo to showcase hardware though.
The project to help the number one off the shelf engine work better on Series S is something that was worth The Coalition getting involved in because it means that those tweeks are now in for everyone to make use of (the source code for the Matrix demo is open and anyone can load it in and see how it's working under the hood). Looking at code examples and learning from them is something all developers have to do anyway and is the primary reason for these demos, not for consumers to download and play with. If a developer doesn't want to learn from these examples and wants to carry on using old methods then they are not really doing their jobs properly, would you trust a surgeon to carry out an operation who wasn't upto date with their knowledge and certifications? I can understand the reason why cross-gen games can be difficult to port due to the game being primarily coded for old APIs using old techniques. However, when these devs are saying things like "when we are developing full next gen only games the Series S is going to be a problem" it really doesn't hold water. It means that they can use all these new techniques right from the ground up that the machines were designed to take advantage of and will allow the developers to write efficient code and get the most out of them. If these devs see these new powerful machines (XSX PS5) as a way to brute force more out of using the old techniques then it's their own mentality that is at fault not the Series S. The Series S is full current gen architecture and any game that is written specifically for XSX and PS5 will down port easily and as many devs have come out and said, when you down port to Series S you actually get more headroom to play with because a lot of the architecture is exactly the same as the full fat consoles. I would bet that those saying that they are having issues down porting are also taking shortcuts or reusing lots of old code and using the power in the full fat consoles to make up for it. Follow good practice with your coding and the down port will take care of itself to a large degree is the message I am getting from the devs who are saying Series S isn't an issue.
 

Ozriel

M$FT
That’sa last gen game targeting specs of 1.3 tflops. Dev here is talking about next Gen games. Why is this so hard?

Matrix UE5 Demo?
I'm pretty sure you're misreading "We worked with Epic on the initial Unreal Engine 4 support for xbox series" as them polishing some kind of demo. Maybe also true, but that's not what it says. They worked with them as usual for hardware support. That collaboration conitnued with UE5 hardware support.

They jumped in and provided polish and optimisation for this demo:

"With their track record of creating games that are technical showcases for Unreal and Xbox, along with their collaboration and early experience with UE5, The Coalition was positioned well to jump in and help optimize and polish The Matrix Awakens: An Unreal Engine 5 Experience with Epic."

Nobody is saying "Epic couldn’t get UE4 running on 12TF, 16GB RAM Series X"

Whatever the hell "getting UE4 running" means anyway. If they also polished and optimised some 'UE4 demo' for their hardware then great, what's your point? they had games like gears for that already which they polished and optimised.

And you’re the one insisting Epic couldn’t get the game running well on Series S and so passed the ‘mess’ over to Coalition.

Meanwhile the clear story is that Coalition had a specific, defined role to handle the final optimization on Xbox Series consoles.

That’sa last gen game targeting specs of 1.3 tflops. Dev here is talking about next Gen games. Why is this so hard?

Dev in question isn’t making a next gen game and has never a made a game that would tax even a PS4.

Also…
It's also got a massive budget and was made by a large amount of different studios (9 I think)

Flight SIM is a taxing, next gen only game made by Asobo with a relatively small team. Runs at native 1080p/30fps with medium PC settings according to DF.
 

damiank

Member
1) Series S sales end up sucking a few years down the road
or
2) Loads of developers choose to go PS5 exclusive

On the second point, it might happen in one of two ways:
a) PS5 ends up selling significantly more than Xbox Series S/X to the point that developers choose to go blue team-only
b) average spend per customer (on individual games) on Xbox drops significantly due to Game Pass, so developers/publishers not interested in being on Game Pass choose to just go PS5-only.
That's just my speculation but if Jimbo is smart he would give a dev "special treatment" like 20/80 split or smh, that would decide to go exclusive.
 

SomeGit

Member
I clarified straight away in the initial post that the main point was that optimisation is required to pass certification

You did not, you even asked for examples of 720p20fps games. If you didn't want to push the argument, you'd just say the numbers don't matter, but you only did that one page later.

I asked for a game that ran at 20fps 720p and you gave matrix awakens demo. A demo that is known to have required MS input for memory and performance optimisation. It even runs at 30fps most of the time in gameplay. Not only was it not an unoptimised game but a first party optimised demo.

Irrelevant, you asked for something that was certified and approved by MS on the Series S. I gave you an example.

Who cares if it had help from a first party studio? Do you think first party studios and third party games that get help from support team like Sony XDev skip certification?

You brought up pubG not me as an example of such a game. I said PubG just had the Xbox one S build, playable at stable 30fps for 2 years on Series S before they released a native Xbox Series S version this June. I even admitted that yeah XSX and PS5 got an update that time too. Why so long. They could have just hit compile and away they went.

I bought PUBG as an example of a game with unnaceptable performance that was approved by MS to release on Xbox One.
Just like I used Cyberpunk as an example that had been approved for release on Xbox One and PS4.
Just like Litchdom Battlemage, just like ARK.

You took PUBG and went on a tirade that it proves that devs don't care about Series S... but the "next gen" update came out to all 3 at the same time.
Native Series S version? It's still under BC, on all 3 consoles. God you can't get one right here, over a point no one is making ffs, stop talking about PUBG you have no clue what you are talking about.

The point was this:

and this:


Were given the A-OK by both MS and Sony. It's irrelevant how they function on Series S.

No idea what you're saying here but min spec can be set by the dev at whatever they want. End of story.

You aren't going to get a game that is not playble at all on a RX560, but is playable just fine at high resolutions on a RX580 when accouting for the same CPU.
The idea that you are trying to pass that you can have a min spec on PC that is unplayable is irrelevant in this case. When that happens the "playable" specs aren't just the same machine with a GPU a tier higher on the same architecture.

'Tale plague' may be showing signs of what? What is it struggling with? Performance optimisation isn't stellar correct? but it hits its target 30fps most of the time. It still had to be optimised and pass certification.

Is it CPU bound? Surely the devs can just lower res and graphical effects and everything would be dandy. No work at all.

Plague Tale shows that you have problems hitting 30 fps target on S, that also scales up to PS5 and X. Plague Tale isn't CPU bound on them, you can check the PC version for that. A Ryzen 5 3600 can easily go over 60fps, let alone 30.

OOM is stability but still requires optimisation and the memory performance is also an issue. Something that runs poorly at 240p15fps on XSS may not necessarily run poorly on XSX/PS5 due to memory performance. It depends what you are doing.

It is very unlikely that memory performance will be an issue, the slow pool is far too small and mostly used up by the OS. The high pool scales in relation to resolution, you are not going to get 720p20 on S and 1080p30 on X, forget about that, you are arguing a moot point here.

The real big issue with the Series S is always going to be memory size, regardless on fast/slow RAM 6GB can be a lot to to cut down even if you are just using quarter res textures and half res buffers.
 
Last edited:

Three

Member
And you’re the one insisting Epic couldn’t get the game running well on Series S and so passed the ‘mess’ over to Coalition.

Meanwhile the clear story is that Coalition had a specific, defined role to handle the final optimization on Xbox Series consoles.
This is how you misconstrue things. I never said they passed over a mess. In fact I said the opposite that they didn't jus hit compile and release a mess but handed the memory optimisation and performance debt/work to The Coalition.
 

Klosshufvud

Member
I'm still not sure whether this dude has simply just extremely poor reading comprehension or is geniunely purposely twisting facts for God knows what reason. Either way there's no point entertaining him anymore.
 

DaGwaphics

Member
I'm not sure how that would be achieved. The demo was 24fps during cutscenes and 30fps during gameplay on Series X/PS5. Whatever optimisation they can do to achieve 60fps on XSX would apply to XSS too. That demo ran fine on XSS anyway with the extra help that MS provided to make it run decently on it.

We were talking about the effort needed by smaller and/or less skilled developers to get things going on the XSS and the fact that the coalition helped optimize the demo on XSS. Seems like improvements to the efficiency of nanite/lumen would make it easier for everyone at any fps target, but certainly for those smaller teams that don't optimize as well.
 

Ozriel

M$FT
This is how you misconstrue things. I never said they passed over a mess. In fact I said the opposite that they didn't jus hit compile and release a mess but handed the memory optimisation and performance debt/work to The Coalition.

You’re the one pitching it as ‘Epic couldn’t do
It so they passed it to Coalition’ and ‘not every dev will have help from Coalition’. Near verbatim quotes from you.

We were talking about the effort needed by smaller and/or less skilled developers to get things going on the XSS and the fact that the coalition helped optimize the demo on XSS. Seems like improvements to the efficiency of nanite/lumen would make it easier for everyone at any fps target, but certainly for those smaller teams that don't optimize as well.

Certainly. Any studio can easily copy Coalition’s approach if required. And some of that gets rolled into the engine itself.

Also worth mentioning that even all these (Matrix demo and the impressive Flight SIM release) came before MS released an SDK update that makes available additional hundreds of MB of additional RAM for games. So there’s even more resources to work with for games going forward.
 

Three

Member
You did not, you even asked for examples of 720p20fps games. If you didn't want to push the argument, you'd just say the numbers don't matter, but you only did that one page later.
I did say it. I said the intent and context matters straight away. I asked for examples only because I used numbers for what I would consider poorly optimised. I just wanted some examples that ran at 20fps 720p on a Series S. You didn't provide that either. Matrix awakens is mostly 30fps in gameplay and has been optimised by MS themselves. You brought up PUBG xbox one s for some reason a 900p game even on the One. I would consider 720p 20fps a poorly optimised game on a Series S
Irrelevant, you asked for something that was certified and approved by MS on the Series S. I gave you an example.
You gave an example on a One S silly. I mentioned dips below 30fps on One S was akin to not hitting a 60fps target today with dips to 40s or whatever. I still maintain that a game running at 720p 20fps is a badly optimised game on a Series S but the values and how stringent they are are not relevant and I made that clear.
Who cares if it had help from a first party studio? Do you think first party studios and third party games that get help from support team like Sony XDev skip certification?
You think when we are talking about the difficulty of optimisation and certification the fact that it was a first party optimised demo is irrelevant? OK
You took PUBG and went on a tirade that it proves that devs don't care about Series S... but the "next gen" update came out to all 3 at the same time.
Native Series S version? It's still under BC, on all 3 consoles. God you can't get one right here, over a point no one is making ffs, stop talking about PUBG you have no clue what you are talking about.
I said support for Series S wasn't high on the priority list for PUBG because it just got the Xbox One S build. Are you saying that's not true while simultaneously arguing it's recent July update was through BC?

Through BC or not the XSS/XSX/PS5 got higher framerate modes only recently and Series S was just running the Xbox One build at locked 30fps before.

Were given the A-OK by both MS and Sony. It's irrelevant how they function on Series S.
Disagree but Ok.
You aren't going to get a game that is not playble at all on a RX560, but is playable just fine at high resolutions on a RX580 when accouting for the same CPU.
The idea that you are trying to pass that you can have a min spec on PC that is unplayable is irrelevant in this case. When that happens the "playable" specs aren't just the same machine with a GPU a tier higher on the same architecture.
Ok? The point was that min spec is different. It doesn't require certification and it can even be barely passable subjectively.
Plague Tale shows that you have problems hitting 30 fps target on S, that also scales up to PS5 and X. Plague Tale isn't CPU bound on them, you can check the PC version for that.
Hell if it was CPU bound the S would have worse framerate performance than the X, no? Instead of the opposite?
So it's not CPU bound, right. what stopped them going even lower res on the higher spec machines? It was just optimisation work in specific scenarios but they aimed for 30fps and hit it fairly well. It isn't some poor performing game. What it struggled with was dev time to optimise for specific hardware.

It is very unlikely that memory performance will be an issue, the slow pool is far too small and mostly used up by the OS. The high pool scales in relation to resolution, you are not going to get 720p20 on S and 1080p30 on X, forget about that, you are arguing a moot point here.
Based on what? The small pool is the bigger issue I agree but memory performance is just as important to overall performance.

Not everything scales easily with res. Look at the id devs comments for an example. This isn't even talking about things like enemy variety on screen, map size etc.
 
Last edited:
First off who the fuck is Ian and who the fuck is Bossa.

Developers need to try harder the CPU is the same just push it at 1080p30fps

What cross system game is currently better looking than Forza Horizon on xbox? I’m struggling to think of one. Will Rockstar get GTA6 running on the XS? They will.
there's a reason racing games come out at the launch of every new console. they're FAR LESS demanding than say a GTA or Assassin's Creed, etc. That's why they can put all the bells and whistles on them and tout them as "look at the power of these machines!" trophies. so saying that Forza looks great on xbox doesn't mean shit. not trying to be a dick. just wish something could be sorted because it will suck to have an entire generation of games hamstrung by a system that people are too cheap to pony up the extra $150/$200 to just get a series x. i mean for god sake's, isn't the "average gamer" age like 37? pretty sure 37 year olds can afford $500 for a gaming system that will be used for 6+ years.
 

Three

Member
We were talking about the effort needed by smaller and/or less skilled developers to get things going on the XSS and the fact that the coalition helped optimize the demo on XSS. Seems like improvements to the efficiency of nanite/lumen would make it easier for everyone at any fps target, but certainly for those smaller teams that don't optimize as well.
Absolutely, you're correct, but consider the idea that you have a limited pool of memory that the coalition have optimised the Matrix demo for. You need even further optimisation to add anything on top. Big developer or not the Series S would determine what you can do.
 

SomeGit

Member
I did say it. I said the intent and context matters straight away. I asked for examples only because I used numbers for what I would consider poorly optimised. I just wanted some examples that ran at 20fps 720p on a Series S. You didn't provide that either. Matrix awakens is mostly 30fps in gameplay and has been optimised by MS themselves. You brought up PUBG xbox one s for some reason a 900p game even on the One. I would consider 720p 20fps a poorly optimised game on a Series S

PUBG has strides of it hitting near single digits frame rates for a lot of the times, you don't consider that unnacceptable framerate? Give me 720p20 stable over that, ffs. And again I didn't just bring up PUGB, also Cyberpunk, ARK, etc.
Plenty of game with very subpar performance, given the A-OK by MS and Sony.

You gave an example on a One S silly.

Matrix is a Series S game, that's the example I gave you when you specifically asked.

I mentioned dips below 30fps on One S was akin to not hitting a 60fps target today with dips to 40s or whatever.

And I mentioned that that is bs goalposting, nothing says that MS changed their performance targets from One S to Series S. You are the one implying that.

I still maintain that a game running at 720p 20fps is a badly optimised game on a Series S but the values and how stringent they are are not relevant and I made that clear.

Badly optimized, yes. Would fail certification? Nobody knows, quit acting like it's fact.

You think when we are talking about the difficulty of optimisation and certification the fact that it was a first party optimised demo is irrelevant? OK

Certification and difficulty of optimization are 2 completely different concepts, why are you putting them together? Yes, the fact that it's a first party optimised demo is irrelevant. First party games also go through certification, why would this be difficult?

I said support for Series S wasn't high on the priority list for PUBG because it just got the Xbox One S build. Are you saying that's not true while simultaneously arguing it's recent July update was through BC?

Through BC or not the XSS/XSX/PS5 got higher framerate modes only recently and Series S was just running the Xbox One build at locked 30fps.

So because all 3 got the update at the same time it means that the Series S was low on their priority list? Beautiful logic, congratulations man.

Disagree but Ok.

You don't think MS and Sony, approved that performance? They were just hacked into XBL and PSN, I assume.
The same way master copies of Cyberpunk were printed behind Sony's back.

Ok? The point was that min spec is different. It doesn't require certification and it can even be barely passable subjectively.

It wasn't, you are just backtracking again.

So it's not CPU bound, right. what stopped them going even lower res on the higher spec machines?

Ask them, maybe they wanted higher resolutions. It is running on weaker processors with plenty of CPU headroom, unless there is a secret sauce for the PC version, yeah it could be higher.

Based on what? The small pool is the bigger issue I agree but memory performance is just as important to overall performance.

It's 2 GB, most of it used by the OS, why would the 56GB/s be a problem, when most games won't use much of it? And the ones who do won’t used stuff that requires more bandwidth than the average stuff you are storing in DDR4 on PC.

Not everything scales easily with res. Look at the id devs comments for an example. This isn't even talking about things like enemy variety on screen, map size etc.

I didn't say everything scales easily, I've been saying that amount is a problem even scalling down with res.
But for stuff that affects performance, yeah most of it will scale with resolution.

DDR5 is around 30GB/s to 50GB/s do you think PC games will have major memory bandwidth problems? Massive memory bandwith is important when handling stuff the GPU will use, that all scales with resolution mostly, which is why VRAM bandwidth needs to be much much bigger.

The biggest problem with the Series S is memory size, a game with OOM errors doesn't run at 720p20, it doesn't run period.
 
Last edited:
Slap in one of these bad boys and we're all good. Nothing stops this train.

s-l500.jpg


The entitlement is hilarious though. Ask for multiple difficulty levels and you're "entitled." But ask to have multiple millions of people's brand new game systems dropped so you can have slightly shinier graphics, and you're not at all a selfish prick apparently.

I'm all about the Series S price point. And frankly, if games have to target that system than maybe we'll see some Switch 2 ports in play also. It's better for Xbox and better for Nintendo, and better for consumers as well.

Anyone crying about Series S is trying to push out poorly optimized games that still don't look on par with Naughty Dog stuff on PS4. They want an easy out of working on optimization, and want you to foot the bill.

The best post (quoted above) that summarized all of this manure was clear back on Page 2.

Get better at optimization, use the tools available, and quit complaining.
 

Three

Member
PUBG has strides of it hitting near single digits frame rates for a lot of the times, you don't consider that unnacceptable framerate? Give me 720p20 stable over that, ffs. And again I didn't just bring up PUGB, also Cyberpunk, ARK, etc.
Plenty of game with very subpar performance, given the A-OK by MS and Sony.



Matrix is a Series S game, that's the example I gave you when you specifically asked.



And I mentioned that that is bs goalposting, nothing says that MS changed their performance targets from One S to Series S. You are the one implying that.



Badly optimized, yes. Would fail certification? Nobody knows, quit acting like it's fact.



Certification and difficulty of optimization are 2 completely different concepts, why are you putting them together? Yes, the fact that it's a first party optimised demo is irrelevant. First party games also go through certification, why would this be difficult?



So because all 3 got the update at the same time it means that the Series S was low on their priority list? Beautiful logic, congratulations man.



You don't think MS and Sony, approved that performance? They were just hacked into XBL and PSN, I assume.
The same way master copies of Cyberpunk were printed behind Sony's back.



It wasn't, you are just backtracking again.



Ask them, maybe they wanted higher resolutions. It is running on weaker processors with plenty of CPU headroom, unless there is a secret sauce for the PC version, yeah it could be higher.



It's 2 GB, most of it used by the OS, why would the 56GB/s be a problem, when most games won't use much of it? And the ones who do won’t used stuff that requires more bandwidth than the average stuff you are storing in DDR4 on PC.



I didn't say everything scales easily, I've been saying that amount is a problem even scalling down with res.
But for stuff that affects performance, yeah most of it will scale with resolution.

DDR5 is around 30GB/s to 50GB/s do you think PC games will have major memory bandwidth problems? Massive memory bandwith is important when handling stuff the GPU will use, that all scales with resolution mostly, which is why VRAM bandwidth needs to be much much bigger.
Your examples of games were on Xbox One S where 30fps1080p is considered a good performance target. The fact that PUBG was a stuttering mess of a 30fps1080p game in early access on an xbox one or whatever is a little different to a game running at 20fps720p on a Series S today I would say.


Even Lichdom Battlemage hit a perfect 900p30fps on an xbox one when they lowered it from 1080p. Cyberpunk a game where Sony exercised its right to stop sales for a poor performing game is not a good example either. The point was that the platform holder can refuse its sale based on performance. unfortunately they just didn't catch it prior there. Ark is jank but performs fine until long play times.

You've completely oversimplified the importance of memory bandwidth to performance.
 
Last edited:

Three

Member
I'm still not sure whether this dude has simply just extremely poor reading comprehension or is geniunely purposely twisting facts for God knows what reason. Either way there's no point entertaining him anymore.
Says the guy who said I was somehow saying Series X versions would be cancelled or that death threats were sent to the devs. You can't get reading comprehension worse than that.
 
Last edited:

SomeGit

Member
Your examples of games were on Xbox One S where 30fps1080p is considered an good performance target. The fact that PUBG was a stuttering mess of a 30fps1080p game in early access on an xbox one or whatever is a little different to a game running at 20fps720p on a Series S today I would say.

That’s your opinion, nothing states that any performance metric changed from last gen to this in relation to certification.


Even Lichdom Battlemage hit a perfect 900p30fps on an xbox one when they lowered it from 1080p. Cyberpunk a game where Sony exercised its right to stop sales for a poor performing game is not a good example either. The point was that the platform holder can refuse its sale based on performance. unfortunately they just didn't catch it prior there. Ark is jank but performs fine until long play times.

Litchdom Battlemage wasn’t certified at 30 it was certified running at 15 to 17 FPS, later patches are irrelevant.

Being taken off PSN doesn’t mean anything certification happens before release, Sony allowed it to be released and put it on sale on PSN, while also printing copies. The aftermath is irrelevant, it passed certification on both platforms.

Ark didn’t just run badly after some time,
If it does now it irrelevant, it was certified like that above.


You've completely oversimplified the importance of memory bandwidth to performance.

I didn't, even the Id dev you were quoting wasn't talking about memory bandwidth on the main pool, he was complaining about the split memory pool and the size of them.
He didn't say anything about the bandwidth limit of the big pool of memory.
Resolution of the buffers and assets impact the required bandwidth and scales accordingly.
 
Last edited:

Ozriel

M$FT
All the handwringing about smaller studios possibly struggling with next gen only games and yet DF has an article up about A Plague's Tale Requiem - an impressive next gen only game - that was scaled down nicely and runs well on the Series S.

These are hard data points.
 
DDR5 is around 30GB/s to 50GB/s do you think PC games will have major memory bandwidth problems? Massive memory bandwith is important when handling stuff the GPU will use, that all scales with resolution mostly, which is why VRAM bandwidth needs to be much much bigger.
Just set texture size and resolution too high for the amount of vram you have and you will see that can be a pretty serious problem (you can see how some GPUs that could otherwise handle higher resolution are bottlenecking because of lack of vram).
 

SomeGit

Member
Just set texture size and resolution too high for the amount of vram you have and you will see that can be a pretty serious problem (you can see how some GPUs that could otherwise handle higher resolution are bottlenecking because of lack of vram).

Agree, but that's a memory size issue not a memory bandwidth issue, unlike a PC you don't have a second big pool of memory to fall back (well you do but it's so small hardly anything GPU related is going to be made there unless your game has nothing else going on). If you exahust the big pool of memory on Series S, you'll likely be kicked out by a OOM error. On PC you'll start using regular RAM pool and being hampered by PCIE bandwidth and DDR RAM bandwidth, at that point the VRAM bandwidth is irrelevant because it's not the bottleneck anymore.
 
Last edited:

Three

Member
That’s your opinion, nothing states that any performance metric changed from last gen to this.



Litchdom Battlemage wasn’t certified at 30 it was certified running at 15 to 17 FPS, later patches are irrelevant.
no specific performance metric is being argued but a 30fps1080p target and getting 17fps1080p on an xbox one is different to what I would consider a really bad 20fps 720p on Series S which was just given as an example. 17fps1080p on an Xbox One S is crap but it's not 20fps720p on a Series S crap.
Being taken off PSN doesn’t mean anything certification happens before release, Sony allowed it to be released and put it on sale on PSN, while also printing copies. The aftermath is irrelevant, it passed certification on both platforms.
It did happen before but they had next gen consoles and the PS4 and xbox one were struggling and causing the delays. you brought up a new game running on old hardware at 720p sub 30fps.

Would you at least agree that Sony removed it post release due to the fact that it was a poor performing game on a PS4?

Ark didn’t just run badly after some time,
If it does now it irrelevant, it was certified like that above.

Again an early access xbox one game with Ark. It certainly was a mess but an early access game on the one isn't the same as a 20fps720p game release on Series S today. That would be considered even worse performance than an Xbox One S and PS4.
 
Last edited:
Agree, but that's a memory size issue not a memory bandwidth issue, unlike a PC you don't have a second big pool of memory to fall back (well you do but it's so small hardly anything GPU related is going to be made there unless your game has nothing else going on). If you exahust the big pool of memory on Series S, you'll likely be kicked out by a OOM error. On PC you'll start using regular RAM pool and being hampered by PCIE bandwidth and DDR RAM bandwidth, at that point the VRAM bandwidth is irrelevant because it's not the bottleneck anymore.
Yup, so the slow memory bank can't feed the fast one.
 
Top Bottom