• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Can I be honest here...I don’t think the Xbox Series S is going to last very long into the generation.

Kappa

Member
The Series S will last the generation for the same reason people still game on PCs with 1060s. Not everyone either wants, or can afford, the latest and most expensive tech, and not everyone wants to game at the absolute maximum resolution or frame rate. There are still plenty of people out there who game on a Ps3 or a 360.

This forum tends to get a warped view of gaming across the general audience because we're all enthusiasts who probably want the best and most advanced gaming experience.

Series S isn't going anywhere. It's the budget gaming console for this generation and a lot of people will be happy to buy it.
Yep especially considering they all have the same cpu. Console warriors are so delusional
 

Derktron

Banned
I tend to agree. It's not just a resolution downgrade as I first hoped, it's concomitantly the framerate, the resolution and the ray tracing, more often than not and even on cross generation games. I don't even have a 4K display, but I'd rather go for a series X as I feel that most games won't even have a performance mode on series S when the generation will abandon cross gen games.
That is my biggest concern, the issue isn't with the resolution, because it's somewhat capable to run some games at high resolution, but the main problem is the performance issues, not a lot of games now don't even have a true Series S mode and I think that is going to be a major problem, & believe or not, people will see right through it. I mean the only game that has a Series S mode that runs at a smooth 60fps with whatever resolution you are running at and that game is the new Assassins Creed game and that is what I noticed and Xbox doesn't seem to realize that, it's going to bite them on their ass in the future when games become fully next-gen only. I mean I guess those reports that some devs were complaining about the Series S are maybe right after all. Only time will tell, the Series S is still fresh. I only made this topic to talk about what maybe could happen. I'm not saying it's going to happen right now. I'm just giving out what I see.
 

yamaci17

Member
i hate series s. openly. and no, it wont hold back nextgen gaming, it wont hold back series x.

only reason I hate series s is how useless it is and i feel bad for people who fall for it promises. i want to persuade them from buying series s because it's such a bad value product

it's literally a 720p 60 fps machine. that's not holding anything back. if ubisoft downgraded valhalla to run 1080p 60 fps with series s, then you could say it was holding back series x. but it's not. they pushed higher graphical fidelity, and series s ended up with 720p (no, it does not dip to 720p. most of time it runs at 720p, and occasionally 810p. neither you nor me have proof. we can argue all we want. for me, it runs at 720-810p 60 fps. i dont even think it would average 900p in a 3 hour gaming session. wish we had more detailed statistics on this, but it seem to be hard.)

unless a "reviewer" plays the valhalla for 4 hours and calculates an average resolution, i wont believe that it runs 900p most of the time. from the footage i've seen, it looks to be something between 720p and 810p (even then, 900p is pathetic. series s will never be able to run nextgen games at 1080p 60 fps. it's a hoax machine that is marketed as 1440p 60 fps and will end up with 720p 60 fps. all these people are delusionally happy when pre-2017 games run at 1080p-1440p 60 fps. of course they will run at 60 fps at high resolutions. xbox one s simply ran rdr 2 at 900p 30 fps. its only logical that series s can make that config to 1080p 60 fps. and once that happens, im pretty sure people will praise the console that how it runs the rdr 2 at 1080p 60 fps (lol) i want new gen games to run at 1080p 60 fps, and that's not happening. see cyberpunk, it already runs at 1080p 30 fps. why? because they would need to go down to 540p to hit 60 fps. only with extra optimizations and series s featureset that it might do 720p 60 fps.


SERIES S will never, ever run the Cyberpunk at 1080p 60 fps. it's simply not capable enough. IT will not run nextgen game at 1080p 60 fps. even the HELLBLADE 2. it will run 720p 60 fps, and microsoft will say they used directml to achieve fake 1080p and call it a day. it will look hideous, probably worse than dlss while dlss is not that perfect at 1080p either

i really cant comprehend how you can justify 720p-810p at this point in 2021. people rioted when games dropped to 900p with a ps4. 900p treatment was only justified for xbox one/one s since it had 1.3 tflops compared to 1.8 tflops of ps4. and even then, people did not like the fact thar xbox one dropped to 900p against ps4's 1080p

yet microsoft is selling a brand new console and this console readily and wholeheartedaly and brutally drops to 720p-810p freely. this is just absurd. it's not holding anything back. it's holding back itself with its crazily gimped graphical processing power and memory.

not to mention, in future games it will probably run lowest of low textures to accomodate for lower memory. developers wont care for 8 gb vram gpus and they wont care for 5.5-6 gb vram of series s (assuming game code takes 2-2.5 gb vram). series x will be able to provide 10-11 gb of vram. when they fill the 10 gb vram with texture sets, how do you think they will scale it back to 5.5-6 gb vram? this simply does not happen when you from 4k to 1080p. you get 1.5-2 gb of vram reducement, at BEST case scenarios. you need to turn down textures AGGRESIVELY. take a look at red dead redemption 2, high textures look literally crap compared to ultra textures. and low-med textures look like xbox360 textures.

if developers chose to accomodate for series s memory, then yes, it may be a bottleneck for higher vram systems. but i dont see that happening. they will push vram usage on higher systems and turn down the textures to lowest. you will have crap looking textures by the end of generation with series s. resolution does not save that much amount of vram. you may not like the fact, but you can always open up game reviews and look up vram usages per resolution and per textures. you can also compare texture quality differences yourself. as i said, rdr 2 is a prime example for this. 4k + ultra textures take up 7.5 gb vram while 1080p + lowest possible textures take up 5 gb vram. all that sacrifice and you only 2.5 gb of vram reducement. now apply this treatment to a game that will use 11 gb vram, how do you scale it back to 5.5-6 gb vram? i dont even want to think the possibilities. games
 
Last edited:
It's still a Zen CPU (same one that's in the Series X) with a SSD and RDNA2 GPU. Saying it's not an upgrade over the Jaguar-powered PS4 is insane.
But what does that give you in the end? I mean beyond the buzzwords?

You run the games at the same settings, sometimes at 60fps instead of 30, but then it drops and / or the resolution ends up below 1080p, which means it's not giving much benefits over either the PS4 pro or even less the one x, and barely anything over the base PS4 in actual gaming experience. The only aspect I can't deny is the loading times, but by the time you are about to upgrade, I find it weak as a main selling point.

Beyond the buzz words, I don't care about snazzy names for new tech.
 
Biggest problem with x said the ram capacity, there is no such thing as 4k assets you can just quickly lower the resolution of without the game looking like complete ass afterwards. With ssds texture budget can be pretty low on ram meaning you can have bigger animation and geometry budgets and you can't really down size those without big trade offs in terms of game play as well.

Another problem is like mentioned games won't target native 4k resolution unlike Microsofts pr would have you belive, so 1440p game on ps4 would mean sub 900p on xss if you assume best case scenario of linear scaling.
The people that buy a Series S don’t care about this lmao.
 

yamaci17

Member
Let me put it this way to assuage your fears. Multiplatform games are intrinsically highly variable. They always have to work at the bottom end. The majority of PC gamers use a GTX 1060, which is possibly weaker than the Series S. Game consoles, especially Xbox systems, are built like static computers, so developers know all they really need to do is lower the base resolution and texture quality to accompany for the weaker GPU and less RAM. Sure, in later gen titles, the Series S may struggle the most for the more demanding titles, but all that really happens is that Series S owners just get the worst performance out of the three major consoles. It's not the end of the world. We've dealt with worse in this gen alone. Look at how badly the VCR Xbox One is chugging with the most demanding games.
1060 performs nearly equal to series s and its a 5 year old gpu at this point, lmao

my friend plays ac valhalla at NATIVE 1080p with med-high settings and gets 45-50 fps.

we tried going down to 900p with him, and got locked 60 fps, surprise, huh? but he said that image quality was HORRENDOUS, horrible compared to crisp native 1080p.

in the end he happily played at native 1080p happily.

series s will also get you 50-60 fps but at 810p to get better stability.

series s is literally a rx 5500xt with rdna2 feature set, 5500xt = near rx 590, and rx 590 is not that far away from a 1060.

series s will surpass all these gpus with its good feature sets, i've argued a lot about this, series s will easily catch up to gtx 1070/5600xt. this i admit, due to its rdna2 features and feature sets (variable rate shading, mesh shading, directml and many more). but it wont change the fact that it will target 810-720p for 60 fps modes.

but directml cant save this console because deep learning ai scaling still needs high amount of pixels to work with. this is why DLSS works superior at 4k. you render at 1440p and use tons of pixels to upscale to 4k.

dlss at 1080p on the other hand works very bad, blurry and bad. because it renders at 720p and result is not desirable. directml will probably be worse than DLSS so it wont save series s, at all. but it will be a grace for series x, since it can render native 1440p and use a lot of pixel information to upscale to 4k with higher quality!
 

yamaci17

Member
The people that buy a Series S don’t care about this lmao.
this is low vs ultra textures in rdr 2. it saves up 600 mb of vram at 1080p (from 6.1 gb to 5.5 gb lmao).

series x will run 11-12 gb textures that will look like on the right

series s will have to accomodate with 5-6 gb for textures. how will this transition happen? even if it looks like the left, it wont still fit the buffer. but it will look like tthe left ,and then hell will be raised, you can be sure about that. that 10 gb vs 16 gb memory is a serious concern and problem for series s.

4 tflops can be accomodated by going down to 720p, 648p and even 540p but textures will be an another beast to handle. this is why some developers sounded concerned. they dont want to make seperate, normal looking textures for this console


Ec8MjlC.png
 
Last edited:

Hezekiah

Banned
I think it exists largely for xCloud to have a lower powered way of serving streaming to devices where it's pointless to use a 4k beast like XSX (and expensive.)

If it goes away, then that means devs not producing a version of their games w/ a lower powered target, and xCloud has to be all XSX all the time as far as rendering to people, whereas they plan on either having XSS in the cloud or having multiple people rendered from XSX hardware. Having 2 gamers on the same XSX hardware.. is.. well.. half as expensive.

So it might be here to stay, whether it catches on much with consumers or not.

They could kill it as a consumer device, and still have devs target it. But what's the point if their big end game is this belief xCloud will be streaming games by the "billions" to 5 inch cell phone screens and using an XSX to do it is not cost efficient?
Latency on streamed games is terrible and will be for years.
 

yamaci17

Member
I mean, didn’t they have the XboxOne S for a long time? Also, I will be surprised if the system doesn’t get games (it would be a huge fuck up for MS), and that’s all I care about. The console for fidelity is already being sold...
one x aimed at native 1620p/1800p/2160p and was mostly succesful, due to targeting 30 fps
one s rendered same games at 810p/900p/1080p as a result

series x aims at 1296p/1440p/1620p rendering and targets 60 fps mainly. this changes the whole contrast for series s.
series s targets 648p/720p/810p as a result.

if series x could target native 4k 60 fps, then series s would have no trouble. but with valhalla, and thanks to digital foundry's sermons about how native 4k is unnecessary, developers will easily tone down the resolution for quick performance/headroom gains. this affects series s in a huge way and will never let it render at 1080p 60 fps as result.

as long as modern aaa games run at 1440p 60 fps with series x, series s will render tham at 720p 60 fps. its only logical. you can only do so much with 4 tflops compared to 12 tflops.
 
Last edited:

bilderberg

Member
It's much better to be gpu limited than cpu limited. The series S sharing the same cpu as the X and being gpu limited basically means they can lower resolution and graphics as much as they need to and get mostly whatever level of performance they want out of it. If you're cpu limited there's basically shit you can do about it. There will probably be a Series S revision just like there would be an X revision down the line.
 

AGRacing

Gold Member
I don't think it was a smart long term buy for anybody... but I also believe many people didn't intend it to be when they bought it.

$220 CDN difference in price here..... For that money you get increased SSSSD capacity, a Disc Drive, a FAR more capable GPU.

I think it is pretty safe to say -- sales for the thing will fall off a cliff as soon as both of them are readily available on the shelf.

EDIT : but then again - that's probably when Microsoft starts bundling software with it or cutting the price of it.
 
Last edited:

Hezekiah

Banned
What were Microsoft thinking when they decided to go ahead with this console? Its already barely hitting 1080p with cross gen games, so hows it gonna fare ahead? I can see games eventually being completely butchered on it. Were Microsoft just hoping it would help their marketshare against Sony with it's cheap price? Maybe they were expecting it to perform much better than it is?
I can see Cyberpunk running like a dog on it after the patch. That's a cross-gen game, imagine how the next GTA, Elder Scrolls, Battlefield 6 etc games will run on it.

Called it from day one, it's only worth buying if you don't have the money for anything else, and even then you're better off sticking with current-gen console and saving up. 4GB RAM in 2021 is a joke.
 

yamaci17

Member
I can see Cyberpunk running like a dog on it after the patch. That's a cross-gen game, imagine how the next GTA, Elder Scrolls, Battlefield 6 etc games will run on it.

Called it from day one, it's only worth buying if you don't have the money for anything else, and even then you're better off sticking with current-gen console and saving up. 4GB RAM in 2021 is a joke.
cyberpunk runs at native 1080p but with locked 30 fps

they will do some variable rate shading, dx12_2 stuff to hit 720p 60 fps and call it a day XD
 

Polygonal_Sprite

Gold Member
I don’t think there’s much to worry about. Being GPU limited is much better than being CPU or disk speed limited if it didn’t have an SSD.

Games will be far lower resolution on Series S but with reconstruction, dynamic resolution, VRS and other tricks the people this box is marketed to won’t care because they probably still have a 1080 or 720p HDTV.
 
Those people you talk about are pretty much the definition of "dudebro" casuals though. Sports games and shooters.
Disagree. Those 'dudebro casuals' tend to know a few things about the 3 consoles and the services that they offer. They may play generic yearly titles but I'd bet they could beat me at many titles and have a decent knowledge of consoles... even if they arnt into games that arnt your yearly Cod.

I've personally always classed 'casual gamers' as those that don't really play games that often and may dip their toes in with things like wii sports or Animal Crossing
 

RoadHazard

Gold Member
Disagree. Those 'dudebro casuals' tend to know a few things about the 3 consoles and the services that they offer. They may play generic yearly titles but I'd bet they could beat me at many titles and have a decent knowledge of consoles... even if they arnt into games that arnt your yearly Cod.

I've personally always classed 'casual gamers' as those that don't really play games that often and may dip their toes in with things like wii sports or Animal Crossing

There are different levels of casuals. The people who have only ever played Wii Sports or AC are ultra casuals (not really "gamers" at all), but the people who only play the yearly releases of FIFA and COD are still casuals as well.
 

Elios83

Member
Series S is a bad product with gimped hardware that only exists with the purpose of becoming a cheap access point to subscribe to Gamepass.
It's intented for a really mainstream audience but those people won't buy it at 299$ and that is why the system isn't selling well even with a really limited supply.
I think that MS might try the 199$ card with a 3-months GP subscription included as soon as this holiday season.
If that won't work the model can be considered pretty much dead.
 

Dream-Knife

Banned



I take every opportunity I can to force my flatmate who doesn't game to look at the difference between 30 and 60 (terrible flatmate haha, I am the LORD of the LAND though so if he says no I could pap him), even showing him videos of side by sides of 30 vs 60 and I think he thinks I'm trolling him and there is no difference.

Its blows my mind, but if you don't know what to look for you can't say why one better than the other I guess. Its been my experience with literally 100s of causal gamers. I wish I had their eyes but for god's sake I've got my father's eyes:

1496573392172118501.png

I actually can't tell with a controller. With a mouse I can, and I never want to play an fps below 100 fps ever again.
 

Riky

$MSFT
I really don't think we'll see mid-gen refreshes this cycle. We saw them last time because the Jaguar CPUs were underpowered from Day 1 and the real purpose of the PS4 Pro was to boost PSVR performance. This time the Zen CPUs and RDNA2 GPUs should provide enough headroom for the entire generation.

P.S., hope you stick around dude. Good seeing you back.

I don't think we will either, MS have been very clear that they don't see die reductions being anywhere near as easy and they will not give the cost savings that we got from previous generations, that's why Series S makes a lot of sense from launch.
 

Riky

$MSFT
i hate series s. openly. and no, it wont hold back nextgen gaming, it wont hold back series x.

only reason I hate series s is how useless it is and i feel bad for people who fall for it promises. i want to persuade them from buying series s because it's such a bad value product

it's literally a 720p 60 fps machine. that's not holding anything back. if ubisoft downgraded valhalla to run 1080p 60 fps with series s, then you could say it was holding back series x. but it's not. they pushed higher graphical fidelity, and series s ended up with 720p (no, it does not dip to 720p. most of time it runs at 720p, and occasionally 810p. neither you nor me have proof. we can argue all we want. for me, it runs at 720-810p 60 fps. i dont even think it would average 900p in a 3 hour gaming session. wish we had more detailed statistics on this, but it seem to be hard.)

unless a "reviewer" plays the valhalla for 4 hours and calculates an average resolution, i wont believe that it runs 900p most of the time. from the footage i've seen, it looks to be something between 720p and 810p (even then, 900p is pathetic. series s will never be able to run nextgen games at 1080p 60 fps. it's a hoax machine that is marketed as 1440p 60 fps and will end up with 720p 60 fps. all these people are delusionally happy when pre-2017 games run at 1080p-1440p 60 fps. of course they will run at 60 fps at high resolutions. xbox one s simply ran rdr 2 at 900p 30 fps. its only logical that series s can make that config to 1080p 60 fps. and once that happens, im pretty sure people will praise the console that how it runs the rdr 2 at 1080p 60 fps (lol) i want new gen games to run at 1080p 60 fps, and that's not happening. see cyberpunk, it already runs at 1080p 30 fps. why? because they would need to go down to 540p to hit 60 fps. only with extra optimizations and series s featureset that it might do 720p 60 fps.


SERIES S will never, ever run the Cyberpunk at 1080p 60 fps. it's simply not capable enough. IT will not run nextgen game at 1080p 60 fps. even the HELLBLADE 2. it will run 720p 60 fps, and microsoft will say they used directml to achieve fake 1080p and call it a day. it will look hideous, probably worse than dlss while dlss is not that perfect at 1080p either

i really cant comprehend how you can justify 720p-810p at this point in 2021. people rioted when games dropped to 900p with a ps4. 900p treatment was only justified for xbox one/one s since it had 1.3 tflops compared to 1.8 tflops of ps4. and even then, people did not like the fact thar xbox one dropped to 900p against ps4's 1080p

yet microsoft is selling a brand new console and this console readily and wholeheartedaly and brutally drops to 720p-810p freely. this is just absurd. it's not holding anything back. it's holding back itself with its crazily gimped graphical processing power and memory.

not to mention, in future games it will probably run lowest of low textures to accomodate for lower memory. developers wont care for 8 gb vram gpus and they wont care for 5.5-6 gb vram of series s (assuming game code takes 2-2.5 gb vram). series x will be able to provide 10-11 gb of vram. when they fill the 10 gb vram with texture sets, how do you think they will scale it back to 5.5-6 gb vram? this simply does not happen when you from 4k to 1080p. you get 1.5-2 gb of vram reducement, at BEST case scenarios. you need to turn down textures AGGRESIVELY. take a look at red dead redemption 2, high textures look literally crap compared to ultra textures. and low-med textures look like xbox360 textures.

if developers chose to accomodate for series s memory, then yes, it may be a bottleneck for higher vram systems. but i dont see that happening. they will push vram usage on higher systems and turn down the textures to lowest. you will have crap looking textures by the end of generation with series s. resolution does not save that much amount of vram. you may not like the fact, but you can always open up game reviews and look up vram usages per resolution and per textures. you can also compare texture quality differences yourself. as i said, rdr 2 is a prime example for this. 4k + ultra textures take up 7.5 gb vram while 1080p + lowest possible textures take up 5 gb vram. all that sacrifice and you only 2.5 gb of vram reducement. now apply this treatment to a game that will use 11 gb vram, how do you scale it back to 5.5-6 gb vram? i dont even want to think the possibilities. games

Your pitching resolution metrics when you have no way of measuring them, no proof and don't even own the console that you " hate". I think I'll take the word of the people with the equipment to actually give real results. Valhalla in the quality mode actually gets as high as 1600p+ on Series S, nobody wants to mention that, I wonder why. These are also all last gen games that have been quickly ported, what will actually happen is now that the GDK contains easy access to SFS, VRS and Mesh Shaders performance will increase on Series S as the whole point of those features is to save on the specs like Ram bandwidth and GPU power where the cutbacks on Series S were made.

People are also forgetting the move to a unified GDK, the minimum spec for PC will be lower than Series S for the entire generation, Series S will just slot into that spectrum.
 
That is my biggest concern, the issue isn't with the resolution, because it's somewhat capable to run some games at high resolution, but the main problem is the performance issues, not a lot of games now don't even have a true Series S mode and I think that is going to be a major problem, & believe or not, people will see right through it. I mean the only game that has a Series S mode that runs at a smooth 60fps with whatever resolution you are running at and that game is the new Assassins Creed game and that is what I noticed and Xbox doesn't seem to realize that, it's going to bite them on their ass in the future when games become fully next-gen only. I mean I guess those reports that some devs were complaining about the Series S are maybe right after all. Only time will tell, the Series S is still fresh. I only made this topic to talk about what maybe could happen. I'm not saying it's going to happen right now. I'm just giving out what I see.
Devs will slowly abandon series s itll simply get brocken ports. Devs will only focus on ps5 and series x theyll stop focusing on series s thats microsofts problem in the end not the devs. That low ram configuration is the biggest problem on series x + the low gpu performance it means not just resolution but objects on screen have to be less, graphical features have to be removed and basically its a bottleneck that stops devs from creating certain levels/ big levels in game design because of series s, so its why they will abandon it.
 

VulcanRaven

Member
It will last sadly. It will hold back new games for the entire console generation because they need to work on it.
 
Last edited:

Daymos

Member
digital only, 364gb of usable space.

It's like buying a car with a 1 quart gas tank and then selling you a 1 gallon gas tank for $10,000.
 

MilkLizard

Member
this is low vs ultra textures in rdr 2. it saves up 600 mb of vram at 1080p (from 6.1 gb to 5.5 gb lmao).

series x will run 11-12 gb textures that will look like on the right

series s will have to accomodate with 5-6 gb for textures. how will this transition happen? even if it looks like the left, it wont still fit the buffer. but it will look like tthe left ,and then hell will be raised, you can be sure about that. that 10 gb vs 16 gb memory is a serious concern and problem for series s.

4 tflops can be accomodated by going down to 720p, 648p and even 540p but textures will be an another beast to handle. this is why some developers sounded concerned. they dont want to make seperate, normal looking textures for this console


Ec8MjlC.png
I admit I didn‘t read all your long ass posts hating on a piece of plastic. Nobody has time for that but as someone who actually plays RDR2 on the S, it doesn‘t look like the left pic at all. Coming from a base PS4, the S is a great console and I don‘t regret buying it.
 

Banjo64

cumsessed
I admit I didn‘t read all your long ass posts hating on a piece of plastic. Nobody has time for that but as someone who actually plays RDR2 on the S, it doesn‘t look like the left pic at all. Coming from a base PS4, the S is a great console and I don‘t regret buying it.
Init.

Also, the power delta between the 3090 and the PS5 is far greater than the delta between the PS5 and Series S.

36tf vs a flexible 10tf (well over 3x difference)
A flexible 10tf vs 4.2tf (well less than 3x difference)

So by this guys logic, I’m assuming he thinks the textures on the PS5 will look like N64 textures compared to the 3090?
 

PhaseJump

Banned
It will last until Microsoft tapers off support and makes the Series X the base entry point to the console side of the Xbox platform.

The generation will continue and likely run longer.

Nothing will be held back. Developers will find some new angle to complain about while needing to put in the work necessary that releases a product. Losers with nothing to do in life but complain about how Microsoft is dooming or threatening their cheetos stained existence will still be fickle, angry, and retarded enough to believe they know how or why the gaming industry does what it does, just because they learned how to count pixels on the plastic box brand they want to hate, instead of something useful like learning how to survive without electricity, or how to fix a flat tire to get where you need to go.

tenor.gif
 

yamaci17

Member
I admit I didn‘t read all your long ass posts hating on a piece of plastic. Nobody has time for that but as someone who actually plays RDR2 on the S, it doesn‘t look like the left pic at all. Coming from a base PS4, the S is a great console and I don‘t regret buying it.
if you would've read it you wouldn't say this

i dont say it looks like that in rdr 2. i say that when systems run out of vram, they have to resort to textures that look like that. once xbox sx/ps5 are fully utilized with their 13.5 gb allocated vram, series s's 8 gb allocated vram will be a huge trouble, and THEN some games will start to look like on the left. i'm not sayin they look like that now, that cant happen because every game out there caters to the 8 gb vram as of now. but that will change, rtx 3070/2080s/series s, all of them will play with GARBAGE textures in nextgen games.
 
Last edited:

BadBurger

Is 'That Pure Potato'
I think if people are happy with it now on their 4K TV's, they'll be fine with it five or six years from now just the same. I doubt these are going to be the same customers snapping up an 8K TV before this generation concludes.

Developers can work with it for the entire gen, as they did with each of three Xbox SKU's last gen.
 
Last edited:

yamaci17

Member
Init.

Also, the power delta between the 3090 and the PS5 is far greater than the delta between the PS5 and Series S.

36tf vs a flexible 10tf (well over 3x difference)
A flexible 10tf vs 4.2tf (well less than 3x difference)

So by this guys logic, I’m assuming he thinks the textures on the PS5 will look like N64 textures compared to the 3090?
no, vram requirements in this generation will be simply dictated ps5/sx. ps5 and sx will never have bad looking textures. besides, 3090'a 36 tfops are fake, its near 21-22.5 tflops in reality (so nearly 2x difference)

rtx 3090 and 6900xt = equal rasterization gaming performance
rtx 3090 36 tf
6900xt 23 tf

misleading, isn't it?

by your logic, rtx 3070 is 20 tflops and rx 6700xt is 12.5-13 tflops, yet they will perform same or nearly in almost all the games.

ampere tflops cannot be directly compared to rdna2 or turing. in short, effective tflops of a 3090 is 22.5 tflops against ps5's 10 rdna 2tflops.

3090 having more vram does not mean anything. no developer ever will specifically target 3090 for textures.

i think you're missing the point here. besides, textures has nothing to do with GPU power, so i dont even know why you've introduced the tflops argument.

textures only need vram. 3090 having absurdly high vram does not mean anything because it's an outlier. ps5 and sx having 16 gb vram means a lot because they will dictate the generation, similar to how ps4 and xbox one dictated their generation. by year of 2018, 5.5-6 gb vram started to be a requirement at 1080p, since ps4/xboxone was able to allocate that amount of total vram to games

series s simply wont have enough vram for games that are designed for 16 gb vram consoles and 12-16 gb vram gpus (16 gb vram gpus will be widespread in 2023, just like how 6-8 gb vram cards dominated the last 4-5 years)

Riky Riky you're really pathetic tbh. explain what you have found as funny in this post, i'm really intrigued. provide your counter-arguments, if you can (i bet you can't)

i wish there was a way to disable comment emotes for ourselves, pretty funny to see coward people spamming awkward emotes and run away.
 
Last edited:

Banjo64

cumsessed
no, vram requirements in this generation will be simply dictated ps5/sx. ps5 and sx will never have bad looking textures. besides, 3090'a 36 tfops are fake, its near 21-22.5 tflops in reality (so nearly 2x difference)

rtx 3090 and 6900xt = equal rasterization gaming performance
rtx 3090 36 tf
6900xt 23 tf

misleading, isn't it?

by your logic, rtx 3070 is 20 tflops and rx 6700xt is 12.5-13 tflops, yet they will perform same or nearly in almost all the games.

ampere tflops cannot be directly compared to rdna2 or turing. in short, effective tflops of a 3090 is 22.5 tflops against ps5's 10 rdna 2tflops.

3090 having more vram does not mean anything. no developer ever will specifically target 3090 for textures.

i think you're missing the point here. besides, textures has nothing to do with GPU power, so i dont even know why you've introduced the tflops argument.

textures only need vram. 3090 having absurdly high vram does not mean anything because it's an outlier. ps5 and sx having 16 gb vram means a lot because they will dictate the generation, similar to how ps4 and xbox one dictated their generation. by year of 2018, 5.5-6 gb vram started to be a requirement at 1080p, since ps4/xboxone was able to allocate that amount of total vram to games

series s simply wont have enough vram for games that are designed for 16 gb vram consoles and 12-16 gb vram gpus (16 gb vram gpus will be widespread in 2023, just like how 6-8 gb vram cards dominated the last 4-5 years)

Riky Riky you're really pathetic tbh. explain what you have found as funny in this post, i'm really intrigued. provide your counter-arguments, if you can (i bet you can't)

i wish there was a way to disable comment emotes for ourselves, pretty funny to see coward people spamming awkward emotes and run away.
:messenger_tears_of_joy:

Kermit The Frog Reaction GIF
 

Riky

$MSFT
I've already answered you above, your "hatred" for a machine which is truly pathetic is clouding your thinking. Next gen game engines will incorporate the full suite of RDNA2 performance saving features, such as SFS, VRS and Mesh Shaders. These will be big for Series X but even bigger for Series S and we've already seen it with Gears 5.
Example,

Only card I could get when my RX480 died recently was a 5500XT, it will still be above the minimum spec for this generation of games at 1080p.
It can run Gears 5 at 60-95fps on Ultra with Freesync, that can't hold a candle to my Series S running the same game with VRS. Therefore if my 5500XT will be ok for 1080p this gen even at medium then the Series S will be fine on a unified GDK, it's obvious when you think it through.
 
Last edited:

yamaci17

Member
I've already answered you above, your "hatred" for a machine which is truly pathetic is clouding your thinking. Next gen game engines will incorporate the full suite of RDNA2 performance saving features, such as SFS, VRS and Mesh Shaders. These will be big for Series X but even bigger for Series S and we've already seen it with Gears 5.
Example,

Only card I could get when my RX480 died recently was a 5500XT, it will still be above the minimum spec for this generation of games at 1080p.
It can run Gears 5 at 60-95fps on Ultra with Freesync, that can't hold a candle to my Series S running the same game with VRS. Therefore if my 5500XT will be ok for 1080p this gen even at medium then the Series S will be fine on a unified GDK, it's obvious when you think it through.

i guess you people are really not reading anything i comment, actually.

i said that MYself. i said that it was near a 5500xt, but it will get near 5600xt/gtx 1070 in future games when those said technologies are used.

but games will get tougher to run. a 1070 may run rdr at 1080p 60 fps today, but it wont run tomorrow's games at that, same for series s. sure, it will enjoy a lead over rx 580/5500xt but it wont be that much that it can push it towards 1080p 60 fps

also, sampler feedback is not a miracle, it will reduce vram usage but only by certain amounts, and as i said, even id tech developre voiced their concern for the memory on series s, so i dont really see the point of arguing this.
 

phil_t98

#SonyToo
this is low vs ultra textures in rdr 2. it saves up 600 mb of vram at 1080p (from 6.1 gb to 5.5 gb lmao).

series x will run 11-12 gb textures that will look like on the right

series s will have to accomodate with 5-6 gb for textures. how will this transition happen? even if it looks like the left, it wont still fit the buffer. but it will look like tthe left ,and then hell will be raised, you can be sure about that. that 10 gb vs 16 gb memory is a serious concern and problem for series s.

4 tflops can be accomodated by going down to 720p, 648p and even 540p but textures will be an another beast to handle. this is why some developers sounded concerned. they dont want to make seperate, normal looking textures for this console


Ec8MjlC.png
Poor example of a screen shot as neither the PS4 of Xbox one ran with textures that low
 

yamaci17

Member
Poor example of a screen shot as neither the PS4 of Xbox one ran with textures that low
im not saying they look like that, jesus. they look like on the right, because they have enough vram to store ultra textures (i will not explain this a third time and i will ignore any people keep saying the same after this specific post)

the left is when a system is out of vram. this is what will happen to series s, in future, because it does not have enough vram for nextgen high quality textures, regardless of the resolution

let me give you a simple example,

let's say there was a therotical ps4 lite that is released with 5 gb vram) in 2013. it would run the RDR 2 like on the left, since it would not have ENOUGH vram to fit high quality textures.

clear enough?

why do you think they went overboard with 8 gbs of vram in 2013 when all the high end gpus were simply 2 gb? because console manufacturers always values the texture quality, and that alone can save a game's grace and quality. textures are an integral part of modern games, you either have ENOUGH vram or you don't. series s won't have enough vram in future games and probably some games will look like on the left. I DONT say that they look like that NOW. stop. seriosuly, learn to read and comprehend.
 
Last edited:

phil_t98

#SonyToo
im not saying they look like that, jesus. they look like on the right, because they have enough vram to store ultra textures (i will not explain this a third time and i will ignore any people keep saying the same after this specific post)

the left is when a system is out of vram. this is what will happen to series s, in future, because it does not have enough vram for nextgen high quality textures, regardless of the resolution
Again as people have said befor, its not aimed at people who want top end its aimed at casuals who pick up and play for an hour or so regaurdless of graphics. Nothing we seen so far has pushed ps4 or series x that has been undoable on series s. It’s just a lower quality console like when somebody buys cheaper end graphics cards
 
im not saying they look like that, jesus. they look like on the right, because they have enough vram to store ultra textures (i will not explain this a third time and i will ignore any people keep saying the same after this specific post)

the left is when a system is out of vram. this is what will happen to series s, in future, because it does not have enough vram for nextgen high quality textures, regardless of the resolution

let me give you a simple example,

let's say there was a therotical ps4 lite that is released with 5 gb vram) in 2013. it would run the RDR 2 like on the left, since it would not have ENOUGH vram to fit high quality textures.

clear enough?

why do you think they went overboard with 8 gbs of vram in 2013 when all the high end gpus were simply 2 gb? because console manufacturers always values the texture quality, and that alone can save a game's grace and quality. textures are an integral part of modern games, you either have ENOUGH vram or you don't. series s won't have enough vram in future games and probably some games will look like on the left. I DONT say that they look like that NOW. stop. seriosuly, learn to read and comprehend.
You don't need "next-gen textures" when you have a low resolution. It would be a complete waste of ressources.
 

yamaci17

Member
You don't need "next-gen textures" when you have a low resolution. It would be a complete waste of ressources.
games these days have one set of textures. that's the problem. if you really think they will specifically build special lower resolution tuned textures for series s, you're deeply mistaken

as i've said a million times, and as you've people misunderstood a million times, there's not a huge vram consumption difference between 1080p and 4k.
t6rJIGc.png



shadow of tomb raider

you either have "enough" vram, or you don't. that's about it.

and if you "lower" textures, they look like on that pic. and even then, as i've said a million times, and as you people are unable to comprehend for a millionth time, LOWERING TEXTURE QUALITY does not reduce VRAM requirement HUGELY. it simply DOES. NOT. because there isn't any OTHER texture sets. games these days use inefficient downscaling texture algorythms to reduce a copule HUNDREDS megabytes of memory.

GOING from ULTRA textures to LOWEST POSSIBLE in textures in MOST games will net a total of 500-700 mb of vram. THAT's not enough to accomodate for series s's vram against series x/ps5's.

i once played games with a 2 gb vram gpu. it simply sucked. even at lowest of low possible textures, games still stuttered while 4 gb vram cards run havoc on these games. just a simple 2 gb vram made all the difference on the world.

this is the situation with series s and x. it sohuldn't have had gimped vram. gimped power can be accomodated, since you people enjoy 648p quality al ot, but gimped vram will be tough to accomodate for
 
Last edited:

Derktron

Banned
Devs will slowly abandon series s itll simply get brocken ports. Devs will only focus on ps5 and series x theyll stop focusing on series s thats microsofts problem in the end not the devs. That low ram configuration is the biggest problem on series x + the low gpu performance it means not just resolution but objects on screen have to be less, graphical features have to be removed and basically its a bottleneck that stops devs from creating certain levels/ big levels in game design because of series s, so its why they will abandon it.
Yeah, that's what I see happening, the Series S seems weaker than a Walmart gaming laptop that cost less than $1000 at least when it comes to the GPU, I think Microsoft made a stupid mistake in making the Series S weak when it comes to GPU performance, I don't care their intentions were to make a cost-efficient system but in the long run, it will bite them in their ass. Because if you think about, what was the reason to make the CPU the same as the Series X but not make it the same or almost the same as the Series X to insure future-proof performances, what they should have done is yes do the Series S but make it disc-less and sell it at a low price of $400, and people will still buy it no matter the price, people bought the Series X and PS5 Disc and Disc-less. So I don't understand why they would even go this route, the true test is when devs stop making last-gen ports and only make next-gen games.
 

Razvedka

Banned
Why? It has the same CPU as the XSX (which is faster than the PS5s) and the same SSD. The only limiting factor is resolution and the console was never meant to be a 4K machine. Again, the console isn't targeted to anyone here who has a fancy 4K TV.
XSS has considerably less memory and bandwidth than the XSX, which is what the guys behind the new Doom engine tech were highly critical of:


As for the CPU, the XSS version of the XSX cpu is clocked lower than both the XSX and PS5. 3.4Ghz w/ SMT(XSS) vs 3.5Ghz (PS5). XSX is 3.6Ghz w/ SMT.

Even then I dunno if we have a clear picture on actual real world performance since all these boxes have dedicated silicon to offload cpu heavy tasks. But however fast XSX is, XSS is technically slower by 200mhz.

XSS is going to age like milk, streaming be damned. That or the XSX is never tapped the full degree that it should be in order to hold XSS's hand.
 

Derktron

Banned
Is today national concern trolling day or something??
Is today national idiot day or something??

--- Imagine being an idiot and thinking this is a troll lol bro try hard, it's a concern to even think about. If you don't like ...... There's the door.


























Also, anyone here like yourself, thinks the Series S is going to last in performances then I'll remember your name. So I can call you out. Have a nice day. Onto my ignore list.
 
Yeah, that's what I see happening, the Series S seems weaker than a Walmart gaming laptop that cost less than $1000 at least when it comes to the GPU, I think Microsoft made a stupid mistake in making the Series S weak when it comes to GPU performance, I don't care their intentions were to make a cost-efficient system but in the long run, it will bite them in their ass. Because if you think about, what was the reason to make the CPU the same as the Series X but not make it the same or almost the same as the Series X to insure future-proof performances, what they should have done is yes do the Series S but make it disc-less and sell it at a low price of $400, and people will still buy it no matter the price, people bought the Series X and PS5 Disc and Disc-less. So I don't understand why they would even go this route, the true test is when devs stop making last-gen ports and only make next-gen games.
Its a big liability. Basically what happened to nintendo wii is awaiting series s
 
Top Bottom