• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Xbox Series X server blades are now enabled.

M1chl

Currently Gif and Meme Champion
You can get GP with some fuckery and hassle, but PS Now is so far IMPOSSIBLE as you need a US credit card, or a card from a country supported.

They should keep those streaming bullshot to those countries and offer download-only to other countries if they seriously want more exposure. It's an expensive, useless feature to most gamers (streaming) so cutting the expenses and making it a fully rental, downloadable model would make both have more money to spend on bringing better quality games to their services instead of dedicating more than $400-500+ HW per streamer.
Absolutely agree with second part, it sucks what kind of hassle is to get these services (well at least PS Now for me), thankfully we are at the same IP pool like Germany, so with some trick in the Router, it can be done, I had few times month of PS Now. Thankfully PS + is kicking ass and i am still paying for it, so I can snag those games, even though my PS5 is probably years away, but I am sharing acc with my brother, so it's just pennies for me : )

Hopefully everything is going to be available soon™
 

CrustyBritches

Gold Member
Thanks to thicc_girls_are_teh_best thicc_girls_are_teh_best for dragging me here. Now my stomach is empty so I can move on.
GIGCX7C.gif
 

Kimahri

Banned
The resolution is kinda shitty, but if they can sort that out, cloud gaming is pretty awesome. I really dig just starting up a game I don't even have installed. It's not as smooth as Stadia though. There ypu uust press play and you're in, with xcloue you need to go through the entire boot sequence. Hopefully that'll get streamlined in the future.
 

CrustyBritches

Gold Member
Stream quality seems a little better for me, but it's still hard to tell what hardware Gears 5 is running from. Image quality is ok for browser streaming, but the input latency on xCloud is still high compared to Stadia or GFN. Having new hardware should help as running high framerate on the server side should improved things. I ended up having better success with the Chrome browser than Edge, and I think a native app would work even better.
 

elliot5

Member
Stream quality seems a little better for me, but it's still hard to tell what hardware Gears 5 is running from. Image quality is ok for browser streaming, but the input latency on xCloud is still high compared to Stadia or GFN. Having new hardware should help as running high framerate on the server side should improved things. I ended up having better success with the Chrome browser than Edge, and I think a native app would work even better.

Definitely looks like XSX to me, at least from a smaller youtube window viewing.
 

CrustyBritches

Gold Member
Definitely looks like XSX to me, at least from a smaller youtube window viewing.
Agreed. I'll take a look at the One S version later tonight. The loading was screaming fast, so I'd assume XSX.

This is a XSX 4K/60fps video of the same area...


XSX:
1HiveXSX.jpg

xCloud:
1-Hivex-Cloud.jpg
 
Last edited:
Because it's still 720p, highly compressed streaming. Enjoy it.
Well it's better than nothing and better than some of their competition. Also from reports I've heard it's actually higher resolution than that for those with beta access so far.

Regardless, it'll always be compressed, that's how it works. Blame video encoding and data packet standards and protocols. Most people in the audience for game streaming either won't care about the compression or won't let it deter them from its benefits.

When both PS Now and GP offer a fully downloadable models here at least with no BS, no VPN, no US accounts then I might give some a month when I find a game that's not worth the full price/more than $10.

Fair enough. You have to understand though there are lots of people who have different circumstances WRT what's downloadable or not, or if certain services are available or not. Can't always look at it from a purely anecdotal POV.
 
Last edited:

e&e

Banned
On mobile, there should be a logo of the series x under the "Cloud" section of games if its upgraded. Some people are saying games like MLB are upgraded but they don't show the logo so it's unclear what's what. It's possible Two Point isn't upgraded yet.

@T-Cake see above^^
Oh that’s what that means? Like This?

n1V1luf.jpg
 

elliot5

Member
Oh that’s what that means? Like This?

n1V1luf.jpg
That's what my impression was but it may be the case that it just means it's XS enhanced. Though not all games like MLB didn't have it iirc.

Either way it'll give you a solid shot that it's gonna run very well on xCloud compared to others
 
  • Like
Reactions: e&e

FeldMonster

Member
Really, do you have a link to any articles?

Cooling in data centres is extremely important, expensive and carbon positive so I wouldn't have thought they would choose to raise the temps with faulty hot chips. I'd be interested to read about this. Maybe you misunderstood?
I need to clarify, as my memory was a bit fuzzy at the time. It took me awhile to find the source. Unfortunately it is within an IEEE document behind a paywall. I think that it is one of these two.


Fortunately, the important slide was taken and shared. Here is a post with the slide in question for example.

And the slide specifically:
ISSCC2021-3_1-page-033.jpg

The Series X GPU has 56 total compute units (26 sets of 2 to be more accurate). 4 are disabled by default for yield purposes. Meaning that if any chips have between 52 - 56 compute units they can be used in a consumer Series X.

In contrast, the XCloud version of the Series X can have anywhere from 48 - 56 "good" compute units and be useable. Therefore if there is a Series X GPU with 48 or 50 compute units, it can only be used in the XCloud version. However, if there are fewer compute units, the TF will be lower, thus the frequency must be increased to achieve the 12.1 TF (1.9 GHz for 50 and 1.98 GHz for 48 vs. 1.825 for 52). As a result however, the chips will run hotter.

Anandtech article (though it barely touches on the topic): https://www.anandtech.com/show/16489/xbox-series-x-soc-power-thermal-and-yield-tradeoffs

I couldn't find the video of this presentation, but perhaps someone better at sleuthing than I can find it.
 
Last edited:

elliot5

Member
in other words they are using xbox series X|s consoles as actual servers?
yes, but not in the console formfactor. Specifically XSX, though one game showed Series S (I think it was No Mans Sky) so idk if they also have XSS blades or if its just the XSX running the XSS profile.
 
Last edited:
Does anyone have a pic of the Series X blade server or boards or SoC? I'm curious what sort of size and hardware revision they went with.

EDIT: Never mind, think I found one.

Xbox-Series-X-xCloud-9.jpg
 
Last edited:

01011001

Banned
in other words they are using xbox series X|s consoles as actual servers?

yes. these can either run 1 Xbox Series X instance or 4 Xbox One S instances at once.
meaning if a game doesn't have any One X or Series X improvements they will launch a virtualized Xbox One S and a single blade can handle 4 of those.
 
Last edited:

elliot5

Member
Does anyone have a pic of the Series X blade server or boards or SoC? I'm curious what sort of size and hardware revision they went with.

EDIT: Never mind, think I found one.

Xbox-Series-X-xCloud-9.jpg
Yeah that pic is the old One S blade IIRC. I don't think any XSX blade pics have been shown, but I would imagine it looks similar.
 

Redlight

Member
I don't understand why anyone would want this. I don't want to see any compression, and I don't want to experience even a hint of latency. I also don't want my basic game performance to be dependent on network latency/bandwidth. Especially after some of my recent game experiences ...

I just don't get this from a quality standpoint. I DO get it from a standpoint of wanting to try and increase their subscriber count without selling hardware but ... to what end? It's like selling frozen TV dinners and saying "No no, this is REAL steak we promise!"

And yeah I've tried recent streaming services. IMO you can still definitely tell. And I have low latency fiber gigabit at home.

I don't personally want to play console games on my phone. Or a laptop. Or any other kind of device with hampered inputs, bad sound, and small screens. I like playing on my large tv, dolby atmos capable surround, and on a comfortable couch, with no lag and no compromises. I can't imagine why anyone would want to play something like Forza, Starfield, or Flight Sim on a phone through a browser with a compressed video stream and noticeable latency. Saying it's "no worse than a 1080p Twitch stream" isn't exactly selling it either.

Also this whole thing is kind of ironic when MS actually used the phrase "uncompressed pixels" to market the One X.
This is all 'I don't like', 'I don't want', 'I will not'. If you don't want it, don't use it, why would anyone else GAF?

I'm sure they'll be a lot of people in the world with great internet and an Xbox One or old PC that will be very keen to play Starfield on their old-ass hardware.
 

CeeJay

Member
I need to clarify, as my memory was a bit fuzzy at the time. It took me awhile to find the source. Unfortunately it is within an IEEE document behind a paywall. I think that it is one of these two.


Fortunately, the important slide was taken and shared. Here is a post with the slide in question for example.

And the slide specifically:
ISSCC2021-3_1-page-033.jpg

The Series X GPU has 56 total compute units (26 sets of 2 to be more accurate). 4 are disabled by default for yield purposes. Meaning that if any chips have between 52 - 56 compute units they can be used in a consumer Series X.

In contrast, the XCloud version of the Series X can have anywhere from 48 - 56 "good" compute units and be useable. Therefore if there is a Series X GPU with 48 or 50 compute units, it can only be used in the XCloud version. However, if there are fewer compute units, the TF will be lower, thus the frequency must be increased to achieve the 12.1 TF (1.9 GHz for 50 and 1.98 GHz for 48 vs. 1.825 for 52). As a result however, the chips will run hotter.

Anandtech article (though it barely touches on the topic): https://www.anandtech.com/show/16489/xbox-series-x-soc-power-thermal-and-yield-tradeoffs

I couldn't find the video of this presentation, but perhaps someone better at sleuthing than I can find it.
Thanks for taking to the time to find these articles, its an interesting read.

I suppose it makes sense to some degree that these blades can run on a couple less WGPs as they don't have to worry about running some of the background stuff like the full dashboard and can give closer to 100% of their resources to running the game.

Is the bolded an assumption from the slide or is there some evidence of this frequency increase?

Heat is the enemy of the datacentre way more than in consumer units (way more processors packed into a smaller case) and the Series X had to have a design entirely aimed at getting rid of that heat. It just seems at odds with my understanding of datacentres that they would choose to increase the frequency of these C tier chips so that the could use them. Microsoft however do seem to be at the bleeding edge of datacentre design so maybe they have factored a higher thermal envelope into the server provisioning than the consumer units specifically so they can use the lower yield chips and throw a lot less silicon in the bin.
 
Last edited:
Top Bottom