• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Will PS6 and Xbox Series Next use ARM based custom-SoCs?

winjer

Member
Seems like they do, but x86 still has some other advantages. Just in addition to ease of development for ports between console and PC, it's less memory-intensive for x86 instructions compared to ARM ones, from what I've read. And RAM prices aren't necessarily going down while module capacities seem to have been set at 2 GB capacities for the longest time.

So for running certain complex code that games may end up eventually doing, do they trade off CPU performance for a bit more memory and maybe less power used? Or do they stick with more complex CPUs if it means needing less RAM capacity for equivalent tasks? It's not like the CPUs in the new consoles eat that much power anyway, they're based on mobile Zen 2 CPUs IIRC.

X86's main advantage is it's legacy. It's has been used for decades for making and playing games.

But it's not the best CPU arch out there. In fact, Jim Keller said this last year in an interview that the X86 is the worst ISA in the planet.
Considering his experience leading design teams in X86 and ARM, at Apple, AMD, Tesla and Intel, he is one of the best persons to ask.
But then again, he also said that ISA now plays a small role in designing modern CPUs arquitectures. So X86 would not prevent AMD or Intel from making faster and more efficient cores.

This ARM vs X86 efficiency thing has been overblown. Especially when Apple released the M1.
But that efficiency had a lot to do with Apple's access to the most advanced process node, with TSMC's 5nm.
AMD could only use 7nm process node. And Intel couldn't produce anything similar in it's fabs.
 

Blendernaut

Neo Member
Nope, as they will try to keep the backwards compatibility and also it's much easier to build a console based on the previeous one that creating completely new stuff from scratch. I don't think that the ARM things makes much sense in this case, to be honest.
 
No. They would need to either design the CPU based on ARMs then latest ISA, or use a stock core. Then they'd need to either design or a licence a GPU architecture.

Assuming they're still willing, getting AMD to design a new SoC based on Their latest iterations of Zen and RDNA would be faaaar easier.
 

Drew1440

Member
AMD has been trying to develop their own ARM core for a few years now, the AMD K12. And Intel has been looking at the prospect too (And even had their own, the XScale). At some point they will have to switch, but who knows what will happen since RISC-V is becoming more prevalent.
Nvidia had an interesting ARM core using technology from Transmeta with their earlier Crusoe processors, The early Denver cores could decode ARM and x86 instructions on the fly, but had to abandon x86 due to licensing issues with Intel. Should this be used in a console, it could pave a way to x86 backwards compatibility to an extent.

Either way nextgen could be interesting, with Intel's investment into their discrete graphics, they may be able to compete with AMD APU's.
Doubtful. PS360 generation was risc-based and developers complained about it. They'll stay x64 to make it easier to work across PC and console.
Issue was the Xenon and the Cell PPU were In-order designs when all X86 and PowerPC G3/4/5 processors were out-of-order. Aside from that, PowerPC was quite an efficient arch before ARM outclassed it. It still has presence in the high end server market with the IBM POWER series but that's more to do with backwards compatibility with existing software.
 

KungFucius

Member
I'm going with no. They already have relationships with AMD and have gone through 3 (2.5?) development cycles with them for the hardware. To do custom ARM designs they would have to transition to in-house design teams which means hiring, training, infrastructure, etc. MS could do this more readily, but they have an additional advantage to using PC-like Hardware, Windows gaming. The main advantage of ARM is power consumption. That is not a driving consideration for consoles that plug into the wall.
 

ReBurn

Gold Member
AMD has been trying to develop their own ARM core for a few years now, the AMD K12. And Intel has been looking at the prospect too (And even had their own, the XScale). At some point they will have to switch, but who knows what will happen since RISC-V is becoming more prevalent.
Nvidia had an interesting ARM core using technology from Transmeta with their earlier Crusoe processors, The early Denver cores could decode ARM and x86 instructions on the fly, but had to abandon x86 due to licensing issues with Intel. Should this be used in a console, it could pave a way to x86 backwards compatibility to an extent.

Either way nextgen could be interesting, with Intel's investment into their discrete graphics, they may be able to compete with AMD APU's.

Issue was the Xenon and the Cell PPU were In-order designs when all X86 and PowerPC G3/4/5 processors were out-of-order. Aside from that, PowerPC was quite an efficient arch before ARM outclassed it. It still has presence in the high end server market with the IBM POWER series but that's more to do with backwards compatibility with existing software.
Of course PowerPC is still used in the server market. But I wasn't talking about PowerPC or servers. I was talking about risc, of which ARM is inclusive, so I don't know how that's relevant to this conversation. We know why Xbox and PlayStation migrated from risc architecture to x86/64. It's very well documented.
 

Amaranty

Member
Microsoft might not release any more console hardware.
Perhaps a small box (or HDMI stick) that runs a customized Edge browser for accessing XCloud games.
Maybe something that could include ML hardware to improve the quality of the streamed frames, thus lowering bandwidth requirements.

Even if they release a full bore Xbox Series XX, you have to know that the above plan is simmering somewhere deep in the MS labs for future deployment.
I seriously doubt that. Poor internet connection and bandwidth caps play a big part. Another reason is that cloud gaming is simply not available in every country (even 1st world countries).
 

truth411

Member
No.
The next Xbox and probably PS6 are already being developed, so too soon for that.
The gen after that though? All bets are off.
More like PS5 Pro is being developed. I doubt PS6 would use ARM, more likely ZEN 8 or whatever equivalent at the time which helps backwards compatibility. Backwards compatibility will be crucial for them, expect cross gen games to last even longer next gen.
 

Bo_Hazem

Gold Dealer
I think so, ARM seems like the natural migration. Also this could make them more hybrid like the Switch by that time if we got as small as 1nm or smaller.
 
i dont think they will go for ARM unless there is some mobile factor involved. They are trying to push 8K UHDTVs, so i am thinking most people will want to game in their 8K UHDTVs by the time PS6 (or PS5Pro) and Xbox Next (or Xbox Series X Pro) come out. X86 is going to be here for a while..
 
maybe. more likely a future Nintendo console will get an ARM chip...assuming the nvidia takeover completes. if ARM is to ever go into playstation/xbox then Nvidia need to offer more money than AMD, which won't be a problem, and offer a better product. yeah Apple is doing an amazing job with their M1 line up but that's Apple's product. If Nvidia want to offer something at that level they need to make it themselves. While i do believe Nvidia can make something special i don't know if it will match Apple.
 
Last edited:
What? Is this "console gaming is dead" again? PS5 and XSX sales seem to indicate otherwise.
A paradigm shift is unpredictable, but unless the hardware is eventually built into the TVs or that TVs come without inputs and the companies need to develop streaming apps for all of them I don't see a world where there are no consoles (a PC plugged on the TV is not a proper replacement, the experience it just not there).
 

SomeGit

Member
Doubtful. PS360 generation was risc-based and developers complained about it. They'll stay x64 to make it easier to work across PC and console.

They didn't complain about it being RISC, they complained about the multi-threading the Cell SPEs being difficulty to work with, because they weren't traditional cores.
Nobody develops ASM anymore so the actual differences between developing a game for a x86-64 CPU and for a ARM CPU is minimal at best.
 
Last edited:

RoadHazard

Member
A paradigm shift is unpredictable, but unless the hardware is eventually built into the TVs or that TVs come without inputs and the companies need to develop streaming apps for all of them I don't see a world where there are no consoles (a PC plugged on the TV is not a proper replacement, the experience it just not there).

Yeah, I don't really see TVs coming with console-level computing hardware built in anytime soon. Or ever. I mean, modern TVs can already run games more advanced than a few console generations back, just like phones and tablets, but they will always lag far behind actual consoles. I think this really only makes sense with streaming, not locally rendered games. And I hope local rendering is here to stay for at least a few more generations.
 
Last edited:
Future EU power regulations for home consoles might force them to it?
Even if they go the Stadia route its more energy efficient and therefore cheaper using ARM.
 

GymWolf

Gold Member
Hopefully yeah so they are gonna be forced to leave crossgen shit behind if what people say about bc is true.

Ps5 being so similar to ps4 was what fucked us this gen.
 
Last edited:

Dream-Knife

Member
No they won't. They went to x86 for commonality. ARM only makes sense for a portable device that is worried about total power draw. Apple went to arm for efficiency and architecture commonality as iPhone is their main market.

If x86 emulation gets amazing in the next 2 years then maybe, but I doubt it.

maybe. more likely a future Nintendo console will get an ARM chip...assuming the nvidia takeover completes. if ARM is to ever go into playstation/xbox then Nvidia need to offer more money than AMD, which won't be a problem, and offer a better product. yeah Apple is doing an amazing job with their M1 line up but that's Apple's product. If Nvidia want to offer something at that level they need to make it themselves. While i do believe Nvidia can make something special i don't know if it will match Apple.
Tegra is an ARM chip.
 
Last edited:

Larogue

Member
No they are weak-ass SoCs made for low power mobile devices.

Those numbers you see in geekbench are boosted by accelerators "aka co-processors" (custom JavaScript accelerator - Video render accelerator, Machine learning accelerator ...etc) to cheat the test, not everything is run on the actual CPU.

In pure unassisted CPU performance they suck, hence the trash scores in cinebench.

Plus most of them are on 5nm nodes, while x86 processors are still on 7/10nm. Which is unfavorable for them in performance per watt, and thermals.

By the end of this year both AMD & Intel will switch to 5/4nm nodes and you will be reminded again how far ahead x86 processors are compared to ARM when they both share the same process node.
 
Last edited:

Larogue

Member
ARM cores are a bit more efficient, because they don't have to maintain a large set of old instruction sets. This implies more used space and power for microcode and for the decode stage.
For companies that use old software and need these instructions, Intel and AMD can't remove these instructions. So they must be kept on the PC side.
But for a console, most of these old instructions can be removed without any loss of compatibility with very old software.
Wrong.
Reduced instruction sets isn't an advantage. It needs more cycles for the instructions that are not available.
 
Last edited:

winjer

Member
Wrong.
Reduced instruction sets isn't an advantage. It needs more cycles for the instructions that are not available.

I suppose that when you say "Reduced instruction sets", you are not referring to RISC type ISAs. But rather to having less specific instructions.
You do realize that X86 has lots of old, deprecated instruction sets that are obsolete and no longer used by modern programs.
But have to be maintained in microcode, so that companies can still use software from several decades ago.
This is what I'm referring to. And X86 has a ton of dead ends with instruction sets.
 
Last edited:

PaintTinJr

Member
Despite synthetic benchmarks, I don't believe the latency matches up when talking game performance.

I was playing Super Mario 3D world (from bowser's fury) in co-op tonight with my youngest, made a critical jump while her character (Rosalina) was putting on a costume and the game framed-out/stalled for a few frames and my character didn't execute the animation correctly and then died- the ARM in the Switch can't match the PPC chip in the WiiU at gameplay and I've experienced similar multiplayer issues with MK8 deluxe on switch, and both these games are flawless on playing on WiiU in multiplayer. So I would say no.

Until such time as CPUs in consoles don't need to improve and ARM can match that level performance in a competitive price, I don't see a reason for either to move away from x64.
 
Last edited:

Larogue

Member
I suppose that when you say "Reduced instruction sets", you are not referring to RISC type ISAs. But rather to having less specific instructions.
You do realize that X86 has lots of old, deprecated instruction sets that are obsolete and no longer used by modern programs.
But have to be maintained in microcode, so that companies can still use software from several decades ago.
This is what I'm referring to. And X86 has a ton of dead ends with instruction sets.
This is a flawed argument I've heard it a lot from ARM fanboys on the internet.

Instruction set cost has not been particularly relevant to system cost in 20 years, because the silicon cost to do instruction microcode translation has been rendered negligible by Moore's law.
Unless you're building a microprocessor that sells for <$1 instruction set cost is basically irrelevant.
However lacking an instruction set when the program/game needs it, leads the CPU to run more cycles to emulate the code. Negatively impacting the performance.
 
Last edited:

winjer

Member
This is a flawed argument I've heard it a lot from ARM fanboys on the internet.

Instruction set cost has not been particularly relevant to system cost in 20 years, because the silicon cost to do instruction microcode translation has been rendered negligible by Moore's law.
Unless you're building a microprocessor that sells for <$1 instruction set cost is basically irrelevant.
However lacking an instruction set when the program/game needs it, leads the CPU to run more cycles to emulate the code. Negatively impacting the performance.

True, when microcode was implemented in P6, it occupied almost a third of the die space. Today, with much smaller process nodes, it's very tiny.
But mind you that X86 still has to have a beefier decode stage, due to the transition from CISC to RISC.
This results in very small differences to ARM, like I said above "a bit more efficient".

You might have also read one of my previous posts, in this very page, where I go into more detail, and basically say something very similar to what you are saying.

But it's not the best CPU arch out there. In fact, Jim Keller said this last year in an interview that the X86 is the worst ISA in the planet.
Considering his experience leading design teams in X86 and ARM, at Apple, AMD, Tesla and Intel, he is one of the best persons to ask.
But then again, he also said that ISA now plays a small role in designing modern CPUs arquitectures. So X86 would not prevent AMD or Intel from making faster and more efficient cores.

This ARM vs X86 efficiency thing has been overblown. Especially when Apple released the M1.
But that efficiency had a lot to do with Apple's access to the most advanced process node, with TSMC's 5nm.
AMD could only use 7nm process node. And Intel couldn't produce anything similar in it's fabs.
 
Last edited:

Larogue

Member
True, when microcode was implemented in P6, it occupied almost a third of the die space. Today, with much smaller process nodes, it's very tiny.
But mind you that X86 still has to have a beefier decode stage, due to the transition from CISC to RISC.
This results in very small differences to ARM, like I said above "a bit more efficient".

You might have also read one of my previous posts, in this very page, where I go into more detail, and basically say something very similar to what you are saying.
X86's main advantage is it's legacy. It's has been used for decades for making and playing games.

But it's not the best CPU arch out there. In fact, Jim Keller said this last year in an interview that the X86 is the worst ISA in the planet.
Considering his experience leading design teams in X86 and ARM, at Apple, AMD, Tesla and Intel, he is one of the best persons to ask.
But then again, he also said that ISA now plays a small role in designing modern CPUs arquitectures. So X86 would not prevent AMD or Intel from making faster and more efficient cores.

This ARM vs X86 efficiency thing has been overblown. Especially when Apple released the M1.
But that efficiency had a lot to do with Apple's access to the most advanced process node, with TSMC's 5nm.
AMD could only use 7nm process node. And Intel couldn't produce anything similar in it's fabs.

Totally agree with that, and I have seen that Jim Keller interview too. Great insight.
 
Last edited:

Tams

Member
AMD has been trying to develop their own ARM core for a few years now, the AMD K12. And Intel has been looking at the prospect too (And even had their own, the XScale). At some point they will have to switch, but who knows what will happen since RISC-V is becoming more prevalent.
Nvidia had an interesting ARM core using technology from Transmeta with their earlier Crusoe processors, The early Denver cores could decode ARM and x86 instructions on the fly, but had to abandon x86 due to licensing issues with Intel. Should this be used in a console, it could pave a way to x86 backwards compatibility to an extent.

Either way nextgen could be interesting, with Intel's investment into their discrete graphics, they may be able to compete with AMD APU's.

Issue was the Xenon and the Cell PPU were In-order designs when all X86 and PowerPC G3/4/5 processors were out-of-order. Aside from that, PowerPC was quite an efficient arch before ARM outclassed it. It still has presence in the high end server market with the IBM POWER series but that's more to do with backwards compatibility with existing software.
AMD's Arm work has really ever only been a backup. And since Zen came out really well, it's been increasingly sidelined.

Perhaps some of it is the sunk cost fallacy, but as long as Zen performs competitively, there's no reason to dump it.
 

Tams

Member
maybe. more likely a future Nintendo console will get an ARM chip...assuming the nvidia takeover completes. if ARM is to ever go into playstation/xbox then Nvidia need to offer more money than AMD, which won't be a problem, and offer a better product. yeah Apple is doing an amazing job with their M1 line up but that's Apple's product. If Nvidia want to offer something at that level they need to make it themselves. While i do believe Nvidia can make something special i don't know if it will match Apple.
Everybody fucking hates Nvidia. Sony hate them. Microsoft hate them. Apple hate them (this one is deep). Nvidia have tried to rip them all off. And if the Arm deal goes through (it won't), even more will even more.

Hell, even Nintendo don't have a great relationship with them. The Switch using Tegra is only because it was expedient and beneficial to both companies. Nvidia had Tegras they couldn't get rid of/utilise the R&D expenditure of and Nintendo needed a cheap but decent enough SoC (and tablet reference design). Nintendo already had experience working with Arm since the first Game Boy, so they could very easily jump to any of the other SoC providers.
 

ParaSeoul

Member
Do you live in a cave?

Nvidia are not going to be buying Arm. That just a couple of days ago they played the 'woe is me' card shows how desperate they have gotten over it.
Not talking about that,referring to the fact that someone already asked if next gen consoles will switch to ARM last year. Funnily enough in January too.
 

Amiga

Member
ARM vs x86 is just about the CPU end. the bulk of gaming will always be about the GPU.

And can ARM run gaming threads faster than X86? if it can then things will change, not before that.
 

Arioco

Member
Everybody fucking hates Nvidia. Sony hate them. Microsoft hate them. Apple hate them (this one is deep). Nvidia have tried to rip them all off. And if the Arm deal goes through (it won't), even more will even more.

Hell, even Nintendo don't have a great relationship with them. The Switch using Tegra is only because it was expedient and beneficial to both companies. Nvidia had Tegras they couldn't get rid of/utilise the R&D expenditure of and Nintendo needed a cheap but decent enough SoC (and tablet reference design). Nintendo already had experience working with Arm since the first Game Boy, so they could very easily jump to any of the other SoC providers.


Yup, apparently every time a console manufacturer has chosen Nvidia things ended up... badly. For several reasons. We have examples like the original XBOX or the PS3.

Unlike AMD it seems like Nvidia is not very cooperative or is willing to provide facilities to their clients. A very strange policy if you ask me. It's almost like Nvidia is not interested at all in the console market. They must have their reasons, I guess, but they're beyond me. I mean, it's like 150-200 million processors every new gen, who would say no to that? AMD must be very happy with Nvidia's attitude and policies.
 

Tams

Member
Yup, apparently every time a console manufacturer has chosen Nvidia things ended up... badly. For several reasons. We have examples like the original XBOX or the PS3.

Unlike AMD it seems like Nvidia is not very cooperative or is willing to provide facilities to their clients. A very strange policy if you ask me. It's almost like Nvidia is not interested at all in the console market. They must have their reasons, I guess, but they're beyond me. I mean, it's like 150-200 million processors every new gen, who would say no to that? AMD must be very happy with Nvidia's attitude and policies.
I think its a mix of hubris, greed, and a bit of laziness.

Why would you even question dear Jensen's work? It doesn't need improvement or alteration! (from the likes of you mere console maker) They'd still like the money though, just do it their way.

To them though, the consoles would be a tiny percentage of their revenue, so they likely were never going to very willing to compromise on price or amount of custom work. AMD really needed the money, so were much more accommodating. They don't need the console money anywhere near as much these days, but it is still a healthy revenue stream (I'm sure there's some wishing they could use the wafer amounts consoles take up for other products, but they made agreements that helped save them). And if they can keep good relations, it could be a tidy amount over several generations.

I have no skinny on what actually went down with Nintendo with the Switch, but I suspect that Nintendo not really needing any alterations to Tegra and the Nvidia Tegra team looking quite desperately to justify their existence and cost may have been why that worked out.
 
Top Bottom