• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

PS3 Cell made Sonys first party what they are today.

DonkeyPunchJr

World’s Biggest Weeb
Don't know why sony opted for nvidia. Rumours said external gpu wasnt even expected on ps3 but just a derivation of the old gpu ps2. Seems first parties stopped this crazyness almost at the last minute.
Absolutely it was their “plan B” when their multi-Cell idea didn’t pan out. Guessing Sony wasn’t in a position to be choosy. They ended up with a gimped 7900GT at around the same time Nvidia was launching their console-slaughtering 8000 series
 

assurdum

Banned
The bold part is the opposite philosophy behind the ps5 design.
You have completely misunderstood the philosophy of the ps5 design. Cerny never claimed faster gpu is better than have more cores units. Never. He simple observed that to have multiple cores which can't squeeze out the most from the hardware, it's only a waste, leaving too much performance on the table causing also a lot of inefficencies. It's a semantic almost opposite. A balanced hardware in frequency and core units it's still better than just the faster one.
 
Last edited:

Clear

CliffyB's Cock Holster
The rise of UE3 was a body-blow to the PS3. Stock UE3 runs at around half the speed on PS3 compared to 360 purely because as an engine it was just unsuited to the hardware. Especially the memory configuration.

It really was square peg vs round hole, and Epic didn't really have much impetus to change things on their end and as a result it was left to the development community to create workarounds and modifications specific to PS3.
 
I would argue that they become what they did today in spite of the cell. Development was more challenging on the tech side, and they made the style of games they would have regardless of the tech behind it.
 

Doczu

Member
The Saturn also had parallel coding way before the norm too, that's one of the reasons it was difficult to code for. You're making negatives into positives on a similar environment, just because of the platform. Both machines had developers that overcame those challenges and made great software, but not because the Cell is unique.
The difference is Saturn did it in an age where no one knew how to do it, and the PS3 did it in a time when this was slowly becoming the norm 🤷🏻‍♂️
More and more deva were getting acustomed to parallel coding in the mid to late 00's,not just for the PS3. In the Saturn era gow many other consoles/PCs used parallel coding?
 

PaintTinJr

Member
Regardless of being power efficient or not, the goal of the Cell was not the same as Alder lake.
The Cell is in no way an ancestor to Alder lake. It was just a complex and expensive FPU unit.
If Alder lake is a hUMA processor - the Cell BE is officially the first and kickstarted that whole CUDA initiative with IBM and about 20 other companies IIRC - then it certainly is, actually or semantically an ancestor of the CELL BE design principles.

How are the SPUs - in reasonable way after reading the documentation/SDK - just complex and expensive FPU units?
I assumed you were technical enough to know that they don't run enslaved to PPU - unless that is the software intention - they can run as one group of 8 ,or eight groups of 1, or any combination and can work autonomously of the PPU - even if it crashed - once kicked off by the PPU or another SPU, and can run general purpose POWER code - as shown in the SDK examples/documentation. If that is just a FPU/ASIC to you, then I don't know what else to say... the CELL BE wasn't some advertising buzzword for PlayStation but was a peer reviewed commercially available product in the wider world of computing with documentation that doesn't match any of the assertions you are making.
 

Shane89

Member
To all the guys saying PS3 production cost was too much because of the Cell, well. no, it wasn't for that. It was due to the Blu Ray. And it's thanks to that that now, you do not have a crappy HD-DVD capable PS5 and XSX now.
 
Last edited:

winjer

Gold Member
If Alder lake is a hUMA processor - the Cell BE is officially the first and kickstarted that whole CUDA initiative with IBM and about 20 other companies IIRC - then it certainly is, actually or semantically an ancestor of the CELL BE design principles.

How are the SPUs - in reasonable way after reading the documentation/SDK - just complex and expensive FPU units?
I assumed you were technical enough to know that they don't run enslaved to PPU - unless that is the software intention - they can run as one group of 8 ,or eight groups of 1, or any combination and can work autonomously of the PPU - even if it crashed - once kicked off by the PPU or another SPU, and can run general purpose POWER code - as shown in the SDK examples/documentation. If that is just a FPU/ASIC to you, then I don't know what else to say... the CELL BE wasn't some advertising buzzword for PlayStation but was a peer reviewed commercially available product in the wider world of computing with documentation that doesn't match any of the assertions you are making.

You do realize that hUMA stands for Heterogeneous Uniform Memory Access. And neither Cell, nor Alder lake use this.
You are probably mixing thing up with Heterogeneous CPU or Heterogeneous Computing. These are concepts precede Cell.
Even the PS2 Emotion Engine is a Heterogeneous Computing system, as it has Vector and media coprocessors.

The SPEs can't even access memory, it has to fetch trough the PPE.
And the SPE doesn't even have a branch predictor, something essential in CPUs. But since the SPEs function is to just compute as much FP in parallel, it doesn't matter that much. Just like on a GPU.
The SPE is more akin to a Compute Unit, than to a CPU core.

Alder Lake is copying ARM's big.LITTLE heterogeneous arch. Not the Cell.
 
Last edited:

DonkeyPunchJr

World’s Biggest Weeb
To all the guys saying PS3 production cost was too much because of the Cell, well. no, it wasn't for that. It was due to the Blu Ray. And it's thanks to that that now, you do not have a crappy HD-DVD capable PS5 and XSX now.
It was both. According to this the BD-ROM drive was estimated at $200-$300 at launch while the Cell was $150-$230. And keep in mind Cell cost $400 million to develop

 

PaintTinJr

Member
You do realize that hUMA stands for Heterogeneous Uniform Memory Access. And neither Cell, nor Alder lake use this.
You are probably mixing thing up with Heterogeneous CPU or Heterogeneous Computing. These are concepts precede Cell.
Even the PS2 Emotion Engine is a Heterogeneous Computing system, as it has Vector and media coprocessors.

The SPEs can't even access memory, it has to fetch trough the PPE.
And the SPE doesn't even have a branch predictor, something essential in CPUs. But since the SPEs function is to just compute as much FP in parallel, it doesn't matter that much. Just like on a GPU.
The SPE is more akin to a Compute Unit, than to a CPU core.

Alder Lake is copying ARM's big.LITTLE heterogeneous arch. Not the Cell.
No, that's not correct either. The SPUs or groups of SPUs can access the unified memory (the XDR) via the EiB (the Elemental Interconnection Bus) with the same priority mechanism as the PPU (because the EiB is ringbus), and in the Sony Zego systems(2x Cell BEs, 1 or 2GB XDR and a GPU) each CELL BE heterogeneous compute unit could access the unified memory via the EiB in a deterministic way.

Whether they do branch prediction well, or not, is purely a design choice, the E cores are weak at branch prediction compared to the P cores, too, not 30x weaker (IIRC) like the SPUs were compared to the PPU, but the same heterogeneous design ideas are in use, none the less. The SPU could also run integer computation just fine too, so long as their code was designed for pre-determined if/else path execution. But rather than argue the 2005 documentation of the Cell BE with me paraphrasing it, you'd be better reading it yourself.
 

StreetsofBeige

Gold Member
Cell didn’t make those solid PS3 first party games. Their studios did. I’d say BR discs contributed more to the games than Cell as it gave them storage to do all their cinematically which is guess would be topped out on DVD.

360 multiplats usually fared better.

Just imagine how good PS3 first party games would be if they had BR + 360s specs which included an extra 10mb edram which was a big factor in better performance.

You’d have a best of both worlds.
 
Last edited:

Shane89

Member
It was both. According to this the BD-ROM drive was estimated at $200-$300 at launch while the Cell was $150-$230. And keep in mind Cell cost $400 million to develop

exactly, remove BD drive and you could have a 200$ cheaper console.

Anyway, they risked, and they succeded.
 
Last edited:

spons

Gold Member
To all the guys saying PS3 production cost was too much because of the Cell, well. no, it wasn't for that. It was due to the Blu Ray. And it's thanks to that that now, you do not have a crappy HD-DVD capable PS5 and XSX now.
What's crap about HD-DVD though? It was region free, and at the time I hoped it would prevail for film.
 

winjer

Gold Member
No, that's not correct either. The SPUs or groups of SPUs can access the unified memory (the XDR) via the EiB (the Elemental Interconnection Bus) with the same priority mechanism as the PPU (because the EiB is ringbus), and in the Sony Zego systems(2x Cell BEs, 1 or 2GB XDR and a GPU) each CELL BE heterogeneous compute unit could access the unified memory via the EiB in a deterministic way.

Whether they do branch prediction well, or not, is purely a design choice, the E cores are weak at branch prediction compared to the P cores, too, not 30x weaker (IIRC) like the SPUs were compared to the PPU, but the same heterogeneous design ideas are in use, none the less. The SPU could also run integer computation just fine too, so long as their code was designed for pre-determined if/else path execution. But rather than argue the 2005 documentation of the Cell BE with me paraphrasing it, you'd be better reading it yourself.

The SPU had to pass through the MFC in the PPE.

The SPE don't do any branch prediction. They are totally dependent on the software compiler.
And mind you, branch prediction is basically the only reason we still have CPUs today. It's the only thing they are great at.
Otherwise, we would have just a bunch of parallel units crunching data.

And mind you, a GPU compute Unit, computes FP and Int.
Once again, the SPE is more closely related to a CU on a GPU, than to a CPU core.

The differences between the Cell CPU and Alder Lake are too numerous to count.
And the only thing somewhat similar, is that they are an heterogenous system.
And once again, Alder Lake took inspiration on ARM's Little.Big heterogeneous system. Not on the Cell heterogeneous system.
 

Panajev2001a

GAF's Pleasant Genius
Early on, EVERYTHING they said about Cell was about multiple Cells working together like cells in a body. And so all the speculation about PS3 was that it’d use multiple Cell chips with no dedicated CPU.

Then a rumor started going around that Sony was contracting Nvidia for something. I remember there was an interview with some Sony higher-up where someone asked if PS3 would use an Nvidia GPU and he laughed and said that was ridiculous, that Sony didn’t need their help (wish I could find the interview).

From that point all the speculation was that they licensed some kind of rasterizer back-end or something from Nvidia… then it was unveiled, and lo and behold, it’s a single Cell chip + a Nvidia GPU each with its own pool of memory.

Bottom line is Sony (along with Toshiba and IBM) bet big and bet wrong. Turns out the kind of stuff Cell was good at, GPUs are better at. And that what matters most in a gaming CPU is its performance in branch-y single-threaded code.
Again, Toshiba had a GPU (pixel shader only, eDRAM heavy) design. It was a PS2 on steroids plus some new innovation).

The SPU had to pass through the MFC in the PPE.
I think each SPE was more independent than that, each had their own MFC/DMAC:
http://didawiki.cli.di.unipi.it/lib...aticanetworking/spd/spd-10-ieeemicro-cell.pdf

W9bSNbo.jpg


4DMA64R.jpg

ZxiDxzD.jpg
 

Pedro Motta

Member
No, those games are running on x86.

Make a better argument that revolves around the cell that makes me change my mind.
Sorry I had no idea you were mentioning specifically Cell in your original post.

But the PS3 had it's fair share of titles that pushed things beyond what was possible at the time, mainly due to the CELL processor.

Warhawk for instance was one of the first games that had real volumetric cloud rendering. All done on the SPUs.
 

winjer

Gold Member
Again, Toshiba had a GPU (pixel shader only, eDRAM heavy) design. It was a PS2 on steroids plus some new innovation).


I think each SPE was more independent than that, each had their own MFC/DMAC:
http://didawiki.cli.di.unipi.it/lib...aticanetworking/spd/spd-10-ieeemicro-cell.pdf

W9bSNbo.jpg


4DMA64R.jpg

ZxiDxzD.jpg

Note that the SPU cannot directly access system memory; the 64-bit virtual memory addresses formed by the SPU must be passed from the SPU to the SPE memory flow controller (MFC) to set up a DMA operation within the system address space.

 

winjer

Gold Member
The MFC is inside each SPE as per the diagram. Not on the PPE (or its PPU) like you said, but each SPU had access to its own private MFC.

Each SPE had an SPU, an MFC/DMAC, LS, and an interface to the ring bus.
wEgUOAt.jpg

(IBM diagram, much more in the paper I linked above)

Your link isn't working. Can you fix it.
 
Last edited:

Punished Miku

Gold Member
After a whole gen of programming on Series S and Series X, Xbox first party are going to hit super saiyan 3.

Nintendo devs programming on the Switch are gaining strength as we speak.
 
Last edited:
I would like to jump in this conversation, but there is so much wrong here. If your name isn't Panajev or Fafalada, you're likely wrong.

The MFC is inside each SPE as per the diagram. Not on the PPE (or its PPU) like you said, but each SPU had access to its own private MFC.

Each SPE had an SPU, an MFC/DMAC, LS, and an interface to the ring bus.
wEgUOAt.jpg

(IBM diagram, much more in the paper I linked above)

I'm not suggesting he knows any of this, but the original patent had an SPE (it was called an APU then) design similar to what this guy is describing. All memory mapping would be done by the PPE from what I remember. During the design process at STI they doubled the size of the LS from 128KB to 256 and added it's own MFC.

Also, some interesting trivial about the clock speed that you mentioned, even before STI was formally developed, Kutaragi is said to have graphed out a subset of processors to that point and traced a logarithmic curve through them and selected 4Ghz as the Cell clock. This is why the patents can all be reverse-engineered back a 4GHz clock.
 
Last edited:
For me, the game that seal the deal on the power of the PS4 was actually a third party game: Metal Gear Solid 4. There was something about that game that felt really special, it was probably my first "next gen" experience, graphics, sounds, acting, story and gameplay. Just the idea that it was using the full 50gb of a Blu ray made me feel happy at that time.
 

winjer

Gold Member
Link works for me but sure it is http only as it is an Italian uni rehosting it. Trust it or not uploaded on wetransfer: https://we.tl/t-k6mgynGKYj

Official IBM architecture overview of CELL BE (university rehosted): https://arcb.csc.ncsu.edu/~mueller/cluster/ps3/SDK3.0/docs/arch/CBEA_v1.01_3Oct2006.pdf
This should work fine :).

I found another link. Check point 5 of that PDF.
You will find that the SPEs only have a virtual memory address. It's the PPE that has the effective memory address control.
The MFC basically translates between this virtual memory addresses and the effective memory addresses.
 
Last edited:

Panajev2001a

GAF's Pleasant Genius
I found another link. Check point 5 of that PDF.
You will find that the SPEs only have a virtual memory address. It's the PPE that has the effective memory address control.
The MFC basically translates between this virtual memory addresses and the effective memory addresses.
It also does its own mapping to physical memory addresses that is why you had an MMU inside the PPE and the SPE. You will need to quote where it says all accesses by the SPU’s are serialised through the PPE…
 

winjer

Gold Member
It also does its own mapping to physical memory addresses that is why you had an MMU inside the PPE and the SPE. You will need to quote where it says all accesses by the SPU’s are serialised through the PPE…


The SPEs use the Synergistic Memory Flow Controller (MFC) to implement memory access. The Memory Flow Controller providesthe SPE with the full Power Architecture virtual memory architecture, using a two-level translation hierarchy with segment and page tables. A first translation step based on the segment tables maps effective addresses used by application threads to virtual addresses, which are then translated to real addresses using the page tables. Compared with traditional memory hierarchies, the Synergistic Memory Flow Controller helps reduce the cost of coherence and allows applications to manage memory access more efficiently. Support for the Power Architecture virtual memory translation gives application threads full access to system memory, ensuring efficient data sharing between threads executing on PPEs and SPEs by providing a common view of the address space across different core types. Thus, a Power Architecture effective address serves as the common reference for an application to reference system memory, and can be passed freely between PPEs and SPEs. In the PPE, effective addresses are used to specify memory addresses for load and store instructions of the Power Architecture ISA. On the SPE, these same effective addresses are used by the SPE to initiate the transfer of data between system memory and the local store by programming the Synergistic Memory Flow Controller. The Synergistic Memory Flow Controller translates the effective address, using segment tables and page tables, to an absolute address when initiating a DMA transfer between an SPE’s local store and system memory. In addition to providing efficient data sharing between PPE and SPE threads, the Memory Flow Controller also provides support for data protection and demand paging. Since each thread can reference memory only in its own process’s memory space, memory address translation of DMA request addresses provides protection between multiple concurrent processes. In addition, indirection through the page translation hierarchy allows pages to be paged out. Like all exceptions generated within an SPE, page translation-related exceptions are forwarded to a PPE while the memory access is suspended. This allows the operating system executing on the PPE to page in data as necessary and restart the MFC data transfer when the data has been paged in. MFC data transfers provide coherent data operations to ensure seamless data sharing between PPEs and SPEs. Thus, while performing a system memory to local store transfer, if the most recent data is contained in a PPE’s cache hierarchy, the MFC data transfer will snoop the data from the cache. Likewise, during local store to system memory transfers, cache lines corresponding to the transferred data region are invalidated to ensure the next data access by the PPE will retrieve the correct data.
 

Three

Member
Wasn't the PS3 the worst console ever in terms of performance per cost wise ? 900$ to make at lunch while the Xbox 360 cost half as much at the time
This was mainly because they included arguably unneeded things.

HDD, Wifi, Bluetooth, card readers, Blu-Ray, 7-8 player controllers, PS2 CPU and GPU.

360 had none of these as standard so was cheaper to produce, it had no BC. It didn't even have a hdmi port at launch.

Sony were too frivolous and took huge losses for no reason expecting people to buy their machine. Little did they know people would be more concerned with comparing 720p vs 640p on DF. The rest is history.
 
Last edited:
This is my first thread on GAF, so go easy on me.

Not a bad first thread at all, OP.

I agree to a large extent that Cell did help shape Sony 1P into what they are today. Not so much on the tech difficulty side: PS2 had some VERY esoteric hardware itself and was quite difficult to program for in its own right. However, PS3 was the first gen where a Sony console was difficult to program for AND a competitor had a "perfect storm" console in terms of ease of development, proper timing, strong technical performance, proper funding, great features innovation and a growing library of strong exclusives as well as great marketing, in the form of the Xbox 360.

With the PS2, systems like GC and Xbox beat it in a lot of technical areas and had a good stable of great exclusives (in time), but they released way too late to be much of a factor. They didn't really have good timing or stable of exclusives to go up with the onslaught PS2 started to see starting with GT3, either. As for Dreamcast, it came too early, and lacked proper funding after the initial period as Sega were bleeding money. It was much easier to develop for than PS2, but came up short tech-wise in some key areas, namely geometry and lighting (you can see where some of this comes into play with even early releases like GT3 vs. Le Mans on DC; check out that Digital Foundry GT retro series).

Because 360 satisfied a lot of those aforementioned conditions, combined with the complications of early PS3 development (and manufacturing, which affected the pricing), a LOT of 3P devs either finally decided to seriously support Xbox for the first time or, in some major cases, released massive AAA games on Xbox exclusively. If not that, 360 got a lot of DLC content early or exclusively (COD comes to mind), and was often the lead platform for 3P games. Even a lot of Japanese support that was once exclusive to Sony started to do a lot of work on Xbox with the 360.

That forced Sony to more or less revitalize the PS3's brand image on their own, and with maybe a small handful of 3P support along the way (like Konami with MGS4). Their 1P teams had to innovate with IP and leveraging the PS3's hardware advantages, and in time, they did. So teams that were already quite good, like Naughty Dog, elevated a lot higher than gen due to those circumstances. Some other 1P teams didn't quite elevate as high as they could, like Polyphony, but they were able to still see some gains and help with restoring the PS3's image as well as show off what the system could really do.

PS3's issues are also what forced Sony to take the path they did with PS4 and, building off the momentum PS3 had from 2010-onward (especially since after that point, 360 more or less gave up on the core gaming audience with their 1P), helped PS4 incredibly well off the bat with 8th-gen. So while I think PS3's hardware was a definite factor, it wasn't the only one, or the main one, IMO.

Toshiba was making a pixel shader only focused GPU with gobs of eDRAM for it (CELL taking over the vertex shading duties… innovating but also taking the PS2 design and turning it up to 11, you can see where they wanted to go), but yes it was ditched for RSX at the last minute and got a bugged chip too.

Their approach could have worked had MS not launched early and with such good HW performance (and architecture) and SW support.
PS3, due to some issues, ended up feeling like less than the sun of its parts for a while, but devs who tamed CELL ended up liking it a lot.

A huge benefit with 360's GPU were the unified shaders. Even if Sony went with the earlier approach (which I haven't had a chance to hear a lot about, but its sounds very interesting and could've been really cool given the great technical performance in games PS2 provided for its time), they still would have lacked unified shaders, so it'd of always been a "limitation" in comparison to 360.

Putting that in quotes because, again, it would come down to how devs used the hardware. OG Xbox had a few graphics pipeline advantages (again regarding shaders) over PS2, but PSP's hardware was robust enough to work around that plus at the time devs hadn't completely became acclimated to that newer stuff to the point where it seemed difficult to do similar results on PS2.

Maybe could've been a bit different with PS3 vs. 360 as unified shaders did become a big part of PC shortly after 360's launch, and the 360 gen is when a lot of PC-centric devs and franchises started pushing hard into the console gaming space (they were already making waves here and there the prior gen but Japanese studios ruled the roost for the vast majority still), but you never know.
 
Last edited:

Azelover

Titanic was called the Ship of Dreams, and it was. It really was.
They've been going further in the same direction a little too long for my liking. But yes, you're totally right.
 

ThaGuy

Member
Wasn't the PS3/360 gen the gen where developers/publishers was going bankrupt left and right. I swear I remember seeing a new developer go down every month that gen smh.
 

Fafalada

Fafracer forever
wasn't the long term plan by Sony to have many products outside of gaming to use the Cell?
They had the same plan for PS2 chipset (well, primarily the GS) - that was the driving reason it was HD capable out of the box.
They managed some use outside of gaming for both - but it was pretty limited overall.

But for games, the PS3 paradigm was the wrong choice. A strong CPU with a weak GPU is the wrong way of doing things.
Graphics world was moving towards compute-heavy workloads, so either approach serves the same purpose just fine. Ease of use is a different story - but that's where we can rightfully point to arrogance in PS3 design leadership.
For what's worth - original targets had Cell 4x more powerful, and the GPU substantially faster at non-compute workloads (but not much else) so what we got was not even all that exotic in the end.

Considering that Toshiba promised to develop a GS on steroids and pixel shaders, why was CELL worse than the EE?
The intent was obviously to take the PS2 paradigm to the next level, but it's hard to argue they made the programming model more painful. IBM ISA was really not human programmer friendly (neither is AMD/Intel - but people stopped writing x86 ASM 20 years ago).
 

Fafalada

Fafracer forever
if you use a proprietary engines and api developed to take advantage of the peculiarities of the console .. you will spend more money, more time and you need more person on to port the game over the PC.
While that's technically true - the science of optimizing driving port-costs down is something big publishers (EA, Ubi etc.) have 'solved' decades ago.
The business model of porting-houses (internal or external) is very much a thing, and still commonly used by AAA industry today, regardless of the engine they use.

And the SPE doesn't even have a branch predictor, something essential in CPUs.
Better than having a branch predictor that's actively degrading performance half of the time like the PPE.
Seriously though - SPE had a short pipeline and latency-free memory access (code and data alike) so it really didn't need branch prediction. Much like on the VU, branches were basically free, minus the usual in-order execution caveats.
 
Last edited:

PaintTinJr

Member
The SPU had to pass through the MFC in the PPE.
I've seen what you quoted further down, but that is for full access to all memory. Going by memory of reading the documentation 15years ago, the SPUs working by themselves - particularly in hypervisor black box mode for banking - certainly doesn't require them to pass data through an unsecure PPU process and they just directly use a 2MB (4MB/8MB?) area IIRC that gets mapped when the SPU is first kicked off by the PPU or another SPU. Which is sort of the point of the CELL being heterogeneous - with Processing units for different intentions, just like P and E cores - and so such a small memory as a source or sink - through the EiB - to a much smaller local store of 256KB makes perfect sense, and the more obvious software paradigm than needing full virtual addressing to work in conjunction with the PPE.

The SPE don't do any branch prediction. They are totally dependent on the software compiler.
And mind you, branch prediction is basically the only reason we still have CPUs today. It's the only thing they are great at.
Otherwise, we would have just a bunch of parallel units crunching data.

And mind you, a GPU compute Unit, computes FP and Int.
Once again, the SPE is more closely related to a CU on a GPU, than to a CPU core.
CUs didn't exist before the CELL BE, the geometry shader had only just become an Opengl extension at the time IIRC to join vertex and fragment shaders, and Opengl 2.1 was just being specified. So making that comparison with - what some described as - "satellite" processors (because they aren't enslaved to the main CPU thread like normal multi-core) is very much out of context of a graphics industry without CUDA/opencl until a few years later, and I already mentioned that the GTX 200 series was the first point where GPUs were able to replace most of the SPU's performance versatility, but even then, there were some problem domains where the SPU had its niche AFAIK.

The differences between the Cell CPU and Alder Lake are too numerous to count.
And the only thing somewhat similar, is that they are an heterogenous system.
And once again, Alder Lake took inspiration on ARM's Little.Big heterogeneous system. Not on the Cell heterogeneous system.
Okay, if that is your distinction, I don't agree but appreciate you noting the CELL BE was the first heterogeneous(hUMA) system.
 

cowgod

Neo Member
PS3 was a misunderstood, anachronistic intellectual heavyweight. 599 U.S. dollars was a bargain for this hardware in its day for the fact that it, and its titles, served as a safe harbor for serious gamers in an era where gaming’s intellectual foundations were otherwise shifting beneath our feet. Although many of us didn’t fully appreciate it at the time, PS3’s catalog was stellar for its deep library of innovative and intellectually stimulating software. They just don’t make them like they used to imo. Its library shines in particular in retrospect, for without Demon’s Souls, we’d nary have Elden Ring, the current greatest game of all time. That entire franchise was born on the PS3, and exclusively no less. A triumph. MGS4 gave us the pinnacle of cutscene-driven storytelling. Heavy Rain gave us all that and more. In terms of intellectual merit, the PS3’s peers are only the PS1 and Saturn. No other console before or since holds the same pedigree in terms of artistic and intellectual quality. A triumph.
 
This is some grade A bullshit.
PS3 was a misunderstood, anachronistic intellectual heavyweight. 599 U.S. dollars was a bargain for this hardware in its day for the fact that it, and its titles, served as a safe harbor for serious gamers in an era where gaming’s intellectual foundations were otherwise shifting beneath our feet. Although many of us didn’t fully appreciate it at the time, PS3’s catalog was stellar for its deep library of innovative and intellectually stimulating software. They just don’t make them like they used to imo. Its library shines in particular in retrospect, for without Demon’s Souls, we’d nary have Elden Ring, the current greatest game of all time. That entire franchise was born on the PS3, and exclusively no less. A triumph. MGS4 gave us the pinnacle of cutscene-driven storytelling. Heavy Rain gave us all that and more. In terms of intellectual merit, the PS3’s peers are only the PS1 and Saturn. No other console before or since holds the same pedigree in terms of artistic and intellectual quality. A triumph.
Hello gamespot system wars troll post lol
 

ACESHIGH

Banned
Cell processor was peak Arrogant Sony. Believing no matter how obtuse their HW was, devs would have to bow and code for it. Their great selling first party games were just course correcting and got pretty late to the party around 2009.
 
Cell processor was peak Arrogant Sony. Believing no matter how obtuse their HW was, devs would have to bow and code for it. Their great selling first party games were just course correcting and got pretty late to the party around 2009.

Exactly, I think this is mostly an oversimplification.

I believe the late success of Naughty Dog more than anything is what changed Sony's first-party trajectory.

Before Naughty hit their mark on the PS3, most of Sony's success came from Gran Turismo and platformers: Crash Bandicoot/Spyro The Dragon, Ratchet and Clank, Jak and Daxter.

God of War was really the outlier here but even that didn't have nearly the success you'd be looking for today.

Uncharted lead to the Last of Us, and The Last of Us Remastered but in all of this, if you look at the performance of Resistance, Killzone, and Infamous, their games still weren't hitting the top marks. All of those franchises have been displaced.

And if you look more closely, they've kind of stumbled the last few years. I think Last of Us 2 and Horizon have been serious disappointments to them saleswise. GT7's controversy was the last thing they needed.

Miles Morales did very well, but Rift Apart not so much.

I think they'll need to continue to re-evaluate their focus and double down on quality. I'm sure COVID hasn't helped, but their games don't have the polish they should.
 

Panajev2001a

GAF's Pleasant Genius
The translation is done in two steps inside the SPE (effective addresses exist so that they can also be shared with the PPE and other SPE’s).
DMA ops are fully contained inside the SPE,

PPE does not do any translation, but data can be snooped from its cache if it is available there and you save a trip to memory as a result on reads. On writes to memory the PPE cache lines are invalidated helping to keep a consistent view of memory.
So, it is still not what you were saying (which would have been crippling for its memory performance).

Also, although the HW is capable (so was the EE, but only on PS2 Linux you had to do virtual to physical translation yourself yet you could still request an unpageable memory region in 4 KB physically continuous chunks but hey the DMAC tags allowed you to specify hop transfer hop transfer instructions so to build a chain of multiple small chunks) I would be surprised if the PS3 GameOS had virtual memory enabled (GameOS ended up with 32 bits only addresses IIRC to optimise cache use, so they were not going crazy) and was letting the OS page things out to disk (if you want to highlight the blurb about what happens if data is paged out… CELL was intended for more than just PS3).
 
Last edited:

winjer

Gold Member
CUs didn't exist before the CELL BE, the geometry shader had only just become an Opengl extension at the time IIRC to join vertex and fragment shaders, and Opengl 2.1 was just being specified. So making that comparison with - what some described as - "satellite" processors (because they aren't enslaved to the main CPU thread like normal multi-core) is very much out of context of a graphics industry without CUDA/opencl until a few years later, and I already mentioned that the GTX 200 series was the first point where GPUs were able to replace most of the SPU's performance versatility, but even then, there were some problem domains where the SPU had its niche AFAIK.

The Cell SPEs are like a precursor for a CU of a GPU. Not for a CPU core.
The SPE is a "dumb" unit. It can't do branching and it can't change the order of instructions that are feed to it. Unlike Alder Lake, or any of Intel CPUs of the past 3 decades.

Okay, if that is your distinction, I don't agree but appreciate you noting the CELL BE was the first heterogeneous(hUMA) system.

hUMA is a term for heterogeneous uniform memory access. It was a term used by AMD for it's APUs.
What you are talking about is Heterogeneous Computing. And that concept is being used since the 1980's.
 
Last edited:

MonarchJT

Banned
Exactly, I think this is mostly an oversimplification.

I believe the late success of Naughty Dog more than anything is what changed Sony's first-party trajectory.

Before Naughty hit their mark on the PS3, most of Sony's success came from Gran Turismo and platformers: Crash Bandicoot/Spyro The Dragon, Ratchet and Clank, Jak and Daxter.

God of War was really the outlier here but even that didn't have nearly the success you'd be looking for today.

Uncharted lead to the Last of Us, and The Last of Us Remastered but in all of this, if you look at the performance of Resistance, Killzone, and Infamous, their games still weren't hitting the top marks. All of those franchises have been displaced.

And if you look more closely, they've kind of stumbled the last few years. I think Last of Us 2 and Horizon have been serious disappointments to them saleswise. GT7's controversy was the last thing they needed.

Miles Morales did very well, but Rift Apart not so much.

I think they'll need to continue to re-evaluate their focus and double down on quality. I'm sure COVID hasn't helped, but their games don't have the polish they should.
and it should be noted that God of war was a tremendous fun isometric hack'n'slash ... not the naughty dog version of the same it has become today.
 

PaintTinJr

Member
The Cell SPEs are like a precursor for a CU of a GPU. Not for a CPU core.
The SPE is a "dumb" unit. It can't do branching and it can't change the order of instructions that are feed to it. Unlike Alder Lake, or any of Intel CPUs of the past 3 decades.
Why would it have wanted to at the time? It was able to hit 80-90% throughput with CPU complex level algorithms when CPUs running well were 40-50% at best with HT/SMT enabled. The RoadRunner was untouched for years because of the SPU design. I don't get why you seemingly want to frame the processor as being no-big-deal and didn't lead to the GPUs, APUs and CPUs like the M1 and P-E Core CPUs we have today.
hUMA is a term for heterogeneous uniform memory access. It was a term used by AMD for it's APUs.
What you are talking about is Heterogeneous Computing. And that concept is being used since the 1980's.

Overview

Originally introduced by embedded systems such as the Cell Broadband Engine, sharing system memory directly between multiple system actors makes heterogeneous computing more mainstream. Heterogeneous computing itself refers to systems that contain multiple processing units – central processing units (CPUs), graphics processing units (GPUs), digital signal processors (DSPs), or any type of application-specific integrated circuits (ASICs). The system architecture allows any accelerator, for instance a graphics processor, to operate at the same processing level as the system's CPU...
When the term hUMA was getting renewed use back in 2005 with the launch of the CELL BE and the major success of IBM RoadRunner, AMD were a prominent company on the consortium logo list, obviously because the RoadRunner used AMD Opteron workstations to feed the tens of thousands of CELL BE machines with data. I've tried using the wayback machine to find the images that framed hUMA in regards of the CELL BE as they were, but sadly no such luck.
 
Last edited:

winjer

Gold Member
Why would it have wanted to at the time? It was able to hit 80-90% throughput with CPU complex level algorithms when CPUs running well were 40-50% at best with HT/SMT enabled. The RoadRunner was untouched for years because of the SPU design. I don't get why you seemingly want to frame the processor as being no-big-deal and didn't lead to the GPUs, APUs and CPUs like the M1 and P-E Core CPUs we have today.

Stop misrepresenting what I say.
My point was that the SPE is a precursor to the modern CU. Sony even had the intention of having just the Cell to render graphics, before realizing the issues with yields.
Had Sony managed to get decent yields on the Cell, those SPEs would have been very similar in concept, to what the CUs we have today in a console SoC.
Your point that Cell led to Alder lake is complete non-sense. Cell was a dead end, that no one copied since.
And just because Alder Lake has one similarity in a very broad technical term, it does not mean it's related to Cell.



When the term hUMA was getting renewed use back in 2005 with the launch of the CELL BE and the major success of IBM RoadRunner, AMD were a prominent company on the consortium logo list, obviously because the RoadRunner used AMD Opteron workstations to feed the tens of thousands of CELL BE machines with data. I've tried using the wayback machine to find the images that framed hUMA in regards of the CELL BE as they were, but sadly no such luck.

WTF is wrong with you? The page you posted has only one mention of hUMA and it refers to AMD's Heterogeneous Unified Memory Access. You continue to use the wrong term for Heterogeneous Computing.
 
Last edited:

solidus12

Member
I believe that some multiplats performed better on PS3; TR Underworld comes to mind (which was handled by Nixxes)
 

Lysandros

Member
When a developer knew how to use it it could out do the PS1. Look at how good Sega Rally, VF2 and the revised Daytona were.
In 3D department? No. No way. Playstation was simply the better architected, more capable machine at it and this was/still is the take of the vast majority of game developers. Forget later examples like Ridge Racer type 4, even earlier games like Porche Challenge and Tobal no 2 (1997) wouldn't be possible on the machine without drastic downgrades. Saturn isn't remotely akin to PS3/CELL, being difficult the program for because of poor design doesn't magically make it 'more powerful' in a hidden way, that's pure historical revisonism/fantasy.
 
Last edited:
Exactly, I think this is mostly an oversimplification.

I believe the late success of Naughty Dog more than anything is what changed Sony's first-party trajectory.

Before Naughty hit their mark on the PS3, most of Sony's success came from Gran Turismo and platformers: Crash Bandicoot/Spyro The Dragon, Ratchet and Clank, Jak and Daxter.

God of War was really the outlier here but even that didn't have nearly the success you'd be looking for today.

Uncharted lead to the Last of Us, and The Last of Us Remastered but in all of this, if you look at the performance of Resistance, Killzone, and Infamous, their games still weren't hitting the top marks. All of those franchises have been displaced.

And if you look more closely, they've kind of stumbled the last few years. I think Last of Us 2 and Horizon have been serious disappointments to them saleswise. GT7's controversy was the last thing they needed.

Miles Morales did very well, but Rift Apart not so much.

I think they'll need to continue to re-evaluate their focus and double down on quality. I'm sure COVID hasn't helped, but their games don't have the polish they should.
I can't understand where this narrative that they stumbled in the last few years is coming from. It's been hit after hit pretty much, not only since the console released but even in the moths that preceded the release. Critically and commercially successful games, sold out console, huge service numbers, etc.

The last R&C was likely more successful then ever, R&C games didn't use to get this much attention or promotion before. GT7 did exceptionally well as well. Returnal did great for a game like that.

You seem to be underselling a lot of things, Horizon was a massively successful new IP and I think it's way to soon to assume the sequel didn't do as well as they hoped for. Releasing close to Elden Ring didn't help but it's nothing that can't be overcome over time.

TLoU2 is probably the only recent game that might've sold bellow expectations (given that Sony doesn't update the numbers) but even that was still massively successful anyway and the multiplayer component was not even released yet.
 
Last edited:
Top Bottom