It's interesting because in real world use cases (and not dancers for PowerPoint slides) most third party games ran better on Xbox 360 than they did on PS3. Cell may have been a more powerful CPU but it was hard to develop for and it was constrained by the rest of the PS3 architecture. Raw power doesn't do any good if you can't access it. So theoretically? Yeah this is probably accurate. Practically? Only Sony's first party ever got a real boost out of it.
In reality Cell's power proved to be largely irrelevant and it was pretty much a disaster for Sony. It ended up costing Sony nearly all of their gaming profit from the prior two generations and it ended up getting Crazy Ken booted in favor of Kaz to run the gaming business and Cerny's more practical approach to hardware architecture. But Sony first party was true wizardry at times during that generation.
Faf or Panajev could speak to the programming far better than I, but from an architectural design point of view I stick by my assessment.
Again, many are falling into confusing [output] which is a variable that is dependent on many things, from developer time investment and managerial prerogatives to corporate strategy and market economics with what the hardware is capable of.
Let's look at this on several levels:
As a system: PS3 as sold was not optimal and I agree and have through this thread. Ideally they either would have diverged with RS and really spiced up the development situation or if they went the moderate route, there was the G80 which was aligned in launch window but would have needed more, heavy, early investment -- so they got nVidia-fucked and went with what they could get and afford in time and cost. It sucks for the tech-forward crowd.
As a 'CPU': Cell was a neat and novel solution to the problems faced at that time. It widely outperformed the competition --
it was literally a generation ahead of it's time, see the numbers -- and if it wasn't covering up for systemic failures with the RSX, would have been able to apply 200Gflop/sec to accelerate additional lighting and deferred shading or advanced physics interactions or AI or post-processing or whatever. Again, this is just a fact in objective reality, it's computational density is a nice mix between CPUs and GPUs on the curve I described earlier and, for it's time, was really neat. Now, things have shifted with transistor budgets and architecture and GPUs have assumed this role wonderfully, but again, you need to stay in a 2001-2005 mindset.
This is just a fact, Cell made it into supercomputing clusters on the Top500 (and dominated the Green500!), nobody in their right mind was using the XBox CPU.
EDIT: And my understanding is that the parts of Cell which fundamentally sucked, basically sucked times 3 on XBox CPU. The SPUs are pretty cut-and-dry and a super-set of the VU's which developers had on the EE, free of the hardwired idiosyncrasies of VU0, for example. And while their ISA wasn't overly verbose, I seem to have a memory of Gschwind and those guys basically going for the most optimal bang-for-buck instructions and chopping everything else out, they do their job. It's IBM's PPE which, correct me if wrong was build on IBM Austin's work with the guTS architecture, that basically was lacking for gamelogic and the branchy code and such.