• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

is PS5 GPU is slightly better than gtx 1080 thats pretty pathetic.

V4skunk

Banned
GameCube was the best technically. While Xbox had faster clocks and more ram, it was bottlenecked by a slow front side bus and low memory bandwidth. Had Xbox been given eDRAM and a faster bus it would have been no contest.
I agree, GC was a little beast. Nothing on xbox technically compared to the Rogue Leader games pushing out over 14-21 million triangles in real time per second.
Metroid Prime 2, Resident Evil 4, F-Zero GX were also graphically on par or better than Xbox's best looking like Riddick.
 
These kinds of hot take posts make no sense. Yes, let's expect the latest PC GPU gaming hardware in a console so that MS and Sony have to sell them for $1000+ in order to break even. Or sell them at a $500 loss per unit. Do you people even think before you post this stuff?

This is still several times more powerful than current Gen consoles.
 
These kinds of hot take posts make no sense. Yes, let's expect the latest PC GPU gaming hardware in a console so that MS and Sony have to sell them for $1000+ in order to break even. Or sell them at a $500 loss per unit. Do you people even think before you post this stuff?

This is still several times more powerful than current Gen consoles.
i don't understand why this thread is still going :messenger_tears_of_joy:
 

pawel86ck

Banned
GameCube was the best technically. While Xbox had faster clocks and more ram, it was bottlenecked by a slow front side bus and low memory bandwidth. Had Xbox been given eDRAM and a faster bus it would have been no contest.
Have you seen splinter cell 1-4 on GC? These games had to be extremely downgraded just to run on GC. Yes, GC had faster RAM, but still 50% less compared to xbox, and on top of that xbox had HDD, so developers could cache everything and build even bigger levels. Xbox also had more advanced GPU, (pixel shader, verte. shaders, shadow buffers), so developers were using shaders and buffer shadows pretty much everywhere.

It's not like you need pixel shaders to render amazing water reflections (or shadow buffers to render shadows), because GC could do it in software, but you need hardware to do it fast enough. Trying to recreate shaders on GC was expensive, these effects take away processing power from the CPU that could be used for other things, this is all because shaders are just instructions executed on the GPU rather than the CPU. On top of that effects like bump mapping requires a bump map texture for every real texture you want to apply it to, so you need more RAM for that. Because of that developers very rarely used bump mapping and other effects on GC. For example Luigis Mansion. The bump mapping is found only one place in the game, a brick texture.

The only game that use bump mapping on many textures was star wars rogue leader, but that was space shooter with 2D background scenery and small levels with simple ground texture, so they could spend more RAM resources on ships (bump mapping, self shadows).

Xbox also had 2x vertex shaders, and enabling vertex shaders has the effect of doubling triangle load

Xbox was the most capable 6 gen console, not GC. All you need to do is look at some Xbox games like splinter cell 1-4, doom 3, Riddick looked like some early ps3/x360 games.
 
I agree, GC was a little beast. Nothing on xbox technically compared to the Rogue Leader games pushing out over 14-21 million triangles in real time per second.
Metroid Prime 2, Resident Evil 4, F-Zero GX were also graphically on par or better than Xbox's best looking like Riddick.
Riddick if you'll notice has low poly counts, cube was a poly pushing monster but the difference is sometimes more extreme in cases where normal maps are used in Xbox titles, they weren't a good replacement for polys in most cases, not until Xbox 360 anyway.

Driving games like rallisport challenge 2 Xbox shined the most where normal maps could be used for track detail but the cars were greatly modeled and at 60fps no less.
 
Have you seen splinter cell 1-4 on GC? These games had to be extremely downgraded just to run on GC. Yes, GC had faster RAM, but still 50% less compared to xbox, and on top of that xbox had HDD, so developers could cache everything and build even bigger levels. Xbox also had more advanced GPU, (pixel shader, verte. shaders, shadow buffers), so developers were using shaders and buffer shadows pretty much everywhere.

It's not like you need pixel shaders to render amazing water reflections (or shadow buffers to render shadows), because GC could do it in software, but you need hardware to do it fast enough. Trying to recreate shaders on GC was expensive, these effects take away processing power from the CPU that could be used for other things, this is all because shaders are just instructions executed on the GPU rather than the CPU. On top of that effects like bump mapping requires a bump map texture for every real texture you want to apply it to, so you need more RAM for that. Because of that developers very rarely used bump mapping and other effects on GC. For example Luigis Mansion. The bump mapping is found only one place in the game, a brick texture.

The only game that use bump mapping on many textures was star wars rogue leader, but that was space shooter with 2D background scenery and small levels with simple ground texture, so they could spend more RAM resources on ships (bump mapping, self shadows).

Xbox also had 2x vertex shaders, and enabling vertex shaders has the effect of doubling triangle load

Xbox was the most capable 6 gen console, not GC. All you need to do is look at some Xbox games like splinter cell 1-4, doom 3, Riddick looked like some early ps3/x360 games.
Chaos theory is also low poly, re4 shits all over it.

The only thing cube couldn't do was normal maps but I'd say that nothing of value was lost there. Doom 3 is hilariously low poly, riddick is better but still significantly less than prime 1 and at less than half the frame rate. And stencil shadows, but again that cane at heavy cost to Xbox and you could fake them nicely on cube and even silent hill 2 on ps2 had an implementation.



Xbox could have higher res textures than cube thanks to ram though, but honestly textures weren't cubes weakness either. One more area where Xbox beat cube though is geometry clipping ; burn out 3 wasn't on cube because cubes gpu wasn't well suited to geometry deformation and the cpu had to help out. But in doing so there just weren't enough CPU resources for everything else.

Xbox had a monstrous chipset but the team didnt soend a lot of time in making sure everything could perform optimally. Again not only in bandwidth but CPU to gpu communication, cube was simply faster.
 
Last edited:

Black_Stride

do not tempt fate do not contrain Wonder Woman's thighs do not do not
Wait the PS5s GPU will be more powerful than like 90% of Steams gamer PCs GPUs?

Im sold as shit considering what the PS4 can do with its at this time wack-ass GPU.

Im not even worried.

Hell with optimization my 1070 probably wont be able to match Console level nextgen.
FUCK!

The next RTX cards need to drop asap. And no im not talking about the Supers.
 

daveonezero

Banned
These kinds of hot take posts make no sense. Yes, let's expect the latest PC GPU gaming hardware in a console so that MS and Sony have to sell them for $1000+ in order to break even. Or sell them at a $500 loss per unit. Do you people even think before you post this stuff?

This is still several times more powerful than current Gen consoles.
exactly. Hardware is finalized long before the console is release. So it has to be finalized a year or so before. Thinking it should have the latest GPU is idiotic.

Op should just be building PC's instead of complaining.
 

gypsygib

Member
Well considering the only better cards are a $700 2080/R7 or $1200 2080ti, I'd say 1080 performance isn't bad for a console aiming at $400-$500.
 

Pimpbaa

Member
Preference is another thing entirely.

I sometimes too sacrifice framerate for eyecandy, that's the beauty of pc.

But in an ideal scenario, every game developer would choose graphics and framerate.

And every and any game would benefit from a framerate boost, no exceptions.

If a game can retain the high quality graphics and have 60fps or higher than yeah that is unquestionably better. But in the context of current gen consoles, 60fps target can come with a signficant cost to quality of the graphics. There are exceptions however, like Doom 2016 on consoles. And I'm sure Doom Eternal will push the envelope of what can be done at 60fps on current consoles even further. But not many developers can come close to id. Next gen consoles should help the situation a great deal with them finally getting good CPUs.
 

Naglafar

Member
If you consider how nice the current consoles look with a 2012 tablet SOC, this is a huge upgrade. 4K 60 with checkerboard is in reach, maybe native in some games.
 

pawel86ck

Banned
Chaos theory is also low poly, re4 shits all over it.

The only thing cube couldn't do was normal maps but I'd say that nothing of value was lost there. Doom 3 is hilariously low poly, riddick is better but still significantly less than prime 1 and at less than half the frame rate. And stencil shadows, but again that cane at heavy cost to Xbox and you could fake them nicely on cube and even silent hill 2 on ps2 had an implementation.



Xbox could have higher res textures than cube thanks to ram though, but honestly textures weren't cubes weakness either. One more area where Xbox beat cube though is geometry clipping ; burn out 3 wasn't on cube because cubes gpu wasn't well suited to geometry deformation and the cpu had to help out. But in doing so there just weren't enough CPU resources for everything else.

Xbox had a monstrous chipset but the team didnt soend a lot of time in making sure everything could perform optimally. Again not only in bandwidth but CPU to gpu communication, cube was simply faster.
RE4 was ported to PS2 with very good results. That game would easily run on xbox. In fact if RE4 would be made with xbox in mind I have no doubts game would look even better than GC version. Levels would be bigger and textures more detailed thanks to more RAM and HDD (like in splinter cell games). Also shaders could be used to improve water surface, lighting effects, and textures, not to mention dynamic shadows could be used everywhere thanks to shadows buffers.

Xbox games were shader focused because shaders can take flat and unrealistic looking scene and turn it into something realistic looking. Also keep in mind with bump mapping developers dont need to rely on polygons that much in order to create detailed scenes (that's why bump mapping was invented). Games like splinter cell 3 or Doom 3 both looked like a next gen games to me, IMO GC games like RE4 or Metroid Prime 1-2 looked just flat compared to that. I have played metroid prime 2 lately on dolphin emulator and there's nothing impressive there, however I can run games like Splinter Cell 3 and even today I'm impressed.
 

Mattyp

Gold Member
Wait the PS5s GPU will be more powerful than like 90% of Steams gamer PCs GPUs?

This has always been the most amusing part of PC gamers shitting on consoles, when there's a large chance their PC wont come close to nextgen at launch and they've still never experienced 4K yet either which has been going on for some time now in consoles.

I wont comment on this GC vs Xbox argument about polygons but games on the OG xbox always looked vastly better in my eyes. Even the GTA3/VC bump over my PS2 editions was massive.


65526182-905129209831300-8684519598743093248-n.jpg
 
Last edited:

JohnnyFootball

GerAlt-Right. Ciriously.
I agree, GC was a little beast. Nothing on xbox technically compared to the Rogue Leader games pushing out over 14-21 million triangles in real time per second.
Metroid Prime 2, Resident Evil 4, F-Zero GX were also graphically on par or better than Xbox's best looking like Riddick.
Uh no. Nintendo did the best they could with what they had and those titles looked absolutely outstanding on Gamecube, but I'm sorry those titles have nothing absolutely on Riddick or any of the best looking games on Xbox.
 

tkscz

Member
the 1080 was release 3 years ago in 2016. while it was 500 Dollars. it was still 3 damn years ago. that would make the console just like last gen, outdated on release day. am not expecting it to be as good as a current gen PC GPU at the high end. but giving us hardware thats maxed out on day one doesn't leave too much of a good taste in my mouth

am just using some of the leaks here that might not be true. 1080 performance is nothing to write home about. 1080 TI perf in 2020? yeah . that would be 4 years after these cards came out. thats pretty reasonable isn't it?.

hell even PS4 wasn't worse than a 2009 GPU.

I thought we finally learned you can't compare console hardware directly to a PC "equivalent".

1080 performance is nothing to write home about, in a PC. In a console, it would run absolute circles around the PS4 Pro and Xbox One X.
 

sircaw

Banned
I bought a ps4 pro last year and have been astounded by the graphics of Horizon zero dawn, god of war, rachet and clank to name a few. If this is what can be achieved on this type of hardware i can only imagine what can be obtained on a 1080+ factoring in that sweet new cpu, ssd and more ram.

I just don't understand how someone could call +1080 power in a console for probably around £450 pathetic.
In my opinion it's breathtaking for the price.
 

JohnnyFootball

GerAlt-Right. Ciriously.
I thought we finally learned you can't compare console hardware directly to a PC "equivalent".

1080 performance is nothing to write home about, in a PC. In a console, it would run absolute circles around the PS4 Pro and Xbox One X.
THANK YOU.

It's getting really annoying that people have all of the sudden forgot this.

The new argument is "well these GPUs are so close to their PC counterparts that the performance should be expected to be similar"

Wrong. So dead wrong. It cannot be overstated how much of an advantage it is to be able to code to metal to ONE configuration with known quantities of RAM/shaders/CPU speed etc.

I'll disagree slightly on your 1080 comment, as it is still a very good GPU for those that got one before the mining craze.
 
Last edited:

pawel86ck

Banned
THANK YOU.

It's getting really annoying that people have all of the sudden forgot this.

The new argument is "well these GPUs are so close to their PC counterparts that the performance should be expected to be similar"

Wrong. So dead wrong. It cannot be overstated how much of an advantage it is to be able to code to metal to ONE configuration with known quantities of RAM/shaders/CPU speed etc.

I'll disagree slightly on your 1080 comment, as it is still a very good GPU for those that got one before the mining craze.
True, gta5 run on PS3 GPU (RSX was much slower compared to 7800 GTX) and 512MB while the same game is totally unplayable on 8800 GTX/Ultra GPU and 2GB ram.

Some recent example metro exodus. Radeon 7870 (a little bit faster GPU compared to PS4) run Metro Exodus at 14-20fps (low settings 1080p), while on PS4 game runs at higher settings and 30fps.

On consoles developers would be able to push graphics fidelity on GTX 1080 equivalent GPU to really absurd levels compared to PS4, and especially in 30fps and 4K checkerboard games. The problem is, right now we still dont know if Gonzalo is real PS5 hardware... for now people only assume Gonzalo = PS5.
 

Geki-D

Banned
This has always been the most amusing part of PC gamers shitting on consoles, when there's a large chance their PC wont come close to nextgen at launch and they've still never experienced 4K yet either which has been going on for some time now in consoles.

I wont comment on this GC vs Xbox argument about polygons but games on the OG xbox always looked vastly better in my eyes. Even the GTA3/VC bump over my PS2 editions was massive.


65526182-905129209831300-8684519598743093248-n.jpg
Nah, man. Every PC gamer is on the frontier of cutting edge technology. Literally 2.95% of Steam users have a 1080 (which we now know is a trash tier, console peasant card) and 0.75% have a 2080. If you were a true PC ubermensch you'd know the joy of having the latest tech automatically teleported straight into your PC apon release for the simple price of "just cheaper than consoles".
 

Naglafar

Member
GameCube was the best technically. While Xbox had faster clocks and more ram, it was bottlenecked by a slow front side bus and low memory bandwidth. Had Xbox been given eDRAM and a faster bus it would have been no contest.

I do love 20 year old console wars. Your above statement simply isnt true. The GC was better than the PS2 for sure and was very impressive at the time, but the Xbox's GPU supported all kinds of advanced features that just didn't exist on other consoles and even most PC video cards at the time. As far as the FSB itself, the cube ran at 1.3 GB/s of bandwidth and the Xbox had 1.06 GB/s - BUT the Xbox still had more RAM, a 250 Mhz faster CPU, and that advanced GPU, which still put it ahead of the pack that gen.
 
Last edited:
RE4 was ported to PS2 with very good results. That game would easily run on xbox. In fact if RE4 would be made with xbox in mind I have no doubts game would look even better than GC version. Levels would be bigger and textures more detailed thanks to more RAM and HDD (like in splinter cell games). Also shaders could be used to improve water surface, lighting effects, and textures, not to mention dynamic shadows could be used everywhere thanks to shadows buffers.

Xbox games were shader focused because shaders can take flat and unrealistic looking scene and turn it into something realistic looking. Also keep in mind with bump mapping developers dont need to rely on polygons that much in order to create detailed scenes (that's why bump mapping was invented). Games like splinter cell 3 or Doom 3 both looked like a next gen games to me, IMO GC games like RE4 or Metroid Prime 1-2 looked just flat compared to that. I have played metroid prime 2 lately on dolphin emulator and there's nothing impressive there, however I can run games like Splinter Cell 3 and even today I'm impressed.
Ps2 version of re4 looks like ass. Greatly reduced polys stripped lighting, reduced foliage etc

Gc could do anything Xbox can through its tev units, sans normal maps which, are not a suitable replacement for geometry.

But, you do you.
 
I do love 20 year old console wars. Your above statement simply isnt true. The GC was better than the PS2 for sure and was very impressive at the time, but the Xbox's GPU supported all kinds of advanced features that just didn't exist on other consoles and even most PC video cards at the time. As far as the FSB itself, the cube ran at 1.3 GB/s of bandwidth and the Xbox had 1.06 GB/s - BUT the Xbox still had more RAM, a 250 Mhz faster CPU, and that advanced GPU, which still put it ahead of the pack that gen.
All your numbers are wrong. Xbox gpu was 233mhz.

Front side bus isn't measured in GB/s. It's 133mhz for Xbox and 162 mhz for cube. Or in other words, cubes gpu could transfer data as fast as it needed while Xbox needed 100mhz faster bus for no bottlenecking.
 

Journey

Banned
GameCube was the best technically. While Xbox had faster clocks and more ram, it was bottlenecked by a slow front side bus and low memory bandwidth. Had Xbox been given eDRAM and a faster bus it would have been no contest.


That is factually incorrect. Xbox had more ram, faster clocked ram and more memory bandwith. Just like Xbox One, GameCube used embedded Dram to make up for the slower bus, but again, just like Xbox One, it does not make it better than PS4 due to the existence of ES RAM, that's merely a bandaid that would never make up for difference, especially when you only have 24MB of this slower ram vs 64MB, that's almost 3 times.

Xbox GPU, codenamed NV2A, was based on the GeForce 3 architecture, the first programmable shader GPU and the programmability we still use today. On the other hand, the GameCube GPU was based on a design by ArtX codenamed "Flipper" which by the way had little influence from ATI by the time the company took over, but one thing is certain, it did NOT support programmable shaders and a far step behind a modern architecture, not to mention it was also clocked lower, only 162 Mhz vs 233 MHz, a near 50% increase on an efficient architecture meant a much higher FP and real world performance . Xbox also had a standard HDD, something we see as the norm today. Halo instant loading sections of a map were glorious and games like Splinter Cell Chaos Theory were practically a generational leap on Xbox over the GameCube which was also gimped by the low storage disc solution compared to DVD. 1.5GB vs 8.5GB.

 
Last edited:

Naglafar

Member
All your numbers are wrong. Xbox gpu was 233mhz.

Front side bus isn't measured in GB/s. It's 133mhz for Xbox and 162 mhz for cube. Or in other words, cubes gpu could transfer data as fast as it needed while Xbox needed 100mhz faster bus for no bottlenecking.

The total bandwidth is a better measurement than raw mhz, especially since the design of the processors is so different. And nowhere did I mention the speed of the GPU? - just its features.
 

Journey

Banned
It's crazy to compare. The difference was not only in the GPU's capabilities of shading and lighting, but also levels ahead in geometry.

Take this complex scene from Spinter Cell. Xbox is able to approximate the PC version best, while PS2 and GameCube had to remove a major portion of the geometry in the complex room where there are so many shapes, but even more complicated when each individual blade casts a shadow.

splintercell_040703_in2.jpg
 

Romulus

Member
That is factually incorrect. Xbox had more ram, faster clocked ram and more memory bandwith. Just like Xbox One, GameCube used embedded Dram to make up for the slower bus, but again, just like Xbox One, it does not make it better than PS4 due to the existence of ES RAM, that's merely a bandaid that would never make up for difference, especially when you only have 24MB of this slower ram vs 64MB, that's almost 3 times.

Xbox GPU, codenamed NV2A, was based on the GeForce 3 architecture, the first programmable shader GPU and the programmability we still use today. On the other hand, the GameCube GPU was based on a design by ArtX codenamed "Flipper" which by the way had little influence from ATI by the time the company took over, but one thing is certain, it did NOT support programmable shaders and a far step behind a modern architecture, not to mention it was also clocked lower, only 162 Mhz vs 233 MHz, a near 50% increase on an efficient architecture meant a much higher FP and real world performance . Xbox also had a standard HDD, something we see as the norm today. Halo instant loading sections of a map were glorious and games like Splinter Cell Chaos Theory were practically a generational leap on Xbox over the GameCube which was also gimped by the low storage disc solution compared to DVD. 1.5GB vs 8.5GB.




Hulk Ultimate Destruction was one of the best examples and technically one of the most impressive games of that generation. Destruction, open world, tons of enemies onscreen. It ran at 720p on the original xbox(compared to 480p in the other console) along with better framerates compared to the GC version. Massive difference and 720p was crazy back then for an open world game with destruction, really any game for that matter.

And then you had games like Doom, Half Life 2, and Republic Commando went even attempted on any consoles outside of xbox.
 
Last edited:
My guess is they will aim for lowest price gpu that can do 4k resolution justice. Just like ps4 was pretty solid 1080p system. I doubt they are aiming to make balls out, as good as possible system cause that will obviously be too expensive for good adoption.

Keep in mind more graphical horsepower means expensive to develop games as well. They have to draw line somewhere.
 
I have to say, if both xbox and ps5 are like this, im also kinda disappointed. Isnt the xbox one X gpu close to that already?
 

JohnnyFootball

GerAlt-Right. Ciriously.
That is factually incorrect. Xbox had more ram, faster clocked ram and more memory bandwith. Just like Xbox One, GameCube used embedded Dram to make up for the slower bus, but again, just like Xbox One, it does not make it better than PS4 due to the existence of ES RAM, that's merely a bandaid that would never make up for difference, especially when you only have 24MB of this slower ram vs 64MB, that's almost 3 times.

Xbox GPU, codenamed NV2A, was based on the GeForce 3 architecture, the first programmable shader GPU and the programmability we still use today. On the other hand, the GameCube GPU was based on a design by ArtX codenamed "Flipper" which by the way had little influence from ATI by the time the company took over, but one thing is certain, it did NOT support programmable shaders and a far step behind a modern architecture, not to mention it was also clocked lower, only 162 Mhz vs 233 MHz, a near 50% increase on an efficient architecture meant a much higher FP and real world performance . Xbox also had a standard HDD, something we see as the norm today. Halo instant loading sections of a map were glorious and games like Splinter Cell Chaos Theory were practically a generational leap on Xbox over the GameCube which was also gimped by the low storage disc solution compared to DVD. 1.5GB vs 8.5GB.


Its strange how much people underestimate the power of the original xbox. Having a Geforce 3 in the xbox was the equivalent of having a 2080 (maybe Ti) in consoles right now.

I had to laugh at previous posts suggesting that Metroid Prime looked better than Chronicles of Riddick.
 

Romulus

Member
.
I have to say, if both xbox and ps5 are like this, im also kinda disappointed. Isnt the xbox one X gpu close to that already?

Depends on your definition of close, and we dont have the final numbers though.. X1X is a powerful console in terms of gpu power, but it really falls apart on the CPU side.
 
Last edited:

Dontero

Banned
I have to say, if both xbox and ps5 are like this, im also kinda disappointed. Isnt the xbox one X gpu close to that already?

Don't look at pro version of consoles as any indicator to next gen.

1. All of games created had to make games with base version in mind. Pro version never were any significant majority of total users. Which means that aside from few insignificant graphical features pro version only in effect could bump up resolution and some settings.
2. Pro versions improved GPU but they didn't improve CPU. CPU already was ahilles heal for PS4 (much less so for xboxone) and with beefier gpus it is for 100% sure those gpus were very bottlenecked by cpus. Which is why you hardly saw improvements in framerate and most of improvements was resolution which is almost entirely on GPU part.

IF new console will have 10TF gpu you will see proper jump in quality of graphics.
 
That is factually incorrect. Xbox had more ram, faster clocked ram and more memory bandwith. Just like Xbox One, GameCube used embedded Dram to make up for the slower bus, but again, just like Xbox One, it does not make it better than PS4 due to the existence of ES RAM, that's merely a bandaid that would never make up for difference, especially when you only have 24MB of this slower ram vs 64MB, that's almost 3 times.

Xbox GPU, codenamed NV2A, was based on the GeForce 3 architecture, the first programmable shader GPU and the programmability we still use today. On the other hand, the GameCube GPU was based on a design by ArtX codenamed "Flipper" which by the way had little influence from ATI by the time the company took over, but one thing is certain, it did NOT support programmable shaders and a far step behind a modern architecture, not to mention it was also clocked lower, only 162 Mhz vs 233 MHz, a near 50% increase on an efficient architecture meant a much higher FP and real world performance . Xbox also had a standard HDD, something we see as the norm today. Halo instant loading sections of a map were glorious and games like Splinter Cell Chaos Theory were practically a generational leap on Xbox over the GameCube which was also gimped by the low storage disc solution compared to DVD. 1.5GB vs 8.5GB.


You guys really have no clue. GameCube had 24mbs main memory, 16mb additional and then 3mb eDRAM. The combined bandwidth of the eDRAM alone was over 17gb/s compared to xboxs 6.4. Your comparison with ps4 vs xbox 1 is flawed. Firstly because the esram is still slower than ps4s main memory, where as gamecubes is much faster than xboxs memory. Second, esram is less versatile than eDRAM and in the era of deferred rendering it means less than eDRAM used to. Framebuffers are too large now, but back then it was a different story.

People are Starting from a conclusion and arguing details second. No point in continuing, kinda like politics.
 
Last edited:
It's crazy to compare. The difference was not only in the GPU's capabilities of shading and lighting, but also levels ahead in geometry.

Take this complex scene from Spinter Cell. Xbox is able to approximate the PC version best, while PS2 and GameCube had to remove a major portion of the geometry in the complex room where there are so many shapes, but even more complicated when each individual blade casts a shadow.

splintercell_040703_in2.jpg
Would you use bayonetta on ps3 as evidence of 360s vast superiority? You guys suck at this lol
 
For that generation, anyone would agree that the OG Xbox was the beast. If these developers put the same effort into that console, it would be miles and miles ahead of anything the PS2 could muster, and of course we know it came out a year later, but so did the PS3 over 360 and games were pretty much on par.
The xbox 360 ps3 situation is due to the advent of unified shaders that ps3 lacked.

In a way the xbox has benefited twice from the innovation of brand new technology that competitors lacked. During the ps2 era it benefited from the innovation of pixel and vertex shaders from nvidia that ps2 lacked, and during the 360 era it benefited from amd's unified shaders that the ps3 lacked.
When I look at that screenshot, I can't help but to praise the effort achieved by the developers whom people seem to forget, the result is attributed to them and not the PS2 hardware which was the weakest that generation.
dreamcast was the weakest
 

3March

Banned
That is rather subjective. DMC5 needs 60fps, but 60fps does not automatically make a game look better than a 30fps. In genres that do not need 60fps, I'll take better looking graphics over 60fps. Particularly when it has some proper motion blur.
It sure as hell does for me. Makes games feel WAY better to play. Fuck immersion: I want gameplay.
 

Romulus

Member
Would you use bayonetta on ps3 as evidence of 360s vast superiority? You guys suck at this lol

I get what you're saying, but the ps3 was known as one of the most complicated consoles to work with of all time, so the discussion has to shift at the point to give the ps3 the benefit of the doubt because of that. Imo, the X360 was superior, not vastly, but it's my belief that the only reason certain ps3 games looked good is the same reason ps4 games do. Sony has the best first party devs by a massive margin, and if all that time and money was spent toward a 360 exclusive, you'd get even better results imo. That's another discussion though.
 
Last edited:
I get what you're saying, but the ps3 was known as one of the most complicated consoles to work with of all time, so the discussion has to shift at the point to give the ps3 the benefit of the doubt because of that. Imo, the X360 was superior, not vastly, but it's my belief that the only reason certain ps3 games looked good is the same reason ps4 games do. Sony has the best first party devs by a massive margin, and if all that time and money was spent toward a 360 exclusive, you'd get even better results imo. That's another discussion though.
No im just saying you can't use a game tailor made for one system in a comparison. 360 actually is the better machine vs. Ps3 but bayo would be a terrible example.

Just like comparing different versions of splinter cell.
 
What is the argument ? Xbox was faster than any other console out there. Arguing about memory bandwidth as if it mattered to comparison makes no sense.
You heard it here first folks ; memory bandwidth doesn't matter.

Why aren't we still using gddr3
 
Last edited:
Top Bottom