DynamiteCop!
Banned
Tell that to the PlayStation 4 Pro lolYou heard it here first folks ; memory bandwidth doesn't matter.
Tell that to the PlayStation 4 Pro lolYou heard it here first folks ; memory bandwidth doesn't matter.
the 1080 was release 3 years ago in 2016. while it was 500 Dollars. it was still 3 damn years ago. that would make the console just like last gen, outdated on release day. am not expecting it to be as good as a current gen PC GPU at the high end. but giving us hardware thats maxed out on day one doesn't leave too much of a good taste in my mouth
am just using some of the leaks here that might not be true. 1080 performance is nothing to write home about. 1080 TI perf in 2020? yeah . that would be 4 years after these cards came out. thats pretty reasonable isn't it?.
hell even PS4 wasn't worse than a 2009 GPU.
You heard it here first folks ; memory bandwidth doesn't matter.
Why aren't we still using gddr3
What is the argument ? Xbox was faster than any other console out there. Arguing about memory bandwidth as if it mattered to comparison makes no sense.
But it is enough, depending on what the game was doing. Xbox was factually bottlenecked when processing alpha transparencies for example. Its fsb speed limited poly counts and frame rates.Memory bandwidth does matter, but if I could choose to play Nintendo games built from the ground up for original Xbox vs GC hardware, I'd choose Metroid Prime etc every time on Xbox. Any game for that generation I'd choose on Xbox. The memory advantage just isn't enough to offset a massive GPU and CPU disparity. It's just too much of a gulf.
But it is enough, depending on what the game was doing. Xbox was factually bottlenecked when processing alpha transparencies for example. Its fsb speed limited poly counts and frame rates.
Obviously I'm not saying GameCube wins in every category, but the overall advantage in poly counts and effects thanks to the fast framebuffers and texture cache means the exclusives just look more modern.
Sam fishers pointy arms next to leon s kennedy and the octogon modeling in doom 3 just doesn't hold up. Other titles like gaiden and orta absolutely look great, but the usual Xbox suspects not so much.
On the games that use normal maps on Xbox ; there absolutely is a polygon deficit in favor of GameCube.If RE4 ran on the PS2 to the extent it did, no doubt in my mind a specifically built Xbox port would best them both. The poly counts were never confirmed by anyone on how many they actually pushed on a given game compared to the next, just speculation on which game pushed the most, supposedly. Nintendo said one thing, devs said another. Xbox said one thing, and their devs said another.
You are over hyping the og xbox. It is clear you never owned a GC.Its strange how much people underestimate the power of the original xbox. Having a Geforce 3 in the xbox was the equivalent of having a 2080 (maybe Ti) in consoles right now.
I had to laugh at previous posts suggesting that Metroid Prime looked better than Chronicles of Riddick.
As for the cpus, it turns out they deliver the exact same flops per cycle. Gamecube's advantage is from the faster fsb and possibly the extra l2 cache, although xbox's l2 cache is more associative. 8 way vs cubes 2 way.You are over hyping the og xbox. It is clear you never owned a GC.
Xbox only had a slight advantage in directx and unified gpu tech with normal map shiny wall fx, these nice effects were part of the directx library quick and simple to add to a game, GC exclusives were pushing out 14-20 million polygons a second with custom effects and nicer textures.
You guys comparing the xbox Intel cpu to the GC IBM cpu are also delusional. Comparing apples to oranges ffs.
According to you Game Cube could do anything like Xbox, and who know's maybe it's true, but the question is how efficiently GC hardware could emulate xbox effects without hardware acceleration for certain effects? GC had no pixel and vertex shaders or shadow buffers!Ps2 version of re4 looks like ass. Greatly reduced polys stripped lighting, reduced foliage etc
Gc could do anything Xbox can through its tev units, sans normal maps which, are not a suitable replacement for geometry.
But, you do you.
On the games that use normal maps on Xbox ; there absolutely is a polygon deficit in favor of GameCube.
With regards to re4, I'm sure the alpha effects would probably be reduced on Xbox. At points there is a LOT of fog on screen. Given that a port to Xbox probably wouldn't include normal maps, and there's only per vertex lighting, its possible Xbox could match cubes poly counts. One thing xbox could have is higher res textures.
It's pure speculation though. Mikami made it sound like a port even to Xbox wouldn't be ideal.
The ps2 version is pretty damn ugly. You're free to believe that about polygon counts, I'm just offering my unbiased observation. It's quite obvious in the modeling of games like doom and Riddick there's a deficit, but I won't die on this hill.Nearly every port on Xbox was superior to GameCube's, especially more demanding games. The polygon stuff is all paper tiger talk during that generation, no hard stats proven either way. There's no metric at which the Gamecube was better at effects or polygon count. Whatever Mikami said, there was a ps2 version, and again, it holds up to GameCube in many aspects. Xbox would definitely further that with a ground up port.
It doesn't work like this! Both GameCube and Xbox had 4 pixel pipelines. Xbox could not do effects cheaper than GameCube just because of its shader featureset. The programmable sharers are simply a toolset ; it doesn't magically make xbox output more than cube or vice versa. Funny you mention star fox ; another early game that pretty much tells where the GameCube stands technically.According to you Game Cube could do anything like Xbox, and who know's maybe it's true, but the question is how efficiently GC hardware could emulate xbox effects without hardware acceleration for certain effects? GC had no pixel and vertex shaders or shadow buffers!
These days raytracing is a big thing, and GPU like 1080ti can run raytracing in software, but can it run RT in software as fast as 2080ti with hardware RT support? Hell no! Just run Quake 2 RTX and you will see the same fps on 2080ti in 1440p as on 1080ti in 480p! So theoretically both GPU's support raytracing, but only 2080ti with HW RT will run RT with decent results.
Game Cube developers very rarely used effects comparable to xbox in their games. Besides Star Wars Rogue Leader and maybe starfox even simple bump mapping was almost non existent in GC games, and not to mention other effects. I believe the problem was GC developers had to do everything in software with much higher performance penalty and besides that they were also RAM limited (xbox had 50% more ram and HDD on top of that). On xbox pretty much all exclusive games (and later on multiplatform games as well) were using very impressive texture and lighting effects (thanks to shaders) and high quality dynamic shadow (thanks to shadow buffers) on regular basis. GC was good at pushing polygons, but without xbox like effects these GC games were flat in the end. There's no way games like splinter cell 3, doom 3, half life 2, PGR2, Riddick, far cry or even halo 2 could be ported to GC with decent results.
The ps2 version is pretty damn ugly. You're free to believe that about polygon counts, I'm just offering my unbiased observation. It's quite obvious in the modeling of games like doom and Riddick there's a deficit, but I won't die on this hill.
You sound triggered lol.I think the GC version is ugly too. Sure the ps2 version is worse, but most of is clarity and low res textures, which the Xbox would clean up to an even further degree than GC. All those games were deficit back then in polygons. I'm not surrounding myself in some agenda filled belief here, there's literally no proof of those metrics in polygons. You're claiming these Xbox shooters are polygon deficit, but what about something built from the ground up like Metroid? That's no polygon stunner by any means. Actually, there's not a single GC title that I think of where it's like, wow that's pushing some polygons. I'm being objective. Maybe Rogue Squardon, but it's so hard to tell because everything looks so simple.
A mean this is first party title and I loved but in no way is it stand next to top tier Xbox shooters visually, even in terms of poly count imo. That's just some bizarre way of mudding the waters because it can't be measured, only an opinion in which I can't prove you wrong or likewise. Not saying it's intentional, but you see my point
You sound triggered lol.
You want to tell people here GC could render shadows as fast without shadow buffers? The TEVs aren't quite as flexible as you think, and in terms of features Flipper was more comparable to Geforce 2 and Radeon 7500 (DX7 cards).The ps2 version is pretty damn ugly. You're free to believe that about polygon counts, I'm just offering my unbiased observation. It's quite obvious in the modeling of games like doom and Riddick there's a deficit, but I won't die on this hill.
It doesn't work like this! Both GameCube and Xbox had 4 pixel pipelines. Xbox could not do effects cheaper than GameCube just because of its shader featureset. The programmable sharers are simply a toolset ; it doesn't magically make xbox output more than cube or vice versa.
Literally anything besides normal maps and stencil shadows can be done on cube, at similar cost.
As for geometry performance, the XGPU is absolutely loaded for bear. Although its architecture in many ways resembles that of the GeForce 3, the XGPU is actually more closely modeled on a next-generation 3D core from nVidia (GeForce 4). One of the very noteworthy features is the addition of a second vertex pipeline, which rockets its geometry throughput up to 116.5 million triangles/sec.
20-30 million triangles/sec on Game Cube vs 116.5 million triangles/sec on xboxOther estimates put Flipper’s geometry rate somewhat higher at 20-30 million polygons/second, which is still well below Xbox’s 116.5 million polygons/second
And yet shadows and texture effects in splinter cell games were extremely downgraded. These games looked really ugly on GC. If GC would be as powerful as what you are saying developers would use GC power in their games and match what xbox was doing, but it wasnt like that.Literally anything besides normal maps and stencil shadows can be done on cube, at similar cost.
Yep, rumors say around 11TF and RT support is confirmed. So seems it's going to be more 1080ti than 1080. And looking at what PS4 did with games like Uncharted 4 or Horizon, PS5 exclusive games are going to look stunning and way above than current PC games.It's not slightly better than 1080, it's closer to 1080ti which is +11 TF card and also has RT support. PS5 will have RT, surely won't be 8.8 TF like 1080 and it doesn't have support for RT
It's not slightly better than 1080, it's closer to 1080ti which is +11 TF card and also has RT support. PS5 will have RT, surely won't be 8.8 TF like 1080 and it doesn't have support for RT
It finally devolved into screenshot wars and metrics stating xbox could push 4x the poly as cube xD
A thread back from 2002.You want to tell people here GC could render shadows as fast without shadow buffers? The TEVs aren't quite as flexible as you think, and in terms of features Flipper was more comparable to Geforce 2 and Radeon 7500 (DX7 cards).
You want to talk about polygons? Xbox GPU was a polygon beast, it had 2 texture units per pixel pipe unlike GC and other important features
Here's very detailed Xbox vs GC hardware analysisGameCube vs. Xbox: Part Deux - Page 13 of 18 - ExtremeTech
For two boxes that do essentially the same thing, Xbox and Game Cube could hardly be more different internally. Take our tour, and understand why.www.extremetech.com
20-30 million triangles/sec on Game Cube vs 116.5 million triangles/sec on xbox
And yet shadows and texture effects in splinter cell games were extremely downgraded. These games looked really ugly on GC. If GC would be as powerful as what you are saying developers would use GC power in their games and match what xbox was doing, but it wasnt like that.
RE4 on Game Cube
Xbox Games
It finally devolved into screenshot wars and metrics stating xbox could push 4x the poly as cube xD
It finally devolved into screenshot wars and metrics stating xbox could push 4x the poly as cube xD
the 1080 was release 3 years ago in 2016. while it was 500 Dollars. it was still 3 damn years ago. that would make the console just like last gen, outdated on release day. am not expecting it to be as good as a current gen PC GPU at the high end. but giving us hardware thats maxed out on day one doesn't leave too much of a good taste in my mouth
am just using some of the leaks here that might not be true. 1080 performance is nothing to write home about. 1080 TI perf in 2020? yeah . that would be 4 years after these cards came out. thats pretty reasonable isn't it?.
hell even PS4 wasn't worse than a 2009 GPU.
You can't run nfs underground on pc with 32mb even if you turn the world upside downPS2 version is 512x480 or lower, gfx quality is worse compared to pc (lowest texture quality or worse) and framerate is 30 fps locked.
You can't run nfs underground on pc with 32mb even if you turn the world upside down
God damn, you guys want Sony to repeat the PS3 don't you? A console over $499 is suicide. I'd say $399 is the sweet spot. And for $399 you're not going to get the latest and greatest PC hardware. Just isn't happening.
Not to mention it can be financial suicide.
Small problem...console games aren't bare metal anymore. Both the XBOX One and the PS4 use APIs. The XBOX One uses DirectX 11/12 and the PS4 uses some fucky Sony proprietary thing.There is so many levels of dumb in the OPs post.
By now you think people would get the clue about consoles vs PC.
A game coded to the metal that can take full advantage of a GPU is going to give several generations better performance than the PC counterpart.
APIs let you code to the metal. An API is just a communication layer, yet you can still communicate you want your code be at metal level, or at least they should. I assume console APIs can do that.Small problem...console games aren't bare metal anymore. Both the XBOX One and the PS4 use APIs. The XBOX One uses DirectX 11/12 and the PS4 uses some fucky Sony proprietary thing.
At that point you're basically bypassing the API and defeating the point of it existing...now...can you? Sure. Technically you can do the same thing on PC. It's a non-sequitur though because nobody bothers to do that.APIs let you code to the metal. An API is just a communication layer, yet you can still communicate you want your code be at metal level, or at least they should. I assume console APIs can do that.
It depends. There's things its worth being at metal and things that dont.At that point you're basically bypassing the API and defeating the point of it existing...now...can you? Sure. Technically you can do the same thing on PC. It's a non-sequitur though because nobody bothers to do that.
Its still going to give you far far far better performance than the GPU equivalent on PC since that API only has to be optimized to communicate with one set of hardware.Small problem...console games aren't bare metal anymore. Both the XBOX One and the PS4 use APIs. The XBOX One uses DirectX 11/12 and the PS4 uses some fucky Sony proprietary thing.
Its still going to give you far far far better performance than the GPU equivalent on PC since that API only has to be optimized to communicate with one set of hardware.
Except...that isn't how APIs work. APIs are designed to be hardware agnostic. The sequence goes something like this:Its still going to give you far far far better performance than the GPU equivalent on PC since that API only has to be optimized to communicate with one set of hardware.
I think we're ready for $499 honestly in 2020. $399 is just less and less money to work with to break even or make a small profit. I don't see how a 9tf GPU, zen 2 CPU GGDR6, SSD, and a 4k bluray will make much profit at $499 and I lowballed the GPU. $399 would be an absolute nightmare. If its $399, expect massive cuts we didnt see coming. If X1X can sell well at $499 with crap exclusives and an ancient tablet CPU, ps5 will be just fine imo.
That's like saying that a player who barely understands starcraft and a professional starcraft player can play at the same level just because the gameplay is skill-agnostic.Except...that isn't how APIs work. APIs are designed to be hardware agnostic. The sequence goes something like this:
Game -> API -> GPU Driver -> GPU
Now it might not be perfect, I'm no expert, but the point is that the API provides a standardized layer that the game can use to communicate with the GPU, and the driver is the hardware and vendor specific part of the sequence that sits between the API and the GPU. This is how the XBOX One X is able to just run XBOX One games without any per game patches necessary despite having a different GPU. This is also how PC games from 15 years generally work without issue when you try to play them with an RTX 2080 Ti (atleast graphically...sometimes they don't like running on newer versions of Windows, but that's a totally different issue).
That's how APIs work. That's the entire point of their existence, to ensure compatibility at the cost of maximum performance. Console games can't go bare metal anymore. Not unless devs are going to patch games in purpetuity. A game that runs on bare metal on the XBOX One won't work on the XBOX One X for example. Next-gen wouldn't be compatible with any games running on bare metal either.That's like saying that a player who barely understands starcraft and a professional starcraft player can play at the same level just because the gameplay is skill-agnostic.
Knowing your target hardware is crucial when you have to make choices in how/where to use your resources or how to allocate them. And in PC, you dont have target hardware just bruteforce.
they arent far optimized anymore because they are the same parts as pcs. back in the day you had the ps2 CPU running at 300 MHZ but is a better gaming machine than a 1 GHZ pentium 4 machine because it wasnt the same part it was specified for games
now days thats not the case.
Why are you talking like if a game couldnt have multiple source codes for different machines?That's how APIs work. That's the entire point of their existence, to ensure compatibility at the cost of maximum performance. Console games can't go bare metal anymore. Not unless devs are going to patch games in purpetuity. A game that runs on bare metal on the XBOX One won't work on the XBOX One X for example. Next-gen wouldn't be compatible with any games running on bare metal either.
You can have multiple different builds for the different machines...but if they were doing that unpatched base XBOX One games would not work on the XBOX One X...Why are you talking like if a game couldnt have multiple source codes for different machines?
APIs doesnt make the same code run in different machines by magic. That's the work of the software suit used to develop the game.
Like you said, the API is a layer between code and the HW drivers, but is not a layer between code and all existing hardware.